The present disclosure relates to a refrigerator that automatically opens and closes a door and a control method of the refrigerator.
A refrigerator is an electronic device capable of storing objects in a low-temperature storage compartment (or storage space). The refrigerator may include one or more doors depending on the number or structure of storage compartments. A door included in the refrigerator is used when loading or unloading objects into or from the storage compartment, and when closed, the door prevents cold air from leaking out of the storage compartment and maintains a constant temperature inside the storage compartment.
An opening and closing operation of a door of such a refrigerator is performed via separate user manipulation based on a user's physical contact. However, as functions of refrigerators become smarter, refrigerators are being proposed which are capable of opening and closing a door based on a user's intention without any separate manipulation by the user.
A refrigerator according to an embodiment of the present disclosure may include a main body, at least one door rotatably coupled to a front of the main body, a door driving unit configured to automatically open and close the at least one door, an image obtaining sensor configured to obtain image data of a user located in the vicinity of the refrigerator, and at least one processor. According to an embodiment of the present disclosure, the at least one processor may be configured to detect a gaze point where a gaze of the user stays for a preset time period or longer by using the obtained image data of the user, determine a door corresponding to the detected gaze point among the at least one door, and control the door driving unit to open the determined door.
A control method of a refrigerator according to an embodiment of the present disclosure may include obtaining image data of a user located in a vicinity of the refrigerator by using an image obtaining sensor included in the refrigerator, detecting a gaze point where a gaze of the user stays for a preset time period or longer by using the obtained image data of the user, determining a door corresponding to the detected gaze point of the user among at least one door included in the refrigerator, and opening the determined door by controlling the door driving unit included in the refrigerator.
Terms used in the present disclosure will be briefly described, and then an embodiment of the present disclosure will be described in detail.
As the terms used in the present disclosure, general terms that are currently widely used are selected by taking functions according to an embodiment of the present disclosure into account, but the terms may be changed according to the intention of one of ordinary skill in the art, precedent cases, advent of new technologies, or the like. Furthermore, specific terms may be arbitrarily selected by the applicant, and in this case, the meaning of the selected terms will be described in detail in the detailed description of an embodiment of the present disclosure. Thus, the terms used in the present disclosure should be defined not by simple appellations thereof but based on the meaning of the terms together with the overall description of the present disclosure.
Throughout the specification, when a part “includes” or “comprises” an element, unless there is a particular description contrary thereto, it is understood that the part may further include other elements, not excluding the other elements. In addition, terms such as “portion”, “module”, etc., described in the specification refer to a unit for processing at least one function or operation and may be implemented as hardware or software, or a combination of hardware and software.
Embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings so that they may be easily implemented by one of ordinary skill in the art of the present disclosure. However, an embodiment of the present disclosure may be implemented in different forms and should not be construed as being limited to embodiments set forth herein. In addition, parts not related to descriptions are omitted to clearly describe an embodiment of present disclosure in the drawings, and like reference numerals denote like elements throughout.
According to an embodiment of the present disclosure, there may be provided a refrigerator and control method thereof for automatically opening and closing a door as desired by a user, based on eye tracking for the user.
According to an embodiment of the present disclosure, there may be provided a refrigerator and control method thereof for automatically opening and closing a door as desired by the user, based on the user's location and tracking of the user's eyes.
A refrigerator 100 according to an embodiment of the present disclosure may include a main body 101, first to fourth doors 102 to 105, an image obtainer 106, and a sensor 107. The number of doors included in the refrigerator 100 is not limited to that shown in
The refrigerator 100 shown in
The main body 101 of the refrigerator 100 may have a box shape and have an open front. The main body 101 of the refrigerator 100 may include an insulating material. Opening and closing operations of the first to fourth doors 102 to 105 are performed to open or close storage compartments provided in the main body 101. To prevent cold air from leaking out of the storage compartments provided in the main body 101 in a state in which the first to fourth doors 102 to 105 are closed, a filler and/or a sealing member may be included between the main body 101 and each of the first to fourth doors 102 to 105. The filler or sealing member may be composed of a rubber material. The storage compartments may also be provided in the first to fourth doors 102 to 105. The storage compartments provided in the first to fourth doors 102 to 105 may be configured in the form of door shelves.
The image obtainer 106 may be used to track a gaze 120 (e.g., a gaze direction) of a user 110. The image obtainer 106 may be configured as an image obtaining sensor for tracking the gaze 120 (e.g., gaze direction) of the user 110. The refrigerator 100 may track the gaze 120 of the user 110 based on image data obtained using the image obtainer 106 to determine a door the user 110 desires to open, automatically open the determined door, and close the opened door.
In order to determine which of the first to fourth doors 102 to 105 is to be opened based on a gaze point of the user 110 detected by tracking the gaze 120 of the user 110, the refrigerator 100 may have two-dimensional (2D) plane coordinate values (x, y) for the front of the main body 101. The gaze point of the user 110 detected by tracking the gaze 120 of the user 110 is a point (or a location) on the front of the refrigerator 100 at which the user's gaze stays for a preset time period or longer, and may be represented by 2D plane coordinate values (x,y). The gaze point of the user may be referred to as a viewpoint. The preset time period may be stored in the refrigerator 100 when the refrigerator 100 is manufactured, or may be set by the user 110.
The refrigerator 100 may detect a distance value between the user 110 and the refrigerator 100 by using the sensor 107. The refrigerator 100 may determine, based on the detected distance value, the user's intention to open at least one door of the refrigerator 100. The distance value between the user 110 and the refrigerator 100 may represent a distance value between a current location of the user 110 and the front of the refrigerator 100.
The sensor 107 is used to detect a distance value 130 between the user 110 and the refrigerator 100. When the distance value between the front of the refrigerator 100 and the current location of the user 110 satisfies a preset distance condition by using a value sensed using the sensor 107, the refrigerator 100 may track the gaze of the user 110 by activating an operation of the image obtainer 106. The refrigerator 100 may determine a door to be opened based on a result of the tracking. The refrigerator 100 may open the determined door and close the opened door. The preset distance condition may refer to a condition under which it is determined that the user 110 has the intention to open a door of the refrigerator 100 while preventing a collision between the door of the refrigerator 100 and a body of the user 110 when opening or closing the door of the refrigerator 100. For example, the preset distance condition may be a condition in which a distance value between the front of the refrigerator 100 and the current location of the user 110 is set to be greater than or equal to 30 cm and less than 60 cm, but is not limited thereto.
According to an embodiment of the present disclosure, the refrigerator 100 may determine a door that the user 110 desires to open based on the result of tracking eyes of the user 110 by using the image obtainer 106. After opening the determined door, the refrigerator 100 may close the opened door when a certain time period has elapsed. Therefore, according to an embodiment of the present disclosure, the door of the refrigerator 100 may be opened automatically according to a gaze of the user 110, which may increase convenience for the user 110. In particular, a door of the refrigerator 100 may be opened automatically even if the user 110 does not pull a handle to open the door of the refrigerator 100, and thus may be useful for users with a weak hand grip strength.
Referring to
The image obtainer 106 obtains image data of the user 110 located near the refrigerator 100 in order to track a gaze of the user 110. The image obtainer 106 may be referred to as an image obtaining sensor. The image obtainer 106 may be mounted at least one location on the front of the refrigerator 100. For example, the image obtainer 106 may be mounted on a central frame of the refrigerator 100 as shown in
For example, referring to 301 of
Although
401 of
The image obtainer 106 may obtain image data of the user 110 located near the refrigerator 100 or the door of the refrigerator 100 in various ways. For example, when the image obtainer 106 is configured to perform eye tracking by using an infrared tracking method, the image obtainer 106 may include a camera and an infrared or near-infrared lighting device. The infrared or near-infrared lighting device may be located in close proximity to the camera. When the image obtainer 106 includes the camera and the infrared or near-infrared lighting device, the image obtainer 106 may obtain black-and-white image data of the user 110, captured by using the camera, while emitting infrared or near-infrared light toward the user 110.
For example, when the image obtainer 106 performs eye tracking by using a red green blue (RGB) camera tracking method, the image obtainer 106 may include an RGB camera and an infrared or near-infrared lighting device. When the image obtainer 106 includes the camera and the infrared or near-infrared lighting device, the image obtainer 106 may obtain color image data of the user 110, captured by using the RGB camera, while emitting infrared or near-infrared light toward the user 110.
The image obtainer 106 may further include a tilt controller that recognizes a moving object and automatically performs a tilt control for the camera within a preset angle range according to a direction in which the object moves. The image obtainer 106 may consist of only a camera. When the image obtainer 106 consists of only a camera, the camera may have the eye-tracking function described above. Accordingly, the image obtainer 106 may be referred to as a camera.
When image data of the user 110 obtained by the image obtainer 106 is received, the processor 210 detects a gaze vector of the user 110. In order to detect the gaze vector of the user 110, the processor 210 detects a face region of the user 110 in the received image data of the user 110 and recognizes contours of a face from the detected face region of the user 110 to detect an orientation of the face, a shape of eyes, and a shape of eyeballs. To detect a shape of eyeballs from the received image data of the user 110, the processor 210 may use a face recognition program.
When the image obtainer 106 is configured to use an infrared tracking method, the processor 210 may detect eyeball shapes 510 and 520 as shown in
When the image obtainer 106 uses an RGB camera, the processor 210 detects an eyeball shape 530 as shown in
When the gaze vector of the user 110 is detected, the processor 210 detects 2D plane coordinate values of the front of the refrigerator 100 based on the detected gaze vector. The detected 2D plane coordinate values of the front of the refrigerator 100 may be referred to as a gaze point of the user 110.
According to an embodiment of the present disclosure, the processor 210 may divide the front of the refrigerator 100 into a plurality of virtual areas and define the plurality of virtual areas as gaze areas. For example, based on the number of doors of the refrigerator 100 or a structure of the refrigerator 100, the processor 210 may divide the front of the refrigerator 100 into six areas, four areas, three areas, or two areas. Moreover, the processor 210 may divide the front of the refrigerator 100 into a plurality of virtual areas based on an input by the user 110. For example, the processor 210 may divide the front of the refrigerator 100 into six areas, four areas, three areas, or two areas according to an input by the user 110.
According to an embodiment of the present disclosure, the processor 210 may divide the entire front of the refrigerator 100 into a plurality of areas and define all of the plurality of areas as gaze areas. Furthermore, the processor 210 may define only a portion of the front of the refrigerator 100 as gaze areas. An operation in which the processor 210 defines a portion of the front of the refrigerator as gaze areas is described later with reference to
The processor 210 may previously have 2D plane coordinate values for each of the first to sixth gaze areas 601 to 606 shown in
The 2D plane coordinate values of the first to sixth gaze areas 601 to 606 may be stored in a memory 1020 provided separately from the processor 210. When the 2D plane coordinate values of the first to sixth gaze areas 601 to 606 are stored in the memory 1020, the processor 210 may read and use the 2D plane coordinate values of the first to sixth gaze areas 601 to 606 stored in the memory 1020. The 2D plane coordinate values of a gaze area of the refrigerator 100 may be set differently depending on the number of doors of the refrigerator 100. For example, when a gaze point detected based on eye tracking for the user 110 is detected as a point 607 having 2D plane coordinate values (x1+i, y1+j), the processor 210 may detect the first gaze area 601 including the gaze point 607 of the user 110 as a gaze area corresponding to the gaze point 607.
As described above, the processor 210 tracks a gaze of the user 110 based on the image data of the user 110 received from the image obtainer 106. The processor 210 may use an eye tracking program or an eye tracking application to detect a gaze area including a tracked gaze point of the user 110. The eye tracking program or eye tracking application may be stored in the memory 1020 provided separately from the processor 210 and used by the processor 210. Detecting a gaze direction of the user 110 may be referred to as including converting a coordinate system of the camera included in the image obtainer 106, which captures images of pupils of the user 110, to a coordinate system of the gaze point of the user 110 on the refrigerator 100.
When coordinate values for a gaze area of the refrigerator 100 are detected, the processor 210 may detect a gaze area including the gaze point of the user 110 among the first to fourth doors 102 to 105 based on the detected coordinate values, and determine a door based on the detected gaze area. For example, when the gaze point of the user 110 is detected to be included in the first gaze area 601, the processor 210 determines the first door 102 as the door to be opened. When the gaze point of the user 110 is detected to be included in the second gaze area 602, because the second gaze area 602 is a boundary area between the first door 102 and the second door 103, the processor 210 determines the first door 102 and the second door 103 adjacent to the boundary area as the doors to be opened. When the gaze point of the user 110 is detected to be included in the third gaze area 603, the processor 210 determines the second door 103 as the door to be opened. When the gaze point of the user 110 is detected to be included in the fourth gaze area 604, the processor 210 determines the third door 104 as the door to be opened. When the gaze point of the user 110 is detected to be included in the fifth gaze area 605, because the fifth gaze area 605 is a boundary area between the third door 104 and the fourth door 105, the processor 210 determines the third door 104 and the fourth door 105 adjacent to the boundary area as the doors to be opened. When the gaze point of the user 110 is determined to be included in the sixth gaze area 606, the processor 210 determines the fourth door 105 as the door to be opened.
When the processor 210 determines the door to be opened, the processor 210 transmits a control signal for opening the determined door to the door driving unit 220. Transmitting, by the processor 210, a control signal for opening the determined door to the door driving unit 220 means that the door driving unit 220 is controlled by the processor 201. Therefore, in the following description, the transmission of a control signal for opening the door from the processor 210 to the door driving unit 220 may indicate that the processor 210 controls the door driving unit 220 to open the door. Additionally, the transmission of a control signal for closing the door from the processor 210 to the door driving unit 220 may indicate that the processor 210 controls the door driving unit 220 to close the door. The door driving unit 220 shown in
When receiving a control signal for opening a determined door from the processor 210, the door driving unit 220 opens the corresponding door. When the door to be opened is the first door 102, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the first motor driver 221 and the first motor 225 and deactivating operations of the second to fourth motor drivers 222 to 224 and the second to fourth motors 226 to 228. When the door to be opened is the second door 103, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the second motor driver 222 and the second motor 226 and deactivating operations of the first, third, and fourth motor drivers 221, 223, and 224 and the first, third, and fourth motors 225, 227, and 228. When the door to be opened is the third door 104, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the third motor driver 223 and the third motor 227 and deactivating operations of the first, second, and fourth motor drivers 221, 222, and 224 and the first, second, and fourth motors 225, 226, and 228. When the door to be opened is the fourth door 105, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the fourth motor driver 224 and the fourth motor 228 and deactivating operations of the first to third motor drivers 221 to 223 and the first to third motors 225 to 227.
The processor 210 may detect an area where a gaze point of the user 110 stays for a preset time period or longer as a gaze area including the gaze point. The preset time period may be set when manufacturing the refrigerator 100, or may be set by the user 110. When receiving the control signal for opening the door from the processor 210, the motor driver 220 may vary the opening speed of the door. For example, when the opening of the first door 102 is detected by a door switch included in the first motor driver 221, the motor driver 220 rotates the motor 225 at a high speed for a certain time period or up to a certain angle. After rotating the motor 225 at high speed for the certain time period or up to the certain angle, the motor driver 220 may reduce the number of rotations of the first motor 225 to rotate the first motor 225 at a lower speed. When the first motor 225 rotates at a high speed, it indicates that the opening speed of the first door 102 is fast, and when the first motor 225 rotates at a low speed, it indicates that the opening speed of the first door 102 is slow. The above-described operational relationship among the first door 102, the first motor driver 221, and the first motor 225 is applied to operational relationship among the second to fourth doors 103 to 105, the second to fourth motor drivers 222 to 224, and the second to fourth motors 226 to 228.
When a preset time period has elapsed since opening of the door, the processor 210 may transmit a control signal for closing the opened door to the door driving unit 220. For example, when the opened door is the first door 102, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the first motor driver 221 and the first motor 225 and deactivating operations of the second to fourth motor drivers 222 to 224 and the second to fourth motors 226 to 228. When the opened door is the second door 103, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the second motor driver 222 and the second motor 226 and deactivating operations of the first, third, and fourth motor drivers 221, 223, and 224 and the first, third, and fourth motors 225, 227, and 228. When the opened door is the third door 104, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the third motor driver 223 and the third motor 227 and deactivating operations of the first, second, and fourth motor drivers 221, 222, and 224 and the first, second, and fourth motors 225, 226, and 228. When the opened door is the fourth door 105, the control signal transmitted from the processor 210 to the door driving unit 220 may represent a control signal for activating operations of the fourth motor driver 224 and the fourth motor 228 and deactivating operations of the first to third motor drivers 221 to 223 and the first to third motors 225 to 227. Transmitting, by the processor 210, a control signal for closing the opened door to the door driving unit 220 may indicate that the door driving unit 220 is controlled by the processor 201 to close the opened door.
The door driving unit 220 may control a rotation speed of a motor differently depending on an angle at which a door is opened. For example, in a state in which the first door 102 is open at 90 degrees and moved to 30 degrees, the first motor 225 may be driven at a low speed to close the first door 102, and in a state in which the first door 102 is opened to within 30 degrees, a driving speed of the first motor 225 may be controlled to be high. The above-described operational relationship among the first door 102, the first motor driver 221, and the first motor 225 is applied to operational relationship among the second to fourth doors 103 to 105, the second to fourth motor drivers 222 to 224, and the second to fourth motors 226 to 228.
Referring to
The sensor 107 measures a distance value between a current location of the user 110 and the refrigerator 100 and transmits the measured distance value to the processor 820. To achieve this, the sensor 107 may include, but is not limited to, an ultrasonic sensor, a radio frequency (RF) sensor (radio detection and ranging (RADAR)), an infrared sensor, or a proximity sensor. The sensor 107 serves to detect the user 110 approaching the refrigerator 100 and may be referred to as a proximity sensor.
The processor 820 may determine the door to be opened, as shown in Table 900 of
According to an embodiment of the present disclosure, when it is determined that the gaze point of the user 110 stays in the area 1911 for the preset time period or longer and then stays in the area 2912 for the preset time period or longer, the processor 820 may transmit a control signal for sequentially opening the first door 840 and the second door 850 to the door driving unit 830. In this case, after the first door 840 is opened, the second door 850 may also be opened sequentially. When the second door 850 is also opened sequentially after the first door 840 is opened, the second door 850 is an unopened door and may be defined as a door other than the first door 840, and the area 2912 may be defined as a gaze point different from that for the area 1911.
An operation in which the processor 820 tracks a gaze of the user 110 based on the image data of the user 110 received from the image obtainer 106 may be performed in the same manner as the operation of the processor 210 described with reference to
When opening or closing a door, the processor 820 may use an output value of the sensor 107 to prevent the user 110 from colliding with the door. For example, when it is determined, based on the output value of the sensor 107, that a first distance value between the user 110 and the refrigerator 100 is greater than or equal to a first threshold value (e.g., 30 cm) and less than a second threshold value (e.g., 60 cm), the processor 820 may transmit a control signal for opening the determined door to the door driving unit 830. After opening the determined door, when it is determined, based on the output value of the sensor 107, that a second distance value between the refrigerator 100 and the user 110 is greater than or equal to the second threshold value, the processor 820 may transmit a control signal for closing the opened door to the door driving unit 830. When a certain time period has elapsed since the opening of the determined door, the processor 820 may activate an operation of the sensor 107 to detect the second distance value between the refrigerator 100 and the user 110. The first distance value represents a distance value between the refrigerator 100 and the user 110 before the door is opened, and the second distance value represents a distance value between the refrigerator 100 and the user 110 after the door is opened.
According to an embodiment of the present disclosure, when the distance value between the user 110 and the refrigerator 100 is determined to be less than the first threshold value (e.g., 30 cm) based on the output value of the sensor 107, the processor 820 may not transmit a control signal for opening the door to the door driving unit 830 because the user 110 and the door may collide due to the opening of the door. When the processor 820 does not transmit a control signal to the door driving unit 830 to open the door, this indicates that the processor 820 does not control the door driving unit 830 to open the door. In addition, when the distance value between the user 110 and the refrigerator 100 is less than the first threshold value (e.g., 30 cm), the processor 820 may output a message guiding the user 110 to move away from the refrigerator 100 by the first distance value or more. In this case, the message may be output via the output unit 1030 shown in
After opening the determined door, when the second distance value between the refrigerator 100 and the user 110 is less than the second threshold value, the processor 820 may not transmit a control signal for closing the opened door to the door driving unit 830 until the second distance value between the refrigerator 100 and the user 110 is the second threshold value or higher. This means that the processor 820 does not control the door driving unit 830 to close the opened door.
The processor 820 may activate the operation of the image obtainer 106 when the first distance value detected using a sensing value (or output value) received from the sensor 107 is determined to be less than the second threshold value in a state in which the determined door is closed. Additionally, the processor 820 may deactivate the operation of the image obtainer 106 when the output value of the sensor 107 is the second threshold value or higher after opening and closing the determined door. In this way, the processor 820 may efficiently manage the resources of the refrigerator 100 by activating the operation of the image obtainer 106 when the user 110 approaches the refrigerator 100 within a threshold distance, and deactivating the operation of the image obtainer 106 when the user 110 moves away from the refrigerator 100 by the threshold distance or higher.
The memory 1020 may store 2D plane coordinate values of the refrigerator 100, programs such as an eye tracking program or an eye tracking application for the user 110, and programs and data for performing operations of the processor 1010.
The memory 1020 may include non-volatile semiconductor memory devices or non-volatile memory devices, such as read-only memory (ROM), high-speed random access memory (RAM), magnetic disk storage devices, and flash memory devices. For example, the memory 1020 may include, as semiconductor memory devices, a Secure Digital (SD) memory card, an SD High Capacity (SDHC) memory card, a mini SD memory card, a mini SDHC memory card, a Trans-Flash (TF) memory card, a micro SD memory card, a micro SDHC memory card, a memory stick, a Compact Flash (CF) memory card, a Multi-Media Card (MMC), MMC micro, an extreme Digital (XD) card, etc. The memory 1020 may be referred to as a storage. In addition, the memory 1020 may include a network-attached storage device accessed via a network.
The output unit 1030 includes a display and/or an audio output unit, and may output guidance information necessary to open or close a door of the refrigerator 100.
Additionally, the output unit 1030 may output a message guiding the user 110 to move away from the refrigerator 100 by the first distance value or more, as described with reference to
The communication unit 1040 may communicate with an external device of the refrigerator 100 in a wired or wireless manner. The external device is a device having a communication channel established with the refrigerator 100. The external device may be at least one of a home server, another server connected to the home server, other home appliances in the home, or a mobile terminal of the user 110. The communication unit 1040 may perform data communication according to standards for the home server.
The communication unit 1040 may transmit and receive data related to remote control over a network, and transmit and receive information related to operations of the other home appliances, and the like. Furthermore, the communication unit 1040 may receive information about a life pattern of the user 110 from a server and utilize it for the operation of the refrigerator 100. Additionally, the communication unit 1040 may perform data communication with the mobile terminal of the user 110 as well as with the home server or remote control in the home.
The communication unit 1040 may include, for example, a short-range communication module, a wired communication module, and a mobile communication module.
The short-range communication module may be a module for short-range communication within a predetermined distance. Short-distance communication technologies may include, but are not limited to, wireless LAN, Wi-Fi, Bluetooth, ZigBee, Wi-Fi Direct (WFD), ultra-wideband (UWB), infrared Data Association (IrDA), Bluetooth Low Energy (BLE), and near field communication (NFC).
The wired communication module refers to a module for communication using electrical signals or optical signals. Wired communication technologies may include pair cables, coaxial cables, optical fiber cables, and Ethernet cables, but are not limited thereto.
The mobile communication module may transmit and receive wireless signals to and from at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signals may include voice call signals, video call signals, or various forms of data according to transmission and reception of text/multimedia messages.
The gaze vector detector 1210 detects a gaze vector of the user 110 based on image data of the user obtained by the image obtainer 106, as described with reference to
In operation S1310, the refrigerator 100 obtains image data of the user 110 located in a vicinity of the refrigerator 100 by using the image obtainer 106. The refrigerator 100 may include at least one door as described with reference to
According to an embodiment of the present disclosure, the refrigerator 100 may activate an operation of the image obtainer 106 when a distance value (or an output value of the sensor 107) between the user 110 and the refrigerator 100, which is measured by the sensor 107, is less than a second threshold value (e.g., 60 cm) in a state in which the door is closed. In this case, the image obtainer 106 may remain deactivated and be activated when the user 110 approaches the vicinity of the refrigerator 100 to obtain an image of an eyeball in order to track a gaze of the user 110.
In operation S1320, the refrigerator 100 detects a gaze point of the user 110 by using the obtained image data of the user 110, as described with reference to
In operation S1330, the refrigerator 100 determines a door corresponding to the detected gaze point of the user 110. For example, as shown in
In operation S1340, the refrigerator 100 controls the door driving unit 220 or 830 to open the determined door. To control the door driving unit 220 or 830, a control signal is transmitted from the processor 210, 820, or 1010 of the refrigerator 100 to the door driving unit 220 or 830, as described with reference to
In operation S1410, the refrigerator 100 obtains image data of the user 110 located in a vicinity of the refrigerator 100 by using the image obtainer 106. The refrigerator 100 may include at least one door as described with reference to
According to an embodiment of the present disclosure, the refrigerator 100 may activate an operation of the image obtainer 106 when a distance value (or an output value of the sensor 107) between the user 110 and the refrigerator 100, which is measured by the sensor 107, is less than the second threshold value (e.g., 60 cm) in a state in which the door is closed. In this case, the image obtainer 106 may remain deactivated and be activated when the user 110 approaches the vicinity of the refrigerator 100 to obtain an image of an eyeball in order to track a gaze of the user 110.
In operation S1420, the refrigerator 100 detects a gaze point of the user 110 by using the obtained image data of the user 110, as described with reference to
In operation S1430, the refrigerator 100 detects a gaze area of the refrigerator 100, which includes the detected gaze point of the user 110, as described with reference to
The gaze area according to the present disclosure may be a virtual area obtained by arbitrarily dividing the front of the refrigerator 100. The gaze area may be a single gaze area or a plurality of gaze areas. The gaze area may be an area obtained by dividing the entire front of the refrigerator 100, or dividing a partial region of the front of the refrigerator 100 (for example, a mid-height region). As shown in
In operation S1440, the refrigerator 100 determines a door corresponding to the detected gaze area as described with reference to
In operation S1450, the refrigerator 100 controls the door driving unit 220 to open the determined door. To this end, the processor 210 of the refrigerator 100 transmits a control signal to the door driving unit 220. The control signal transmitted to the door driving unit 220 is as described with reference to
In operation S1510 of
In operation S1550, the refrigerator 100 guides information about the determined door. For example, the refrigerator 100 may guide information about the determined door via a display included in the output unit 1030, as shown in
In operation S1610, the refrigerator 100 may obtain image data of the user 110. In operation S1620, the refrigerator 100 may detect a gaze point of the user 110 by using the obtained image data of the user 110. In operation S1630, the refrigerator 100 may detect a gaze area including the detected gaze point of the user 110. In operation S1640, the refrigerator 100 may determine a door corresponding to the detected gaze area. In operation S1650, the refrigerator 100 may transmit a control signal for opening the determined door to the door driving unit 220 or 830. Because operations S1610 to S1650 of
In operation S1650, when a certain time period has elapsed since opening of the door, the refrigerator 100 controls the door driving unit 220 or 830 to close the opened door. To this end, the processor 210, 820, or 1010 transmits a control signal for closing the opened door to the door driving unit 220 or 830. The certain time period may be sufficient time for the user 110 to place objects (e.g., food, ingredients, containers, etc.) into or out of the refrigerator 100. The certain time period may be a time preset at the factory for the refrigerator 100, or a time set by the user 110. The user 110 may change the certain time period by using the display included in the output unit 1030 of the refrigerator 100 or an application on a mobile terminal.
Moreover, according to an embodiment of the present disclosure, when the user 110 is detected by the sensor 107 (e.g., a proximity sensor) even after the certain time period has elapsed, the refrigerator 100 may not transmit a control signal for closing the door to the door driving unit 220 or 830 in order to prevent the user 110 from colliding with the door. Not transmitting the control signal for closing the door to the door driving unit 220 or 830 indicates that the processors 210, 820, or 1010 of the refrigerator 100 does not control the door driving unit 220 or 830 to close the door.
In operation S1710, the refrigerator 100 receives a sensing value (or an output value) from the sensor 107. In operation S1711, the refrigerator 100 detects a distance value between the refrigerator 100 and the user 110 based on the received sensing value.
In operation S1712, when the detected distance value is determined to be greater than or equal to a first threshold value d1 and less than a second threshold value d2, in operation S1713, the refrigerator 100 may obtain image data of the user 110 located in a vicinity of the refrigerator 100 or a door of the refrigerator 100. In this case, the detected distance value in operation S1712 may be defined as a first distance value because it is before a door is opened or closed.
In operation S1714, the refrigerator 100 may detect a gaze point of the user 110 by using the obtained image data of the user 110. In operation S1715, the refrigerator 100 may detect a gaze area including the detected gaze point of the user 110. In operation S1716, the refrigerator 100 may determine a door corresponding to the detected gaze area. In operation S1717, the refrigerator 100 controls the door driving unit 220 or 830 to open the determined door. To achieve, the processor 210, 820, or 1010 may transmit a control signal to the door driving unit 220 or 830. Because operations S1714 to S1717 respectively correspond to operations S1420 to S1450 of
After transmitting the control signal for opening the determined door to the door driving unit 220 or 830, the refrigerator 100 returns to operation S1710 to determine a relationship between the distance value between the refrigerator 100 and the user 110 and the first threshold value d1 and the second threshold value d2 based on a sensing value (or output value) received from the sensor 107. In this case, the distance value may be defined as a second distance value because it is after the door is opened. In operations S1718 and S1719, when the refrigerator 100 determines that the distance value (second distance value) between the refrigerator 100 and the user 110 is greater than or equal to the second threshold value d2 and that the door of the refrigerator 100 is opened, in operation S1720, the refrigerator 100 controls the door driving unit 220 or 830 to close the opened door and ends a door opening and closing operation. The processor 210, 820, or 1010 of the refrigerator 100 controls the door driving unit 220 or 830 to close the opened door. In order to control the door driving unit 220 or 830 to close the opened door, the processor 210, 820, or 1010 transmits a control signal for closing the opened door to the door driving unit 220 or 830.
In operation S1718, when the second distance value between the refrigerator 100 and the user 110 is determined not to be greater than or equal to the second threshold value d2, in operation S1721, the refrigerator 100 determines whether the distance value (second distance value) between the refrigerator 100 and the user 110 is less than the first threshold value d1. In operation S1721, when the distance value (second distance value) between the refrigerator 100 and the user 110 is determined to be less than the first threshold value d1, in operation S1723, the refrigerator 100 stops the door opening and closing operation and returns to operation S1710. By doing so, a collision between the door of the refrigerator 100 and the user 110, due to the door opening and closing operation, may be prevented.
In operation S1721, when the distance value (second distance value) between the refrigerator 100 and the user 110 is determined not to be less than the first threshold value d1, the refrigerator 100 returns to operation S1710 and monitors a sensing value (or output value) received from the sensor 107.
The distance value between the refrigerator 100 and the user 110 described with reference to
The flowchart of
In operation S1810, the refrigerator 100 may guide a gaze area where the user 110 may gaze to open a door by using the output unit 1030. For example, an LED light device may be located close to the area 1911 and the area 2912 shown in
In operation S1820, the refrigerator 100 may obtain image data of the user 110 located in a vicinity of the refrigerator 100 or a door of the refrigerator 100. In operation S1830, the refrigerator 100 may detect a gaze point of the user 110 by using the obtained image data of the user 110. In operation S1840, the refrigerator 100 may detect a gaze area including the detected gaze point of the user 110. In operation S1850, the refrigerator 100 may determine a door corresponding to the detected gaze area. In operation S1860, the refrigerator 100 may control the door driving unit 220 or 830 to open the determined door. To control the door driving unit 220 or 830 to open the determined door, the processor 210, 820, or 1010 of the refrigerator 100 may transmit a control signal to the door driving unit 220 or 830. Because operations S1820 to S1860 of
In addition, each of the flowcharts illustrated in
Although each of the flowcharts of
Although each of the flowcharts of
According to an embodiment of the present disclosure, the refrigerator 100 may automatically open a door of the refrigerator 100 as desired by the user 110, based on eye tracking for the user 110, thereby improving convenience for the user 110 when opening and closing the door of the refrigerator 100.
According to an embodiment of the present disclosure, the refrigerator 100 may open and close a door as desired by the user 110 at a more precise time, based on eye tracking for the user 110 and a location of the user 110, thereby preventing incorrect opening and closing of the door of the refrigerator 100 and preventing a collision between the door of the refrigerator 100 and the user 110.
According to an embodiment of the present disclosure, the refrigerator 100 may guide an area where the user 110 can gaze to open the at least one door 102, 103, 104, 105, 840, or 850, thereby allowing the user 110 to more accurately determine a door to be opened.
The refrigerator 100 according to an embodiment of the present disclosure may include the main body 101, the at least one door 102, 103, 104, 105, 840, or 850 rotatably coupled to the front of the main body 101, the door driving unit 220 or 830 for automatically opening and closing the at least one door 102, 103, 104, 105, 840, or 850, the image obtaining sensor 106 for obtaining image data of the user 110 located in a vicinity of the refrigerator 100, and the at least one processor 210, 820, or 1010 configured to detect a gaze point where a gaze of the user 110 stays for a preset time period or longer by using the image data of the user 110, determine a door corresponding to the detected gaze point among the at least one door 102,103,104, 105, 840, or 850, and control the door driving unit 220 or 830 to open the determined door.
According to an embodiment of the present disclosure, the refrigerator 100 may include the proximity sensor 107 that detects the approach of the user 110 to the refrigerator 100, and the at least one processor 210, 820, or 1010 may be configured to, in a state in which the at least one door 102, 103, 104, 105, 840, or 850 is closed, detect a first distance value between the refrigerator 100 and the user 110 by using an output value of the proximity sensor 107, and when the detected first distance value is determined to be greater than or equal to a first threshold value and less than a second threshold value, control the door driving unit 220 or 830 to open the determined door.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to, when the detected first distance value is less than the second threshold value in a state in which the at least one door 102, 103, 104, 105, 840, or 850 is closed, activate an operation of the image obtaining sensor 106 to obtain image data of the user 110 located in a vicinity of the at least one door 102, 103, 104, 105, 840, or 850.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to, after opening the determined door, detect a second distance value between the refrigerator 100 and the user 110 by using an output value of the proximity sensor 107, and when the detected second distance value is greater than or equal to the second threshold value, control the door driving unit 220 or 830 to close the opened door.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to, when a certain time period has elapsed since the opening of the determined door, control the proximity sensor 107 to detect a second distance value between the refrigerator 100 and the user 110.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to deactivate the operation of the image obtaining sensor 106 when the detected second distance value is greater than or equal to the second threshold value after opening and closing the determined door.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to divide the front of the refrigerator 100 into a plurality of virtual gaze areas based on a number of doors of the refrigerator 100 or a structure of the refrigerator 100, and detect an area where a gaze point stays for a preset time period or longer among the plurality of virtual gaze areas as a gaze area of the front of the refrigerator 100 including the gaze point.
According to an embodiment of the present disclosure, the refrigerator 100 may include the output unit 1030 that displays information guiding the user 110 about the plurality of gaze areas where the user 110 can gaze to open the at least one door 102, 103, 104, 105, 840, or 850.
According to an embodiment of the present disclosure, the output unit 1030 may include a lighting device that emits light.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to, when the detected gaze point is a boundary area between the at least one door 102, 103, 104, 105, 840, or 850, control the door driving unit 220 or 830 to open all doors adjacent to the boundary area.
According to an embodiment of the present disclosure, the at least one processor 210, 820, or 1010 may be configured to, after opening the determined door, when another gaze point is detected by using the image data of the user 110 obtained by the image obtaining sensor 106, determine another door corresponding to the detected other gaze point among unopened doors, and control the door driving unit 220 or 830 to open the determined other door.
A control method of the refrigerator 100 according to an embodiment of the present disclosure may include obtaining image data of the user 110 located in a vicinity of the refrigerator 100 by using the image obtaining sensor 106 (S1310, S1410, S1510, S1610, pr S1820), detecting a gaze point where a gaze of the user 110 stays for a preset time period or longer by using the obtained image data of the user 110 (S1320, S1420, S1520, S1620, or S1830), determining a door corresponding to the detected gaze point among at least one door 102, 103, 104, 105, 840, or 850 included in the refrigerator 100 (S1330, S1440, S1540, S1640, or S1850), and opening the determined door by controlling the door driving unit 220 or 830 included in the refrigerator 100 (S1340, S1450, S1560, S1650, or S1860).
According to an embodiment of the present disclosure, the opening of the determined door (S1340, S1450, S1560, S1650, S1860) may include, in a state in which the at least one door 102, 103, 104, 105, 840, or 850 is closed, detecting a first distance value between the refrigerator 100 and the user 110 by using an output value of the proximity sensor 107 included in the refrigerator 100 (S1711), and when the detected first distance value is determined to be greater than or equal to a first threshold value and less than a second threshold value, controlling the door driving unit 220 or 830 to open the determined door (S1717).
According to an embodiment of the present disclosure, the obtaining of the image data of the user 110 (S1310, S1410, S1510, S1610, or S1820) may include, when the detected first distance value is less than the second threshold value in a state in which the at least one door 102, 103, 104, 105, 840, or 850 is closed, obtaining the image data of the user 110 located in the vicinity of the at least one door 102, 103, 104, 105, 840, or 850 by activating an operation of the image obtaining sensor 106.
According to an embodiment of the present disclosure, the control method of the refrigerator may include, after the opening of the determined door, detecting a second distance value between the refrigerator 100 and the user 110 by using an output value of the proximity sensor 107, and when the detected second distance value is greater than or equal to the second threshold value, controlling the door driving unit 220 or 830 to close the opened door (S1720).
According to an embodiment of the present disclosure, the control method of the refrigerator may include deactivating the operation of the image obtaining sensor 106 when the detected second distance value is greater than or equal to the second threshold value after opening and closing the determined door.
According to an embodiment of the present disclosure, the control method of the refrigerator may include dividing the front of the refrigerator 100 into a plurality of virtual gaze areas based on a number of doors of the refrigerator 100 or a structure of the refrigerator 100, and detecting an area where a gaze point stays for a preset time period or longer among the plurality of virtual gaze areas as a gaze area of the front of the refrigerator 100 including the gaze point.
According to an embodiment of the present disclosure, the control method of the refrigerator may include displaying, via the output unit 1030, information for guiding the user 110 about the plurality of gaze areas where the user 110 can gaze to open the at least one door 102, 103, 104, 105, 840, or 850.
According to an embodiment of the present disclosure, the opening of the determined door may include, when the detected gaze point is a boundary area between the at least one door 102, 103, 104, 105, 840, or 850, controlling the door driving unit 220 or 830 to open all doors adjacent to the boundary area.
According to an embodiment of the present disclosure, the control method of the refrigerator 100 may include, after the opening of the determined door, when another gaze point is detected by using the image data of the user 110 obtained by the image obtaining sensor 106, determining another door corresponding to the detected other gaze point among unopened doors, and controlling the door driving unit 220 or 830 to open the determined other door.
A machine-readable storage medium may be provided in the form of a non-transitory storage medium. In this regard, the term ‘non-transitory’ only means that the storage medium does not include a signal (e.g., an electromagnetic wave) and is a tangible device, and the term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium. For example, the ‘non-transitory storage medium’ may include a buffer in which data is temporarily stored.
According to an embodiment, methods according to various embodiments of the disclosure may be included in a computer program product when provided. The computer program product may be traded, as a product, between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc-ROM (CD-ROM)) or distributed (e.g., downloaded or uploaded) on-line via an application store or directly between two user devices (e.g., smartphones). For online distribution, at least a part of the computer program product (e.g., a downloadable app) may be at least transiently stored or temporally generated in the machine-readable storage medium such as memory of a server of a manufacturer, a server of an application store, or a relay server.
Number | Date | Country | Kind |
---|---|---|---|
10-2021-0137141 | Oct 2021 | KR | national |
This application is a continuation of International Application PCT/KR2022/013850, filed Sep. 16, 2022, which is incorporated herein by reference in its entirety, and claims foreign priority to Korean application 10-2021-0137141, filed Oct. 15, 2021, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2022/013850 | Sep 2022 | WO |
Child | 18599565 | US |