The present subject matter relates generally to refrigerator appliances, and more particularly to systems and methods for monitoring the status of one or more doors of such refrigerator appliances.
Refrigerator appliances generally include a cabinet that defines a chilled chamber. A wide variety of food items may be stored within the chilled chamber. The low temperature of the chilled chamber relative to ambient atmosphere assists with increasing a shelf life of the food items stored within the chilled chamber.
In order to maintain the chilled chamber below ambient temperature, the cabinet is thermally insulated and the chilled chamber is selectively sealingly enclosed by a thermally insulated door. The door is movable to an open position which permits access to the chilled chamber, e.g., for loading items into the chilled chamber or taking items out of the chilled chamber. When the door is in the open position, the chilled chamber is exposed to relatively warm and/or humid air and such exposure, particularly for a prolonged period of time such as when the door is inadvertently left open and unattended, may be detrimental to the food items stored therein and may result in excessive energy consumption by the refrigerator appliance. Thus, some refrigerator appliances include a door alarm or door open notification. Such alarms, however, may be unhelpful or annoying when the door is intentionally left open, such as when loading a large amount of groceries at one time.
Accordingly, a refrigerator appliance with improved door alarms would be useful. More particularly, a refrigerator appliance that is capable of identifying an intentional door opening, and methods of identifying intentional refrigerator door openings, would be useful.
Aspects and advantages of the invention will be set forth in part in the following description, or may be apparent from the description, or may be learned through practice of the invention.
In an exemplary embodiment, a method of operating a refrigerator appliance is provided. The refrigerator appliance includes a cabinet defining a food storage chamber with a door movably coupled to the cabinet. The door is movable between a closed position where the food storage chamber is at least partially enclosed by the door and an open position where the door permits access to the food storage chamber. The refrigerator appliance also includes a sensor operable to detect a user presence. The method includes detecting an opening of the door and obtaining an input with the sensor after detecting the opening of the door. The method also includes determining that the opening of the door was intentional and temporarily disabling a door alarm of the refrigerator appliance based on the determination that the opening of the door was intentional.
In another exemplary embodiment, a refrigerator appliance is provided. The refrigerator appliance includes a cabinet defining a food storage chamber with a door movably coupled to the cabinet. The door is movable between a closed position where the food storage chamber is at least partially enclosed by the door and an open position where the door permits access to the food storage chamber. The refrigerator appliance also includes a sensor operable to detect a user presence and a controller. The controller is operable for detecting an opening of the door and obtaining an input with the sensor after detecting the opening of the door. The controller is also operable for determining that the opening of the door was intentional and temporarily disabling a door alarm of the refrigerator appliance based on the determination that the opening of the door was intentional.
In still another exemplary embodiment, a method of operating a refrigerator appliance is provided. The method includes receiving an intentional door open input. The method also includes disabling a door alarm of the refrigerator appliance in response to the intentional door open input. The method further includes automatically re-enabling the door alarm after disabling the door alarm.
These and other features, aspects and advantages of the present invention will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
A full and enabling disclosure of the present invention, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures.
Reference now will be made in detail to embodiments of the invention, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the invention, not limitation of the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope or spirit of the invention. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents.
Directional terms such as “left” and “right” are used herein with reference to the perspective of a user standing in front of the refrigerator appliance 100 to access the refrigerator and/or items stored therein. Terms such as “inner” and “outer” refer to relative directions with respect to the interior and exterior of the refrigerator appliance, and in particular the food storage chamber(s) defined therein. For example, “inner” or “inward” refers to the direction towards the interior of the refrigerator appliance. Terms such as “left,” “right,” “front,” “back,” “top,” or “bottom” are used with reference to the perspective of a user accessing the refrigerator appliance. For example, a user stands in front of the refrigerator to open the doors and reaches into the food storage chamber(s) to access items therein.
Refrigerator appliance 100 includes a cabinet or housing 120 defining an upper fresh food chamber 122 (
Refrigerator doors 128 are each rotatably hinged to an edge of housing 120 for accessing fresh food chamber 122. As may be seen in
Operation of the refrigerator appliance 100 can be regulated by a controller 134 that is operatively coupled to a user interface panel 136. User interface panel 136 provides selections for user manipulation of the operation of refrigerator appliance 100 to modify environmental conditions therein, such as temperature selections, etc. In some embodiments, user interface panel 136 may be proximate a dispenser assembly 132. Panel 136 provides selections for user manipulation of the operation of refrigerator appliance 100 such as, e.g., temperature selections, selection of automatic or manual override humidity control (as described in more detail below), etc. In response to user manipulation of the user interface panel 136, the controller 134 operates various components of the refrigerator appliance 100. Operation of the refrigerator appliance 100 can be regulated by the controller 134, e.g., controller 134 may regulate operation of various components of the refrigerator appliance 100 in response to programming and/or user manipulation of the user interface panel 136.
The controller 134 may include a memory and one or more microprocessors, CPUs or the like, such as general or special purpose microprocessors operable to execute programming instructions or micro-control code associated with operation of refrigerator appliance 100. The memory may represent random access memory such as DRAM, or read only memory such as ROM or FLASH. In one embodiment, the processor executes programming instructions stored in memory. The memory may be a separate component from the processor or may be included onboard within the processor. It should be noted that controllers 134 as disclosed herein are capable of and may be operable to perform any methods and associated method steps as disclosed herein.
The controller 134 may be positioned in a variety of locations throughout refrigerator appliance 100. In the illustrated embodiment, the controller 134 may be located within the door 128. In such an embodiment, input/output (“I/O”) signals may be routed between the controller and various operational components of refrigerator appliance 100. In one embodiment, the user interface panel 136 may represent a general purpose I/O (“GPIO”) device or functional block. In one embodiment, the user interface panel 136 may include input components, such as one or more of a variety of electrical, mechanical or electro-mechanical input devices including rotary dials, push buttons, and touch pads. The user interface panel 136 may include a display component, such as a digital or analog display device designed to provide operational feedback to a user. For example, the user interface panel 136 may include a touchscreen providing both input and display functionality. The user interface panel 136 may be in communication with the controller via one or more signal lines or shared communication busses.
As will be described in more detail below, refrigerator appliance 100 may further include features that are generally configured to detect the presence and, in some embodiments, identity of a user. More specifically, such features may include one or more sensors, e.g., cameras 192 and/or 196 (see, e.g.,
As shown schematically in
Although a single camera 192 is illustrated in
In some embodiments, it may be desirable to activate the photo camera or cameras 192 for limited time durations and only in response to certain triggers. For example, the IR camera, e.g., second camera 196, may be always on and may serve as a proximity sensor, such that the photo camera(s) 192 are only activated after the IR camera 196 detects motion at the front of the refrigerator appliance 100. In additional embodiments, the activation of the first camera(s) 192 may be in response to a door opening, such as detecting that the door was opened using a door switch. In this manner, privacy concerns related to obtaining images of the user of the refrigerator appliance 100 may be mitigated. According to exemplary embodiments, camera assembly 190 may be used to facilitate a user detection and/or identification process for refrigerator appliance 100. As such, each camera 192 may be positioned at the front opening 148 to fresh food chamber 122 to monitor one or more doors 128 and/or 130 and adjoining areas, such as while food items are being added to or removed from fresh food chamber 122 and/or freezer chamber 124.
It should be appreciated that according to alternative embodiments, camera assembly 190 may include any suitable number, type, size, and configuration of camera(s) 192 for obtaining images of any suitable areas or regions within or around refrigerator appliance 100. In addition, it should be appreciated that each camera 192 may include features for adjusting the field of view and/or orientation.
It should be appreciated that the images obtained by camera assembly 190 may vary in number, frequency, angle, resolution, detail, etc. in order to improve the clarity of the particular regions surrounding or within refrigerator appliance 100. In addition, according to exemplary embodiments, controller 134 may be configured for illuminating the chilled chamber (e.g., one or both of fresh food chamber 122 and freezer chamber 124) using one or more light sources prior to obtaining images. Notably, controller 134 of refrigerator appliance 100 (or any other suitable dedicated controller) may be communicatively coupled to camera assembly 190 and may be programmed or configured for analyzing the images obtained by camera assembly 190, e.g., in order to detect and/or identify a user proximate to the refrigerator appliance 100, as described in more detail below.
In general, controller 134 may be operably coupled to camera assembly 190 for analyzing one or more images obtained by camera assembly 190 to extract useful information regarding objects or people within the field of view of the one or more cameras 192 and/or 196. In this regard, for example, images obtained by camera assembly 190 may be used to extract a facial image or other identifying information related to one or more users. Notably, this analysis may be performed locally (e.g., on controller 134) or may be transmitted to a remote server (e.g., in the “cloud,” as those of ordinary skill in the art will recognize as referring to a remote server or database in a distributed computing environment including at least one remote computing device) for analysis. Such analysis is intended to facilitate user detection, e.g., by identifying a user accessing the refrigerator appliance, such as adding or removing food items to or from the fresh food chamber 122 and/or freezer chamber 124.
Specifically, according to an exemplary embodiment as illustrated in
Notably, camera assembly 190 may obtain images upon any suitable trigger, such as a time-based imaging schedule where camera assembly 190 periodically images and monitors the field of view, e.g., in and/or in front of the refrigerator appliance 100. According to still other embodiments, camera assembly 190 may periodically take low-resolution images until motion (such as opening of one or more doors 128 or 130) is detected (e.g., via image differentiation of low-resolution images), at which time one or more high-resolution images may be obtained. According to still other embodiments, refrigerator appliance 100 may include one or more motion sensors (e.g., optical, acoustic, electromagnetic, etc.) that are triggered when an object or user moves into or through the area in front of the refrigerator appliance 100, and camera assembly 190 may be operably coupled to such motion sensors to obtain images of the object 182 during such movement.
According to still other embodiments, refrigerator appliance 100 may include a door switch that detects when refrigerator door 128 is opened, at which point camera assembly 190 may begin obtaining one or more images. According to exemplary embodiments, the image may be obtained continuously or periodically while doors 128 and/or 130 are open. In this regard, obtaining one or more images may include determining that a door of the refrigerator appliance is open and capturing images at a set frame rate while the door is open.
It should be appreciated that the images obtained by camera assembly 190 may vary in number, frequency, angle, resolution, detail, etc. in order to improve the clarity thereof. In addition, according to exemplary embodiments, controller 134 may be configured for illuminating a refrigerator light (not shown) while obtaining the image or images. Other suitable imaging triggers are possible and within the scope of the present subject matter.
Using the teachings disclosed herein, one of skill in the art will understand that the present subject matter can be used with other types of refrigerators such as a refrigerator/freezer combination, side-by-side, bottom mount, compact, and any other style or model of refrigerator appliance. Accordingly, other configurations of refrigerator appliance 100 could be provided, it being understood that the configurations shown in the accompanying FIGS. and the description set forth herein are by way of example for illustrative purposes only.
Turning now to
The refrigerator appliance 10 may be in communication with the remote user interface device 1000 device through various possible communication connections and interfaces. The refrigerator appliance 10 and the remote user interface device 1000 may be matched in wireless communication, e.g., connected to the same wireless network. The refrigerator appliance 10 may communicate with the remote user interface device 1000 via short-range radio such as BLUETOOTH® or any other suitable wireless network having a layer protocol architecture. As used herein, “short-range” may include ranges less than about ten meters and up to about one hundred meters. For example, the wireless network may be adapted for short-wavelength ultra-high frequency (UHF) communications in a band between 2.4 GHz and 2.485 GHz (e.g., according to the IEEE 802.15.1 standard). In particular, BLUETOOTH® Low Energy, e.g., BLUETOOTH® Version 4.0 or higher, may advantageously provide short-range wireless communication between the refrigerator appliance 10 and the remote user interface device 1000. For example, BLUETOOTH® Low Energy may advantageously minimize the power consumed by the exemplary methods and devices described herein due to the low power networking protocol of BLUETOOTH® Low Energy.
The remote user interface device 1000 is “remote” at least in that it is spaced apart from and not physically connected to the refrigerator appliance 10, e.g., the remote user interface device 1000 is a separate, stand-alone device from the refrigerator appliance 10 which communicates with the refrigerator appliance 10 wirelessly. Any suitable device separate from the refrigerator appliance 10 that is configured to provide and/or receive communications, information, data, or commands from a user may serve as the remote user interface device 1000, such as a smartphone (e.g., as illustrated in
The remote user interface device 1000 may include a memory for storing and retrieving programming instructions. Thus, the remote user interface device 1000 may provide a remote user interface which may be an additional user interface to the user interface panel 136. For example, the remote user interface device 1000 may be a smartphone operable to store and run applications, also known as “apps,” and the remote user interface may be provided as a smartphone app.
As mentioned above, the refrigerator appliance 10 may also be configured to communicate wirelessly with a network 1100. The network 1100 may be, e.g., a cloud-based data storage system including one or more remote databases and/or remote servers, which may be collectively referred to as “the cloud.” For example, the refrigerator appliance 10 may communicate with the cloud 1100 over the Internet, which the refrigerator appliance 10 may access via WI-FI®, such as from a WI-FI® access point in a user's home.
Now that the construction and configuration of refrigerator appliance 100 have been presented according to an exemplary embodiment of the present subject matter, exemplary methods for operating a refrigerator appliance, such as refrigerator appliance 100, are provided. In this regard, for example, controller 134 may be configured for implementing one or more of the following exemplary methods. However, it should be appreciated that the exemplary methods are discussed herein only to describe exemplary aspects of the present subject matter, and are not intended to be limiting.
Turning now to
Also by way of example, the refrigerator may further include a sensor operable to detect a user presence. In some embodiments, the sensor may be or include a camera assembly positioned and configured for monitoring the food storage chamber and an area in front of the cabinet that is contiguous with the food storage chamber, such as the camera assemblies described above with respect to
As shown in
In some embodiments, the method may also include, and/or the refrigerator appliance may further be configured for, detecting or identifying one or more users, e.g., based on one or more images. In some embodiments, detection of the user(s) may be accomplished with the camera assembly 190. For example, the refrigerator appliance may include a camera, and the step of obtaining an input with the sensor may include capturing an image with the camera. Such embodiments may further include detecting the user(s) based on the image captured by the camera. In some embodiments, the operation of the camera may be tied to the door opening, e.g., the camera may be operable and configured to capture an image each time the door is opened and/or each time the door is closed after detecting a door opening. The structure and operation of cameras are understood by those of ordinary skill in the art and, as such, the camera is not illustrated or described in further detail herein for the sake of brevity and clarity. In such embodiments, the controller 134 of the refrigerator appliance 100 may be configured for image-based processing, e.g., to detect a user based on an image of the user, e.g., a photograph taken with the camera(s) 192 of the camera assembly 190. For example, the controller 134 may be configured to identify the user by comparison of the image to a stored image of a known or previously-identified user. For example, controller 134 of refrigerator appliance 100 (or any other suitable dedicated controller) may be communicatively coupled to camera assembly 190 and may be programmed or configured for analyzing the images obtained by camera assembly 190, e.g., in order to detect a user accessing refrigerator appliance 100, such as food items therein.
In some exemplary embodiments, the method 400 may include analyzing one or more images, e.g., such image(s) may be an embodiment of the input that was obtained at step 420, to detect a user. It should be appreciated that this analysis may utilize any suitable image analysis techniques, image decomposition, image segmentation, image processing, etc. This analysis may be performed entirely by controller 134, may be offloaded to a remote server (e.g., in the cloud 1100) for analysis, may be analyzed with user assistance (e.g., via user interface panel 136), or may be analyzed in any other suitable manner. According to exemplary embodiments of the present subject matter, the analysis may include a machine learning image recognition process.
According to exemplary embodiments, this image analysis may use any suitable image processing technique, image recognition process, etc. As used herein, the terms “image analysis” and the like may be used generally to refer to any suitable method of observation, analysis, image decomposition, feature extraction, image classification, etc. of one or more images, videos, or other visual representations of an object. As explained in more detail below, this image analysis may include the implementation of image processing techniques, image recognition techniques, or any suitable combination thereof. In this regard, the image analysis may use any suitable image analysis software or algorithm to constantly or periodically monitor refrigerator appliance 100 and/or a proximate and contiguous area in front of the fresh food chamber 122 and/or freezer chamber 124. It should be appreciated that this image analysis or processing may be performed locally (e.g., by controller 134) or remotely (e.g., by offloading image data to a remote server or network, e.g., in the cloud).
Specifically, the analysis of the one or more images may include implementation an image processing algorithm. As used herein, the terms “image processing” and the like are generally intended to refer to any suitable methods or algorithms for analyzing images that do not rely on artificial intelligence or machine learning techniques (e.g., in contrast to the machine learning image recognition processes described below). For example, the image processing algorithm may rely on image differentiation, e.g., such as a pixel-by-pixel comparison of two sequential images. This comparison may help identify substantial differences between the sequentially obtained images, e.g., to identify movement, the presence of a particular object, the existence of a certain condition, etc. For example, one or more reference images may be obtained when a particular condition exists, and these references images may be stored for future comparison with images obtained during appliance operation. Similarities and/or differences between the reference image and the obtained image may be used to extract useful information for improving appliance performance. For example, image differentiation may be used to determine when a pixel level motion metric passes a predetermined motion threshold.
The processing algorithm may further include measures for isolating or eliminating noise in the image comparison, e.g., due to image resolution, data transmission errors, inconsistent lighting, or other imaging errors. By eliminating such noise, the image processing algorithms may improve accurate object detection, avoid erroneous object detection, and isolate the important object, region, or pattern within an image. In addition, or alternatively, the image processing algorithms may use other suitable techniques for recognizing or identifying particular items or objects, such as edge matching, divide-and-conquer searching, greyscale matching, histograms of receptive field responses, or another suitable routine (e.g., executed at the controller 134 based on one or more captured images from one or more cameras). Other image processing techniques are possible and within the scope of the present subject matter.
In addition to the image processing techniques described above, the image analysis may include utilizing artificial intelligence (“AI”), such as a machine learning image recognition process, a neural network classification module, any other suitable artificial intelligence (AI) technique, and/or any other suitable image analysis techniques, examples of which will be described in more detail below. Moreover, each of the exemplary image analysis or evaluation processes described below may be used independently, collectively, or interchangeably to extract detailed information regarding the images being analyzed to facilitate performance of one or more methods described herein or to otherwise improve appliance operation. According to exemplary embodiments, any suitable number and combination of image processing, image recognition, or other image analysis techniques may be used to obtain an accurate analysis of the obtained images.
In this regard, the image recognition process may use any suitable artificial intelligence technique, for example, any suitable machine learning technique, or for example, any suitable deep learning technique. According to an exemplary embodiment, the image recognition process may include the implementation of a form of image recognition called region based convolutional neural network (“R-CNN”) image recognition. Generally speaking, R-CNN may include taking an input image and extracting region proposals that include a potential object or region of an image. In this regard, a “region proposal” may be one or more regions in an image that could belong to a particular object or may include adjacent regions that share common pixel characteristics. A convolutional neural network is then used to compute features from the region proposals and the extracted features will then be used to determine a classification for each particular region.
According to still other embodiments, an image segmentation process may be used along with the R-CNN image recognition. In general, image segmentation creates a pixel-based mask for each object in an image and provides a more detailed or granular understanding of the various objects within a given image. In this regard, instead of processing an entire image—i.e., a large collection of pixels, many of which might not contain useful information—image segmentation may involve dividing an image into segments (e.g., into groups of pixels containing similar attributes) that may be analyzed independently or in parallel to obtain a more detailed representation of the object or objects in an image. This may be referred to herein as “mask R-CNN” and the like, as opposed to a regular R-CNN architecture. For example, mask R-CNN may be based on fast R-CNN which is slightly different than R-CNN. For example, R-CNN first applies a convolutional neural network (“CNN”) and then allocates it to zone recommendations on the covn5 property map instead of the initially split into zone recommendations. In addition, according to exemplary embodiments, standard CNN may be used to obtain, identify, or detect any other qualitative or quantitative data related to one or more objects or regions within the one or more images. In addition, a K-means algorithm may be used.
According to still other embodiments, the image recognition process may use any other suitable neural network process while remaining within the scope of the present subject matter. For example, the step of analyzing the one or more images may include using a deep belief network (“DBN”) image recognition process. A DBN image recognition process may generally include stacking many individual unsupervised networks that use each network's hidden layer as the input for the next layer. According to still other embodiments, the step of analyzing one or more images may include the implementation of a deep neural network (“DNN”) image recognition process, which generally includes the use of a neural network (computing systems inspired by the biological neural networks) with multiple layers between input and output. Other suitable image recognition processes, neural network processes, artificial intelligence analysis techniques, and combinations of the above described or other known methods may be used while remaining within the scope of the present subject matter.
In addition, it should be appreciated that various transfer techniques may be used but use of such techniques is not required. If using transfer techniques learning, a neural network architecture may be pretrained such as VGG16/VGG19/ResNet50 with a public dataset then the last layer may be retrained with an appliance specific dataset. In addition, or alternatively, the image recognition process may include detection of certain conditions based on comparison of initial conditions, may rely on image subtraction techniques, image stacking techniques, image concatenation, etc. For example, the subtracted image may be used to train a neural network with multiple classes for future comparison and image classification.
It should be appreciated that the machine learning image recognition models may be actively trained by the appliance with new images, may be supplied with training data from the manufacturer or from another remote source, or may be trained in any other suitable manner. For example, according to exemplary embodiments, this image recognition process relies at least in part on a neural network trained with a plurality of images of the appliance in different configurations, experiencing different conditions, or being interacted with in different manners. This training data may be stored locally or remotely and may be communicated to a remote server for training other appliances and models.
It should be appreciated that image processing and machine learning image recognition processes may be used together to facilitate improved image analysis, object detection, or to extract other useful qualitative or quantitative data or information from the one or more images that may be used to improve the operation or performance of the appliance. Indeed, the methods described herein may use any or all of these techniques interchangeably to improve image analysis process and facilitate improved appliance performance and consumer satisfaction. The image processing algorithms and machine learning image recognition processes described herein are only exemplary and are not intended to limit the scope of the present subject matter in any manner.
Method 400 may also include a step 430 of determining that the opening of the door was intentional. For example, the input, e.g., image, may be analyzed to determine that a user is present in front of the refrigerator appliance 100. Thus, it may be determined that the opening of the door was intentional based on the input obtained at step 420 because the user is present at the refrigerator appliance, e.g., loading or unloading the refrigerator appliance. As another example, a user input may be received which indicates that the door opening was intentional, and it may thereby be determined that the door opening was intentional based on the user input.
As illustrated in
In some embodiments, the analysis of the input and the determination that the door opening was intentional may be performed using an intentional door opening detection software. The intentional door opening software may be built by a remote server, e.g., in the cloud, and may further be updated and/or re-built with additional inputs at subsequent door openings. For example, the intentional door opening software may be trained using one or more user inputs. Thus, in some embodiments, e.g., at initial or prior intentional door opening events, the determination that the opening of the door was intentional may include receiving a user input that indicates the opening of the door was intentional. Such user input may include an intentional door opening mode selection, e.g., prior to the door opening, or a manual deactivation of the door alarm, e.g., after detecting the door opening and activating the door alarm.
When the refrigerator appliance receives such user input(s) and thus determines that the door opening was intentional, the refrigerator appliance may then gather data, e.g., obtain input with the sensor, such as images obtained with one or more cameras, and the gathered data may be used to rebuild or update the intentional door opening software. For example, the intentional door opening software may be built by a remote server, e.g., in the cloud, and downloaded by the refrigerator appliance, such as transmitted from the remote server and received by the refrigerator appliance. Then, at a subsequent intentional door opening (which may be determined automatically, e.g., by analyzing sensor input such as camera images, and/or based on manual user input) additional data may be gathered and such additional data may be sent to the cloud, such as transmitted from the refrigerator appliance and received by the remote server. The remote server may then use the additional data to update and/or rebuild the intentional door opening software. The updated intentional door opening software may then be transmitted to, e.g., re-downloaded by, the refrigerator appliance. Accordingly, the intentional door opening software may be continuously updated and the accuracy of the intentional door opening software may be continuously improved with additional data. In particular, the remote server may be in communication with numerous refrigerator appliances, may receive data from multiple of the refrigerator appliances, and may update the intentional door opening software based on all the data from the multiple refrigerator appliances.
Thus, in some embodiments, method 400 may also include transmitting the input obtained from the sensor at step 420 to a remote server from the refrigerator appliance after receiving the user input. In such embodiments, method 400 may further include building an intentional door opening detection software by the remote server based on the input obtained from the sensor. The intentional door opening detection software may then be transmitted from the remote server to the refrigerator appliance.
In some embodiments, the method 400 may include downloading intentional door opening detection software from a remote server prior to detecting the opening of the door. In such embodiments, the step of 430 determining that the opening of the door was intentional may include analyzing the input obtained from the sensor with the previously downloaded intentional door opening detection software.
Further embodiments may include both initially downloading the intentional door opening detection software from the remote server prior to detecting the opening of the door, followed by uploading the input obtained at step 420, e.g., transmitting the input obtained from the sensor at step 420, to the remote server from the refrigerator appliance after determining that the door opening was intentional (by analyzing the input locally and/or by receiving a user input indicating that the door opening is or was intentional). Thus, the intentional door opening detection software may then be updated or rebuilt by the remote server, and the updated or rebuilt intentional door opening detection software may be downloaded by the refrigerator appliance for use in a subsequent door opening.
Turning now to
Still referring to
In at least some embodiments, the intentional door opening mode may also include gathering data (e.g., input from the sensor operable to detect a user presence) that will be transmitted to the remote server, e.g., in the cloud, to build or update intentional door opening software.
In addition to training, e.g., updating, the intentional door opening detection software, other settings or parameters of the refrigerator appliance may also be adjusted or updated. For example, the refrigerator appliance may be pre-programmed, e.g., at manufacture, with a default door alarm time, e.g., where the door alarm activates after the door has been open for the default door alarm time. In some embodiments, when the input obtained with the sensor includes an identification of a particular user, and the particular user has a greater historical incidence of disabling the door alarm and/or has a history (as indicated, e.g., by data gathered by the refrigerator appliance over time during prior door openings with the same user being identified at the prior door openings, where such data may include a user identity and a door open time, as well as a door alarm status, etc.) of intentionally opening the door for an extended period of time (such as a length of time greater than the pre-programmed default door alarm time), the door alarm time parameter may be updated with a longer door alarm delay time before the door alarm is activated in response to detecting and/or identifying the particular user. Additionally, the door alarm may also or instead be disabled temporarily in response to detecting the particular user, e.g., until the door closing is detected.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Number | Name | Date | Kind |
---|---|---|---|
3996434 | Griffin | Dec 1976 | A |
4528558 | Steers | Jul 1985 | A |
4691195 | Sigelman | Sep 1987 | A |
4894643 | Thompson | Jan 1990 | A |
5063372 | Gillett | Nov 1991 | A |
6401466 | Olsen | Jun 2002 | B1 |
20160091243 | Beier | Mar 2016 | A1 |
20160169576 | Takaki | Jun 2016 | A1 |
20160358508 | Cheatham, III | Dec 2016 | A1 |
20200110532 | Mani | Apr 2020 | A1 |
20200327601 | Kim | Oct 2020 | A1 |
20210131011 | Park et al. | May 2021 | A1 |
20210180857 | Thayyullathil | Jun 2021 | A1 |
20210331328 | Kim | Oct 2021 | A1 |
20220330072 | Zeng | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
208504845 | Feb 2019 | CN |
110553449 | Dec 2019 | CN |
102014004704 | Aug 2015 | DE |
201911048248 | Dec 2019 | IN |
20120126449 | Nov 2012 | KR |
Number | Date | Country | |
---|---|---|---|
20230258398 A1 | Aug 2023 | US |