The present disclosure generally relates to a sensor system for a food preparation environment and, more specifically, to a system that provides interactive feedback to assist in recipe preparation.
According to one aspect of the present disclosure, a motion analysis apparatus for food preparation includes at least one motion detection device configured to capture motion data. A controller is in communication with the motion detection device. The controller is configured to compare the motion data to a food preparation movement and initiate a motion instruction for a first step of a recipe in response to a comparison of the motion data to the food preparation movement.
According to another aspect of the present disclosure, a method for providing a food preparation instruction is disclosed. The method includes accessing a recipe in a food preparation application and capturing motion data in response to a first step of the recipe. The method further includes comparing the motion data to a food preparation movement defined by a first motion profile for the first step of the recipe. In response to a comparison of the motion data to the food preparation movement, the method outputs a motion instruction for the first step.
According to yet another aspect of the present disclosure, a motion analysis apparatus for a handheld appliance is disclosed. The apparatus includes at least one motion detection device configured to capture motion data in connection with the handheld appliance. A controller is in communication with the motion detection device and is configured to access a recipe from a food preparation application and access a motion profile defining the food preparation movement based on a step of the recipe. The controller further receives an operational setting configured to control the handheld appliance based on the step of the recipe and compares the motion data based on movement of the handheld appliance to the food preparation movement. In response to the comparison of the motion data to the food preparation movement, the controller initiates a motion instruction for the step of a recipe.
These and other features, advantages, and objects of the present disclosure will be further understood and appreciated by those skilled in the art by reference to the following specification, claims, and appended drawings.
In the drawings:
The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles described herein.
The present illustrated embodiments reside primarily in combinations of method steps and apparatus components related to a monitoring device and feedback system. Accordingly, the apparatus components and method steps have been represented, where appropriate, by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein. Further, like numerals in the description and drawings represent like elements.
For purposes of description herein, the terms “upper,” “lower,” “right,” “left,” “rear,” “front,” “vertical,” “horizontal,” and derivatives thereof shall relate to the disclosure as oriented in
The terms “including,” “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “comprises a . . . ” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
Referring generally to
As depicted in
In the examples discussed herein, the monitoring device 12 may be coupled to the user 18 or the handheld implement 15, such that the movement of the monitoring device 12 is indicative of the movement of the user 18 and/or the handheld implement 15. In this configuration, a controller 50 (
In general, the monitoring system 10 may provide for visual, audible, and/or tactile cues that lead the user 18 through a recipe accessed via a food preparation application (e.g., a software application or app). Depending on the capability of the monitoring device 12, the display device 16, and an appliance 30, the food preparation application may be displayed on a corresponding display screen 140 (
In operation, the system 10 may monitor the motion data or sensor data captured by the monitoring device 12 in reference to a food preparation movement. The food preparation movement may correspond to a motion profile that may change for each step of a recipe. Examples of motion profiles may include a kneading motion, a stirring motion, a whipping motion, a rolling motion, and various movements that may be associated with one or more steps of the recipe accessed via the food preparation application. Such movements may be defined based on experimental data providing ranges of motion data corresponding to the preferred or best techniques for each step of the recipe. Based on the sensor data reported by the monitoring device 12, the monitoring system may provide feedback in a number of ways. For example, in some instances, feedback and instructions may be communicated to the user 18 via one or more display devices 16 (e.g., tablet, computer, smartphone, smart home hub, etc.). In some case, the feedback or instructions may be communicated directly from the monitoring device 12.
For example, in response to the comparison of the motion data to the predefined motion profile, a controller 50 or processor of the monitoring device 12 may provide feedback to the user 18 indicating whether the detected motion matches the predefined movement or motion profile for the recipe instruction. An exemplary block diagram of the monitoring device is discussed later in reference to
In some cases a recipe may include a number of steps that require one or more movements of the user 18, the handheld implement 15, and/or settings of the appliance(s) 30 required to complete the preparation of a food product. In order to guide or orchestrate each of the steps of the recipe, the system 10 may monitor the sensor data captured by the monitoring device 12. Based on the sensor data, the system 10 may provide feedback to the user 18 regarding a movement or technique associated with each of the preparation steps. For example, in a stirring step of a recipe, the system 10 may detect that motion of a user is too abrupt or changes too rapidly. Based on this detection, system 10 may output an instruction to the user 18 to slow or elongate the movements associated with the monitoring device 12. Similarly, in a whipping step, the system 10 may detect that the movement of the user is less than a desired rate or intensity. In response to such a condition, the system 10 may instruct the user to increase a rate of motion. In this way, the system 10 may provide for feedback and coaching of the user 18 via interactive instructions communicated from the monitoring device 12, the display device 16, and/or the appliance 30.
In some implementations, the controller 50 of the system 10 on which the food preparation application is operating may automatically proceed through steps of the recipe in response to the sensor data detected via the monitoring device 12. For example, the controller 50 may begin a recipe by initiating or displaying a motion instruction for a first step of a recipe. Once initiated, the controller 50 may monitor the sensor or motion data identified by the monitoring device 12 and compare the sensor data to a food preparation movement associated with the first step of the recipe. More specifically, the controller 50 may access a first motion profile defining the food preparation movement based on the first step of the recipe. The step and associated information may be stored in a local memory or the remote database 54. Based on the sensor data, the controller 50 may monitor an accumulation (e.g., summation, integration of numeric model, etc.) of the food preparation movement (e.g., motion data that matches the first motion profile) and determine a completion of the first step of the recipe in response to the accumulation of the food preparation movement meeting or exceeding predefined accumulated motion total associated with the first step. Accordingly, the system 10 monitors the sensor data for motion (e.g., acceleration, velocity, range, variation, etc.) and compares the motion of the user 18 to motion profiles for various recipe steps.
As discussed herein, each of the motion profiles may be defined based on experimental results captured with the monitoring device 12 or similar motion detection devices (e.g., accelerometers, gyroscopes, magnetometers, etc.). In operation, the motion data is captured in terms of linear and/or rotational acceleration. In order to define the motion profiles for each of the food preparation movements (e.g., kneading, stirring, whipping, rolling, etc.), the motion data may be measured and recorded in terms of linear acceleration (e.g. meters per second squared (m/s2) or in G-forces (g)) and rotational acceleration (e.g., degrees per second). For example, in order to define a whipping motion for a particular recipe step, the motion data of the monitoring device 12 may be measured or simulated with a kinematic model of the user 18 having an ideal or model movement, which may be measured from one or more professional chefs or skilled food preparation professionals. Based on the experimental motion data, ranges of acceleration for each of the translational axes 58a and rotational axes 58b defined relative to the coordinate system 58 may be identified for each of the food preparation movements.
In operation, each of the food preparation movements may be defined both quantitatively and qualitatively, and the system 10 may monitor the motion data separately to identify the conformance of the movement generated by the user 18 based on the information reported by the monitoring device 12. That is, the rate of movement of the whisk in connection with the user 18 or directly connected to the monitoring device 12 may be compared to corresponding target rates to identify whether the movement of the user 18 undertaking the food preparation task is at a target intensity or within a target acceleration range. That is, the controller 50 may process the motion data to identify the average peak acceleration and frequency of periodic movements (e.g., back and forth repetitive movements) of the monitoring device 12 in connection with the user 18 and/or the handheld implement 15. Based on the motion data from the monitoring device 12, the controller 50 may determine and discriminate between occurrence of a whipping motion, stirring motion, or various movements of the whisk as defined for different food preparation movements according to a recipe step. The stirring motion may be defined by the recipe step as linear stirring primarily extending along one axis, a rotational stirring creating acceleration in multiple axes, or a beating motion that may correspond to similar motion in multiple axes that is more abrupt (e.g., including greater changes in acceleration direction) than rotational stirring. For each case, the controller may determine whether the food preparation motion conforms to the motion profile defined in the recipe step by comparing the motion data captured by the monitoring device 12 to the motion profile defined in the recipe step.
The determination of whether the motion data detected by the monitoring device 12 conforms to the motion profile defined in the recipe step may be accomplished based on a variety of comparisons. For example, the quantitative assessment of the motion data may be identified based on a comparison of the linear acceleration to a target acceleration rate to determine whether the movement of the user 18 conforms to the motion profile for the recipe step. More specifically, the linear acceleration may be identified along a motion path in reference to the translational axes 58a of the coordinate system 58. The motion path may be compared to the target motion profile based on the peak acceleration in each of the translational axes 58a, which may be monitored as an average of the peak acceleration of the repetitive movement detection over a predetermined time (e.g., 2, 5, 8, etc. seconds). The average peak acceleration may be compared to a predetermined value of the average peak acceleration defined in the motion profile for the recipe step. Based on the comparison between the motion data detected for the user 18 by the monitoring device 12 and the predetermined values stored for the motion profile, the controller 50 may determine the extent to which the average peak acceleration or the acceleration range of the motion data matches the target motion profile.
The level of conformance of the motion of the user 18 to the predetermined motion profile may be identified in response to the average peak acceleration matching the target acceleration defined by the motion profile. In cases wherein the average peak acceleration is less or greater than the target, the controller 50 may control the system 10 to output an instruction to increase or decrease the intensity of the motion. Additionally, the controller 50 may calculate the percent difference between the target acceleration and the motion reported by the monitoring device 12. The percent difference may be communicated to the user 18 via the display devices 16 in the form of an instruction (e.g., “slow down, your motion exceeds the target intensity or acceleration by X %”). Similarly, the duration of the recipe step may be modified by the controller 50 in response to the difference in the intensity of the motion of the user 18 over time compared to the target rate or intensity identified in the motion profile. For example, if the user 18 achieves an average rate of acceleration that is 20% less than the target average acceleration, the controller 50 may extend the time for the recipe step by a corresponding time (e.g., increase from 5 minutes to 6 minutes for a recipe step.) In this way, the controller 50 of the system 10 may monitor the motion data reported by the monitoring device to identify whether the motion qualitatively conforms to the motion profile associate with the recipe step.
In addition to the quantitative assessment, the controller 50 may monitor the motion data from the monitoring device 12 to qualitatively assess the motion data relative to the motion profile. Continuing with the same example, the whipping motion may be defined as a linear stirring motion, a rotational stirring motion, or a beating motion. In order to qualitatively assess the quality of the movement of the user 18 during the recipe step, the controller 50 may compare the motion data to the motion profile based on the variations of the linear or rotation motion. For example, a linear stirring motion may be defined by the motion profile of the recipe as extending along a line or arc, which may correspond to acceleration data that occurs repetitively along a line or vector measured among the translational axes 58a. According, the controller 50 may monitor the motion data detected by the monitoring device 12 to determine the vector along which the acceleration occurs and the conformance to the vector over time. More specifically, the controller 50 may detect a recurring, alternating direction of the linear acceleration of the user 18 to identify conformance with a linear path for the linear stirring motion. The conformance of the acceleration to the linear path may similarly be communicated by the system 10 to the user 18 via the display devices 16. Similarly, movement that does not conform to the linear path may vary among multiple axes or may include components of rotational acceleration that are not indicative of a linear stirring motion.
The detection of such non-conforming movements may be identified by the controller 50 and communicated to the user 18 to correct the movement associated with the current recipe step. For example, in response to the motion data failing to conform to a linear path, the controller 50 may output instructions to the user 18 via the display devices 16 instructing the user to, “Adjust motion to a linear path. The current motion includes more circular or rotational motion than desired.” In addition to the text or audible instructions, detailed figures may be displayed demonstrating the linear path in relation to a hand of a representative user completing the movement. In some examples, an example video clip or animated graphic (e.g. GIF) of the target movement associated with the predetermined motion profile for the recipe step may be displayed on one or more of the display devices 16. In this way, the controller 50 of the system may monitor the motion data to determine whether movement of the user 18 qualitatively conforms to the model technique instructed for each recipe step. As discussed throughout the disclosure, the motion profiles or predefined movements associated with each of the recipe steps may be defined based on experimental data providing ranges of motion data (average acceleration, range of acceleration, linear versus rotational motion, jerk/abruptness or rate of change of direction of the acceleration, frequency, stroke length, etc.) corresponding to the preferred or best techniques for each step of the recipe. Based on the sensor data reported by the monitoring device 12, the monitoring system may provide feedback to the user in a number of ways to assist the user 18 in conforming to the recipes steps.
The monitoring of the sensor data captured by the monitoring device 12 may be applied in a variety of ways as discussed in the following exemplary use cases. For example, the sensor data may be applied by the system 10 to provide feedback to the user 18, track the progress of completion of each recipe step, and determine or estimate a quality of the finished food product produced with the recipe based on the performance of the user 18 relative to the predefined movements and steps of the recipe. In this way, the system 10 may provide tracking and feedback for each step of the recipe as well as make qualitative assessments of the movement identified in the sensor data. Based on the motion or sensor data, the system 10 may be operable to distinguish effective movements that corresponds to an ideal or model food preparation technique (e.g., stirring, whipping, kneading, etc.) from movements that are not associated with the preparation techniques and motion profile associated with the pending step of the recipe. Based on this comparative assessment, the system 10 may monitor the rate of completion or time necessary to complete each step of the recipe and provide instructions as to how the technique of the user 18 can be improved to conform to the motion profile.
The quality assessment may further be applied by the controller 50 to adjust instructions or automated steps associated with one or more of the appliances 30. For example, based on the performance of the user 18 relative to the target or predefined motion profiles, the system 10 may estimate a resulting quality of the resulting food product (e.g., product less airy or fluffy as a result of poor whipping form). The quality assessment may further be evaluated to adjust recipe instructions or settings of one or more of the appliances 30. In some cases, the quality assessment may be implemented to adjust the temperature of a refrigerator 30d for storage or a baking temperature of the oven 30c in order to optimize the resulting quality of the food product. Accordingly, the disclosure provides for a wide range of corrective instructions and tracking operations that are applied by the system 10 to improve the interactive operation with the food preparation application.
As previously discussed, the movement detected via the monitoring device 12 during the first step of the recipe may be tracked by the controller 50 to detect whether the accumulated motion that conforms to the motion profile for the food preparation motion of the first step is sufficient to indicate that the first step is complete. For example, if the first step is a whipping step, the controller 50 may track the detected movement associated with an effective whipping technique based on the predefined motion profile to determine a rate at which the whipping is completed. Based on the determination, a duration of the whipping step may be increased or decreased. Once the accumulated motion associated with effective whipping is determined by the controller 50 to be complete, the controller 50 may identify the completion of the first step of the recipe and move forward to a second step of the recipe. Upon completion of the first step, the controller 50 may output an instruction identifying the second step to the user 18 to indicate the next step required to complete the cooking operation as defined by the recipe.
In addition to providing instructions for manual user operations, the system 10 may similarly be applied to control one or more appliance(s) 30 to complete various preheating, temperature adjustment, or more generally, setting adjustments required to complete one or more of the steps of the recipe. For example, one or more of the steps controlled by the system 10 may communicate a control instruction to one of more appliance(s) 30. For example, the communication interface(s) 40 of the system 10 may provide for communication between the controller 50 and the one or more appliance(s) 30. In some cases, the controller 50 may adjust or control an oven 30c or burner setting of a range 30b. For example, based on a recipe step, the controller 50 may automatically set the temperatures and cooking cycles needed for cooking the recipe with a cooktop 30a, freestanding range 30b, oven 30c or similar heating devices or appliances. Similarly, the controller 50 may communicate a temperature setting to a refrigerator 30d, which may adjust the temperature of the refrigerator 30d to support a pending cooling action required for the recipe. In addition to or as a trigger to select the step associated with the automatic control of the appliance(s) 30, the controller 50 may monitor the sensor data for movements of the user 18 indicative of one or more gestures (e.g. a page turn motion, pointing motion, a swipe, a twist of the hand or wrist, etc.) to activate the automated control of the appliance(s) 30. Accordingly, the motion data may be monitored by the controller 50 of the system 10 to determine the completion of a step in the recipe and/or the system may monitor the sensor data for various cues or gestures to trigger the activation of one or more recipe steps.
As shown in
Referring to
In some cases the controller 50 or a control module/processor in communication with the controller 50 may be incorporated in one or more of the appliance(s) 30. In such cases, the operation (e.g., heating, cooling, control settings, etc.) of the appliance(s) 30 may be controlled by the controller 50 and notifications from the system 10 may be communicated via one or more user interfaces (e.g., digital displays, audible indicators, etc.) of the appliance(s) 30. In this configuration, the appliance(s) 30 may be controlled by the system 10 to output instructions, notifications, or automated controls associated with the recipes of the food preparation application. Accordingly, the monitoring system 10 may be configured to control or receive motion data and inputs from a variety of sensors 22, IMUs 22a, appliance(s) 30, and handheld implements 15 to support the food preparation operation. In cases where the system 10 implements various connected devices 52 as demonstrated in
Still referring to
In various examples, the first communication interface 40a may correspond to a first communication protocol that communicatively couples with the second communication interface via one or more of the connected devices 52. In such implementations, the first communication interface 40a may utilize Bluetooth®, Bluetooth® Low Energy (BLE), Thread, Ultra-Wideband, Z-Wave®, ZigBee®, or similar communication protocols. The second communication interface 40b may correspond to different wireless communication protocol than the first communication interface including, but not limited to, global system for mobile communication (GSM), general packet radio services (GPRS), code division multiple access (CDMA), enhanced data GSM environment (EDGE), fourth-generation (4G) wireless, fifth-generation (5G) wireless, Wi-Fi, world interoperability for microwave access (WiMAX), local area network (LAN), Ethernet®, etc. Though discussed as implementing different wireless communication protocols, the first and second communication interfaces 40a, 40b may alternatively be implemented as a single, common communication interface and protocol.
As discussed herein, the IMU 22a may include a plurality of inertial sensors configured to monitor the orientation and inertial states of the monitoring device 12 in a coordinate system 58 comprising a plurality of translational axes 58a and rotational axes 58b. In this configuration, the IMU 22a may be operable to measure and communicate a direction and magnitude of movement along and/or about each of the axes 58a/58b and communicate such information to the processing unit of the monitoring device 12. In some embodiments, the IMU 22a may further comprise a magnetometer or magnetic field detector. The magnetometer may correspond to one or more magnetometers. Magnetometers may be configured to detect magnetic fields and may be utilized to orient the monitoring device 12 to provide a bearing or azimuthal direction of a magnetic pole relative to the monitoring device 12. In this configuration, a magnetometer may be operable to measure the azimuthal direction of the monitoring device 12 and communicate the directional data to the processing unit.
As discussed herein the IMU 22a of the monitoring device 12 may correspond to various forms of accelerometers, gyroscopes, and/or magnetometers, which may be implemented separately or as integrated circuits. As non-limiting examples, the IMU 22a may correspond to accelerometers, gyroscopes, and magnetometers based on Micro-Electro-Mechanical Systems (MEMS) technology, Fiber Optic Gyros (FOG), and/or Ring Laser Gyros (RLG). Some exemplary devices that may be suitable for at least some applications as discussed herein may include devices from a variety of manufacturers including: Analog Devices, Inc.; ST Microelectronics, N.V.; Kionix, Inc.; etc. It will be appreciated that IMUs 22a may vary in quality, grade, performance, and/or durability across manufacturers and product lines.
As further discussed in reference to
Referring now to
Though various exemplary applications are discussed, the monitoring system 10 may generally implement the monitoring device 12 in a food preparation environment to provide an interactive cooking experience via one or more computerized display devices 16 (e.g. tablet, computer, smartphone) in communication with the monitoring device 12. As elaborated in various detailed examples, the system 10 provides feedback to a user 18 as well as control of one or more appliance(s) 30. In this way, the system 10 may provide for an assisted cooking experience that not only improves the resulting food products but also instructs the user 18 to improve cooking and food preparation techniques through interactive instructions.
Referring now to
Once the user 18 has received the instruction 70, the controller 50 may activate the monitoring device 12 to compare the movement of the user 18 detected with the IMU 22a to a predefined motion profile. As previously discussed, in order to monitor a manual food preparation step, such as whipping the eggs 72, the controller 50 of the system 10 may compare the sensor or motion data recorded by the IMU 22a to the motion profile accessed via the remote database 54. The motion profile may comprise acceleration data ranges as well as periodic cycles of the translational and/or rotational motion data for each of the axes 58a/58b that correspond to a targeted whipping operation. If the movement of the user 18 conforms to the motion profile, the food preparation application may respond by confirming that the motion detected by the IMU 22a conforms to the proper technique for the whipping operation. If the motion detected by the IMU 22a differs, the controller 50 may communicate an instruction 70 to the user 18 to adjust speed, adjust a length or direction of stroke, or adjust a technique (e.g., a wrist rotation). As shown in
Based on the instructions 70, including audible cues, visual cues and tips, and tactile feedback, the user 18 may continue to complete each step of the recipe as guided by the food preparation application. Throughout the various steps of the recipe, the controller 50 of the system 10 may identify various user inputs to various connected devices 52 (e.g., the monitoring device 12, to the display device 16, an appliance 30, etc.). The controller 50 may selectively increment through each step of a recipe and access future steps and corresponding data (e.g. motion profiles, instructions, etc.) via the remote database 54. In the specific example of whipping the eggs 72, the controller 50 of the system 10 may identify that the whipping process is complete based on the sensor data reported from the IMU 22a. For example, the controller 50 may monitor the sensor or motion data identified by the monitoring device 12 and compare the sensor data to the motion profile for the whipping movement. Based on the sensor data, the controller 50 may monitor an accumulation of the food preparation movement (e.g., motion data that matches the whipping motion profile) and determine a completion of the first step of the recipe in response to the accumulation of the food preparation movement meeting or exceeding predefined accumulated motion total associated with the whipping step.
Once the movement total meets or exceeds the motion total indicated for the recipe, the controller 50 may prompt the user 18 to activate the next step via the display device 16 or the monitoring device 12 or automatically increment the recipe to the next step in the food preparation application. As previously discussed, the user input may be received via a touchscreen or button of the user interface of the display device 16, the monitoring device 12, or other connected devices 52. Alternatively, the controller 50 may be configured to detect a gesture (e.g. a page turn motion, pointing motion, a swipe, etc.) to indicate that the recipe should continue to the next step. In this way, the system 10 may provide for automatic and/or intuitive operation of the food preparation program by the user 18.
Referring now to
In addition to communicating the operating instructions or control commands to the handheld implement 15 (i.e., the handheld appliance in this example), the controller 50 may monitor the sensor data captured by the implement tracking device 12b to determine whether the device is activated (e.g., by vibrations identified via the IMU 22a) and to compare the movement of the user 18 to a motion profile for the corresponding blending step of the recipe. Similar to the manual whipping operation discussed previously, the controller 50 may compare the sensor data to the motion profile for the blending movement. Based on the sensor data, the controller 50 may monitor an accumulation of the food preparation movement (e.g., motion data that matches the blending motion profile) and determine a completion of the blending step of the recipe in response to the accumulation of the food preparation movement meeting or exceeding predefined accumulated motion total associated with the blending step. In this way, the activation of the appliance may be detected or controlled by the monitoring device 12 and the movement of the user 18 may be monitored. As previously discussed, the movement of the user 18 may be compared by the controller 50 to the motion profile in order to provide corrective instructions and detect an accumulated or effective level or amount of the blending applied. Once the detected blending level meets or exceeds a total defined for the step of the recipe, the controller 50 may determine that the blending step is completed and move forward to the next step in the recipe and/or provide the user 18 with instructions to move to the next step in the recipe.
In some cases, the sensors (e.g., the IMU 22a) of the monitoring device 12 may be configured to detect a quality of a substance or the mixture 80 being blended or mixed by the handheld implement 15. For example, when coupled to the handheld implement 15, the vibrations measured by the IMU 22a of the monitoring device 12 may differ significantly when the blender or mixer initially mixes ingredients as compared to the vibrations measured once the ingredients have been mixed together into a homogenous or evenly distributed mixture. That is, the initial mixing of various batters or mixtures for food preparation may include drastic variations in vibrations associated with the operation of churning blades or mixing heads as a result of passing through substances that are unevenly distributed or unevenly mixed. By comparison, once the mixture 80 of ingredients reaches an even consistency or evenly mixed combination, the variations in vibration diminish significantly. For example, thoroughly mixed substances generally provide for consistent vibrations when compared to ingredients that are unevenly distributed. Similarly, the vibrations associated with substances that change states (e.g., melting to liquid, whipping fluffing from liquid) also result in changes in vibrational frequency and intensity that may be reported by the monitoring device 12 and detected by the controller 50. In response to the changes in the vibration or force response of the operation of the handheld implement 15 (e.g., the blender, mixer, etc.) or appliance 30, the controller 50 may detect a state of the substance being processed. Based on the response, the controller 50 may detect whether the blending or mixing operation is complete.
Additionally, in some cases, the controller 50 may detect whether the consistency of the mixture 80 is too thin or thick. For example, if a mixture does not include adequate moisture, the intensity of the vibrations reported by the monitoring device 12 may not decrease over time at an expected rate. For example, if the variation in vibrational intensity does not decrease at a predetermined rate associated with the recipe (e.g., the intended consistency of the mixture 80) and the mixing utility (e.g., the handheld implement 15), the controller 50 may infer that the mixture is too thick. In response to such a determination, the controller 50 may output an instruction 82 to the user 18 indicating that more moisture (e.g., milk, water, oil, etc.) should be added to the recipe. Additionally or alternatively, the controller 50 may include in the instruction 82 that the user should review the ingredients added and the previous recipe steps to ensure that ingredients were added in appropriate proportions as instructed. In this way, the sensor data detected in relation to the operation of the handheld implement 15 or appliance 30 may be processed by the controller 50 to determine a quality of the mixture 80 or food product associated with or processed in the current step of the recipe. Though discussed in reference to the mixture being too thick, it shall be understood that the same attributes of the vibrations reported by the monitoring device could be applied to conversely identify if too much liquid is included in the mixture 80.
Referring now to
The quality assessment may further be applied by the controller 50 to adjust instructions or automated steps associated with one or more of the appliances 30. For example, the quality assessment may further be evaluated to adjust recipe instructions or settings of one or more of the appliances 30. In some cases, the quality assessment may be implemented to adjust the temperature of a refrigerator for storage or a baking temperature of the oven 30c in order to optimize the resulting quality of the food product. For example, in response to a dough being processed or kneaded greater than or less than a target level, the controller 50 may adjust a recipe instruction or automated command to the oven 30c to increase or decrease a baking temperature or time. That is, the controller 50 may infer the consistency of the dough based on the sensor data reported by the monitoring device 12 and adjust the baking time or baking temperature to bake the dough for a different duration and/or temperature identified according to the consistency. In this way, the sensor data may be applied by the controller 50 to adjust the recipe instructions and/or automated commands for the appliances 30 to optimize the resulting quality of the food product.
Still referring to
Referring now to
In the example shown in
Referring now to
Once the recipe is selected or accessed in step 104, the recipe may be communicated to the user 18 by the system 10 (106). As discussed in various examples, the recipe instruction may be communicated to the user 18 via the display device 16, the monitoring device 12, or other connected devices 52. Once the instruction is communicated, the routine 100 may continue to activate the monitoring device 12 to capture sensor data indicative of the movement of the user 18 (108). Based on the recipe instruction, the controller 50 may access a predefined motion profile identifying a target range or multi-axis translational and rotational acceleration profile associated with the pending step of the recipe. More specifically, the controller may access a first motion profile defining the food preparation movement based on the first step of the recipe. The step and associated information may be stored in a local memory or a remote database 54.
Based on the sensor data, the controller may determine if the sensor data indicates that the motion of the user 18 is within a target range (110). If the motion is not within the target range, the controller 50 may update or output a control instruction from the display device 16, monitoring device 12 or various connected device 52 to provide a corrective motion instruction to the user 18 (112). Based on the response of the user 18 to the corrective motion instruction as detected from the sensor data, the controller 50 may update the recipe instruction. For example, the controller may increase a time for a whipping instruction, instruct additional rolling or kneading, or provide updated technique or movement instructions based on the motion identified by the monitoring device 12 (114). Following the updating of the recipe instruction, the routine 100 may return to step 108 to monitor the sensor data.
In step 110, if the motion is within the target range, the controller 50 may monitor the time or an accumulation of a food preparation movement (e.g., motion data that matches the first motion profile) within the target range (116). Based on the accumulated motion identified from the sensor data, the controller 50 may determine if the accumulated motion meets or exceeds a target accumulated motion for the recipe step to determine a recipe step completion (118). If the total motion within the target range is less than the target accumulated motion for the recipe step, the controller 50 may update the recipe instruction to indicate a remaining time, steps, or modified techniques necessary to complete the recipe step and return to step 116 to monitor the sensor data (120). If the accumulated motion is greater than or equal to the total for the recipe instruction in step 118, the controller 50 may continue to determine if the recipe is complete or if the recipe includes further instructions (122). If the recipe includes further instructions, the routine 100 may move to the next recipe step (124). If the recipe step analyzed in step 122 is the final step, the controller 50 may end the routine for the food preparation application (126). Though described in the routine 100 as an automated recipe step initiation, the user 18 may be prompted or otherwise use one of the connected devices 52 to manually select a recipe step as discussed in various examples.
Referring now to
The memory 134 may correspond to a dedicated memory (e.g. RAM, ROM, Flash, etc.)
and may further include a non-transient, computer readable medium, such as electronic memory or other storage medium such as hard disk, optical disk, flash drive, etc., for use with the processing unit 132. For example, an operating system may be stored in the storage medium and provide for the monitoring device 12 to control and manage system resources, enable functions of application software (e.g., the food preparation application), and interface application programs with other software and functions of the monitoring device 12. The I/O interface 136 may correspond to a variety of interfaces configured to provide communication to and from the processing unit. For example, the I/O interface may correspond to one or more of a variety of I/O interfaces including, but not limited to, PCIe ports, SATA interface ports, Intel® QuickPath Interconnect® (QPI) ports, USB 2.0, USB 3.0, or Thunderbolt™ interfaces (10 Gbps), Firewire, Fire800, Express Card, serial memory interfaces, etc.
As previously discussed, the functionality or extent of operating features of the monitoring device 12 may vary for different applications. In some cases, the monitoring device 12 may correspond to a wearable smart device (e.g., smart watch), which may include a display screen 140, speaker 142, microphone 144, and/or a user interface 146 (e.g., pushbuttons or touchscreen of display screen 140). Accordingly, the monitoring device 12 may be implemented by the system 10 to varying extents based on the features available in the monitoring device 12. That is, in some instances, the monitoring device 12 may passively detect and communicate the sensor data to the system 10 while in other cases, the monitoring device may additionally provide one or more audible, visual, and/or tactile alerts as well as receiving user input to control the operation or provide feedback to the food preparation application operated on the system 10 via the user interface 146. Accordingly, the monitoring device may be flexibly implemented in the system 10 to suit a desired application.
In various applications, the monitoring device 12 may comprise a wireless communication circuit 150. The communication circuit 150 provides the operating functionality of the communication interface 40 as previously discussed. That is, wireless communication circuit 150 may provide for communication via the first communication interface 40a and/or the second communication interface 40b via one or more circuits. The first communication interface 40a may provide for a first communication protocol including, but not limited to, Bluetooth®, Bluetooth® Low Energy (BLE), Thread, Ultra-Wideband, Z-Wave®, ZigBee®, or similar communication protocols. The second communication interface 40b may provide for a second communication protocol including, but not limited to, global system for mobile communication (GSM), general packet radio services (GPRS), code division multiple access (CDMA), enhanced data GSM environment (EDGE), fourth-generation (4G) wireless, fifth-generation (5G) wireless, Wi-Fi, world interoperability for microwave access (WiMAX), local area network (LAN), Ethernet®, etc. Though discussed as implementing different wireless communication protocols, the first and second communication interfaces 40a, 40b may alternatively be implemented as a single, common interface and protocol.
Accordingly, the system and methods discussed herein provide for the monitoring of the sensor data captured by a monitoring device that may be applied in a variety of ways. In various cases, the sensor data may track the progress of steps of a recipe completed by a user as well as track the movement or vibrational feedback of an appliance or handheld implement used to prepare a food product. In this way, the system may provide feedback to the user, track the progress of completion of each recipe step, and/or determine or estimate a quality of the finished food product produced with the recipe based on the performance of the user relative to the predefined movements and steps of the recipe. In this way, the system may provide tracking and feedback for each step of the recipe as well as make qualitative assessments of the movement identified in the sensor data. Accordingly, the disclosure may be applied in a variety of ways to assess and improve a food preparation or cooking operation based on the sensor data and methods discussed herein.
As discussed herein, the controllers or processors of the disclosed system may correspond to devices that perform computer-implemented processes and apparatuses for practicing those processes. Implementations also may be embodied in the form of a computer program product having computer program code containing instructions embodied in non-transitory and/or tangible media, such as hard drives, USB (universal serial bus) drives, or any other machine readable storage medium, where computer program code is loaded into and executed by a computer, which provides for the configuration of a special-purpose control device, controller, or computer that implements the disclosed subject matter. Implementations also may be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer/controller, or transmitted over some transmission medium or communication interfaces, wherein when the computer program code is loaded into and executed by controller, the controller becomes an apparatus for practicing implementations of the disclosed subject matter. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits. Implementations of the controller may be implemented using hardware that may include a processor, for example, circuits such as an ASIC (Application Specific Integrated Circuit), portions or circuits of individual processor cores, entire processor cores, individual processors, programmable hardware devices such as a field programmable gate arrays (FPGA), and/or larger portions of systems that include multiple processors or multiple controllers that may operate in coordination.
In some implementations, the disclosure provides for a motion analysis apparatus for food preparation comprising at least one motion detection device configured to capture motion data, wherein the motion detection device comprises at least one of a linear acceleration sensor and an angular rate sensor; and a controller in communication with the motion detection device, wherein the controller is configured to: compare the motion data to a food preparation movement; and initiate a motion instruction for a first step of a recipe in response to a comparison of the motion data to the food preparation movement.
In various implementations, the systems and methods described in the application may comprise one or more of the following features or steps alone or in combination:
In some implementations, a method for providing a food preparation instruction comprising: accessing recipe in a food preparation application; capturing motion data in response to a first step of the recipe; comparing the motion data to a food preparation movement defined by a first motion profile for the first step of the recipe; and output a motion instruction for the first step in response to a comparison of the motion data to the food preparation movement.
In various implementations, the systems and methods described in the application may comprise one or more of the following features or steps alone or in combination:
In some implementations, a motion analysis apparatus for a handheld appliance comprising: at least one motion detection device configured to capture motion data in connection with the handheld appliance; and a controller in communication with the motion detection device, wherein the controller is configured to: access a recipe from a food preparation application for a food product; access a motion profile defining the food preparation movement based on a step of the recipe; receive an operational setting configured to control the handheld appliance based on the step of the recipe; compare the motion data based on movement of the handheld appliance to the food preparation movement; and initiate a motion instruction for the step of a recipe in response to the comparison of the motion data to the food preparation movement.
In various implementations, the systems and methods described in the application may comprise one or more of the following features or steps alone or in combination:
It will be understood by one having ordinary skill in the art that construction of the described disclosure and other components is not limited to any specific material. Other exemplary embodiments of the disclosure disclosed herein may be formed from a wide variety of materials, unless described otherwise herein.
For purposes of this disclosure, the term “coupled” (in all of its forms, couple, coupling, coupled, etc.) generally means the joining of two components (electrical or mechanical) directly or indirectly to one another. Such joining may be stationary in nature or movable in nature. Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components. Such joining may be permanent in nature or may be removable or releasable in nature unless otherwise stated.
It is also important to note that the construction and arrangement of the elements of the disclosure as shown in the exemplary embodiments is illustrative only. Although only a few embodiments of the present innovations have been described in detail in this disclosure, those skilled in the art who review this disclosure will readily appreciate that many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited. For example, elements shown as integrally formed may be constructed of multiple parts or elements shown as multiple parts may be integrally formed, the operation of the interfaces may be reversed or otherwise varied, the length or width of the structures and/or members or connector or other elements of the system may be varied, the nature or number of adjustment positions provided between the elements may be varied. It should be noted that the elements and/or assemblies of the system may be constructed from any of a wide variety of materials that provide sufficient strength or durability, in any of a wide variety of colors, textures, and combinations. Accordingly, all such modifications are intended to be included within the scope of the present innovations. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the desired and other exemplary embodiments without departing from the spirit of the present innovations.
It will be understood that any described processes or steps within described processes may be combined with other disclosed processes or steps to form structures within the scope of the present disclosure. The exemplary structures and processes disclosed herein are for illustrative purposes and are not to be construed as limiting.
This application claims priority under 35 U.S.C. § 119(e) and the benefit of U.S. Provisional Application No. 63/262,819 entitled SENSOR SYSTEM AND METHOD FOR ASSISTED FOOD PREPARATION, filed on Oct. 21, 2021, by Pratyaksh Rohatgi et al., the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63262819 | Oct 2021 | US |