The embodiments herein relate to diet monitoring systems and more particularly to a personalized diet monitoring system of a user. The present application is based on, and claims priority from an Indian Application Number 5637/CHE/2013 filed on 6 Dec. 2013, the disclosure of which is hereby incorporated by reference herein.
Generally, health problems are often caused to people by eating excessive quantity of food. For example, by consuming more calories of food, people become obese. Similarly, by consuming excessive quantities of saturated fats, cholesterol levels in the body increases. Other health problems are caused to people due to insufficient fiber content in their diet. Hence, people's diet should be monitored constantly and remind about their food intake details. For example, there are people with chronic disease like Diabetic; such people must be reminded about regular food intake in specific intervals. Currently, existing fitness and food consumption tracking applications allows the people to remember and manually enter details of food in the application for every meal.
Some existing systems measure the food consumption information of a user by using different devices associated with the user's articles. For example, some existing systems measures food consumption by using cameras associated with wearable devices, to take pictures of the food being consumed by the user. However, for such systems, the user should manually trigger the devices to provide the input pictures of the food being consumed by the user. In other existing systems, the imaging devices should be focused towards a food source manually. As a result, human intervention is required each and every time when the user is consuming the food which is a cumbersome process.
Thus there remains a need of system and method for automatically capturing the food consumption information of a user.
The principal object of the embodiments herein is to provide a system and method for capturing food consumption information of a user by automatically triggering one or more input means. The input means can be automatically triggered by detecting one or more food consumption actions of the user.
Another object of the embodiments herein is to automatically trigger one or more imaging members to capture plurality of pictures of the food being consumed by the user, when a food consumption action is detected.
Yet another object of the embodiments herein is to automatically trigger one or more voice input means to capture voice data relating to the food being consumed by the user, when a food consumption action is detected.
Yet another object of the embodiments herein is to automatically trigger one or more scanning members to capture code data relating to the food being consumed by the user, when a food consumption action is detected.
Yet another object of the embodiments herein is to automatically identify information relating to the food being consumed by a user based on user's history information, and user's personal preferences, when a food consumption action is detected.
Yet another object of the embodiments herein is to generate one or more recommendations relating to the food being consumed by a user.
Accordingly the embodiments herein provide a method for capturing food consumption information of a subject. The method comprises detecting one or more food consumption action(s) of the subject. If the food consumption action is detected, the method further comprises automatically triggering one or more input means to capture information relating to the food being consumed by the subject.
Accordingly the invention provides an electronic device for capturing food consumption information of a subject, the electronic device comprising an integrated circuit. Further the integrated circuit comprises a processor, and a memory. The memory comprises a computer program code within the integrated circuit. The memory and the computer program code with the processor cause the device to detect one or more food consumption action of the subject. If the food consumption action is detected, the electronic device is further configured to automatically trigger one or more input means to capture information relating to the food being consumed by the subject.
Accordingly the invention provides a computer program product comprising computer executable program code recorded on a computer readable non-transitory storage medium, the computer executable program code when executed, causing the actions including detecting one or more food consumption action of a subject. If the food consumption action is detected, the computer executable program code when executed, causing further actions including automatically triggering one or more input means to capture information relating to the food being consumed by the subject.
These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
This invention is illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The embodiments herein achieve a method and system for capturing food consumption information of a subject. The subject can be a user whose diet should be monitored. In an embodiment, the method includes automatically triggering one or more input means to capture information relating to the food being consumed by the subject (user). In an embodiment, the food includes but is not limited to a solid food, liquid nourishment (such as beverages), medicine and water. The input means can be automatically triggered by detecting one or more food consumption actions of the user. For example, the input means can be, but not limited to an imaging member, a voice input means, user's historic information, user's personalized preferences, and a scanning member and so on. In an embodiment, the input means can be a wearable or non-wearable members associated with user's articles. Further, the method includes generating one or more recommendations relating to the captured food information. Furthermore, the method includes providing the generated recommendations to the user and/or to a guardian of the user.
Unlike conventional systems, the disclosed method and system does not require any manual intervention and can automatically trigger one or more input means to capture the food consumption information of the user. The input means can be, but not limited to, an imaging member, a scanning member, and a voice recognition member. In the existing system, only the imaging member is used to capture the food consumption of the user, whereas the proposed method uses different input means to capture the food consumption information. Further, in the proposed method, plurality of input means work together and detect the food consumption information. Thus, the proposed method enhances the user experience while consuming the food.
Referring now to the drawings, and more particularly to
The display member 101 can be configured to allow the user to provide user's personalized food data. Further, the display member 101 can be configured to provide the recommendations relating to the food being consumed by the user. For example, the recommendation output can be audio, visual, text, voice, photo, light, vibration, ring tone or essentially any other type of output.
The imaging member 102 captures one or more pictures of the food being consumed by the user. For example, the imaging member is a camera. In an embodiment, the imaging member 102 automatically captures the pictures of the food located in the vicinity of the user.
The scanning member 103 scans the code (for example, RFID or bar code) available in the food material being consumed by the user. In an embodiment, the scanning member 103 automatically scan the code available in the food located in the vicinity of the user. The located food in the vicinity of the user can be the food items consumed by the user daily. The voice recognition member 104 captures the voice input, relating to the food consumed by the user.
In an embodiment, the controlling member 105 can be configured to automatically trigger the input means to capture information relating to the food being consumed by the user. The input means can be automatically triggered by detecting one or more food consumption actions of the user. For example, when the user starts consuming the food, the controlling member 105 automatically triggers an imaging member, such as a camera, to capture a plurality of pictures of the food. Further, each input means is capable of performing one or more functions such as, but not limited to, eating pattern recognition, human motions, facial recognition, gesture recognition, food recognition, voice recognition, bar code recognition and so on.
The controlling member 105 can be configured to identify information related to the food being consumed by the user. For example, if the user is consuming liquid nourishment, the controlling member 105 identifies information, such as protein, calorie, nutrient, fat, sugar, carbohydrates, and so on, related to the liquid nourishment. In an embodiment, while displaying the data, quantitative values may be associated with the constituents, and can be measured in any suitable units such as teaspoons or grams. Furthermore, the controlling member 105 can be configured to generate and provide recommendations to the user, relating to the identified food information. In an embodiment, the controlling member 105 can be configured to provide recommendation about the food located in the vicinity of the user.
Further, the controller member 105 can be configured to generate the recommendation to the user relating to the food consumed or about the food located within the vicinity of the user. In an embodiment, the controlling member 105 can be configured to dynamically switch the profile associated with the user by detecting a pattern of food consumption action of the user.
In an embodiment, the communication interface member 106 provides various communication channels between the electronic device 100 and other devices connected to the system. The communication channel can be a wireless communication such as, but not limited to, a Bluetooth, Wi-Fi, and the like. For example, after generating the recommendations relating to the food being consumed by the user in the electronic device 100, the controlling member 105 provides the generated recommendation to guardian or care taker of the user through the communication interface member 106. In an embodiment, the communication interface member 106 can be configured to provide necessary communication channels to correlate the captured information with the food item descriptive table available in an online database.
In an embodiment, the storage member 107 stores the user's food history data and user's personalized preferences. In an embodiment, the storage member 107 stores the food item descriptive table which includes the details of all available food, for example, constituent data (such as calorie, proteins and the like), pictures, and videos of each food item.
At step 202, the method 200 includes determining whether any food consumption action is detected. The method 200 allows one or more sensors associated with user's articles, to detect the food consumption action. The different sensors includes accelerometer, inclinometer, motion sensor, sound sensor, smell sensor, blood pressure sensor, heart rate sensor, EEG sensor, ECG sensor, EMG sensor, electrochemical sensor, gastric activity sensor, GPS sensor, location sensor, image sensor, optical sensor, piezoelectric sensor, respiration sensor, strain gauge, electrogoniometer, chewing sensor, swallow sensor, temperature sensor, and pressure sensor. For example, when the user performs actions like moving hand with a spoon towards his/her mouth, rolling of hand, wrist, or arm, acceleration or inclination of lower arm or upper arm, bending of the shoulder, elbow, wrist, or finger joints, movement of the jaws, the sensors or imaging members associated with the user's articles, such as a wrist watch or mug, determine that the food consumption action is detected.
At step 203, the method 200 includes automatically triggering the input means to capture information relating to the food if the food consumption action is detected. The method 200 allows the controlling member 105 to automatically trigger one or more input means. The input means can be, but not limited to, an imaging member (such as a camera), a voice input means, user's historic information, user's personalized preferences, and a scanning member (such as a RFID/Bar code scanner) and so on. For example, when the food consumption action is detected, the voice input means is automatically triggered and provides a request to the user to feed the input. Further, the voice input means captures voice commands provided by the user. In an embodiment, the method 200 allows the user to select a method of providing the input means among the variety of input means available.
At step 204, the method 200 includes identifying the food type by correlating the captured food information with the food item descriptive table. The food item descriptive table can be configured to store all available food details, for example, constituent data of each food (such as calorie, proteins and the like), pictures, videos, and so on. The food item descriptive table can be an online database providing the details of the food. The method 200 allows the communication interface member 106 to provide necessary communication channels to correlate the captured information with the food item descriptive table. The database may also contain real-time user location, body mass index (BMI) history, medical history, risk factors associated with various diseases and medical conditions such as obesity and diabetes, demographic diversity, availability of food resources to the user at various times of the day, and relevant epidemiological parameters, and so on. In an embodiment, the food item descriptive table can be stored in the storage member 107. In an embodiment, the food item descriptive table can be associated with the server. In an embodiment, any suitable communication channel can be used to provide communication between the electronic device 100 and server. For example, the communication channel can be, but not limited to, a wireless network, wire line network, public network such as the Internet, private network, general packet radio network (GPRS), local area network (LAN), wide area network (WAN), metropolitan area network (MAN), cellular network, public switched telephone network (PSTN), personal area network, and the like. The method 200 allows the controlling member 105 to correlate the captured food information with the information available in the food item descriptive table. For example, the imaging member, such as a camera, captures a plurality of pictures of the food being consumed by the user. Further, the controlling member 105 correlates the captured food pictures with the online food item descriptive table and identifies the food type. In an embodiment, the method 200 allows the electronic device 100 to identify the location of the user by using suitable techniques such as GPS or by receiving manual or voice inputs from the user. The electronic device 100 may store a record of time and location at which the user consumes the food item for each and every occurrence. For example, sometimes the user consumes food items at outdoors such as, restaurants, hotels, and so on. The method 200 allows the electronic device 100 to provide the location and time details in the food history.
At step 205, the method 200 includes computing the food constituent's data. The food constituent's data includes information about the food being consumed by the user. For example, constituents such as calories, proteins, fat, carbohydrates, protein, amino acids and so on present in the food. The method 200 allows the controlling member 105 to compute the food constituent's data by matching identified food information with the food item descriptive table.
At step 206, the method 200 includes generating a recommendation related to the computed food constituents data and food quantity. The method 200 allows the controlling member 105 to generate the recommendation by analyzing the pictures of the user mouth and the captured pictures of the food reachable to the user mouth. In an embodiment, recommendations related to user's health, such as exercise plans, absence of food activity by the user and so on are generated and suggested to the user. For example, when current values of the user's periodic nutritional parameters reach maximum or sufficient, a recommendation is generated indicating the same to the user. In an embodiment, the recommendations can be based on user's historic information. For example, if the user consumes medicine before his/her meal every day, a recommendation is generated to the user if the user forgets to take the medicine before his/her meal. For example, the method 200 concludes that the user should not eat the food item and then recommend the user accordingly. The recommendation is generated, for example, based on the food identification data and the user's personalized food preferences. For example, if the identified food contains one or more items to which the user is allergic or intolerant or dislikes, the recommendation is generated indicating that the user should not eat the food. In an embodiment, the recommendations may include diet quality score of the user for a day or week or month based on nutrition or guardian advice.
At step 207 the method 200 includes providing the recommendation to the user and/or to a guardian of the user. The method 200 allows the display member 101 to display the recommendation to the user on the user's electronic device 100. In an embodiment, the recommendation can be displayed on the wearable or non-wearable input members, such as wrist watch, spectacles, and so on placed in the proximity of the user. In an embodiment, the recommendation can be shared as a notification in user's social network based on the user preference.
In an embodiment, the recommendation can be provided before, during or after the food consumption of the user. In an embodiment, the recommendations can be notifications such as an electronic mail (email), push notification, instant message or text message (like short message service (SMS) text or multimedia messaging service (MMS) text), and so on. For example, an image or video can be sent as a MMS to show the junk food consumed by the user.
For example, recommendation is provided using a text, voice, photo, video, light, vibration, and ring tone.
In an embodiment, the recommendation is related to exercise, food wastage, illness, obesity, and dietary restriction.
The various actions, acts, blocks, steps, and the like in method 200 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions, acts, blocks, steps, and the like may be omitted, added, modified, skipped, and the like without departing from the scope of the invention.
In an embodiment, the input means can be a manual input, such as a voice. When the food consumption action is detected, the voice input means associated with the voice recognition member 104 is automatically triggered to capture voice data relating to the food being consumed by the user. For example, when the user is taking the food, the voice recorder associated with the wrist watch of the user is triggered automatically and captures the voice input provided by the user (name or other description of the food item).
In an embodiment, the input means can be a scanning member, such as a RFID scanner or a Barcode scanner. When the food consumption action is detected, the scanning member associated with the scanning member 103 is triggered automatically to capture the coded data such as Universal Product Code (UPC), RFID tag and the like relating to the food being consumed by the user. For example, when the user starts consuming snacks from a pack, the scanning member is triggered automatically and captures the code data from the pack.
In an embodiment, the food consumption information of the user can be captured by identifying user's history information, and user's personalized preferences. For example, the user configures his/her personalized preferences in the electronic device 100. User configures breakfast food item as a burger. Hence, whenever, the food consumption is detected in the breakfast time, the food item is captured as a burger.
At step 702, the method 700 includes determining whether any food item is detected at the hands of the user. For example, when the user is buying grocery, the input members associated with the user's necklace monitors for food items at the hands of the user.
At step 703, the method 700 includes identifying the food type by correlating the captured food information with the food item descriptive table stored in the storage member 107. In an embodiment, the food item descriptive table can be associated with the server. In an embodiment, the food item descriptive table can be an online database. In an embodiment, the identification can be performed by analyzing the food's shape, color, texture, and volume; or by analyzing the food's packaging and so on. For example the food type identified as liquid based on color, and other characters captured in the picture, exact food item is identified. In an embodiment, the identified food data may include the details such as, but not limited to, origin of the food, for example, the geographic location in which the food was grown, manufactured, prepared, and packaged. The information can be collected from the food tem descriptive table. For example, when the camera in the wrist watch of the user captures plurality of pictures of a food item, the controlling member 102 correlates the captured images with images stored in the food descriptive table.
At step 704, the method 700 includes computing the food constituent's data. The method allows the controlling member 102 to compute the food constituent's data by matching the food type with the information present in the food descriptive table. For example, if the food identified is a cheese burger, the constituent's data is computed based on the information available for cheese burger in the food descriptive table.
At step 705, the method 700 includes generating a recommendation relating to the food, considering the user personalized preferences and the user past food history information. At step 706, the method 700 includes providing the recommendations to the user. For example, the user is consuming an apple daily after his/her meal. Further, the controlling member 105 monitors the food item apple in the vicinity of the user. When the food item is detected as the ‘apple’ within the vicinity of the user, a recommendation is generated and provided to the user indicating that apple is available within the vicinity of the user.
In an embodiment, the food identification data can be developed by correlating the voice data obtained from the user with the food item descriptive table. For example, the voice recorder associated with the electronic device 100 of the user captures voice commands of the user. Further, the food identification data can be developed by correlating this voice data with the data in the food descriptive table.
In an embodiment, the food identification data can be developed by correlating the code data obtained with the data available in the food item descriptive table. For example, the Barcode scanning member associated with the mug captures barcode printed on the food packet present in the proximity of the user. Further, the food identification data can be developed by correlating this captured code data with the data in the food descriptive table.
In an embodiment, the constituent's data of the identified food can be computed by matching the data with pre-stored data available in the electronic device 100. In an embodiment, the constituent's data of the identified food can be computed by matching the data with an online database.
At step 1203, the method 1200 includes frequently monitoring the food consumption of the user. For example, the cameras associated with the user's articles captures the pictures of the food being consumed by the user at regular intervals of time until the food consumption action is completed. At step 1204, the method 1200 includes storing the feedback associated with the food being consumed by the user. The method 1200 allows the storage member 107 to store the details of the food, such as type of food, quantity of the food consumed by the user, quantity of the food wasted and so on. For example, if the user has finished eating a meal, the user food history information is updated including a record of the leftover or wasted food. In an embodiment, the feedback information can also be stored in the food item descriptive table.
At step 1205, the method 1200 includes detecting the consumption of same food by the user next time. At step 1206, the method 1200 includes identifying the feedback associated with the food being consumed. For example, the user is consuming an apple, the controlling member 105, identifies if there is any feedback associated with the apple, when the user consumes apple previous time. At step 1207, the method 1200 includes determining any feedback associated with the food being consumed by the user. At step 1208, the method 1200 includes generating the recommendation in response to determining that there is feedback associated with the food. For example, previously the user wasted 2 pieces of apple. Hence, the controlling member 105 generates the recommendation indicating the wastage of the same food previously. At step 1207, the method 1200 repeats from step 1205, if it is determined that there is no feedback identified associated with the food. Further, the method 1200 includes providing the generated recommendation to the user. In an embodiment, the recommendations include frequency of food consumption, meal reminders, eating limitations, exercise recommendations, food log recommendations, food wastage notifications, restrictions, medical conditions, food intake histories and the like. The user may accept or reject the recommendation provided by the system. For example, the recommendations may include nutrition advice for a diabetic user with a particular amount of insulin at a particular time, based on the user's personalized food data.
The various actions, acts, blocks, steps, and the like in method 1200 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions, acts, blocks, steps, and the like may be omitted, added, modified, skipped, and the like without departing from the scope of the invention.
In an embodiment, the user may assign rankings to the food items relative to each other.
The algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 1604 or the storage 1605 or both. At the time of execution, the instructions may be fetched from the corresponding memory 1604 and/or storage 1605, and executed by the processing unit 1601.
In case of any hardware implementations various networking devices 1607 or external I/O devices 1606 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit. The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements.
The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements. The elements shown in
The embodiment disclosed herein specifies a method and system for capturing food consumption information of a user by automatically triggering one or more input means. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
Number | Date | Country | Kind |
---|---|---|---|
5637/CHE/2013 | Dec 2013 | IN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2014/011972 | 12/5/2014 | WO | 00 |