The present application relates generally to nutrition monitoring and, more particularly, to methods and systems for determining the nutritional content of food items using computer devices.
In accordance with one or more embodiments, a computer-implemented method is disclosed for determining nutrition information of a nutrition item for a user of a computer device. The computer device has a camera and being capable of communicating with a computer server system over a communications network. The method comprises the steps, performed by the computer server system, of: (a) receiving a photograph of a nutrition item from the user computer device over the communications network, the photograph being captured by the camera of the user computer device; (b) analyzing the photograph and extracting a set of features from the photograph using a nutrition item feature recognition engine, and determining a nutrition item type of the nutrition item based at least in part on the one or more extracted features; (c) determining a quantity, volume, or mass of the nutrition item in the photograph; (d) determining nutritional content information of the nutrition item based on the nutrition item type and the quantity, volume, or mass of the nutrition item; and (e) transmitting the nutritional content information to the user computer device over the communications network to be displayed to the user.
In accordance with one or more further embodiments, a computer system comprises at least one processor, memory associated with the at least one processor, and a program supported in the memory for determining nutrition information of a nutrition item. The program contains a plurality of instructions which, when executed by the at least one processor, cause the at least one processor to: (a) receive a photograph of a nutrition item from a user computer device over a communications network, the photograph being captured by a camera of the user computer device; (b) analyze the photograph and extract a set of features from the photograph using a nutrition item feature recognition engine, and determine a nutrition item type of the nutrition item based at least in part on the one or more extracted features; (c) determine a quantity, volume, or mass of the nutrition item in the photograph; (d) determine nutritional content information of the nutrition item based on the nutrition item type and the quantity, volume, or mass of the nutrition item; and (e) transmit the nutritional content information to the user computer device over the communications network to be displayed to the user.
In accordance with one or more further embodiments, a computer program product residing on a non-transitory computer readable medium is disclosed. The computer program product has a plurality of instructions stored thereon for determining nutrition information of a nutrition item which, when executed by a computer processor, cause that computer processor to: (a) receive a photograph of a nutrition item from a user computer device over a communications network, the photograph being captured by a camera of the user computer device; (b) analyze the photograph and extract a set of features from the photograph using a nutrition item feature recognition engine, and determine a nutrition item type of the nutrition item based at least in part on the one or more extracted features; (c) determine a quantity, volume, or mass of the nutrition item in the photograph; (d) determine nutritional content information of the nutrition item based on the nutrition item type and the quantity, volume, or mass of the nutrition item; and (e) transmit the nutritional content information to the user computer device over the communications network to be displayed to the user.
As used herein, the term ‘nutrition item’ or ‘nutrition object’ refers to a specific real-world object that provides nutrition to people when consumed, e.g., a single apple or a glass of orange juice.
The term ‘nutrition item type’ or ‘nutrition object type’ refers to a group of related nutrition items whose nutritional content scales generally linearly with a nutrition item's quantity, volume, or mass, e.g., all Honeycrisp apples or all Hamlin orange juice.
Various embodiments disclosed herein relate to computer-implemented methods and systems enabling users to quickly and easily determine the nutritional content of nutrition items they are contemplating consuming using a smartphone or other computer device, and to optionally maintain a log of consumed nutrition items and associated nutrition information.
The client devices 102 operated by users can comprise any computing device that can communicate with the computer server system 100 including, without limitation, smartphones (e.g., the Apple iPhone and Android-based smart phones), wearable computer devices (e.g., smart watches and smart glasses), personal computers (including desktop, notebook, and tablet computers), smart TVs, cell phones, and personal digital assistants. The client devices include operating systems (e.g., Android, Apple iOS, and Windows Phone OS, among others) on which applications run. The operating systems allow programmers to create applications or apps to provide particular functionality to the devices. As discussed below, users can install an app on their client devices enabling them to quickly and conveniently determine nutritional content of nutrition items.
A representative client device 102 includes at least one computer processor and a storage medium readable by the processor for storing applications and data. The client device also includes input/output devices such as a camera, one or more speakers for acoustic output, a microphone for acoustic input, and a display for visual output, e.g., an LCD or LED display, which may have touch screen input capabilities. The client device 102 also includes a communication module or network interface to communicate with the computer server system 100 or other devices via network 104.
The system 100 uses various engines to determine nutrition item information from photos of nutrition items received from users. In accordance with one or more embodiments, the engines can include generative adversarial networks and deep convolutional neural networks leveraging residual blocks, inception modules, and region proposal layers. For example, a feature recognition engine discussed below can be pre-trained using semisupervised generative adversarial networks. Additionally, boosted trees can also be leveraged. These methods are trained on data including manually annotated nutrition item photos, user behaviors, interaction patterns, engineered features, and user-specific data.
At step 202, the system 100 receives a photo containing one or more nutrition items from the user. The user can use an app downloaded on his or her smartphone or other client device to take and transmit the photo. The user can also optionally identify a meal associated with the nutrition item as shown in
At step 204, the system 100 analyzes the photo and extracts a set of suggested features from the photo using a nutrition item feature recognition engine. The feature recognition engine extracts features from the photo that are human-understandable. Features include but are not limited to commonly understood attributes (e.g., liquid, red) or higher level groupings (e.g., bread, meat).
The suggested features are transmitted to the user. As shown in the screenshot 500 of
At step 208, the system combines the extracted features (from step 204) and the user selected feature (in this example, a banana from step 206) to form search query returning a plurality of nutrition item types associated with the selected feature. The nutrition item types are transmitted to user.
The user selects one of the banana products, which selection is received by the system 100 at step 210.
At step 212, the system receives a user input of quantity, volume, or mass of the selected raspberry product. As shown in the exemplary screenshot 700 of
As step 214, the system 100 combines the user selected nutrition item type and quantity, volume, or mass to determine the nutritional content of a nutrition item in the photo. The app can dynamically display the nutritional content information at 704 as the user scrolls the wheel 702.
The user can select the “Add” button 706 to add the banana serving nutritional information to his or her daily nutrition log as shown in the exemplary screenshot 800 of
Steps 902, 904, 906, 908, and 910 in
At step 912, the system 100 combines the selected nutrition item type and photo to form a composite object. Composite objects are a collection of feature vectors that form the inputs for various engines. Feature vectors are defined as n-dimensional vectors of variables that numerically describe the input photo. Feature vectors can be generated through a variety of filters (e.g., deep neural network layer extraction), transformations, augmentations, and perturbations to the photo.
At step, 914 the system extracts a second set of features from the composite object. These composite objects are a collection of feature vectors that form the inputs for various engines. Feature vectors in step 914 include, but are not limited to, nutritional object segmentations, bound box estimations, voxel data which may be provided through hardware (multiple camera data), nutrition object type pairwise probabilities, and specific user probabilities.
At step 916, the system estimates a quantity, volume, or mass from the second set of extracted features.
At step 918, the system combines the user selected nutrition item type and estimated quantity, volume, or mass to determine the nutritional content of a nutrition item in the photo, and transmits the information to the user.
At step 1002, the system 100 receives a photo containing one or more nutrition items from the user.
At step 1004, the system extracts a set of features from the photo and matches the pictured nutrition item to one or many specific nutrition item types. This may use other pieces of information from the user such as location or previous logging history to determine the most likely specific nutrition item types.
At step 1006, the system receives a user input of quantity, volume, or mass.
At step 1008, the system combines the nutrition item type and user-provided quantity, volume, or mass to determine the nutritional content of a nutrition item in the photo. The nutrition information is transmitted to the user.
At step 1102, the system receives a photo containing one or more nutrition items from the user.
At step 1104, the system extracts a set of features from the photo and matches a nutrition item type.
At step 1106, the system combines the nutrition item type and photo to form a composite object.
At step 1108, the system extracts a second set of features from the composite object.
At step 1110, the system estimates a quantity, volume, or mass from the second set of extracted features.
At step 1112, the system combines the nutrition item type and estimated quantity, volume, or mass to determine the nutritional content of a nutrition item in the photo. The information is transmitted to the user.
The methods, operations, modules, and systems described herein for determining nutrition item information may be implemented in one or more computer programs executing on a programmable computer system (which can be part of the server computer system 100).
Each computer program can be a set of instructions or program code in a code module resident in the random access memory of the computer system. Until required by the computer system, the set of instructions may be stored in the mass storage device or on another computer system and downloaded via the Internet or other network.
Having thus described several illustrative embodiments, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to form a part of this disclosure, and are intended to be within the spirit and scope of this disclosure. While some examples presented herein involve specific combinations of functions or structural elements, it should be understood that those functions and elements may be combined in other ways according to the present disclosure to accomplish the same or different objectives. In particular, acts, elements, and features discussed in connection with one embodiment are not intended to be excluded from similar or other roles in other embodiments.
Additionally, elements and components described herein may be further divided into additional components or joined together to form fewer components for performing the same functions. For example, the computer system may comprise one or more physical machines, or virtual machines running on one or more physical machines. In addition, the computer system may comprise a cluster of computers or numerous distributed computers that are connected by the Internet or another network.
Accordingly, the foregoing description and attached drawings are by way of example only, and are not intended to be limiting.
This application claims priority from U.S. Provisional Patent Application No. 62/425,410 filed on Nov. 22, 2016 entitled COMPUTER-IMPLEMENTED METHODS AND SYSTEMS FOR DETERMINING THE NUTRITIONAL CONTENT OF NUTRITION ITEMS, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62425410 | Nov 2016 | US |