This application claims the benefit of Japanese Priority Patent Application JP 2013-039355 filed Feb. 28, 2013, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an information processing device and a storage medium.
Recently, devices that assist dietary lifestyle management are being proposed.
For example, PTL 1 below discloses technology that reduces the user workload of recording meal content for efficient management. Specifically, if a food image is sent together with time and date information from a personal client to a center server, an advisor (expert) at the center server analyzes the image of food, and inputs and sends advice.
Also, PTL 2 below discloses technology that calculates calorie intake and meal chewing time on the basis of a captured image of a dish captured by a wireless portable client, and manages the calorie intake and meal chewing time of the dish in real-time during the meal.
PTL 1: JP 2003-85289A
PTL 2: JP 2010-33326A
However, with the above PTL 1, it is difficult to display advice in real-time regarding food that a user is about to eat.
On the other hand, with the above PTL 2, although a warning is displayed in real-time regarding excessive calorie intake or insufficient meal chewing time, the calculated calorie intake is the total calories for one meal (dish), and the calories per ingredient of the food are not calculated.
Accordingly, the present disclosure proposes a new and improved information processing device and storage medium capable of presenting an indicator depending on the type of food.
According to an embodiment of the present disclosure, there is provided an information processing apparatus including: circuitry configured to obtain a captured image of food; transmit the captured image of food; receive, from a data providing device, at least one indication of at least one ingredient included within the food of the captured image; and initiate a displaying of the at least one indication to a user, in association with the food of the captured image.
According to another embodiment of the present disclosure, there is provided a method including: obtaining a captured image of a food; transmitting the captured image; receiving, from a data providing device, at least one indication of at least one ingredient included within the food of the captured image; and displaying the at least one indication to a user, in association with the food of the captured image.
According to another embodiment of the present disclosure, there is provided a non-transitory computer-readable medium having embodied thereon a program, which when executed by a computer causes the computer to perform a method, the method including: obtaining a captured image of a food; transmitting the captured image; receiving, from a data providing device, at least one indication of at least one ingredient included within the food of the captured image; and displaying the at least one indication to a user, in association with the food of the captured image.
According to another embodiment of the present disclosure, there is provided a data providing device including: an image obtaining unit configured to obtain a captured image of food; a type distinguishing unit configured to distinguish at least one ingredient included within the food of the captured image; an indicator generating unit configured to generate at least one indication in relation to the at least one ingredient; and a display data providing unit configured to provide the generated at least one indication to be displayed in association with the food of the captured image, wherein at least one of the image obtaining unit, the type distinguishing unit, the indicator generating unit, and the display data providing unit is implemented via a processor.
According to another embodiment of the present disclosure, there is provided a data providing method including: obtaining a captured image of food; distinguishing at least one ingredient included within the food of the captured image; generating at least one indication in relation to the at least one ingredient; and providing the generated at least one indication to be displayed in association with the food of the captured image.
According to the present disclosure as described in embodiments, it becomes possible to present an indicator depending on the type of food.
Hereinafter, embodiments of the present disclosure will be described in detail and with reference to the attached drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Hereinafter, the description will proceed in the following order.
1. Summary of display control process according to embodiments of present disclosure
2. Basic configuration and operational process of HMD
2-1. Basic configuration of HMD
2-2. Operational process of HMD
3. Screen display examples
3-1. Indicator display
3-2. Suit able/unsuitable display
3-3. Display of calculated indicator based on accumulated indicator
3-4. Display of preparation method-dependent indicator
4. Conclusion
First, a display control process according to an embodiment of the present disclosure will be summarized with reference to
Also, the HMD 1 is configured such that, in the worn state, a pair of display units 2 for the left eye and the right eye are placed immediately in front of either eye of the user 8, or in other words at the locations where the lenses of ordinary eyeglasses are positioned. A captured image of a real space captured with an image capture lens 3a, for example, is displayed on the display units 2. The display units 2 may also be transparent, and by having the HMD 1 put the display units 2 in a see-through state, or in other words a transparent or semi-transparent state, ordinary activities are not impaired even if the user 8 wears the HMD 1 continuously like eyeglasses.
Also, as illustrated in
Also, although only illustrated on the left eye side in
Note that the external appearance of the HMD 1 illustrated in
Also, although the image capture lens 3a and the light emitter 4a that provides illumination are placed facing forward on the side of the right eye in the example illustrated in
It is also acceptable to provide a single earphone speaker 5a to be worn in only one ear, rather than as left and right stereo speakers. Likewise, a microphone may be one of either the microphone 6a or 6b.
Furthermore, a configuration not equipped with the microphones 6a and 6b or the earphone speakers 5a is also conceivable. A configuration not provided with the light emitter 4a is also conceivable.
The above thus describes an external configuration of the HMD 1 illustrated in
Herein, with the technology described in the above PTL 2 as a device that assists dietary lifestyle, the total calories of one meal (dish) are calculated. However, a user is not strictly limited to eating an entire dish, and in addition, cases in which a user prefers to eat only specific ingredients from a dish are also anticipated. Also, since calories and nutritional components differ by ingredient, presenting indicators such as the calories and nutritional components per ingredient greatly improves the utility of technology that assists dietary lifestyle.
Furthermore, in cases in which improvements in dietary lifestyle are demanded due to problems of lifestyle-related diseases or the like, the intake and numerical values of calories, fat, sugar, purines, cholesterol, and the like become problematic. A user is responsible for regularly taking care to recognize preferred and non-preferred food substances for dietary lifestyle improvement. For example, persons at risk of hyperlipidemia, persons with high total cholesterol values, persons with high LDL cholesterol (bad cholesterol) values, and the like are responsible for paying attention to cholesterol.
In this case, preferred food substances may include food substances with low cholesterol and food substances high in unsaturated fatty acids that reduce cholesterol. Food substances with low cholesterol include egg whites, tofu, lean tuna, chicken breast, natto, clams, milk, spinach, potatoes, and strawberries, for example. Meanwhile, food substances high in unsaturated fatty acids that reduce cholesterol include blue-backed fish (such as mackerel, saury, yellowtail, sardines, and tuna), and vegetable oils (such as olive oil, safflower oil, canola oil, and sesame oil). In addition, food substances that help to reduce cholesterol include broccoli, Brussels sprouts, greens, bell peppers, lotus root, burdock root, dried strips of daikon radish, natto, mushrooms, and seaweed, and these may be said to be preferable food substances.
On the other hand, non-preferred food substances may include food substances with high cholesterol and food substances high in saturated fatty acids that increase cholesterol. Food substances with high cholesterol include egg yolks, chicken eggs, broiled eel, chicken liver, beef tongue, quail eggs, conger eel, raw sea urchin, smelt, beef liver, pork liver, beef ribs, beef giblets, pork shoulder, chicken thighs, chicken wings, and gizzards, for example. Also, food substances high in saturated fatty acids that increase cholesterol include fatty meat such as rib and loin meat, chicken skin, bacon, cheese, dairy cream, butter, lard, and Western confectionery using large amounts of butter and dairy cream, for example.
However, there is a large amount of information on such food substances as above, and it is difficult for a user to continually ingest preferred food substances, as in some cases the user may forget during a meal, or unexpected food substances may be non-preferred.
Accordingly, focusing on the above circumstances led to the creation of a display control system according to embodiments of the present disclosure. A display control system according to embodiments of the present disclosure is able to present an indicator depending on the type of food.
Specifically, with the HMD 1 (information processing device) illustrated in
As an exemplary indicator display, an image P1 that includes calorie displays for each ingredient (leeks, bean sprouts, and pork liver) may be displayed on the display units 2, as illustrated in
In addition, the HMD 1 may determine, according to the distinguishing of each ingredient in a captured image, whether or not that ingredient is preferable for the user, and display the determination result on the display units 2. For example, the HMD 1 conducts display control to display an image that recommends eating at a position corresponding to the above food substances with low cholesterol or the above food substances high in unsaturated fatty acids that reduce cholesterol. In addition, the HMD 1 conducts display control to display an image that forbids eating at a position corresponding to the above food substances with high cholesterol or the above food substances high in saturated fatty acids that increase cholesterol, or outputs a warning sound.
The above thus summarizes a display control process according to an embodiment. Next, a basic configuration and operational process of an HMD 1 (information processing device) that conducts a display control process according to an embodiment will be described with reference to
<2-1. Basic Configuration of HMD>
(Main Controller 10)
The main controller 10 is made up of a microcontroller equipped with a central processing unit (CPU), read-only memory (ROM), random access memory (RAM), non-volatile memory, and an interface unit, for example, and controls the respective components of the HMD 1.
Also, as illustrated in
The type distinguishing unit 10a distinguishes types of food in a captured image, and supplies distinguished results to the indicator generator 10c and the recommendation determination unit 10d. Specifically, the type distinguishing unit 10a distinguishes the type of each ingredient included in food. For example, from a captured image capturing the dish 30 of stir-fried liver and leeks (also called stir-fried leeks with liver) illustrated in
The preparation method distinguishing unit 10b distinguishes a preparation method of food in a captured image (such as stir-fried, grilled, boiled, fried, steamed, raw, or dressed), and supplies distinguished results to the indicator generator 10c. Preparation methods may be distinguished on the basis of a captured image analysis result from the captured image analyzer 13, smell data sensed by a smell sensor (not illustrated), or thermal image data acquired by a thermal image sensor (not illustrated). Specifically, the preparation method distinguishing unit 10b is able to distinguish preparation methods by using a dish's color (such as the browning color) or shininess (oil shininess) features extracted from a photograph, and data for distinguishing preparation methods that is stored in the storage unit 22. For example, from a captured image capturing the dish 30 of stir-fried liver and leeks illustrated in
The indicator generator 10c generates an indicator depending on a type of food distinguished by the type distinguishing unit 10a. In the present specification, an indicator refers to a numerical value of calories, vitamins, fat, protein, carbohydrates, calcium, magnesium, dietary fiber, potassium, iron, retinol, sugar, salt, purines, or cholesterol, for example. The indicator generator 10c references data for generating indicators that is included in the storage unit 22, and according to the type of an ingredient, extracts indicators included in that ingredient. In the data for generating indicators, types of ingredients and indicators for those ingredients are associated. The indicator generator 10c may also generate values for indicators included in an ingredient according to an amount (mass) of that ingredient estimated by image analysis.
Also, since indicators change according to preparation method in some cases depending on the nutrient properties, the indicator generator 10c may also re-generate an indicator according to a preparation method distinguished by the preparation method distinguishing unit 10b. Specifically, the indicator generator 10c is able to re-generate an indicator by referencing data related to changes in respective indicators associated preparation methods.
Furthermore, the indicator generator 10c may also generate a specific indicator according to a user's medical information (including disease history and medication history), health information (include current physical condition information), genetic information, predisposition information (including allergy information), or the like, and a type of food distinguished by the type distinguishing unit 10a. A specific indicator refers to an indicator that indicates a component that warrants particular attention on the basis of a user's medical information of the like, for example. For example, on the basis of a user's medical information or health information, the indicator generator 10c generates an indicator indicating cholesterol or an indicator indicating salt content, rather than an indicator indicating calories. The above medical information, health information, genetic information, predisposition information, and the like may be extracted from the storage unit 22, or acquired from a designated server via the communication unit 21. Also, in the case in which the HMD 1 is provided with a biological sensor that detects a user's biological information (such as blood pressure, body temperature, pulse, or brain waves), the indicator generator 10c is able to use information detected from the biological sensor as current health information. Furthermore, a user's biological information may be acquired via the communication unit 21 of the HMD 1 from a communication unit in a user-owned biological information detection device (not illustrated) separate from the HMD 1, and may be used as current health information.
The recommendation determination unit 10d determines whether or not respective ingredients are suitable for a user, on the basis of the types of respective ingredients distinguished by the type distinguishing unit 10a. The question of suitable or unsuitable may be determined on the basis of data on ingredients generally considered suitable/unsuitable, or determined on the basis of a user's medical information, health information, or the like. Ingredients generally considered suitable may include ingredients that warm the body, for example. Also, in cases such as where a user has a lifestyle-related disease or is responsible for paying attention to cholesterol intake as discussed earlier, suitable food substances may include with low cholesterol and food substances high in unsaturated fatty acids that reduce cholesterol. On the other hand, unsuitable food substances may include food substances with high cholesterol and food substances high in saturated fatty acids that increase cholesterol. Also, the recommendation determination unit 10d supplies determination results to the output data processor 16.
The accumulation controller 10e applies control to accumulate indicators generated by the indicator generator 10c in the storage unit 22. More specifically, the accumulation controller 10e applies control to accumulate indicators for ingredients eaten by a user from among the indicators generated by the indicator generator 10c.
The calculation unit 10f calculates a new indicator value on the basis of an indicator accumulated in the storage unit 22 and an indicator currently generated by the indicator generator 10c. For example, the calculation unit 10f is able to calculate a total intake indicator for a designated period by adding an indicator for ingredients currently being ingested to indicators accumulated in the storage unit 22. Also, the calculation unit 10f is able to calculate a remaining future available intake indicator by subtracting an indicator for a designated period being stored in the storage unit 22 and an indicator for ingredients being currently ingested from an ideal total intake indicator for a designated period. The calculation unit 10f supplies calculated, new indicators to the output data processor 16.
(Image Capture Unit)
The image capture unit 3 includes a lens subsystem made up of the image capture lens 3a, a diaphragm, a zoom lens, a focus lens, and the like, a driving subsystem that causes the lens subsystem to conduct focus operations and zoom operations, a solid-state image sensor array that generates an image capture signal by photoelectric conversion of captured light obtained with the lens subsystem, and the like. The solid-state image sensor array may be realized by a charge-coupled device (CCD) sensor array or a complementary metal-oxide-semiconductor (CMOS) sensor array, for example.
(Image Capture Controller)
The image capture controller 11 controls operations of the image capture unit 3 and the image capture signal processor 12 on the basis of instructions from the main controller 10. For example, the image capture controller 11 controls the switching on/off of the operations of the image capture unit 3 and the image capture signal processor 12. The image capture controller 11 is also configured to apply control (motor control) causing the image capture unit 3 to execute operations such as autofocus, automatic exposure adjustment, diaphragm adjustment, and zooming. The image capture controller 11 is also equipped with a timing generator, and controls signal processing operations with timing signals generated by the timing generator for the solid-state image sensors as well as the sample and hold/AGC circuit and video A/D converter of the image capture signal processor 12. In addition, this timing control enables variable control of the image capture frame rate.
Furthermore, the image capture controller 11 controls image capture sensitivity and signal processing in the solid-state image sensors and the image capture signal processor 12. For example, as image capture sensitivity control, the image capture controller 11 is able to conduct gain control of signals read out from the solid-state image sensors, set the black level, control various coefficients for image capture signal processing at the digital data stage, control the correction magnitude in a shake correction process, and the like.
(Image Capture Signal Processor)
The image capture signal processor 12 is equipped with a sample and hold/automatic gain control (AGC) circuit that applies gain control and waveform shaping to signals obtained by the solid-state image sensors of the image capture unit 3, and a video analog/digital (A/D) converter. Thus, the image capture signal processor 12 obtains an image capture signal as digital data. The image capture signal processor 12 also conducts white balance processing, luma processing, chrome signal processing, shake correction processing, and the like on an image capture signal.
(Captured Image Analyzer)
The captured image analyzer 13 is an example of a configuration for acquiring external information. Specifically, the captured image analyzer 13 analyzes image data (a captured image) that has been captured by the image capture unit 3 and processed by the image capture signal processor 12, and obtains information on an image included in the image data.
Specifically, the captured image analyzer 13 conducts analysis such as point detection, line/edge detection, and area segmentation on image data, for example, and outputs analysis results to the type distinguishing unit 10a and the preparation method distinguishing unit 10b of the main controller 10.
(Illumination Unit, Illumination Controller)
The illumination unit 4 includes the light emitter 4a illustrated in
(Audio Input Unit, Audio Signal Processor)
The audio input unit 6 includes the microphones 6a and 6b illustrated in
(Output Data Processor)
The output data processor 16 includes functions that process data for output from the display units 2 or the audio output unit 5, and is formed from a video processor, a digital signal processor, a D/A converter, and the like, for example. Specifically, the output data processor 16 generates display image data, and conducts luma level adjustment, color correction, contrast adjustment, sharpness (edge enhancement) adjustment, and the like on the generated display image data. The output data processor 16 may also generate an indicator display image on the basis of an indicator depending on a type of food generated by the indicator generator 10c of the main controller 10, and may also generate a display image of a new indicator on the basis of a new indicator calculated by the calculation unit 10f. Also, the output data processor 16 may generate a display image indicating whether or not something is suitable, on the basis of a recommendation determination result depending on a type of food determined by the recommendation determination unit 10d. The output data processor 16 supplies processed display image data to the display controller 17.
The output data processor 16 also generates audio signal data, and conducts volume adjustment, sound quality adjustment, acoustic effects, and the like on the generated audio signal data. The output data processor 16 may also generate audio signal data announcing whether or not something is suitable, on the basis of a recommendation determination result depending on a type of food determined by the recommendation determination unit 10d of the main controller 10. The output data processor 16 supplies processed audio signal data to the audio controller 18.
Note that the output data processor 16 may also generate driving signal data for producing vibration from a vibration notification unit (not illustrated) formed by a driving motor or the like. The output data processor 16 generates a driving signal announcing whether or not something is suitable, on the basis of a recommendation determination result depending on a type of food determined by the recommendation determination unit 10d of the main controller 10.
(Display Controller)
The display controller 17, according to control from the main controller 10, conducts driving control for displaying display image data supplied from the output data processor 16 on the display units 2. The display controller 17 may be made up of a pixel driving circuit for causing display in display units 2 realized as liquid crystal displays, for example. The display controller 17 is also able to control the transparency of each pixel of the display units 2, and put the display units 2 in a see-through state (transparent state or semi-transparent state).
Specifically, a display controller 17 according to an embodiment controls the display units 2 to display an image generated by the output data processor 16 on the basis of an indicator depending on a type of food generated by the indicator generator 10c. In addition, a display controller 17 according to an embodiment may also control the display units 2 to display an image generated by the output data processor 16 on the basis of a recommendation result (suitable or not) per type of food determined by the recommendation determination unit 10d. At this point, the display controller 17 may also apply control to display an image of an indicator or recommendation result in correspondence with the position of each ingredient in the food. Also, the display controller 17 may also display an indicator or recommendation result near an ingredient that a user is about to eat, and move the display position of the image of the indicator or recommendation result according to the positional movement of the ingredient during eating.
In addition, a display controller 17 according to an embodiment may also control the display units 2 to display an image generated by the output data processor 16 on the basis of a new indicator calculated by the calculation unit 10f.
In addition, a display controller 17 according to an embodiment displays a captured image on the display units 2 in real-time, and additionally superimposes an image illustrating indicators, recommendation results, or the like in correspondence with the positions of respective ingredients in the captured image being displayed. Alternatively, the display controller 17 may apply control to put the display units 2 in a see-through state (without displaying a captured image), and display an image illustrating indicators, recommendation results, or the like in correspondence with the positions of ingredients existing in a real space.
(Display Units)
The display units 2, according to control from the display controller 17, display a captured image, or an image illustrating indicators, recommendation results, or the like for respective ingredients.
(Audio Controller)
The audio controller 18, according to control from the main controller 10, applies control to output audio signal data supplied from the output data processor 16 from the audio output unit 5. More specifically, the audio controller 18 applies control to announce an indicator generated by the indicator generator 10c, announce an indicator newly calculated by the calculation unit 10f, or announce a suitable/unsuitable ingredient determined by the recommendation determination unit 10d.
(Audio Output Unit)
The audio output unit 5 includes the pair of earphone speakers 5a illustrated in
(Storage Unit)
The storage unit 22 is a member that records or plays back data with respect to a designated recording medium. The storage unit 22 is realized by a hard disk drive (HDD), for example. Obviously, various media such as flash memory or other solid-state memory, a memory card housing solid-state memory, an optical disc, a magneto-optical disc, and holographic memory are conceivable as the recording medium, and it is sufficient to configure the storage unit 22 to be able to execute recording and playback in accordance with the implemented recording medium.
Also, a storage unit 22 according to an embodiment stores data for distinguishing ingredients that is used by the type distinguishing unit 10a, data for distinguishing preparation methods that is used by the preparation method distinguishing unit 10b, data for distinguishing indicators that is used by the indicator generator 10c, and data for determining recommendations that is used by the recommendation determination unit 10d. Also, the storage unit 22 stores a user's medical information, health information, genetic information, predisposition information, and the like. Furthermore, the storage unit 22 stores indicators whose accumulation is controlled by the accumulation controller 10e.
(Communication Unit)
The communication unit 21 sends and receives data to and from external equipment. The communication unit 21 communicates wirelessly with external equipment directly or via a network access point, according to a scheme such as a wireless local area network (LAN), Wi-Fi (Wireless Fidelity, registered trademark), infrared communication, or Bluetooth (registered trademark).
The above thus describes in detail an internal configuration of an HMD 1 according to an embodiment. Note that although the audio output unit 5, audio input unit 6, audio signal processor 15, and audio controller 18 are illustrated as an audio-related configuration, it is not strictly necessary to provide all of the above. Also, although the communication unit 21 is illustrated as part of the configuration of the HMD 1, it is not strictly necessary to provide the communication unit 21.
According to the above configuration, an HMD 1 according to an embodiment is able to display indicators in real-time on the display units 2 in accordance with respective ingredients of food in a captured image captured by the image capture unit 3, and assist the dietary lifestyle of the user 8. Next, an operational process of an HMD 1 according to an embodiment will be described.
2-2. Operational Process of HMD
An HMD 1 according to an embodiment is worn by the user 8, and applies control to display indicators for respective ingredients in real-time while the user is eating. An indicator display process by such an HMD 1 will be specifically described hereinafter with reference to
(2-2-1. Indicator Display Process)
Next, in step S106, the type distinguishing unit 10a of the HMD 1 distinguishes a per-ingredient type of food in the image, on the basis of a captured image of food captured by the image capture unit 3. Specifically, the type distinguishing unit 10a distinguishes the types of respective ingredients on the basis of color and shape features of respective objects extracted from an image. The type distinguishing unit 10a outputs distinguished results to the indicator generator 10c.
Subsequently, in step S109, the indicator generator 10c generates indicators for respective ingredients, according to the types of respective ingredients distinguished by the type distinguishing unit 10a. Specifically, the indicator generator 10c extracts a designated indicator associated with a distinguished type of ingredient from the data for distinguishing ingredients that is stored in the storage unit 22, which is generated as an indicator for that ingredient. Note that the indicator generator 10c may also generate an indicator depending on a size or amount of the relevant ingredient, which is estimated on the basis of a captured image. The indicator generator 10c supplies a generated indicator to the output data processor 16.
Next, in step S112, the display controller 17 controls the display units 2 to display an image including indicators for respective ingredients supplied from the output data processor 16. For example, as illustrated in
Subsequently, in the case where the user gives display rejection instructions (S115/Yes), in step S118 the HMD 1 applies control to hide the indicators and display food normally. Note that the normal display control for food may be a transparency control for the display units 2. Also, display rejection instructions from a user are voice input from the audio input unit 6, or a gesture input from the image capture unit 3, for example.
Next, in the case where the user gives display instructions for another indicator (S121/Yes), in step S124 the HMD 1 applies control to display another indicator. For example, the HMD 1 applies control to display a cholesterol display for respective ingredients as another indicator, at positions corresponding to the respective ingredients.
(2-2-2. Gaze-Dependent Indicator Display Process)
Although the indicator display process described above with reference to
Next, in step S136, the HMD 1 determines whether or not an eating advisor mode is set. In the example illustrated in
Subsequently, in the case where the eating advisor mode is not set (S115/No), in step S139 the HMD 1 applies control to display food normally.
On the other hand, in the case where the eating advisor mode is set (S115/Yes), in step S142 the HMD 1 conducts user gaze extraction (acquisition of gaze input information). Specifically, on the basis of an eye image captured by an image capture lens (not illustrated) installed at a position able to capture a user's eye while being worn, the captured image analyzer 13 tracks pupil movement, and outputs a tracking result to the main controller 10. The main controller 10 then extracts the orientation of the user's gaze on the basis of the pupil movement tracking result.
Next, in step S145, the main controller 10 focuses on an ingredient at the end of the user's gaze, on the basis of the orientation of the user's gaze and a captured image of food. In other words, the main controller 10 selects an ingredient that the user is looking at (a specific object) as a target from among food (multiple objects) in a captured image.
Subsequently, in step S148, the type distinguishing unit 10a distinguishes the type of the ingredient (a specific object) selected as a target.
Subsequently, in step S151, the indicator generator 10c generates an indicator, such as a calorie count, for example, depending on the distinguished type of ingredient.
Then, in step S154, the display controller 17 controls the display units 2 to display an image including an indicator for the ingredient being focused on that is supplied from the output data processor 16. In this way, an HMD 1 according to an embodiment is able to apply control to display an indicator for an ingredient that the user is looking at.
Note that in the case where the user gives display instructions for another indicator (S157/Yes), in step S160 the HMD 1 applies control to display another indicator for the ingredient being focused on. For example, the HMD 1 displays, on the display units 2, a numerical cholesterol value for the ingredient being focused on as another indicator.
(2-2-3. Upper Limit-Dependent Indicator Display Process)
Although the respective indicator display processes described above with reference to
Next, in step S206, the HMD 1 displays indicators for respective ingredients. Specifically, the HMD 1 executes the process illustrated from S103 to S112 of
Subsequently, in step S209, the main controller 10 of the HMD 1 recognizes an indicator for one mouthful of an ingredient eaten by the user. Specifically, on the basis of a captured image, the main controller 10 identifies an ingredient conveyed to the user's mouth by chopsticks, a spoon, a fork, or the like, and recognizes the indicator for that ingredient. Herein, an indicator for one mouthful (an additive value) is expressed as AEj.
Next, in step S212, the calculation unit 10f of the main controller 10 calculates an indicator value (the current value of AE) for the case of accumulating AE (equal to AEt) by AEj, and supplies the calculated result to the output data processor 16. Also, the calculation unit 10f may calculate the proportion (Q %) of the current value versus a preset intake upper limit value for a designated period. The intake upper limit value is an upper limit value on calorie intake in one day, an upper limit value on calorie intake in one week, or an upper limit value on cholesterol in one day, or the like, for example. Such an upper limit value may also be set on the basis of a user's medical information and health information.
Subsequently, in step S215, the display controller 17 controls the display units 2 to display an image including the current value of AE (AE+AEj), or the proportion (Q %) of the current value versus the upper limit value, that is supplied from the output data processor 16. Thus, the user is able to recognize the current value (AE+AEj) or the proportion (Q %) of the current value versus the upper limit value for an indicator ingested up to the present, and respond by refraining from the food in the future or the like.
Subsequently, the main controller 10 determines whether or not the user is continuing to eat. The main controller 10 determines that eating continues in the case where an action, such as the user scooping the next ingredient with a spoon, is extracted on the basis of a captured image captured by the image capture lens 3a, for example.
Next, in the case where eating does not continue and the meal has ended (S218/No), in step S221 the main controller 10 takes the AE (AE+AEj) calculated in the above S212 as the accumulated value AEt up to the present in the designated period, which is then saved in the storage unit 22 and displayed on the display units 2.
Subsequently, in the case where the user continues to eat (S218/Yes), in step S224 the main controller 10 determines whether or not the Q % displayed in the above S215 (the proportion of the current value versus the upper limit value) is 90% or greater.
Next, in the case of being below 90% (S224/No), in step S227 the main controller 10 displays the Q % displayed in the above S215 normally.
On the other hand, in the case of being 90% or greater (S224/Yes), in step S230 the main controller 10 determines whether or not the Q % displayed in the above S215 is 100+a (alpha) % or greater. In other words, the main controller 10 determines whether or not the current value of AE has exceeded the upper limit value+a (alpha).
Subsequently, in the case of being below 100+a % (S230/No), in step S236 the main controller 10 instructs the display controller 17 or the audio controller 18 to produce a warning display from the display units 2 or a warning announcement from the audio output unit 5. Thus, in the case where the current value of AE is between 90% and 100+a %, the HMD 1 issues a warning to the user, and is able to prompt the user to pay attention to his or her intake of a designated indicator (calories or cholesterol, for example).
On the other hand, in the case of 100+a % or greater (S230/Yes), in step S233 the main controller 10 instructs the display controller 17 or the audio controller 18 to produce a stop display from the display units 2 or a stop announcement from the audio output unit 5. A stop notification has a higher alert level than a warning notification. For example, the main controller 10 may cause the display units 2 to display “STOP EATING” in large letters, or cause the audio output unit 5 to output a warning sound until the user stops eating.
Subsequently, in step S239, the main controller 10 determines whether or not the user has eaten again. The main controller 10 determines that the user has eaten again in the case where an action, such as the user conveying a mouthful of an ingredient to his or her mouth, is extracted on the basis of a captured image captured by the image capture lens 3a, for example. In the case of eating again (S239/Yes), the main controller 10 again conducts the process illustrate in the above S209, and in the case of not eating again (S239/No), the process ends.
The above thus specifically describes an indicator display process according to an embodiment with reference to
Next, screen display examples according to an embodiment will be described with reference to
<3-1. Indicator Display>
First, a display example of indicators for respective ingredients will be described with reference to
Specifically, the display controller 17 displays an image 40 indicating that a captured food is being recognized, like on the display screen P2 illustrated in
Subsequently, in the case of no retry instructions, the main controller 10 distinguishes the types of respective ingredients in a captured image with the type distinguishing unit 10a, and displays, on the display units 2, indicators for respective ingredients generated by the indicator generator 10c according to the distinguished types. Specifically, the main controller 10 displays an indicator image 33a indicating the calories and masses of respective ingredients, like on the display screen P5 illustrated in
Also,
Also, an indicator table according to an embodiment is not limited to the indicator table illustrating calories and masses for respective ingredients illustrated in
Furthermore, a main controller 10 according to an embodiment is capable of displaying an indicator for an ingredient that a user is about to eat near that ingredient, and also moving the display position of the indicator according to the positional movement of the ingredient during eating. Herein,
Like on the display screen P9 illustrated in
Furthermore, as an ingredient of an eating target comes closer in conjunction with the user's eating actions, a display controller 17 according to an embodiment likewise moves the display position of the image 32d illustrating the indicator according to the movement of the ingredient, like on the display screen P10 illustrated in
<3-2. Suitability/Unsuitability Display>
The above thus describes indicator screen display examples in detail and with reference to
The display controller 17 then applies control to display an image 44a indicating that pork liver is an unsuitable ingredient, and an image 44b indicating that bean sprouts are a suitable ingredient, like on the display screen P11 illustrated in
<3-3. Display of Calculated Indicator Based on Accumulated Indicator>
Next, the display of a calculated indicator by an HMD 1 according to an embodiment will be described. As discussed above, the main controller 10 of an HMD 1 according to an embodiment includes an accumulation controller 10e and a calculation unit 10f, and the accumulation controller 10e accumulates indicators. Also, the calculation unit 10f calculates a new indicator value based on an accumulated indicator and an indicator currently generated by the indicator generator 10c. The new indicator value is a total intake indicator for a designated period or a remaining future available intake indicator, for example. Subsequently, the display controller 17 applies control to display the calculated new indicator. Hereinafter, the display of a calculated indicator will be specifically described with reference to
Subsequently, if the user starts eating, the indicator generator 10c of the main controller 10, on the basis of a captured image captured by the image capture unit 3, generates a calorie count corresponding to (one mouthful of) an ingredient eaten by the user, which is supplied to the accumulation controller 10e. The accumulation controller 10e accumulates the calorie count of one mouthful eaten by the user in the storage unit 22. Next, the calculation unit 10f subtracts the calorie count accumulated in the storage unit 22 since the start of eating, as well as a calorie count currently generated by the indicator generator 10c (the currently ingested calorie count), from the calorie count of the food, and calculates a remaining calorie count. The calculation unit 10f supplies the remaining calorie count calculated in this way to the output data processor 16. The display controller 17 then applies control to display an image 36a that illustrates the remaining calorie count supplied from the output data processor 16 as a bar enabling comparison with the total calorie count of the food, like on the display screen P14 illustrated in
In the example described with reference to
<3-4. Display of Preparation Method-Dependent Indicators>
Next, the display of preparation method-dependent indicators according to an HMD 1 of an embodiment will be described. As discussed above, the main controller 10 of an HMD 1 according to an embodiment includes a preparation method distinguishing unit 10b, in which the preparation method distinguishing unit 10b distinguishes the preparation method of a food, and the indicator generator 10c re-generates indicators for respective ingredients according to the distinguished preparation method. Thus, it is possible to display indicators that also account for the case of changing according to preparation method. Hereinafter, the display of preparation method-dependent indicators will be specifically described with reference to
As discussed above, with an HMD 1 according to an embodiment, it is possible to present an indicator depending on a type of food in real-time while a user is eating.
Also, the HMD 1 may also provide a suitability/unsuitability display for respective ingredients included in the food.
Also, the HMD 1 may also present an indicator that is newly calculated on the basis of an accumulated indicator.
Furthermore, the HMD 1 may also re-generate and present an indicator depending on the dish preparation method.
The foregoing thus describes embodiments of the present technology in detail and with reference to the attached drawings. However, the present disclosure is not limited to such examples. It is clear to persons ordinarily skilled in the technical field of the present disclosure that various modifications or alterations may occur insofar as they are within the scope of the technical ideas stated in the claims, and it is to be understood that such modifications or alterations obviously belong to the technical scope of the present disclosure.
For example, it is possible to create a computer program for causing hardware such as a CPU, ROM, and RAM built into the HMD 1 to exhibit the functionality of the HMD 1 discussed earlier. A computer-readable storage medium made to store such a computer program is also provided.
Also, in the above respective embodiments, although an HMD 1 is used as an example of an information processing device, an information processing device according to an embodiment is not limited to an HMD 1, and may also be a display control system formed from a smartphone and an eyeglasses-style display, for example. The smartphone (information processing device) is connectable to the eyeglasses-style display in a wired or wireless manner, and is able to transmit and receive data.
Herein, the eyeglasses-style display includes a wearing unit having a frame structure that wraps halfway around the back of the head from either side of the head, and is worn by a user by being placed on the pinna of either ear, similarly to the HMD 1 illustrated in
Also, the eyeglasses-style display is provided with an image capture lens for capturing the user's gaze direction while in the worn state, similarly to the HMD 1 illustrated in
The smartphone (information processing device) includes functions similar to the main controller 10, and distinguishes respective ingredients of food from a captured image, and generates an image illustrating indicators for distinguished ingredients. Additionally, the smartphone (information processing device) transmits a generated image to the eyeglasses-style display, and an image illustrating indicators for respective ingredients is displayed on the display units of the eyeglasses-style display.
Application is also conceivable to an eyeglass-style device that, although similar in shape to an eyeglasses-style display, does not include display functions. In this case, food is captured by a camera, provided on the eyeglasses-style device, that captures the wearer's (the user's) gaze direction, and a captured image is transmitted to the smartphone (information processing device). Subsequently, the smartphone (information processing device) generates an image illustrating indicators for respective ingredients of the food depicted in the captured image, which is displayed on a display of the smartphone.
Furthermore, although the foregoing embodiments described the type distinguishing unit 10a distinguishing types of respective ingredients and the preparation method distinguishing unit 10b distinguishing a preparation method on the basis of a captured image analysis result from the captured image analyzer 13 of the HMD 1, such a captured image analyzing process may also be conducted in the cloud. The HMD 1 sends a captured image of a dish to the cloud via the communication unit 21, receives a result that has been analyzed in the cloud (on an analysis server, for example), and on the basis thereof, conducts various distinguishing with the type distinguishing unit 10a and the preparation method distinguishing unit 10b.
Additionally, the present technology may also be configured as below.
Number | Date | Country | Kind |
---|---|---|---|
2013-039355 | Feb 2013 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2014/000431 | 1/28/2014 | WO | 00 |