The present disclosure relates to a heating cooker that carries out heat control based on a result of an image recognition, a method for controlling the heating cooker, and a heating cooking system.
A microwave oven, an example of the heating cooker, usually needs an input of a heating time by a user before starting the cooking.
A large number of techniques has been developed for automatically setting a heating time. For instance, patent literature 1 discloses a technique that analyzes an image of a heat-target object shot before cooking, thereby selecting a cooking method.
Patent Literature 1: Japanese Utility Model No. 3036671
A conventional technique has a drawback such as a character recognition cannot be done with respect to a shot image depending on a state in a heating chamber (e.g. the inside of the heating chamber is gloomy due to poor lighting).
The present disclosure addresses the foregoing problem and shows one aspect of a heating cooker that is formed of a heating chamber accommodating a heat target object, an imaging section for shooting an image of an interior of the heating chamber, and a controller for setting a shooting condition for shooting the image of the interior of the heating chamber before carrying out an image recognition to the image. This controller analyzes the image of the interior of the heating chamber for recognizing a state inside the heating chamber before changing the shooting condition in response to the state inside the heating chamber.
The foregoing one aspect of the heating cooker that carries out the image recognition with respect to the image of the interior of the heating chamber allows increasing an accuracy in the image recognition.
A first aspect of the heating cooker of the present disclosure shows that the heating cooker includes a heating chamber for accommodating a heat target object, a shooting section for shooting an image of an interior of the heating chamber, and a controller for setting a shooting condition in order to shoot the image of the interior of the heating chamber and then carrying out an image recognition with respect to the image. The controller analyzes the image of the interior of the heating chamber for recognizing the state inside the heating chamber, and then changes the shooting condition in response to the state inside the heating chamber before shooting.
According to this first aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A second aspect of the heating cooker of the present disclosure shows that the heating cooker described in the first aspect further includes a lighting device that carries out a lighting to the interior of the heating chamber. The shooting condition contains a setting for the shooting section during the shooting and a setting for the lighting device during the lighting. According to this second aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A third aspect of the heating cooker of the present disclosure shows that the controller of the heating cooker in the first aspect specifies a target region in the image of the interior of the heating chamber for the image recognition, and carries out the image recognition to the target region in the image shot again after the shooting condition is changed.
According to this third aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A fourth aspect of the heating cooker of the present disclosure focuses on the shooting condition in which a sharpness of the image is changed. To be more specific, the controller sets the sharpness of the image to a higher level before shooting the image for the image recognition than the sharpness of the image shot for specifying the target region. According to this fourth aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A fifth aspect of the heating cooker of the present disclosure shows that the controller of the heating cooker in the first aspect changes the shooting condition for shooting the image again until the image recognition fails a given number of times.
According to this fifth aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A sixth aspect of the heating cooker of the present disclosure shows that a character recognition and an object recognition are included in the first aspect. The controller sets the shooting condition differently to shoot the image for the object recognition from the shooting condition for the character recognition.
According to this sixth aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A seventh aspect of the heating cooker of the present disclosure focuses on a control method. The method includes the following steps:
According to this seventh aspect, the heating cooker, which can carry out the image recognition with respect to the image of an interior of the heating chamber, allows increasing an accuracy in the image recognition.
An eighth aspect of the heating cooker of the present disclosure focuses on the control method. The steps in the seventh aspect further includes the step of specifying a target region in the image of the interior of the heating chamber that accommodates the heat target object for the image recognition. In the step of carrying out the image recognition, the image recognition is carried out to the target region in the image shot based on the shooting condition changed.
According to this eighth aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
A ninth aspect of the heating cooker of the present disclosure focuses on the control method in the seventh aspect. In the step of changing the shooting condition, which is changed until the image recognition fails a given number of times.
According to this ninth aspect, the heating cooker, which can perform the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy of the image recognition.
A tenth aspect of the heating cooker of the present disclosure focuses on the control method. The image recognition in the seventh embodiment includes a character recognition and an object recognition. In the step of changing the shooting condition, a shooting condition for shooting an image for the object recognition is set differently from that for shooting the image for the character recognition.
According to this tenth aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
An eleventh aspect of the heating cooker of the present invention shows a heating cooking system that includes a heating cooker and an information processing device. The heating cooker includes a heating chamber for accommodating a heat target object, a shooting section for shooting an image of an interior of the heating chamber, a controller for setting a shooting condition in order to shoot the image of the interior of the heating chamber and then carrying out an image recognition with respect to the image, and a communicator for communicating the image shot by the shooting section and the shooting condition set by the controller. The information processing section includes a communication section that communicates the image and the shooting condition, and a control section that analyzes the image received by the communication section, thereby recognizing a state inside the heating chamber, setting the shooting condition in response to the state inside the heating chamber, and transmitting the set shooting condition from the communication section.
According to this eleventh aspect, the heating cooker, which can carry out the image recognition with respect to the image of the interior of the heating chamber, allows increasing an accuracy in the image recognition.
The exemplary embodiments of the present disclosure are demonstrated hereinafter with reference to the accompanying drawings.
As
Display section 102 displays information (e.g. heating time) related to the operation of heating cooker 100a. Menu selecting section 103 has multiple buttons, such as a start button, to be used by users to input details of setting (e.g. heating time). Menu selecting section 103 can be formed of a touch panel.
Lighting 105 is placed deep in an opening provided on a lateral wall of heating chamber 101 for lighting an interior of heating chamber 101. Camera 106 is placed deep in an opening provided on a ceiling of heating chamber 101 for shooting inside heating chamber 101.
A control circuit (not shown) controls lighting 105 and camera 106, thereby allowing heating cooker 100a to obtain information such as a heating degree of heat target object 107a in a form of images.
As
Camera 106 is disposed closely to the left end of the ceiling of heating chamber 101; nevertheless, as long as camera 106 can shoot an entire inside of heating chamber 101, it can be disposed at another place (e.g. on the lateral wall of heating chamber 101). Camera 106 placed at the center of the ceiling can shoot heat target object 107a from the right above. Camera 106 can include multiple shooting sections.
As
Controller 200 includes shoot control section 204, lighting control section 208, heating control section 212, internal state control section 213, state analyzing section 215, image processing section 216, and recognizing section 217.
In this first embodiment, controller 200 is formed of a microcomputer. The present disclosure is not limited to this instance; nevertheless, use of a programmable microprocessor allows changing a content to be processed with ease, and increasing a degree of freedom in design.
In order to speed up the processing speed, the structural elements of controller 200 can be formed of a logic circuit. These structural elements can be formed of a single element or multiple elements physically. In the case of using the multiple elements, each of the structural elements can be corresponded to each one of the multiple elements. In this case, it is presumable that each of these multiple elements works as each one of the structural elements.
Shooting section 201 is equivalent to camera 106 shown in
Shoot setting section 202 is included in menu selecting section 103 and is used for setting a shoot. Shoot information administration section 203 includes nonvolatile memory for storing the content set by shoot setting section 202.
Shoot control section 204 controls shooting section 201 in response to the set content stored in shoot information administration section 203, which stores the images shot by shooting section 201.
Lighting device 205 is equivalent to lighting 105 shown in
Lighting setting section 206 is included in menu selecting section 103 and is used for setting the lighting. Lighting information administration section 207 includes a nonvolatile memory for storing the content set by lighting setting section 206. Lighting control section 208 controls lighting device 205 in response to the set content stored in lighting information administration section 207.
Heating section 209 is formed of a magnetron and others for heating the heat target object 107a. Heating setting section 210 is included in menu selecting section 103 and is used for setting details of the heating.
Heating information administration section 211 includes a nonvolatile memory for storing the details set by heating setting section 210. Heating control section 212 controls heating section 209 in response to the set details stored in heating information administration section 211.
Internal state control section 213 controls shoot control section 204 and lighting control section 208 in response to an internal state such as whether or not door 104 is open or whether or not heating section 209 is in operation, and transmits necessary information to heating control section 212.
Internal state administration section 214 includes a nonvolatile memory for storing the information about a state inside heating chamber 101 (hereinafter simply referred to ‘internal state’), and transmits the information about the internal state in response to a change in the internal state to internal state control section 213. The internal state implies the information such as whether or not heat target object is present in heating chamber 101 and whether or not door 104 is closed.
State analyzing section 215 analyzes the internal state based on the information supplied from internal state administration section 214 before determining whether or not the image recognition can be implemented. The state, in which the image recognition can be implemented, refers to the presence of heat target object 107a in heating chamber 101 and door 104 being closed. The image recognition includes the character recognition, barcode recognition, and object recognition.
In the case of the image recognition being implementable in the internal state, image processing section 216 analyzes the image shot by shooting section 201 for specifying the region where characters regarding heat information are possibly included. Hereinafter this specified region is referred to as a target region to be used for the image recognition such as character recognition.
Recognizing section 217 carries out the image recognition to the target region. When recognizing section 217 normally carries out the image recognition, heating control section 212 controls heating section 209 in response to the information obtained through the image recognition.
When heat target object 107a, for example a lunch box (refer to
The items for controlling hardware include a location and a focus of shooting section 201 (refer to
As
As
In the instance discussed above, set-value table 601 lists two candidates about item ‘sharpness’, nevertheless, table 601 can list four states (e.g. maximum, medium, minimum, and OFF). Item ‘contrast’ will not be limited to the above instance.
Tables 601 and 701 are controlled by shoot information administration section 203 (refer to
As
As
Item ‘brightness (Lux)’ is set to any one of the following six states, viz. ‘0’, ‘50’, ‘100’, ‘200’, ‘500’, and ‘1000’. These six states correspond to ‘0x0000’, ‘0x0032’, ‘0x0064’, ‘0x0008’, ‘0x01F4’, ‘0x03E8’ respectively.
In the instance discussed above, set-value table 901 lists six states about item ‘brightness’, nevertheless, the number of the states can be other than six. Set-value table 901 not necessarily includes predetermined states, instead, a user can input any state or value in table 901.
Set-value table 901 is administered by lighting information administration section 207 (refer to
As
As
In step S1002, when state analyzing section 215 determines that the internal state is not yet ready for the image recognition, then the process returns to step S1001. On the other hand, when state analyzing section 215 determines that the internal state is ready for the image recognition, the process moves to step S1003, where shoot control section 204 changes the shooting condition if necessary, and shooting section 201 shoots the image inside heating chamber 101.
In step S1004, image processing section 216 specifies the target region in the shot image. In step S1005, recognizing section 217 carries out the character recognition to the specified target region.
In step S1006, when recognizing section 217 determines that either one of the target region specification or the character recognition is not carried out normally, then the process moves to step S1007, where internal state control section 213 urges the user to input a cooking condition manually. When state analyzing section 215 recognizes the input via menu selecting section 103, the cooking condition is changed in response to the input.
In step S1006, when recognizing section 217 determines that the character recognition is carried out normally, the cooking condition in response to a result of the determination is input automatically.
When an input of the cooking condition is finished in step S1006 or in step S1007, the process moves to step S1008, where internal state control section 213 urges the user to start cooking by turning on a start button or with another signal. Recognition of pressing the start button by state analyzing section 215 prompts heating control section 212 to control heating section 209 such that heating section 209 starts cooking following the set cooking condition.
In step S1009, internal state control section 213 carries on recognizing the internal state until heat-target object 107 becomes a given state. When heat target object 107a becomes the given state, heating control section 212 prompts heating section 209 to stop cooking.
In the instance discussed above, the character recognition is carried out to food label 400. This character recognition can be replaced with the object recognition to heat target object 107a, a food recognition in parts to heat target object 107a, or the barcode recognition to food label 400.
A heat time and a heat wattage are the typical target information for the character recognition; nevertheless, the target information for the character recognition can be characters printed on food label 400, such as an expiry data, type of content in lunch box, product name, calories, and price.
In this first embodiment, the character recognition starts upon closing door 104; nevertheless, the character recognition can start upon opening door 104 or upon pressing the start button.
In this first embodiment, the set-item table can be changed in order to specify the target region and for the image recognition. Heating cooker 100a can be structured such that display section 102 displays the image inside heating chamber 101 when the inside of the heating chamber 101 is shot.
Hereinafter, the set items for shooting section 201 (set-item table 501) and the set items for lighting device 205 (set-item table 801) are collectively referred to as shooting condition.
Heating cooker 100b in accordance with the second embodiment is demonstrated hereinafter.
Heating cooker 100b has almost the same structure (refer to
As
The method for controlling the lightings in accordance with this second embodiment basically follows the same processes shown in
In step S1003, in the case of the state shown in
This second embodiment proves that the control of the lighting in response to the location of heat-target object 107a will prevent halation. As a result, an accuracy in the image recognition can be improved.
Heating cooker 100c in accordance with the third embodiment is demonstrated hereinafter.
Heating cooker 100c has a similar structure to that (refer to
In this third embodiment, food label 400 (refer to
The operation of heating cooker 100c in accordance with the third embodiment is demonstrated with reference to
This flowchart has the same steps as those shown in
In step S1002, state analyzing section 215 determines the internal state is ready for the image recognition, then the process moves to step S1501, where shoot control section 204 puts the item ‘sharpness’ listed in set-item table 501 in OFF state, and then shoots the image inside heating chamber 101.
In step S1502, image processing section 216 specifies a target region, which possibly contains the characters related to heating information, from the shot image.
As
In step S1503, shoot control section 204 puts item ‘sharpness’ in ON state and then shoots the image inside heating chamber 101 again.
The image shot in step S1503 undergoes the character recognition in step S1005 with respect to the target region specified in step S1502. Use of the image shot with item ‘sharpness’ staying in ON state will achieve the character recognition with a higher accuracy.
In this third embodiment, item ‘sharpness’ is put in OFF state in order to specify the target region, and item ‘sharpness’ is put in ON state in order to carry out the character recognition; nevertheless the third embodiment is not limited to this instance.
Other software processes, such as noise reduction, contrast, and white balance, can be used for specifying the target region and carrying out the character recognition. Multiple software processes can be combined. At least one hardware process can be added to the software process.
A change in the setting can be omitted in step S1501 or step S1503 to shorten a process time.
Heating cooker 100d in accordance with the fourth embodiment is demonstrated hereinafter.
Heating cooker 100d has a structure similar to that of heating cooker 100a in accordance with the first embodiment (refer to
In this fourth embodiment, a target region for the character recognition is extracted from food label 400 (refer to
In step S1006, when recognizing section 217 determines that either one of the target region specification or the character recognition is not carried out normally, the process moves on to step S1701, where recognizing section 217 determines whether or not the character recognition fails a given number of times.
Until the character recognition fails the given number of times, internal state control section 213 changes the shooting condition in step S1702 in response to the causes of the failures.
In step S1003, the inside of heating chamber 101 is shot in response to the changed shooting condition. In step S1006, when recognizing section 217 determines that the character recognition fails the given number of times, the process moves on to step S1007 discussed above.
In this fourth embodiment, in step S1006, the determination by recognizing section 217 that either one of the target region specification or the character recognition is not carried out normally prompts internal state control section 213 to change the shooting condition. Nevertheless, when image processing section 216 fails in specifying the target region or when recognizing section 217 fails in the character recognition, the shooting condition can be changed every time in such occasions for a next shooting.
Here is a more specific instance:
Heat target object 107a is sometimes not fit into a shooting frame depending on a location of object 107a or a situation during the shooting, or food label 400 cannot be read well due to reflection of the lighting. Focusing, iris-in or iris-out cannot be always controlled appropriately to the overall space in heating chamber 101 depending on the locations of shooting section 201 or lighting device 205.
This fourth embodiment proposes an idea to overcome the problem discussed above, viz. state analyzing section 215 realizes a location of the target region in the image, and then changes the shooting condition in response the location.
In the case of the image shown in
The heating cooker in accordance with the fifth embodiment is demonstrated hereinafter.
Heating cooker 100e has a structure similar to that of heating cooker 100a (refer to
As
In step S2102, state analyzing section 215 (refer to
In step S2103, when state analyzing section 215 determines that door 104 is closed, state analyzing section 215 determines which one of the character recognition or the object recognition should be used in response to the recognition mode.
At the first time, since the recognition mode is set to the character recognition, internal state control section 213 changes the setting to that for the character recognition in step S2105. Then shoot control section 204 prompts shooting section 201 to shoot the image inside heating chamber 101.
In step S2106, image processing section 216 specifies a target region for the object recognition. In step S2107, recognizing section 217 carries out the character recognition to the target region.
In step S2108, when recognizing section 217 determines that the character recognition is carried out normally, the cooking details in response to the result of the character recognition are input automatically. In step S2108, if recognizing section 217 determines that either one of the target region specification or the character recognition is not carried out normally, the process moves on to step S2109, where the object recognition is selected as the recognition mode. Then the process returns to step S2102.
In step S2102, state analyzing section 215 recognizes the internal state of heating cooker 100e. In step S2103, when state analyzing section 215 determines that door 104 is closed, state analyzing section 215 confirms the recognition mode in step S2104. Then the process moves on to step S2110.
This time, since the recognition mode is set to the object recognition, in step S2110 internal state control section 213 changes the setting to that for the object recognition. Then shoot control section 204 prompts shooting section 201 to shoot the image inside heating chamber 101.
In step S2111, image processing section 216 specifies a target region for the object recognition. Instep S2112, recognizing section 217 carries out the object recognition to the target region.
In step S2113, when recognizing section 217 determines that either one of the target region specification or the object recognition is not carried out normally, internal state control section 213 urges the user to input the setting manually. When state analyzing section 215 recognizes the input done through menu selecting section 103, the setting following the input is implemented.
In step S2113, when recognizing section 217 determines that the object recognition is carried out normally, the cooking details in response to the object recognition are input automatically.
When the input of the setting is finished in step S2113 or step S2114, the process moves on to step S2115, where internal state control section 213 urges the user to start cooking through depressing the start button or with another action. When state analyzing section 215 recognizes the press of the start button, heating control section 212 controls heating section 209 such that it starts cooking in response to the setting of the cooking details.
In step S2116, internal state control section 213 carries on recognizing the internal state until heat target object 107 becomes a given state. When heat target object 107b becomes the given state, heating control section 212 finishes the cooking.
As discussed above, in this fifth embodiment the settings are prepared for the shooting differently in the case of the character recognition from in the case of the object recognition, so that a more accurate image recognition can be achieved.
The heating cooking system in accordance with the sixth embodiment is demonstrated hereinafter.
This sixth embodiment includes heating cooker 100f that has a structure similar to that of heating cooker 100a (refer to
Heating cooker 100f differs from heating cooker 100a in the presence of a communicator, and in the absence of functions of state analyzing section 215, image processing section 216, and recognizing section 217 in controller 200. The communicator connects, via a network, heating cooker 100f to an information processing device such as a portable terminal or an external server.
In this sixth embodiment, the information processing device implements the functions of state analyzing section 215, image processing section 216, and recognizing section 217.
The information processing device thus includes a communicating section and a control section in order to implement the foregoing functions. The communicating section of the information processing device receives the image inside heating chamber 101 from the communicator of heating cooker 100f. The control section of the information processing device analyzes the image received, thereby recognizing a state inside heating chamber 101, and sets the shooting condition in response to the state inside heating chamber 101. The communicating section of the information processing section transmits the set shooting condition to the communicator of heating cooker 100f.
The image recognition in this sixth embodiment is carried out by the external information processing device, viz. the heating cooking system is formed of heating cooker 100f and the information processing device.
In this heating cooking system, the information processing device can include not only the functions of state analyzing section 215, image processing section 216, and recognizing section 217, but also functions of other structural elements included in controller 200.
A display section provided to the portable terminal can display the image of the interior of the heating chamber. Here is another instance, where the display section of the portable terminal can display a shape, characters, and a barcode of an object recognized by the portable terminal or the external server. A display section provided to the heating cooker can display the shape, characters, and the barcode of the object recognized.
The present disclosure is useful for professional heating cookers used in convenience stores and catering traders. It is also useful for home-use heating cookers.
100
a,
100
b,
100
c,
100
d,
100
e,
100
f heating cooker
101 heating chamber
102 display section
103 menu selecting section
104 door
105, 105a, 105b, 105c, 105d lighting
106 camera
107
a,
107
b heat target object
200 controller
201 shooting section
202 shoot setting section
203 shoot information administration section
204 shoot control section
205 lighting device
206 lighting setting section
207 lighting information administration section
208 lighting control section
209 heating section
210 heating setting section
211 heating information administration section
212 heating control section
213 internal state control section
214 internal state administration section
215 state analyzing section
216 image processing section
217 recognizing section
301, 302, 401 target region
400 food label
Number | Date | Country | Kind |
---|---|---|---|
2015-231284 | Nov 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/004918 | 11/18/2016 | WO | 00 |