The present technology relates to an information processing apparatus, an information processing method, and a program, and particularly relates to an information processing apparatus, an information processing method, and a program capable of appropriately estimating an internal temperature of a food material being heated.
When a lump of meat such as steak meat is cooked in a frying pan, it is very difficult to accurately control a degree of cooking inside the food material. A skilled chef has a skill of pressing meat with fingers during heating to judge a degree of cooking of the inside, but there is a case where he/she cannot always make a perfect judgement with respect to a food material having a large individual difference, and thus cannot cook as intended.
Some of recent cooking utensils have a function of performing temperature control by heating meat in a state of inserting a probe of a thermometer into the meat. However, it is not preferable since there is a risk in terms of hygiene and leakage of meat juice. Not limited to the meat, there is a high need for a technique for measuring a temperature inside a food material in a non-destructive manner during cooking, and such a technique is strongly demanded by an armature cook and a professional cook who pursues an accurate finish.
Under such a background, many techniques for estimating an internal temperature of a food material during heating have been proposed. For example, Patent Document 1 proposes a technique in which sensors for measuring a shape and a surface temperature of an object are arranged on a top surface and a side surface in a cooker, a three-dimensional model is configured on the basis of the shape of the object, and an internal temperature of the object is estimated by thermal conduction analysis by a boundary element method.
In the technique of Patent Document 1, it is not assumed to measure (estimate) a temperature on a bottom surface side that cannot be directly detected by a temperature sensor. Therefore, the technique described in Patent Document 1 is not suitable for cooking in a frying pan and the like.
The present technology has been made in view of such a situation, and an object thereof is to appropriately estimate an internal temperature of a food material being heated.
An information processing apparatus according to one aspect of the present technology, includes: a construction unit that constructs a three-dimensional model representing a shape and a temperature distribution of a cooking object on the basis of sensor data acquired by sensors that measure states of a cooking utensil and the cooking object; and an internal temperature estimation unit that estimates an internal temperature of the cooking object by performing thermal conduction analysis based on the three-dimensional model.
In one aspect of the present technology, a three-dimensional model representing a shape and a temperature distribution of a cooking object is constructed on the basis of sensor data acquired by sensors that measure states of a cooking utensil and the cooking object, and an internal temperature of the cooking object is estimated by performing thermal conduction analysis based on the three-dimensional model.
<Outline of Present Technology>
The present technology supports appropriate control of a heating state of a cooking object in a cooking method in which the cooking object is brought into contact with a heating medium and heated by thermal conduction.
Hereinafter, a mode for carrying out the present technology will be described. Note that the description will be given in the following order.
<<1. Cooking Assistance System>>
The cooking assistance system of the present technology is used, for example, in a scene where a cook U1 cooks steak meat as a cooking object 12 using a frying pan as a cooking utensil 11.
The cooking assistance system in
The heating device 1 includes a cooking stove, an IH cooker, or the like for heating the cooking utensil 11. The heating device 1 is installed in a work area where the cooking object 12 is cooked in the cooking utensil 11.
A camera sensor including the stereo camera 2 and the thermographic camera 3 is installed, for example, above the work area as a position where the work area including the cooking utensil 11 and the cooking object 12 can be seen. In a general kitchen environment, for example, the camera sensor is attached near a ventilation opening installed above the heating device 1.
The stereo camera 2 images the work area and acquires a visible image including depth information. The thermographic camera 3 images the work area and acquires a thermal image. The camera sensor is connected with the processor module 4 installed in a predetermined place such as in the same kitchen environment by a high-speed interface, and transmits and receives data to and from the processor module 4 in real time.
The processor module 4 is connected with the server 6 via the network device 5. The processor module 4 performs information processing in cooperation with the server 6 to estimate an internal temperature of the cooking object 12. Furthermore, the processor module 4 automatically adjusts heating power of the heating device 1, automatically adjusts air conditioning by the air conditioner 8, and presents information to the cook U1 by the information terminal 7 according to a heating state of the cooking object 12.
As indicated by broken lines, data is transmitted and received between the processor module 4 and each of the heating device 1, the network device 5, the information terminal 7, and the air conditioner 8 by, for example, wireless communication.
The server 6 is a server on an intranet or the Internet.
The information terminal 7 includes a smartphone, a tablet terminal, or the like having a display such as a liquid crystal display (LCD). The information terminal 7 placed near the cook U1 or the like detects an operation of the cook U1 and receives an input of information. The information terminal 7 performs information presentation and the like to the cook U1 according to control by the processor module 4.
The air conditioner 8 adjusts air conditioning of the kitchen environment according to control by the processor module 4.
<<2. Configuration of Each Device>>
The cooking assistance system in
Note that the processing unit illustrated in
The sensor unit 21 includes a temperature sensor 31, a distance sensor 32, and an image sensor 33. For example, the temperature sensor 31, the distance sensor 32, and the image sensor 33 are constituted by the camera sensor including the stereo camera 2 and the thermographic camera 3.
The temperature sensor 31 is a sensor that measures a surface temperature distribution of an object. The distance sensor 32 is a sensor that measures a three-dimensional shape of an object. The image sensor 33 is a sensor that captures an image of an object in a visible light region.
Each sensor of the sensor unit 21 is handled separately as a logical function, but is not necessarily configured by three corresponding physical devices. Hereinafter, these three sensors are appropriately collectively referred to as a basic sensor group.
Each sensor of the basic sensor group measures a state of an object in a non-contact and non-destructive manner, and transmits a measurement result as time-series data to the information processing apparatus 22. An internal parameter and an external parameter of each sensor are calculated by so-called camera calibration, and pixels between the sensors can be associated with each other by coordinate transformation. In other words, data measured by the basic sensor group can be expressed by a common three-dimensional coordinate system (world coordinates) in the information processing apparatus 22.
The information processing apparatus 22 includes a calculation unit 41 and a storage unit 42. For example, the information processing apparatus 22 is configured by the processor module 4.
The calculation unit 41 includes, for example, a general-purpose calculator such as a CPU, a GPU, or a DSP, or a dedicated calculator specialized for AI-related processing or the like. The calculation unit 41 estimates a three-dimensional shape, a surface temperature, and a thermal conduction characteristic of the cooking object 12 on the basis of information measured by the sensor unit 21 and known information held in the storage unit 42, and estimates an internal temperature of the cooking object 12 by thermal conduction analysis using a three-dimensional model.
The storage unit 42 includes a storage device such as a memory and a storage. The storage unit 42 holds known information such as a database representing thermal conduction characteristics of the cooking object 12.
Note that the information processing apparatus 22 may be configured by a combination of the local processor module 4 and the server 6 on the network.
The effector unit 23 is a peripheral device controlled by the information processing apparatus 22 to control a cooking state. Note that the effector unit 23 also includes an input device that is not included in the sensor unit 21 and is related to information input by a cook, an information terminal used by a user in a remote place away from the work area, and the like.
The effector unit 23 includes, for example, a UI device 51, the heating device 1, and the air conditioner 8. The UI device 51 includes the information terminal 7 in
As illustrated in
The sensor data input unit 101 receives sensor data transmitted from the sensor unit 21, and outputs the sensor data to the position and shape recognition unit 102, the surface temperature extraction unit 103, the process situation recognition unit 104, and the thermal conduction characteristic estimation unit 105.
The position and shape recognition unit 102 recognizes a position and a shape of the cooking object 12 on the basis of the sensor data supplied from the sensor data input unit 101. For example, the position and shape recognition unit 102 detects shielding of a field of view due to cooking work of a cook.
Furthermore, the position and shape recognition unit 102 performs detection and contour extraction of the cooking utensil 11 and the cooking object 12. The position and shape recognition unit 102 performs shape recognition of the cooking object 12 and construction of a three-dimensional model thereof. The three-dimensional model of the cooking object 12 is a model representing a shape and a temperature distribution of the cooking object 12. A recognition result by the position and shape recognition unit 102 is supplied to the surface temperature extraction unit 103, the process situation recognition unit 104, the thermal conduction characteristic estimation unit 105, and the internal temperature estimation unit 106.
The surface temperature extraction unit 103 extracts surface temperatures of the cooking object 12 and the heating medium on the basis of the sensor data supplied from sensor data input unit 101 and contour information of the cooking object 12 and the heating medium supplied from the position and shape recognition unit 102. An extraction result of the surface temperatures by the surface temperature extraction unit 103 is supplied to the process situation recognition unit 104, the thermal conduction characteristic estimation unit 105, and the internal temperature estimation unit 106.
The process situation recognition unit 104 recognizes a situation of a cooking process on the basis of the sensor data supplied from the sensor data input unit 101, the position and shape of the cooking object 12 supplied from the position and shape recognition unit 102, and the extraction result of the surface temperatures supplied from the surface temperature extraction unit 103. Specifically, the process situation recognition unit 104 detects input, removal, a shape change, a position/posture change, and the like of the cooking object 12. A recognition result by the process situation recognition unit 104 is supplied to the thermal conduction characteristic estimation unit 105 and the internal temperature estimation unit 106.
The thermal conduction characteristic estimation unit 105 estimates a thermal conduction characteristic of the cooking object 12 on the basis of the sensor data supplied from the sensor data input unit 101 and the position and shape of the cooking object 12 supplied from the position and shape recognition unit 102 according to the situation of the cooking process recognized by the process situation recognition unit 104. Specifically, the thermal conduction characteristic estimation unit 105 estimates a physical property value of the cooking object 12.
Furthermore, the thermal conduction characteristic estimation unit 105 estimates contact thermal resistance between the cooking object 12 and the heating medium on the basis of the contour information of the cooking object 12 supplied from the position and shape recognition unit 102 and the extraction result of the surface temperatures by the surface temperature extraction unit 103. Information indicating the contact thermal resistance estimated by the thermal conduction characteristic estimation unit 105 is supplied to the internal temperature estimation unit 106.
The internal temperature estimation unit 106 estimates a temperature of a portion to be heated of the cooking object 12 on the basis of the information supplied from the thermal conduction characteristic estimation unit 105. According to the situation of the cooking process recognized by the process situation recognition unit 104, an internal temperature of the cooking object 12 is estimated on the basis of the three-dimensional model of the cooking object 12 in which the temperature of the portion to be heated has been set. An estimation result of the internal temperature by the internal temperature estimation unit 106 is supplied to the effector control unit 107.
The effector control unit 107 controls the effector unit 23 on the basis of the estimation result of the internal temperature by the internal temperature estimation unit 106. The effector control unit 107 controls the heating device 1, controls information presentation to a cook, and the like.
<<3. Operation of Information Processing Apparatus>>
<Overall Processing>
Processing of the information processing apparatus 22 having the above configuration will be described with reference to a flowchart of
The flowchart of
In step S1, the sensor data input unit 101 receives an input of sensor data from each sensor of the sensor unit 21. A thermal image representing a surface temperature is input from the temperature sensor 31, and an RGB image as a visible image is input from the distance sensor 32. Furthermore, a depth image as depth information is input from the image sensor 33.
Each sensor data is time-series data captured in real time, and is input to the sensor data input unit 101 at an arbitrary timing. In order to simplify the description, it is assumed below that all pieces of sensor data are input in synchronization at a constant cycle.
Note that a series of processing described with reference to the flowchart of
In step S2, the position and shape recognition unit 102 performs position and shape recognition processing. In the position and shape recognition processing, cooking work of a cook is detected, and a position, a contour, and a shape of the cooking object 12 are recognized. Furthermore, a three-dimensional model of the cooking object 12 is constructed on the basis of the recognition result of the position and shape of the cooking object 12. Details of the position and shape recognition processing will be described later with reference to a flowchart of
In step S3, the surface temperature extraction unit 103 performs surface temperature extraction processing. In the surface temperature extraction processing, surface temperatures of the cooking object 12 and the heating medium are extracted. Details of the surface temperature extraction processing will be described later with reference to a flowchart of
In step S4, the process situation recognition unit 104 recognizes a situation of a cooking process on the basis of the information obtained before the processing in step S4. Details of recognition of the situation of the cooking process will be described later.
In step S5, the thermal conduction characteristic estimation unit 105 performs thermal conduction characteristic estimation processing. In the thermal conduction characteristic estimation processing, a thermal conduction characteristic of the cooking object 12 is estimated. Details of the thermal conduction characteristic estimation processing will be described later with reference to a flowchart of
In step S6, the internal temperature estimation unit 106 performs internal temperature estimation processing. In the internal temperature estimation processing, a temperature of the portion to be heated of the cooking object 12 is estimated, and an internal temperature is estimated on the basis of an estimation result of the temperature of the portion to be heated. Details of the internal temperature estimation processing will be described later with reference to a flowchart of
In step S7, the effector control unit 107 controls the effector unit 23 on the basis of an estimation result of the internal temperature of the cooking object 12 by the internal temperature estimation unit 106. An example of control of the effector unit 23 will be described later.
After the control of the effector unit 23 is performed in step S7, the process ends. Every time the sensor data is input, the above series of processing is executed.
<Position and Shape Recognition Processing>
Here, the position and shape recognition processing performed in step S2 of
(2-1) Detection of Shielding of Field of View by Operation of Cook
In step S21, the position and shape recognition unit 102 detects shielding of a field of view of the camera sensor due to the cooking work of the cook.
When the cook is operating the cooking object 12, human fingers, tongs, and the like may enter the field of view of the camera sensor and shield the object whose position and shape are to be recognized. The object includes the cooking object 12 such as steak meat and the cooking utensil 11 such as a frying pan. If subsequent processing is performed in such a state, there is a case where accuracy of the processing is reduced.
The position and shape recognition unit 102 detects that a hand of the cook is inserted into the field of view (within an imaging range) of the camera sensor by image recognition for an RGB image and a depth image, and recognizes a position of the hand. In a case where an important object is shielded by the cooking work of the cook, subsequent processing related to recognition of the object is skipped. By skipping the processing in and after step S22, it is possible to prevent a decrease in the accuracy of the processing.
(2-2) Detection and Contour Extraction of Cooking Utensil 11 and Cooking Object 12
In step S22, the position and shape recognition unit 102 detects, by image processing, whether the cooking utensil 11 and the cooking object 12 are present in the field of view. In a case where the cooking utensil 11 and the cooking object 12 are present in the field of view, the position and shape recognition unit 102 extracts contours of the cooking utensil 11 and the cooking object 12 on an image of the basic sensor group. Extracting the contour means identifying the contour of the object.
Various methods are conceivable as a method of contour extraction using an image processing technology based on sensor data acquired by the basic sensor group. Even in a case where any method is used, if a contour can be specified on an image acquired by a certain sensor, the contour can be specified on an image acquired by another sensor by coordinate conversion between the sensors.
The contour extraction of the cooking utensil 11 is performed by, for example, the following methods.
Object detection of the cooking utensil 11 and contour extraction of the cooking utensil 11 are performed using a database (inference model) generated by machine learning. When an RGB image is used as input information, information indicating a contour of the cooking utensil 11 is obtained as output information from the database.
In a case where known information such as a three-dimensional model representing a three-dimensional shape of the cooking utensil 11 is prepared in the position and shape recognition unit 102, a contour of the cooking utensil 11 is extracted using the known information. For example, presence/absence and a position and posture of the cooking utensil 11 are detected by registration processing of the three-dimensional model with respect to a scene point cloud generated on the basis of a depth image. The contour of the cooking utensil 11 on an RGB image is extracted on the basis of the detected position and posture and the three-dimensional model.
In a case where a three-dimensional shape of the cooking utensil 11 and a marker such as a characteristic pattern embedded in a specific region of the cooking utensil 11 are prepared as known information in the position and shape recognition unit 102, presence/absence and a position and posture of the marker on an RGB image are detected using the known information. On the basis of the position and posture of the marker, a position and posture of the cooking utensil 11 are identified, and a contour thereof on the RGB image is extracted.
After detecting the presence/absence of the cooking utensil 11 and extracting the contour thereof by the above-described method, the position and shape recognition unit 102 detects presence/absence of the cooking object 12 and extracts a contour thereof by setting the inside of the contour of the cooking utensil 11 as a region of interest.
As the cooking object 12 that can be handled by methods proposed by the present technology, a solid object capable of specifying an overall shape and a shape of the portion to be heated is assumed. For example, solid objects such as lump meat, fish (whole or fillet), pancake, and omelet are treated as the cooking object 12. The plurality of cooking objects 12 may be put on the cooking utensil 11, or the cooking objects 12 of different types may be put thereon together.
Detection of the presence/absence of the cooking object 12 and extraction of the contour thereof are performed by, for example, the following methods.
Object detection of the cooking object 12 and contour extraction of the cooking object 12 are performed using a database generated by machine learning. When an RGB image is used as input information, information indicating a contour of the cooking object 12 is obtained as output information from the database.
In a case where the three-dimensional shape of the cooking utensil 11 is known and the position and posture of the cooking utensil 11 are detected by the method (B) or (C) described above, a depth image showing only the cooking utensil 11 is generated. By performing background difference processing on a depth image actually captured by the stereo camera 2 with the depth image showing only the cooking utensil 11 as a background, a depth image of the cooking object 12 is extracted as a foreground. Therefore, a contour of the cooking object 12 on the depth image is extracted.
(2-3) Shape Recognition and Three-Dimensional Model Construction of Cooking Object 12
Returning to the description of
In a case where the cooking object 12 is put in the cooking utensil 11, the position and shape recognition unit 102 constructs a three-dimensional model on the basis of shape information representing a shape of the cooking object 12.
Furthermore, in a case where any of the shape, volume, and a position and posture of the cooking object 12 is changed in a cooking process, the position and shape recognition unit 102 reconstructs the three-dimensional model.
Since the contour of the cooking object 12 on the depth image is extracted by the processing of step S22, it is possible to create point cloud data of the cooking object 12. The point cloud data represents a three-dimensional shape of the cooking object 12 with respect to an exposed surface where the distance sensor 32 can detect a distance. Note that a three-dimensional shape and a position and posture of the cooking utensil 11 (in particular, a heating surface in contact with the cooking object 12) serving as a heating medium are recognized. The three-dimensional shape and the position and posture of the cooking utensil 11 serve as a reference of the three-dimensional model of the cooking object 12.
As illustrated in A of
As indicated by a broken line in B of
As illustrated in C of
As indicated by densely painted voxels in A of
As indicated by thinly painted voxels in B of
The position and shape recognition unit 102 sets a set of voxels determined as the components as a shape structure of the three-dimensional model of the cooking object 12, thereby constructing the three-dimensional model of the cooking object 12.
The construction procedure of the three-dimensional model as described above is an example. On the basis of the shape information of the cooking object 12 measured by the distance sensor 32, the three-dimensional model of the cooking object 12 is constructed so as to have an expression form suitable for thermal conduction analysis. Accuracy of the contour extraction performed in step S22 and the construction of the three-dimensional model performed in step S23 is determined on the basis of estimation accuracy of an internal temperature required as a result of the thermal conduction analysis.
After the three-dimensional model of the cooking object 12 is constructed in step S23, the processing returns to step S2 in
<Surface Temperature Extraction Processing>
The surface temperature extraction processing performed in step S3 of
(3-1) Extraction of Surface Temperature of Cooking Object 12
In step S31, the surface temperature extraction unit 103 extracts a surface temperature of the cooking object 12 on the basis of the thermal image acquired by the temperature sensor 31, and maps the surface temperature on the three-dimensional model of the cooking object 12 constructed by the position and shape recognition unit 102. Extracting the temperature means detecting the temperature.
On a surface of the cooking object 12, a position where temperature can be extracted on the basis of the thermal image is referred to as a “temperature extraction point”. The temperature extraction point is represented in three-dimensional coordinates.
Furthermore, a position where temperature is defined on the three-dimensional model is referred to as a “temperature definition point”. The temperature definition point is set according to a method of constructing the three-dimensional model. For example, the temperature definition point is set at a vertex or a center point of each voxel.
The surface temperature extraction unit 103 determines temperature at the temperature definition point on the basis of a temperature value at the temperature extraction point in the vicinity of the temperature definition point. For example, the surface temperature extraction unit 103 sets a region within a certain distance from the temperature definition point as a neighboring region, and determines a temperature at the temperature extraction point closest to the temperature definition point as the temperature at the temperature definition point.
The temperature at the temperature definition point may be determined using general sampling processing such as applying filter processing to the thermal image in a case where there is a lot of noise at the temperature extraction point, or linearly complementing the temperature at the temperature definition point in a case where a temperature gradient is large in the vicinity of the temperature definition point.
For example, a temperature definition point of a voxel indicated by hatching in
(3-2) Extraction of Surface Temperature of Heating Medium
In step S32, the surface temperature extraction unit 103 extracts a surface temperature of the heating medium on the basis of the thermal image acquired by the temperature sensor 31.
By the position and shape recognition processing in step S2 of
The surface temperature extraction unit 103 extracts the surface temperature of the heating medium on the basis of a temperature in a region as indicated by hatching in A of
As shown in the thermal image in A of
Therefore, as shown in B of
The surface temperature extraction unit 103 extracts an average temperature of the neighboring region in the entire region of the heating medium as a surface temperature Treat of the heating medium.
After the surface temperature Theat of the heating medium is calculated in step S32, the process returns to step S3 in
<Situation Recognition of Cooking Process>
Details of recognition of the situation of the cooking process performed in step S4 of
Various recognition methods are conceivable as a situation recognition method using an image recognition technology based on sensor data acquired by the basic sensor group. An example will be described below.
According to the processing in steps S2 and S3, the number of cooking objects 12 put in the cooking utensil 11, and their positions, contours, shapes, and surface temperatures are recognized. Furthermore, a timing at which cooking work is performed by a cook is also recognized as auxiliary information. The auxiliary information is information for assisting the recognition of the situation of the cooking process.
Occurrence of the input or removal of the cooking object 12 ((A) or (B) described above) is recognized in a case where the number of cooking objects 12 is changed before and after the cooking work by the cook is performed.
In a case where a weight sensor is provided as a sensor constituting the sensor unit 21, occurrence of the input or removal of the cooking object 12 may be recognized in response to detection of a discontinuous weight change by the weight sensor. In a case where there is a plurality of cooking objects 12, it is necessary to identify sameness of an individual as the cooking object 12 and to appropriately maintain association with the three-dimensional model.
Occurrence of the position and posture change of the cooking object 12 ((C) described above) is recognized on the basis of a change in position, contour, shape, surface temperature, surface image, and the like of the cooking object 12 regardless of presence or absence of the number change. In particular, it is recognized that a portion (portion to be heated) of the cooking object 12 in contact with the heating medium has greatly changed, such as turning over of meat or fish being cooked.
In an example in which steak meat is turned over, there is a case where no significant change in position, contour, shape, and the like of the steak meat. Thus, while referring to these pieces of information, it is preferable to determine the change in position and posture of the steak meat on the basis of the change in surface temperature of the cooking object 12 as described later.
The shape of the cooking object 12 may change in the cooking process. For example, pancake or hamburger steak serving as the cooking object 12 expands by heating. In a case where the posture or shape of the cooking object 12 changes in the cooking process and a deviation from the three-dimensional model increases, it is necessary to reconstruct the three-dimensional model. Reconstruction of the three-dimensional model will be described later.
<Thermal Conduction Characteristic Estimation Processing>
The thermal conduction characteristic estimation processing performed in step S5 of
Generation of Thermal Conduction Characteristic
In step S41, the thermal conduction characteristic estimation unit 105 determines whether or not an input of the cooking object 12 has been detected on the basis of the recognition result of the process situation by the process situation recognition unit 104.
In a case where it is determined in step S41 that the input of the cooking object 12 has been detected, in step S42, the thermal conduction characteristic estimation unit 105 estimates a thermal conduction characteristic of the cooking object 12 input to the cooking utensil 11. The thermal conduction characteristic is a parameter required for thermal conduction analysis and includes, for example, thermal conductivity, specific heat, density, and a thermal diffusion coefficient of an object. The thermal diffusion coefficient is calculated on the basis of the thermal conductivity, specific heat, and density of the object.
Specifically, the thermal conduction characteristic estimation unit 105 specifies a food material characteristic indicating a type, a portion, quality, and the like of the food material as the cooking object 12, and obtains a thermal conduction characteristic corresponding to the food material characteristic by using various known measurement data. The food material characteristic is specified, for example, by methods described below.
A cook selects recipe data using a UI function of the effector unit 23. For example, when cooking work is performed according to navigation by an application installed in the information terminal 7, the cook selects recipe data of a dish to be made.
The thermal conduction characteristic estimation unit 105 specifies a food material characteristic by directly acquiring food material characteristic information included in the recipe data selected by the cook or by acquiring food material characteristic information from a database.
As auxiliary means of the method (A) described above, a cook directly inputs a food material characteristic using the UI function. For example, in a case where there is a difference between the food material characteristic of the food material presented by the recipe data and a food material characteristic of a food material actually used for cooking, the cook inputs information of the food material characteristic that is not held by the recipe data. A type of food material may be set using a button of a main body of the cooking utensil 11 such as a microwave oven.
The thermal conduction characteristic estimation unit 105 specifies a food material characteristic of the cooking object 12 by image recognition based on information acquired by the sensor unit 21, such as an RGB image in which the cooking object 12 appears. With respect to a food material characteristic such as a fat content of meat in which an individual difference of a food material appears, image recognition of an image in which the actual cooking object 12 appears is effective.
The thermal conduction characteristic estimation unit 105 can specify volume of the input cooking object 12 on the basis of the three-dimensional model of the cooking object 12. In a case where the sensor unit 21 includes a weight sensor and can individually measure weight of the cooking object 12, the thermal conduction characteristic estimation unit 105 specifies density on the basis of the volume and weight of the cooking object 12. In a case where the food material characteristic is specified by the method (C) described above, it is possible to narrow more probable candidates for the food material characteristic by setting the density as known information.
The thermal conduction characteristic of the cooking object 12 is estimated on the basis of the food material characteristic specified as described above.
Update of Thermal Conduction Characteristic
On the other hand, in a case where it is determined in step S41 that the input of the cooking object 12 has not been detected, the process proceeds to step S43.
In step S43, the thermal conduction characteristic estimation unit 105 determines whether or not a shape change of the cooking object 12 has been detected on the basis of the recognition result of the process situation by the process situation recognition unit 104.
In a case where it is determined in step S43 that the shape change of the cooking object 12 has been detected, in step S44, the thermal conduction characteristic estimation unit 105 updates the thermal conduction characteristic of the cooking object 12.
The thermal conduction characteristic of the cooking object 12 generally changes in a heating process. Therefore, it is desirable that the thermal conduction characteristic is updated as needed not only at the time of input but also after the input. The thermal conduction characteristic estimation unit 105 repeatedly updates the thermal conduction characteristic in the cooking process such as the heating process.
There is also a food material whose density greatly changes in the heating process, such as a pancake. Such a change in density affects a thermal conduction characteristic. In a case where volume of the cooking object 12 greatly changes in the heating process, the thermal conduction characteristic estimation unit 105 updates an estimated value of density on the basis of a state of the cooking object 12.
Moisture of meat or fish is lost by heating. Since specific heat of moisture is high, a moisture content of the cooking object 12 has a great influence on the thermal conduction characteristic. Therefore, it is also useful to detect a change in moisture content.
For example, as described in Patent Document 4, in a case where it is assumed that a weight change of the cooking object 12 is due to moisture evaporation, the thermal conduction characteristic estimation unit 105 can estimate a moisture content on the basis of the weight change of the cooking object 12.
Furthermore, as another method, the moisture content may be detected using a database constructed by machine learning so as to input the RGB image acquired by the image sensor 33 and output the food material characteristic and the moisture content of the cooking object 12. In this case, it is possible to expect estimation accuracy equivalent to that of a skilled chef who visually judges a state of the cooking object 12.
Not limited to the RGB image, the database may be constructed by machine learning using a thermal image obtained by imaging the cooking object 12 (surface temperature of the cooking object 12), an internal temperature estimated in subsequent processing, and the like.
In a case where the sensor unit 21 is provided with a near-infrared spectrometer, a change in moisture content may be obtained on the basis of an infrared radiation spectrum of the cooking object 12 measured by the near-infrared spectrometer. In this manner, the moisture content may be directly measured.
Estimation of Contact Thermal Resistance
On the other hand, in a case where it is determined in step S43 that the shape change of the cooking object 12 has not been detected, the processing proceeds to step S45.
In step S45, the thermal conduction characteristic estimation unit 105 determines whether or not a posture change of the cooking object 12 has been detected on the basis of the recognition result of the process situation by the process situation recognition unit 104.
In a case where it is determined in step S45 that the posture change of the cooking object 12 has been detected, in step S46, the thermal conduction characteristic estimation unit 105 estimates contact thermal resistance between the cooking object 12 and the heating medium.
As illustrated in an upper stage of
On the other hand, as shown in a lower stage of
The temperature of the cooking object 12 and the temperature of the heating medium in the vicinity of the cooking object 12 are extracted by the processing in steps S2 and S3 in
For example, in a case where conditions defined by the following formulas (1) and (2) are satisfied, it is determined that the cooking object 12 has been turned over. Note that the process situation recognition unit 104 determines that the cooking object 12 has been turned over based on the following formulas (1) and (2).
[Math. 1]
T
after
−T
before
>T
flip (1)
[Math. 2]
T
heat
−T
after
<T
gap (2)
Tbefore represents a surface temperature of the cooking object 12 before a posture change, and Tafter represents a surface temperature of the cooking object 12 after the posture change. Theat represents a temperature of the heating medium in the vicinity of the cooking object 12 after the posture change.
Furthermore, Tflip represents a threshold of a temperature difference at which it is determined that the cooking object has been flipped over, and Tgap represents a threshold of a temperature difference of a contact surface at which it is determined that the cooking object 12 has been in contact with the heating medium for a sufficient time.
Satisfying the condition defined by the formula (1) means that a change in surface temperature of the cooking object 12 is larger than the threshold before and after the posture change, that is, the surface of the cooking object 12 exposed by turning over is sufficiently heated.
Furthermore, satisfying the condition defined by the formula (2) means that a difference between the temperature of the heating medium and the surface temperature of the cooking object 12 after the posture change is smaller than the threshold, that is, the surface of the cooking object 12 exposed by turning over is sufficiently heated to a temperature close to the temperature of the heating medium.
When the conditions defined by the formulas (1) and (2) are satisfied and it is determined as being immediately after the portion to be heated of the cooking object 12, which has been sufficiently heated, is exposed to the surface, the surface temperature Tafter can be regarded as a temperature equal to the temperature of the portion to be heated in contact with the heating medium having the temperature Theat.
As illustrated in the thermal image in the lower stage of
Contact thermal resistance Rcontact on the contact surface between the heating medium and the cooking object 12 is defined by the following formula (3).
Q represents a heat flow rate transferred from the heating medium to the cooking object 12. In a case of steady thermal conduction, the heat flow rate Q is expressed by the following formula (4).
A represents an area of the contact surface on which thermal conduction occurs, and k represents thermal conductivity of the cooking object 12. z represents a direction in which heat is transferred. Here, z represents a vertically upward direction from the contact surface. T is a temperature of the cooking object 12 and is represented by a function of z.
The area A is obtained on the basis of the contour of the cooking object 12 extracted by the position and shape recognition processing in step S2 of
Therefore, when ∂T/∂z is determined, the heat flow rate Q is estimated by the formula (4). In a situation where the temperature of the heating medium is stabilized and the heating medium is heated for a sufficient time, it can be expected that a temperature gradient inside the cooking object 12 is a relatively monotonous gradient from the contact surface toward a center portion (along the z direction).
In a case where a linear temperature gradient occurs, ∂T/∂z can be approximated to a value expressed by the following formula (5).
L represents thickness of the cooking object 12 in the z direction, and is obtained on the basis of the three-dimensional model constructed by the processing of step S2. Tcenter represents a temperature of the center portion of the cooking object 12, which is located above the contact surface by a distance L/2.
Although a value of the temperature Tcenter of the center portion is accurately unknown, in a case where the formula (1) holds, the values of Tcenter and Tbefore can be approximated as the same value. Assuming a state in which the exposed surface before the posture change is not directly heated yet and the temperature of the exposed surface is close to normal temperature (temperature before the start of heating), it is considered that the temperature of the center portion remains at a similar temperature.
Therefore, the contact thermal resistance Rcontact is approximately obtained by the following formula (6).
The contact thermal resistance Rcontact thus obtained is used for estimation of internal temperature. After the contact thermal resistance is estimated in step S46, the processing returns to step S5 in
Similarly, after the thermal conduction characteristic is obtained in step S42, after the thermal conduction characteristic is updated in step S44, or in a case where it is determined in step S45 that the posture change of the cooking object 12 has not been detected, the processing returns to step S5 in
<Internal Temperature Estimation Processing>
The internal temperature estimation processing performed in step S6 of
(6-1) Estimation of Temperature of Portion to be Heated
In step S61, the internal temperature estimation unit 106 estimates a temperature of the portion to be heated in contact with the heating medium, and maps the temperature on the three-dimensional model of the cooking object 12. For example, the temperature of the portion to be heated is mapped to a temperature definition point of a voxel indicated with dots in
In the formula (6), the surface temperature Tafter is replaced with a temperature Tbottom of the portion to be heated (bottom surface), and the surface temperature Tbefore is replaced with a temperature Ttop of the exposed surface (surface), whereby the following formula (7) representing the temperature Tbottom of the portion to be heated is obtained.
As shown in the formula (8), r is a dimensionless constant. The contact thermal resistance Rcontact is caused by roughness, hardness, pressing pressure, and the like of the contact surface. It is considered that the contact thermal resistance Rcontact is not rapidly changed in a state in which oil and the like serving as the heating medium is uniformly interposed on the contact surface and the portion to be heated of the cooking object 12 is heated to some extent.
Therefore, by treating the contact thermal resistance Rcontact calculated once as a constant, the temperature Tbottom of the portion to be heated of the cooking object 12 can be estimated on the basis of the temperature Theat of the heating medium, the temperature Ttop of the exposed surface of the cooking object 12, and the constant r obtained by the known parameter.
In a case where the contact thermal resistance Rcontact is unknown, the internal temperature estimation unit 106 regards the contact thermal resistance Rcontact=0 and maps the temperature Theat of the heating medium as the temperature Tbottom of the portion to be heated of the cooking object 12.
(6-2) Estimation of Internal Temperature
In step S62, the internal temperature estimation unit 106 estimates an internal temperature of the cooking object 12. By the processing up to the previous stage, the temperatures of the surface portion and the portion to be heated of the cooking object 12 are mapped on the three-dimensional model.
Here, the inside of the cooking object 12 is a portion corresponding to a region of the three-dimensional model to which temperature is not mapped. The internal temperature is estimated by a different method for each of the following conditions.
In an initial state in which the cooking object 12 is put into the cooking utensil 11 and heating is started, it is assumed that the internal temperature of the cooking object 12 is uniform. That is, the internal temperature is considered to be a temperature close to a surface temperature measured by the temperature sensor 31.
In this case, the internal temperature estimation unit 106 obtains an average value of the surface temperature of the cooking object 12, and maps the average value as the internal temperature in a voxel corresponding to the inside of the cooking object 12. This corresponds to an initial condition of thermal conduction analysis.
As described above, when the three-dimensional model of the cooking object 12 is first constructed, a representative value of the surface temperature of the cooking object 12 is estimated as the internal temperature. The representative value of the surface temperature of the cooking object 12 includes a value obtained on the basis of the surface temperature of the cooking object 12, such as an average value or a median value of the surface temperature of the cooking object 12.
In a case where the change in position and posture or shape of the cooking object 12 is recognized in step S4, the three-dimensional model is reconstructed.
As illustrated in an upper stage of
As illustrated in a lower stage of
In a case where the posture and shape of the cooking object 12 change, creation of voxels of the cooking object 12 and extraction of a temperature at a temperature definition point in a surface portion of the three-dimensional model indicated with dots are performed in a manner similar to the processing in steps S2 and S3.
The bottom surface temperature in a case where the posture and the shape of the cooking object 12 change is specified by the processing of step S61 and mapped to a voxel indicated with dots.
For other voxels excluding the surface portion and the portion to be heated among the voxels constituting the three-dimensional model, ideally, it is desirable to reproduce a temperature distribution estimated before the reconstruction in order to continue the thermal conduction analysis.
However, it is generally difficult to specify how the posture and shape of the cooking object 12 have changed before and after cooking work by a cook. Therefore, it is not easy to associate the temperature distribution of the three-dimensional model in voxel units and map the temperature distribution from the three-dimensional model before the reconstruction to the three-dimensional model after the reconstruction.
Therefore, the temperature at the temperature definition point of the three-dimensional model corresponding to the inside of the cooking object 12 is mapped by the following method.
First, on the basis of the temperature distribution before the reconstruction, the internal temperature estimation unit 106 obtains internal energy Uall of the cooking object 12 (total amount of heat held by the cooking object 12) on the basis of the following formula (9).
c represents specific heat, and Ti represents temperature. T0 represents reference temperature. A subscript i represents a temperature definition point. The sum of the internal energy at all the temperature definition points of the three-dimensional model before the reconstruction is obtained by the formula (9).
Note that the specific heat c is generally not uniform over an entire food material and has temperature dependency, but is treated as a constant here. In a case where an accurate value of the specific heat c including dependency on a portion of the food material and temperature is obtained in step S5, calculation may be performed using the accurate value of the specific heat c instead of being treated as the constant.
Next, as shown in the following formula (10), the internal temperature estimation unit 106 obtains internal energy Ubound of a portion where temperature can be specified in the reconstructed three-dimensional model.
A subscript j represents a temperature definition point. The sum of internal energy at temperature definition points in a region corresponding to the surface portion and the portion to be heated from which the temperature is extracted in the reconstructed three-dimensional model is obtained by the formula (10).
The internal temperature estimation unit 106 determines a temperature Tbulk by the following formula (11), where Nbulk is the total number of temperature definition points whose temperatures are not specified in the reconstructed three-dimensional model.
The internal temperature estimation unit 106 maps the temperature Tbulk as a temperature value at a temperature definition point whose temperature is not specified in the reconstructed three-dimensional model.
As described above, when the change in position and posture or shape of the cooking object 12 has been recognized, the three-dimensional model is reconstructed so that the sum of the internal energy is stored before and after the reconstruction, and the temperature Tbulk is estimated as the internal temperature on the basis of the reconstructed three-dimensional model. Therefore, it is possible to substantially maintain estimation accuracy of the internal temperature before and after the reconstruction of the three-dimensional model.
Using the three-dimensional model configured by the method (A) or (B) described above, the internal temperature estimation unit 106 performs thermal conduction analysis by numerical analysis such as a finite element method. A thermal conduction model, which is a mathematical model of thermal conduction, is expressed by a three-dimensional non-steady thermal conduction equation as in the following formula (12).
κ represents a thermal diffusion coefficient, and t represents time. T(x, y, z, t) represents a temperature of the cooking object 12 expressed as a function of time and space. Hereinafter, the arguments x, y, and z representing spatial coordinates are omitted. The thermal conduction model is illustrated in
As an initial condition of the non-steady thermal conduction equation, a temperature distribution T(0) at time t=0 is given. Time t=0 is time when the cooking object 12 is put in the cooking utensil 11. In the method (A) described above, a temperature distribution mapped on the three-dimensional model corresponds to the temperature distribution T(0) under the initial condition. Note that the reconstruction of the three-dimensional model performed in the method (B) described above is performed at timing of time t>0, and substantially means resetting of the initial condition.
As a boundary condition at time t>0, a temperature distribution T top (t) of the surface temperature measured by the temperature sensor 31 and a temperature distribution Tbottom(t) of the bottom surface temperature estimated in step S61 are given.
The temperature inside the cooking object 12, which is not bound as a boundary condition, is obtained by numerical calculation based on a governing equation obtained by discretizing the formula (12).
Incidentally, in order to appropriately control a finish of the cooking object 12, it is important to predict a change in the internal temperature of the cooking object 12 in a residual heat process after heating is stopped. In particular, in a case where volume of the cooking object 12 is large, a center temperature of the cooking object 12 greatly changes in the residual heat process.
The internal temperature estimation unit 106 predicts a temperature change of the cooking object 12 in the residual heat process by applying the thermal conduction model described above. For example, in a case where heating is stopped at time t=tstop, the internal temperature estimation unit 106 predicts a time change of the temperature distribution Ttop(t) of the surface temperature and the temperature distribution Tbottom(t) of the bottom surface temperature after time t=tstop to use the predicted time change as a boundary condition, and executes similar numerical analysis prior to real time.
First, a method of predicting the temperature distribution T top (t) is considered. As described above, since the temperature of the exposed surface of the cooking object 12 is always measured by the temperature sensor 31, the temperature distribution Ttop(t) is predicted on the basis of time-series data of measurement values until heating is stopped.
As shown in
Next, a method of predicting the temperature distribution Tbottom(t) will be considered. The temperature distribution Tbottom(t) is predicted on the basis of a prediction result of a temperature change of the heating medium after the heating is stopped.
As shown in
Therefore, in order to accurately predict the temperature change of the heating medium, a thermal conduction characteristic of the cooking utensil 11 to be actually used is required. The thermal conduction characteristic of the cooking utensil 11 is calculated on the basis of physical property values related to thermal conduction, such as a material, specific heat, volume, and a surface area of the cooking utensil 11. A method of estimating the thermal conduction characteristic of the cooking utensil 11 to be actually used and predicting the temperature change of the heating medium on the basis of the thermal conduction characteristic is not a realistic method because there is no versatility.
In order to predict the temperature change of the heating medium, for example, it is an effective method to perform calibration for the cooking utensil 11 to be actually used as an initial setting.
For example, a cook places only the cooking utensil 11 to be actually used on a stove, heats the cooking utensil to a sufficiently high temperature, then stops fire, and naturally leaves the cooking utensil. By measuring the temperature of the cooking utensil 11 in a cooling process by the temperature sensor 31, a curve representing the temperature change as illustrated in
The internal temperature estimation unit 106 holds, as a characteristic value of the cooking utensil 11, a slope of the temperature decrease of the cooking utensil 11 determined according to a difference between the temperature of the cooking utensil 11 and the room temperature. After the calibration, the internal temperature estimation unit 106 can predict a temperature change of the cooking utensil 11 on the basis of a measurement value of the temperature of the cooking utensil 11 and a measurement value of the room temperature at the time when heating is stopped in a cooking process.
On the basis of the temperature change of the cooking utensil 11, the internal temperature estimation unit 106 can predict a temperature change of the temperature distribution Tbottom(t) on the basis of the formula (8) described above.
After the internal temperature of the cooking object 12 is estimated in step S62, the process returns to step S6 in
<Effector Control>
Details of control of the effector unit 23 performed in step S7 of
(A) Present Information about Heating Situation
The effector control unit 107 communicates with the information terminal 7 in
For example, the effector control unit 107 displays a surface temperature and an internal temperature of the cooking object 12 in real time from a free viewpoint on the basis of the three-dimensional model. The display of the surface temperature and the internal temperature is displayed in a manner similar to display of a result of thermal conduction analysis on a screen by a computer aided engineering (CAE) tool installed in a PC using CG.
There is a strong need to visualize not only a temperature distribution at a certain time but also a heat flow (temperature gradient in a time direction). Similar to visualizing a vector field with the CAE tool, the heat flow may be made visible. In addition to the current temperature distribution, a speed at which heat passes through a food material is presented, so that the cook can predict a change in the cooking object 12. The cooking assistance system assists the cook so as to enable appropriate heating control for obtaining an ideal finish.
Although an automatic adjustment function as described in (B) below is expected to evolve in the future, there are many cases where human skills are superior. Cooking in a work area may proceed while being at a remote location and receiving judgement and guidance from a skilled chef who has seen the presentation by the information terminal 7. The information terminal 7 constituting the cooking assistance system is connected to the information processing apparatus 22 via an intranet or the Internet.
(B) Automatic Adjustment of Heating Power
The effector control unit 107 communicates with the heating device 1 in
Furthermore, as described later, a function of automatically stopping heating at a timing when it can be expected that a center temperature of the cooking object 12 reaches a target temperature by a simulation of a residual heat process is also useful.
(C) Air Conditioning Control
There is a case where environmental conditions such as temperature and humidity of a cooking environment greatly affects a finish of a dish. In cooking of meat or fish, an operation of returning a temperature of a food material to normal temperature is performed as preliminary preparation. Unevenness in the temperature of the food material at the start of overheating directly leads to unevenness in a degree of cooking, and therefore it is important to keep the normal temperature (temperature in a room) constant.
Even in a residual heat process, a change in center temperature varies depending on temperature of a place where the meat is placed. Air conditioning control by a dedicated device specialized for a cooking environment is considered to be effective for accurately controlling heating conditions in search of reproducibility of a dish.
As feedback control based on temperature information measured by the thermographic camera 3, the effector control unit 107 controls operation setting of the air conditioner 8 so that a periphery of the cooking object 12 is in appropriate conditions.
Data measured by an auxiliary sensor including a sensor built in another device other than the basic sensor group, such as the heating device 1, may be used. For example, a thermometer or a hygrometer installed at a position where a vicinity of the cooking object 12 can be measured is provided as the auxiliary sensor.
A fragrance sensor may be provided as the auxiliary sensor. In a case of making a dish whose fragrance is important, it is essential to remove an excessive odor of environment by air conditioning for controlling a finish.
As described above, the effector control unit 107 controls at least one of the heating device 1, the information terminal 7, or the air conditioner 8, for example.
(D) Storage of Recognition Data
Sensor data measured by the sensor unit 21 and information estimated by the information processing apparatus 22 may be stored in the storage unit 42 for post-analysis. That is, applications such as estimation results by the information processing apparatus 22 are not limited to heating control in real time. A physical configuration of the storage unit 42 is arbitrary so that the storage unit is included in the processor module 4 or provided in the server 6.
It is effective in a case where the sensor unit 21 includes a temperature sensor capable of accurately measuring an internal temperature of the cooking object 12. For example, a cooking thermometer using a thermocouple probe is provided as a component of the cooking utensil 11. The information processing apparatus 22 acquires data measured by the cooking thermometer and stores the data together with other sensor data in association with information of the estimation results. Reference to these pieces of information as Ground Truth information for estimation of the internal temperature is useful for development of a technical method for improving estimation accuracy of the internal temperature.
Furthermore, storing the RGB image acquired by the image sensor 33 together with the estimation result of the thermal conduction characteristic of the cooking object 12 by the thermal conduction characteristic estimation unit 105 is useful for verifying validity of the estimation result, and is useful for technological development for accuracy improvement.
With the above processing, in the cooking assistance system, the temperature of the portion to be heated of the cooking object 12, which is important for thermal conduction analysis, is accurately obtained on the basis of the thermal image acquired by the temperature sensor 31 without destroying the cooking object 12.
By using the accurately obtained temperature of the portion to be heated for estimation of the internal temperature, the cooking assistance system can estimate the internal temperature with high accuracy.
Furthermore, in the cooking assistance system, exposure of the portion to be heated of the cooking object 12 is recognized on the basis of the sensor data acquired by the sensor unit 21, and the contact thermal resistance between the portion to be heated and the heating medium is estimated on the basis of the thermal image acquired immediately after the exposure.
By using the contact thermal resistance for the estimation of the temperature of the portion to be heated, the cooking assistance system can improve estimation accuracy of the temperature of the portion to be heated and can estimate the internal temperature with high accuracy.
In the cooking assistance system, the three-dimensional model of the cooking object 12 is constructed on the basis of the sensor data acquired by the distance sensor 32, and temperature information to be the boundary condition and the initial condition of the thermal conduction analysis is mapped to the three-dimensional model.
By performing the thermal conduction analysis on the basis of the three-dimensional model reflecting the actual shape of the cooking object 12, the cooking assistance system can estimate the internal temperature with high accuracy. Furthermore, the cooking assistance system can also simulate a temperature change after heating is stopped using the same three-dimensional model.
<<4. Others>>
The series of processing described above can be executed by hardware or software. In a case where the series of processing is executed by the software, a program constituting the software is installed from a program recording medium to a computer incorporated in dedicated hardware, a general-purpose personal computer, or the like.
A central processing unit (CPU) 201, a read only memory (ROM) 202, and a random access memory (RAM) 203 are mutually connected by a bus 204.
Moreover, an input/output interface 205 is connected to the bus 204. An input unit 206 including a keyboard, a mouse, and the like, and an output unit 207 including a display, a speaker, and the like are connected to the input/output interface 205.
Furthermore, a storage unit 208 including a hard disk, a nonvolatile memory, or the like, a communication unit 209 including a network interface or the like, and a drive 210 that drives a removable medium 211 are connected to the input/output interface 205.
In the computer configured as described above, for example, the CPU 201 loads a program stored in the storage unit 208 into the RAM 203 via the input/output interface 205 and the bus 204 and executes the program, whereby the above-described series of processing is performed.
The program executed by the CPU 201 is provided, for example, by being recorded in the removable medium 211 or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and is installed in the storage unit 208.
Note that the program executed by the computer may be a program in which processing is performed in time series in the order described in the present specification, or may be a program in which processing is performed in parallel or at necessary timing such as when a call is made, and the like.
Note that, in the present specification, the system means a set of a plurality of components (devices, modules (parts), and the like), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device housing a plurality of modules in one housing are both systems.
The effects described in the present specification are merely examples and are not limited, and there may be other effects.
An embodiment of the present technology is not limited to the above-described embodiment, and various modifications can be made without departing from the scope of the present technology.
For example, the present technology can be configured as cloud computing in which one function is shared and jointly processed by a plurality of devices via a network.
Furthermore, each step described in the above-described flowcharts can be executed by one device or shared and executed by a plurality of devices.
Moreover, in a case where one step includes a plurality of processing, the plurality of processing included in the one step can be executed by one device or shared and executed by a plurality of devices.
The present technology can have the following configurations.
(1)
An information processing apparatus including:
(2)
The information processing apparatus according to (1), in which
(3)
The information processing apparatus according to (1) or (2), in which
(4)
The information processing apparatus according to any one of (1) to (3), further including:
(5)
The information processing apparatus according to (4), in which
(6)
The information processing apparatus according to (4) or (5), in which
(7)
The information processing apparatus according to any one of (4) to (6), further including:
(8)
The information processing apparatus according to (7), in which
(9)
The information processing apparatus according to (7) or (8), in which
(10)
The information processing apparatus according to (7) or (8), in which
(11)
The information processing apparatus according to (10), in which
(12)
The information processing apparatus according to (10) or (11), further including:
(13)
The information processing apparatus according to any one of (1) to (12), in which
(14)
The information processing apparatus according to any one of (1) to (13), in which
(15)
The information processing apparatus according to any one of (5) to (12), in which
(16)
The information processing apparatus according to any one of (1) to (15), further including:
(17)
The information processing apparatus according to (16), in which
(18)
An information processing method including:
(19)
A program for causing a computer to execute processing including:
Number | Date | Country | Kind |
---|---|---|---|
2020-161384 | Sep 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/033273 | 9/10/2021 | WO |