The present disclosure relates to an occupant temperature estimating device, an occupant state detection device, an occupant temperature estimating method, and an occupant temperature estimating system.
Conventionally in a vehicle, a technique of estimating a temperature of a body part of an occupant in a vehicle interior and performing various controls for air conditioning or the like using the estimated temperature is known. Note that the temperature of the body part of the occupant here is a surface temperature of the body part of the occupant.
As a technique of estimating the temperature of the body part of the occupant in the vehicle interior, there is a technique of performing image processing on a temperature image such as an infrared image acquired from a sensor that detects a temperature in the vehicle interior, and estimating the temperature of the body part of the occupant on the basis of an infrared intensity (for example, Patent Literature 1).
In the vehicle interior, there is a heat source other than the body part of the occupant. Therefore, when the temperature of the body part of the occupant is estimated simply on the basis of the temperature in the temperature image, there is a problem that a temperature due to a heat source other than the body part of the occupant may be erroneously estimated as the temperature of the body part of the occupant. In particular, in a temperature image having a low resolution, erroneous estimation as described above is likely to occur.
The present disclosure has been made in order to solve the above problem, and an object of the present disclosure is to provide an occupant temperature estimating device capable of enhancing estimation accuracy of a temperature of a body part of an occupant based on a temperature image as compared with a conventional temperature estimating technique based on the temperature image.
An occupant temperature estimating device according to the present disclosure includes: a temperature image acquiring unit that acquires a temperature image which is obtained by imaging a vehicle interior and whose pixels each have temperature information; a binarization processing unit that sets at least one temperature candidate region in a target region from which a temperature of a body part of an occupant present in the vehicle interior is to be estimated, by binarizing pixels in the target region on the basis of the temperature information possessed by the pixels in the target region, the target region being included in a region of the temperature image acquired by the temperature image acquiring unit; a candidate region temperature calculating unit that calculates a region temperature for the temperature candidate region in the target region on the basis of the temperature information possessed by a pixel in the temperature candidate region; and a temperature estimating unit that determines a temperature region from among the at least one temperature candidate region on the basis of a separation degree indicating how much the temperature information possessed by the pixel in the temperature candidate region stands out from the temperature information possessed by a pixel in a region other than the temperature candidate region in the target region, the separation degree being calculated for the temperature candidate region in the target region, and estimates that the temperature of the body part of the occupant is the region temperature for the temperature region.
According to the present disclosure, estimation accuracy of a temperature of a body part of an occupant based on a temperature image can be enhanced as compared with a conventional temperature estimating technique based on the temperature image.
Hereinafter, an embodiment of the present disclosure will be described in detail with reference to the drawings.
The occupant temperature estimating system 100 estimates a temperature of a body part of an occupant present in a vehicle interior on the basis of a temperature image obtained by imaging the vehicle interior.
In the first embodiment, the temperature of the body part of the occupant is a surface temperature of the body part of the occupant. The body part of the occupant is specifically a hand or a face of the occupant.
The occupant is assumed to be a driver in the following first embodiment, as an example.
The occupant temperature estimating system 100 includes a sensor 1 and an occupant temperature estimating device 2.
The sensor 1 is, for example, an infrared array sensor.
The sensor 1 is mounted on a vehicle, and acquires a temperature image by imaging a vehicle interior. The sensor 1 is disposed at a position where the sensor 1 can image a region including a hand and a face of an occupant in the vehicle interior. That is, an imaging region of the sensor 1 includes the region including the face and the hand of the occupant.
In the first embodiment, the temperature image imaged by the sensor 1 may have a medium resolution. Specifically, the number of pixels of the temperature image only needs to be, for example, about 100×100 pixels or less. Therefore, a relatively inexpensive sensor such as a thermopile can be used as the sensor 1. The sensor 1 may be shared with a so-called “Driver Monitoring System (DMS)”.
Each pixel of the temperature image imaged by the sensor 1 has temperature information. The temperature information is represented by a numerical value.
The occupant temperature estimating device 2 estimates a temperature of the hand or the face of the occupant on the basis of the temperature image imaged by the sensor 1.
In the following first embodiment, as an example, it is assumed that the occupant temperature estimating device 2 estimates the temperatures of the hand and the face of the occupant on the basis of the temperature image. Note that this is merely an example, and the occupant temperature estimating device 2 may estimate the temperature of at least one of the hand and the face of the occupant. The temperature of the hand of the occupant estimated by the occupant temperature estimating device 2 may be the temperature of one hand of the occupant or the temperatures of both hands of the occupant.
Here,
Note that, in
Here, as illustrated in
In the temperature image illustrated in
The occupant temperature estimating device 2 estimates the temperatures of the hand and the face of the occupant on the basis of the temperature image as illustrated in
In the first embodiment, regions to be subjected to estimation of the temperatures of the hand and the face of the occupant in a region of the temperature image are set in advance. In the following first embodiment, a region to be subjected to estimation of the temperature of the hand of the occupant on the temperature image is referred to as “hand target region”. A region to be subjected to estimation of the face of the occupant on the temperature image is referred to as “face target region”.
The hand target region is set in advance depending on the disposition position and the angle of view of the sensor 1. The hand target region is, for example, a region set by assuming that a hand of a driver of a standard physique will be imaged when the driver sits at a standard position and drives in a temperature image imaged by the sensor 1.
The face target region is set in advance depending on the disposition position and the angle of view of the sensor 1. The face target region is, for example, a region set by assuming that a face of a driver of a standard physique will be imaged when the driver sits at a standard position and drives in a temperature image imaged by the sensor 1. In the first embodiment, the hand target region and the face target region are also collectively referred to simply as “target region”. Note that the target region has a form of a temperature image, and each pixel in the target region has temperature information.
In
In the temperature image, a region where the hand of the driver 201 is imaged and a region where the face of the driver 201 is imaged have high temperatures.
Therefore, by extracting a high temperature region from each of the hand target region 204 and the face target region 205, the occupant temperature estimating device 2 can estimate that the temperature of the hand or the face of the driver 201 is the temperature of the extracted high temperature region.
However, there is a heat source other than the hand or the face of the driver 201 in the vehicle interior. In the temperature image, a high temperature region due to the heat source other than the hand or the face of the driver 201 is generated.
For example, when the driver 201 changes the position where the steering wheel 202 is gripped, the steering wheel 202 at a portion which has been gripped by the driver 201 until immediately before the change may have heat. In this case, as illustrated in
For example, when the window 203 is heated by sunlight or the like and is hot, as illustrated in
As described above, presence of a heat source serving as noise, such as the second heat source 204b or the fourth heat source 205b, in the hand target region and the face target region may generate a high temperature region.
In this case, if the occupant temperature estimating device 2 simply extracts the temperature of the high temperature region in the target region and estimates that the temperature of the hand or the face of the occupant is the extracted temperature, the occupant temperature estimating device 2 may erroneously estimate the temperature of an erroneous portion that is not the hand of the occupant as the temperature of the hand of the occupant, or may erroneously estimate the temperature of an erroneous portion that is not the face of the occupant as the temperature of the face of the occupant.
In the conventional technique as described above, since it is not considered that a high temperature region due to a heat source other than the body part of the occupant may be generated on the temperature image, the temperature of the body part of the occupant may be erroneously estimated.
In particular, when the resolution of the temperature image is not high, for example, medium or less, it is difficult to distinguish the body part of the occupant from other parts on the temperature image. Therefore, when the temperature of the high temperature region in the target region is simply extracted from the temperature image and thereby the temperature of the body part of the occupant is estimated, there is a high possibility that the temperature of the body part of the occupant may be erroneously estimated.
Meanwhile, by estimating the temperatures of the hand and the face of the occupant from the temperature image in consideration of a possibility of generation of a high temperature region due to a heat source other than the hand and the face of the occupant on the temperature image, the occupant temperature estimating device 2 according to the first embodiment prevents erroneous estimation of the temperatures of the hand and the face of the occupant, and more accurately estimates the temperatures of the hand and the face of the occupant from the temperature image.
The occupant temperature estimating device 2 will be described in detail.
The occupant temperature estimating device 2 includes a temperature image acquiring unit 21, an estimation processing unit 22, a reliability estimating unit 23, and an estimation result determining unit 24.
The estimation processing unit 22 includes a binarization processing unit 221, a candidate region temperature calculating unit 222, and a temperature estimating unit 223.
The binarization processing unit 221 includes a target region extracting unit 2211.
The temperature image acquiring unit 21 acquires a temperature image from the sensor 1. As described above, the temperature image is a temperature image obtained by imaging a vehicle interior by the sensor 1, and is a temperature image in which each pixel has temperature information.
The temperature image acquiring unit 21 outputs the acquired temperature image to the estimation processing unit 22.
The estimation processing unit 22 estimates the temperatures of the hand and the face of the occupant on the basis of the temperature image acquired by the temperature image acquiring unit 21.
By binarizing pixels in a target region in a region of the temperature image acquired by the temperature image acquiring unit 21, in other words, in the hand target region and the face target region on the basis of temperature information possessed by the pixels, the binarization processing unit 221 of the estimation processing unit 22 sets one or more temperature candidate regions out of the target region. The temperature candidate region is a region within the target region and is a candidate region whose temperature is estimated to be the temperature of the hand of the occupant or the temperature of the face of the occupant.
In the first embodiment, processing of setting one or more temperature candidate regions in the target region by binarizing pixels on the basis of temperature information possessed by the pixels in the target region, performed by the binarization processing unit 221, is also referred to as “binarization processing”.
The binarization processing unit 221 will be described in detail.
First, the target region extracting unit 2211 of the binarization processing unit 221 extracts a target region from the region of the temperature image acquired by the temperature image acquiring unit 21. Specifically, the target region extracting unit 2211 extracts the hand target region and the face target region out of the region of the temperature image.
Here,
In the temperature image illustrated in
The binarization processing unit 221 performs binarization processing on the target region extracted by the target region extracting unit 2211.
Details of the binarization processing by the binarization processing unit 221 will be described below.
First, on the basis of the target region extracted by the target region extracting unit 2211, the binarization processing unit 221 creates an image (hereinafter, referred to as “first binary image”) in which pixels in the target region are classified into a high temperature region and a low temperature region.
Specifically, the binarization processing unit 221 performs Otsu's binarization on the target region (first Otsu's binarization). Since Otsu's binarization is a known image processing technique, a detailed description thereof is omitted.
By performing first Otsu's binarization on the target region, the binarization processing unit 221 classifies pixels in the target region into the high temperature region and the low temperature region depending on whether or not a temperature corresponding to a pixel is equal to or higher than a threshold (hereinafter, referred to as “temperature determination threshold”), and thereby creates a first binary image.
The binarization processing unit 221 classifies pixels whose corresponding temperatures are equal to or higher than the temperature determination threshold into the high temperature region, and classifies pixels whose corresponding temperatures are lower than the temperature determination threshold into the low temperature region.
Here, the high temperature region and the low temperature region classified by first Otsu's binarization are referred to as “first high temperature region” and “first low temperature region”, respectively.
A pixel classified into the first high temperature region by performing first Otsu's binarization by the binarization processing unit 221 is referred to as “class 1 (first)”. A pixel classified into the first low temperature region by performing first Otsu's binarization by the binarization processing unit 221 is referred to as “class 0 (first)”.
The binarization processing unit 221 sets a pixel value of class 1 (first) to “1”, and sets a pixel value of class 0 (first) to “0” in the created first binary image.
The binarization processing unit 221 further masks the region of the pixel classified into “class 0 (first)” in first Otsu's binarization in the target region, in other words, the region of the pixel in the first low temperature region, and performs Otsu's binarization (second Otsu's binarization) on the region of the pixel classified into “class 1 (first)”, in other words, the first high temperature region.
By performing second Otsu's binarization on the first high temperature region in the target region, the binarization processing unit 221 creates an image (hereinafter, referred to as “second binary image”) in which pixels in the first high temperature region are classified into a high temperature region and a low temperature region depending on whether or not a temperature corresponding to a pixel is equal to or higher than a temperature determination threshold. Note that the temperature determination threshold in first Otsu's binarization and the temperature determination threshold in second Otsu's binarization are different from each other.
Specifically, the binarization processing unit 221 classifies pixels whose corresponding temperatures are equal to or higher than the temperature determination threshold into a second high temperature region, and classifies pixels whose corresponding temperatures are lower than the temperature determination threshold into a second low temperature region. Here, the binarization processing unit 221 also classifies the masked pixel in the first low temperature region among pixels in the target region into the low temperature region.
The high temperature region and the low temperature region classified by second Otsu's binarization are referred to as “second high temperature region” and “second low temperature region”, respectively.
A pixel classified into the second high temperature region by performing second Otsu's binarization by the binarization processing unit 221 is referred to as “class 1 (second)”. A pixel classified into the second low temperature region by performing second Otsu's binarization by the binarization processing unit 221 is referred to as “class 0 (second)”.
The binarization processing unit 221 sets a pixel value of class 1 (second) to “1”, and sets a pixel value of class 0 (second) to “0” in the created second binary image.
When the second binary image is created by performing Otsu's binarization on the target region twice as described above, the binarization processing unit 221 groups consecutive pixels of class 1 (second), in other words, adjacent pixels of class 1 (second) in the second binary image, and thereby sets one region. Specifically, when a certain pixel (hereinafter, referred to as “pixel of interest”) among pixels of class 1 (second) is set as a center and there is a pixel of class 1 (second) (hereinafter, referred to as “connected pixel”) in four neighboring pixels that are in vertical and horizontal contact with the pixel of interest, the binarization processing unit 221 groups the pixels by connecting the connected pixel to the pixel of interest. In the first embodiment, a method of connecting a connected pixel of class 1 (second) among four neighboring pixels that are in vertical and horizontal contact with the pixel of interest to the pixel of interest and thereby grouping the pixels is referred to as “four-connection”.
The binarization processing unit 221 sets a region formed by connecting the pixel of interest to the connected pixel by four-connection in the second binary image as a temperature candidate region. One or more temperature candidate regions may be set.
The binarization processing unit 221 assigns a region label to the set temperature candidate region. When there is a plurality of temperature candidate regions, the binarization processing unit 221 assigns different region labels to the respective temperature candidate regions.
In the first embodiment, the second binary image after the binarization processing unit 221 sets the temperature candidate region and assigns the region label to the set temperature candidate region is also referred to as “label image”.
The binarization processing unit 221 assigns, for example, a region label of “0” to the second low temperature region on the label image.
Then, the binarization processing unit 221 sets, as a temperature candidate region in the target region, a temperature candidate region within the target region corresponding to the temperature candidate region set on the label image.
A concept of the binarization processing performed by the binarization processing unit 221 will be specifically described with reference to the drawings.
Note that, in the following description, the binarization processing will be described in detail by assuming that, as an example, the binarization processing unit 221 performs the binarization processing on the hand target region. The binarization processing unit 221 also performs the binarization processing on the face target region by a method similar to that performed on the hand target region.
Note that
The binarization processing unit 221 also performs Otsu's binarization on the face target region extracted by the target region extracting unit 2211 by a similar method to Otsu's binarization performed on the hand target region.
First, the binarization processing unit 221 performs first Otsu's binarization on the hand target region (see a reference sign 601 in
As a result, the binarization processing unit 221 classifies pixels in the hand target region into the first high temperature region and the first low temperature region depending on whether or not a temperature corresponding to a pixel is equal to or higher than the temperature determination threshold, and thereby creates a first binary image (see a reference sign 602 in
In the first binary image indicated by a reference sign 602 in
The binarization processing unit 221 further masks the region of the pixel in the first low temperature region in the hand target region, and then performs second Otsu's binarization on the first high temperature region (see a reference sign 603 in
As a result, the binarization processing unit 221 creates a second binary image (see a reference sign 604 in
In the second binary image indicated by a reference sign 604 in
When performing Otsu's binarization twice on the hand target region and thereby creating the second binary image, the binarization processing unit 221 performs four-connection in the second binary image and thereby sets a temperature candidate region. Then, the binarization processing unit 221 assigns a region label to the set temperature candidate region.
For example, as illustrated in
Meanwhile, the binarization processing unit 221 assigns, for example, a region label of “0” to the second low temperature region.
The binarization processing unit 221 sets, as a temperature candidate region in the hand target region, a temperature candidate region within the hand target region corresponding to the temperature candidate region set on the label image.
When performing the binarization processing as described above, the binarization processing unit 221 outputs the hand target region after the temperature candidate region is set (hereinafter, referred to as “candidate region post-setting temperature image”, see a reference sign 702 in
Return to the description of
The candidate region temperature calculating unit 222 calculates, on the basis of temperature information possessed by pixels in a temperature candidate region in the target region, a region temperature for the temperature candidate region.
Specifically, first, the candidate region temperature calculating unit 222 classifies, on the basis of the candidate region post-setting temperature image and the label image output from the binarization processing unit 221, temperature candidate regions in the candidate region post-setting temperature image into regions of the respective region labels assigned to the temperature candidate regions.
Then, the candidate region temperature calculating unit 222 calculates region temperatures for the respective temperature candidate regions classified in accordance with the region label.
Specifically, for example, the candidate region temperature calculating unit 222 calculates a median value of the temperature information possessed by the pixels in the temperature candidate region, and uses the calculated median value as the region temperature of the temperature candidate region.
A concept of an example of a method in which the candidate region temperature calculating unit 222 calculates, on the basis of temperature information possessed by pixels in a temperature candidate region in the target region, a region temperature for the temperature candidate region will be described with reference to the drawings.
In the following description, as an example, the candidate region temperature calculating unit 222 calculates the region temperature on the basis of the temperature information possessed by pixels in the temperature candidate region in the hand target region, but the candidate region temperature calculating unit 222 also calculates a region temperature for the face target region by a method similar to the method performed for the hand target region.
Note that, in
Here, as illustrated in
The candidate region temperature calculating unit 222 classifies the temperature candidate regions in the candidate region post-setting temperature image into a temperature candidate region with the region label “1” (see a reference sign 801 in
Note that
The candidate region temperature calculating unit 222 calculates a median value of temperature information possessed by pixels in the temperature candidate region with the region label “1”, a median value of temperature information possessed by pixels in the temperature candidate region with the region label “2”, and a median value of temperature information possessed by pixels in the temperature candidate region with the region label “3”.
In
The candidate region temperature calculating unit 222 sets the region temperature of the temperature candidate region with the region label “1” to 34.1° C., sets the region temperature of the temperature candidate region with the region label “2” to 33.6° C., and sets the region temperature of the temperature candidate region with the region label “3” to 33.7° C.
The candidate region temperature calculating unit 222 outputs information in which the temperature candidate region and the region temperature are associated with each other (hereinafter, referred to as “region temperature information”) to the temperature estimating unit 223 in the estimation processing unit 22.
The temperature estimating unit 223 calculates a separation degree, determines one temperature region from among temperature candidate regions in the target region set by the binarization processing unit 221 on the basis of the calculated separation degree, and estimates that the temperature of the hand or the face of the occupant is the region temperature for the one temperature region.
In the first embodiment, the “separation degree” is a degree indicating how much temperature information possessed by pixels in the temperature candidate region in the target region stands out from temperature information possessed by pixels in a region other than the temperature candidate region in the target region.
The temperature estimating unit 223 first calculates the separation degree on the basis of the following formula (1).
Separation degree (%)=inter-class dispersion(σb2)=total dispersion(σt2) (1)
In the above description, class 1 (foreground) refers to a temperature candidate region in the entire region of the candidate region post-setting temperature image (target region). Class 2 (background) refers to a region other than the temperature candidate region in the candidate region post-setting temperature image (target region).
The temperature estimating unit 223 calculates the separation degree for each temperature candidate region.
The following description will be given, as an example, by assuming that the temperature estimating unit 223 calculates the separation degree for each temperature candidate region for the hand target region. However, the temperature estimating unit 223 also calculates the separation degree for the face target region by a method similar to the method performed for the hand target region.
Here,
First, the temperature estimating unit 223 creates class 1 (foreground) for each temperature candidate region to which a region label is assigned on the basis of the candidate region post-setting temperature image and the label image output from the binarization processing unit 221 (see reference signs 1001, 1002, and 1003 in
Specifically, the temperature estimating unit 223 extracts a temperature candidate region in the candidate region post-setting temperature image in accordance with the region label assigned to the temperature candidate region, and thereby creates class 1 (foreground).
Note that, when performing classification in accordance with the region label assigned to the temperature candidate region in the candidate region post-setting temperature image (see
The temperature estimating unit 223 creates class 2 (background).
Specifically, the temperature estimating unit 223 extracts a region other than the temperature candidate region on the basis of the candidate region post-setting temperature image output from the binarization processing unit 221, and thereby creates class 2 (background) (see a reference sign 1004 in
Then, the temperature estimating unit 223 calculates a separation degree for each temperature candidate region using the above formula (1).
In
After calculating the separation degree as described above, the temperature estimating unit 223 determines one temperature region from among the temperature candidate regions on the basis of the calculated separation degree.
For example, the temperature estimating unit 223 determines that a temperature candidate region having the largest calculated separation degree is the temperature region. In the example of
Then, the temperature estimating unit 223 estimates that the temperature of the hand or the face of the occupant is a region temperature for the determined temperature region.
The temperature estimating unit 223 may determine the region temperature for the determined temperature region from the region temperature information output from the candidate region temperature calculating unit 222.
For example, in the example of
The temperature estimating unit 223 outputs information regarding the estimated temperatures of the hand and the face of the occupant to the reliability estimating unit 23. The information regarding the estimated temperatures of the hand and the face of the occupant includes, in addition to the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223, information regarding the temperature region in the target region and a calculated separation degree of the temperature region.
The reliability estimating unit 23 estimates a reliability of the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 on the basis of the information regarding the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223.
For example, the reliability estimating unit 23 estimates the reliability using a learned model in machine learning (hereinafter, referred to as “machine learning model”).
The machine learning model 231 is a learned model that receives, as inputs, the region temperature of the temperature region, the separation degree of the temperature region, the area of a circumscribed rectangle circumscribing the temperature region in the target region, position information of the circumscribed rectangle in the target region, the vertical length of the circumscribed rectangle, and the horizontal length of the circumscribed rectangle, and outputs a reliability. The reliability is represented by, for example, a numerical value of 0 to 1.
The machine learning model 231 corresponding to the hand of the occupant and the machine learning model 231 corresponding to the face of the occupant are created in advance. The machine learning model 231 is constituted by, for example, a Bayesian model or a neural network.
In the first embodiment, the position of the circumscribed rectangle is the position of a point at an upper left end of the circumscribed rectangle on the target region when an upper left of the target region is used as an origin. The position of the circumscribed rectangle is represented by the coordinates of the position of the point at the upper left end of the circumscribed rectangle. The position information of the circumscribed rectangle includes the X coordinate and the Y coordinate of the point at the upper left end of the circumscribed rectangle.
The machine learning model 231 is created in advance by, for example, a learning device (not illustrated).
The learning device acquires, for example, a temperature image imaged when the vehicle is experimentally caused to travel and the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2 mounted on the vehicle. Then, from the temperature image acquired at the time of the experiment, the region temperature, the separation degree, the area of the circumscribed rectangle, the position information of the circumscribed rectangle, the vertical length of the circumscribed rectangle, and the horizontal length of the circumscribed rectangle are calculated. Note that the region temperature is the temperature of the hand or the face of the occupant estimated by the occupant temperature estimating device 2 at the time of the experiment. In addition, the learning device calculates an error between the temperature of the hand of the occupant acquired at the time of the experiment and the actual temperature of the hand of the occupant at the time of the experiment, and an error between the temperature of the face of the occupant acquired at the time of the experiment and the actual temperature of the face of the occupant at the time of the experiment. Note that the actual temperatures of the hand and the face of the occupant are manually input by, for example, an administrator or the like. The learning device uses the calculated error as teacher data.
The learning device causes the machine learning model 231 to perform learning by so-called supervised learning using the acquired region temperature, the acquired separation degree, the acquired area of the circumscribed rectangle, the acquired position information of the circumscribed rectangle, the acquired vertical length of the circumscribed rectangle, the acquired horizontal length of the circumscribed rectangle, and the errors as learning data.
Here, it is assumed that the reliability estimating unit 23 estimates the reliability of the temperature of the hand of the occupant estimated by the temperature estimating unit 223. At this time, the reliability estimating unit 23 uses, as inputs to the machine learning model 231 for estimating the reliability of the temperature of the hand of the occupant, the temperature of the hand of the occupant estimated by the temperature estimating unit 223, in other words, the region temperature of the temperature region determined by the temperature estimating unit 223, the separation degree of the temperature region calculated by the temperature estimating unit 223, the area of the circumscribed rectangle circumscribing the temperature region in the target region, the position information of the circumscribed rectangle in the target region, the vertical length of the circumscribed rectangle, and the horizontal length of the circumscribed rectangle, and then uses the obtained reliability as the reliability of the temperature of the hand of the occupant estimated by the temperature estimating unit 223.
In the first embodiment, the reliability estimating unit 23 may estimate the reliability of the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 in accordance with a calculation rule set in advance.
The calculation rule is set in advance. The calculation rule can be set appropriately, but a calculation rule based on the region temperature of the temperature region, the separation degree of the temperature region, the area of the circumscribed rectangle circumscribing the temperature region in the target region, the position information of the circumscribed rectangle in the target region, the vertical length of the circumscribed rectangle, and the horizontal length of the circumscribed rectangle is used.
Specifically, the calculation rule has, for example, the following contents.
<Calculation Rule>
The reliability is a result obtained by integrating an evaluation value of the region temperature of the temperature region (first evaluation value), an evaluation value of the separation degree of the temperature region (second evaluation value), an evaluation value of the area of the circumscribed rectangle circumscribing the temperature region in the target region (third evaluation value), an evaluation value of the position information of the circumscribed rectangle in the target region (fourth evaluation value), an evaluation value of the vertical length of the circumscribed rectangle (fifth evaluation value), and the horizontal length of the circumscribed rectangle (sixth evaluation value).
The first evaluation value to the sixth evaluation value are calculated, for example, as follows.
First evaluation value: calculated to be “1” when the region temperature of the temperature region is equal to or higher than a threshold (first threshold), and calculated to be “0.5” when the region temperature is lower than the first threshold
Second evaluation value: calculated to be “1” when the separation degree of the temperature region is equal to or higher than a threshold (second threshold), and calculated to be “0.5” when the separation degree is lower than the second threshold
Third evaluation value: calculated to be “1” when the area of the circumscribed rectangle is equal to or higher than a threshold (third threshold), and calculated to be “0.5” when the area of the circumscribed rectangle is lower than the third threshold
Fourth evaluation value: calculated to be “1” when the position of the circumscribed rectangle is within a predetermined region, and calculated to be “0.5” when the position of the circumscribed rectangle is not within the predetermined region
Fifth evaluation value: calculated to be “1” when the vertical length of the circumscribed rectangle is equal to or higher than a threshold (fifth threshold), and calculated to be “0.5” when the vertical length of the circumscribed rectangle is lower than the fifth threshold
Sixth evaluation value: calculated to be “1” when the horizontal length of the circumscribed rectangle is equal to or higher than a threshold (sixth threshold), and calculated to be “0.5” when the horizontal length of the circumscribed rectangle is lower than the sixth threshold
The reliability estimating unit 23 outputs information regarding the estimated reliability to the estimation result determining unit 24. The reliability estimating unit 23 outputs the information regarding the temperatures of the hand and the face estimated by the temperature estimating unit 223 to the estimation result determining unit 24 in association with the estimated reliability.
The estimation result determining unit 24 determines whether or not to adopt the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 by comparing the reliability estimated by the reliability estimating unit 23 with a threshold set in advance (hereinafter, referred to as “reliability determination threshold”).
For example, when the reliability is equal to or higher than the reliability determination threshold, the estimation result determining unit 24 determines that the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 are reliable, and outputs information regarding the temperatures of the hand and the face of the occupant.
Note that the reliability determination thresholds set for the hand of the occupant and the face of the occupant may be different from each other.
For example, when determining that only one of the temperature of the hand of the occupant and the temperature of the face of the occupant is reliable, the estimation result determining unit 24 may output only the information regarding the temperature determined to be reliable. In addition, for example, when determining that at least one of the temperature of the hand of the occupant and the temperature of the face of the occupant is unreliable, the estimation result determining unit 24 may determine that neither the temperature of the hand of the occupant nor the temperature of the face of the occupant is reliable, and the estimation result determining unit 24 may determine not to output information regarding either of the temperatures.
Examples of an output destination of the information regarding the temperatures of the hand and the face of the occupant by the estimation result determining unit 24 include a wakefulness level detecting unit 4 and a sensible temperature detecting unit 5 (see
For example, the estimation result determining unit 24 may store information regarding the temperature of the hand or the face of the occupant determined to be reliable in a storage unit (not illustrated).
An operation of the occupant temperature estimating device 2 according to the first embodiment will be described.
The temperature image acquiring unit 21 acquires a temperature image from the sensor 1 (step ST1301).
The temperature image acquiring unit 21 outputs the acquired temperature image to the estimation processing unit 22.
The binarization processing unit 221 of the estimation processing unit 22 sets one or more temperature candidate regions in the target region by binarizing pixels in the target region in the region of the temperature image acquired by the temperature image acquiring unit 21 in step ST1301, in other words, in the hand target region and the face target region on the basis of temperature information possessed by the pixels (step ST1302).
The binarization processing unit 221 outputs the candidate region post-setting temperature image of the target region after the temperature candidate region is set and the label image to the candidate region temperature calculating unit 222 and the temperature estimating unit 223 in the estimation processing unit 22.
The candidate region temperature calculating unit 222 calculates, on the basis of temperature information possessed by pixels in a temperature candidate region in the target region, a region temperature for the temperature candidate region (step ST1303).
The candidate region temperature calculating unit 222 outputs region temperature information in which the temperature candidate region and the region temperature are associated with each other to the temperature estimating unit 223 in the estimation processing unit 22.
The temperature estimating unit 223 calculates a separation degree, determines one temperature region from among temperature candidate regions set by the binarization processing unit 221 in step ST1302 on the basis of the calculated separation degree, and estimates that the temperature of the hand or the face of the occupant is a region temperature for the one temperature region (step ST1304).
The temperature estimating unit 223 outputs information regarding the estimated temperatures of the hand and the face of the occupant to the reliability estimating unit 23.
The reliability estimating unit 23 estimates a reliability of the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 on the basis of the information regarding the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 in step ST1304 (step ST1305).
The reliability estimating unit 23 outputs information regarding the estimated reliability to the estimation result determining unit 24. The reliability estimating unit 23 outputs the information regarding the temperatures of the hand and the face estimated by the temperature estimating unit 223 to the estimation result determining unit 24 in association with the estimated reliability.
The estimation result determining unit 24 determines whether or not to adopt the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 by comparing the reliability estimated by the reliability estimating unit 23 in step ST1305 with the reliability determination threshold (step ST1306).
When determining to adopt the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223, the estimation result determining unit 24 outputs the information regarding the temperatures of the hand and the face of the occupant.
As described above, the occupant temperature estimating device 2 according to the first embodiment sets a temperature candidate region in the target region by binarizing pixels in the target region in the region of the acquired temperature image on the basis of temperature information possessed by the pixels, and calculates a region temperature for the temperature candidate region. Then, the occupant temperature estimating device 2 determines a temperature region from among the temperature candidate regions on the basis of the separation degrees calculated for the temperature candidate regions in the target region, and estimates that the temperature of the hand or the face of the occupant is the region temperature for the determined temperature region.
As a result, the occupant temperature estimating device 2 can enhance estimation accuracy of the temperatures of the hand and the face of the occupant based on the temperature image as compared with a conventional temperature estimating technique based on the temperature image.
In addition, the occupant temperature estimating device 2 can accurately estimate the temperatures of the hand and the face of the occupant even from a temperature image having a medium or lower resolution. Therefore, the occupant temperature estimating system 100 can make the sensor 1 used to estimate the temperatures of the hand and the face of the occupant relatively inexpensive. In addition, in the occupant temperature estimating system 100, for example, it is not necessary to newly dispose a highly accurate sensor in order to enhance the estimation accuracy of the temperatures of the hand and the face of the occupant, and for example, it is possible to accurately estimate the temperatures of the hand and the face of the occupant by using an existing sensor having a medium or lower resolution, such as a sensor used in a driver monitoring system.
In addition, the occupant temperature estimating device 2 according to the first embodiment estimates the reliability of the estimated temperatures of the hand and the face of the occupant, and determines whether or not to adopt the estimated temperatures of the hand and the face of the occupant by comparing the estimated reliability with the reliability determination threshold.
Therefore, the occupant temperature estimating device 2 can further enhance the estimation accuracy of the temperatures of the hand and the face of the occupant based on the temperature image, and can prevent an estimation result with low reliability among the estimation results of the temperatures of the hand and the face of the occupant from being used in another device or the like.
In the first embodiment described above, the occupant temperature estimating device 2 can determine whether or not the estimated temperatures of the hand and the face of the occupant are reliable in consideration of the situation of the occupant determined on the basis of a camera image imaged by a camera, which will be described in detail below.
In
The occupant temperature estimating system 100a includes a camera 3.
The camera 3 is, for example, a visible light camera or an infrared camera, and is mounted on a vehicle. The camera 3 images a vehicle interior and thereby acquires a camera image. The camera 3 is disposed at a position where the camera 3 can image a region of a hand or a face of an occupant in the vehicle interior. That is, an imaging region of the camera 3 includes a region including the face or the hand of the occupant. Note that the imaging regions of the camera 3 and the sensor 1 do not have to be the same as each other. The camera 3 may be shared with a so-called driver monitoring system.
The camera 3 outputs the camera image to the occupant temperature estimating device 2a.
The occupant temperature estimating device 2a is different from the occupant temperature estimating device 2 in that the occupant temperature estimating device 2a includes a camera image acquiring unit 25 and a situation detecting unit 26.
The camera image acquiring unit 25 acquires a camera image obtained by imaging the occupant in the vehicle interior by the camera 3.
The camera image acquiring unit 25 outputs the acquired camera image to the situation detecting unit 26.
The situation detecting unit 26 detects the situation of the occupant on the basis of the camera image acquired by the camera image acquiring unit 25. In the first embodiment, the situation of the occupant detected by the situation detecting unit 26 is a situation that is assumed to hinder detection of the temperatures of the hand and the face of the occupant.
Specifically, for example, the situation detecting unit 26 detects a region of the occupant's hair or a face direction angle of the occupant in the temperature image acquired by the temperature image acquiring unit 21 on the basis of a region of the occupant's hair or a face direction angle of the occupant in the camera image acquired by the camera image acquiring unit 25.
Note that in the occupant temperature estimating device 2a, the temperature image acquiring unit 21 outputs the temperature image to the estimation processing unit 22 and the situation detecting unit 26.
The situation detecting unit 26 may detect the region of the occupant's hair or the face direction angle of the occupant in the camera image using a known image processing technique.
The disposition position and the angle of view of the camera 3 and the disposition position and the angle of view of the sensor 1 are known in advance. Therefore, when detecting the region of the occupant's hair in the camera image or the face direction angle of the occupant with respect to the camera 3, the situation detecting unit 26 can detect the region of the occupant's hair in the temperature image or the face direction angle of the occupant with respect to the sensor 1. Note that, as the face direction angle of the occupant with respect to the sensor 1 is larger, it is further assumed that the face of the occupant is not directed to the sensor 1, in other words, detection of the temperature of the face of the occupant is hindered.
The situation detecting unit 26 outputs information regarding the detected situation of the occupant, specifically, for example, information regarding the region of the occupant's hair in the temperature image or information regarding the face direction angle of the occupant to the estimation result determining unit 24.
When the region of the occupant's hair detected by the situation detecting unit 26 is equal to or higher than a threshold set in advance (hereinafter, referred to as “hair region determination threshold”), or when the face direction angle of the occupant detected by the situation detecting unit 26 is equal to or higher than a threshold set in advance (hereinafter, referred to as “face direction angle determination threshold”), the estimation result determining unit 24 does not adopt the temperature of the face of the occupant estimated by the temperature estimating unit 223.
As the hair region determination threshold, a size of a region is set to such an extent that the temperature of the face is not sufficiently detected in the temperature image. In addition, as the face direction angle determination threshold, a face direction angle is set to such an extent that the temperature of the face is not sufficiently detected in the temperature image.
Note that
In the camera image 1501a illustrated in
Meanwhile, in the camera image 1501b illustrated in
As described above, for example, when most of the face of the occupant is hidden by the occupant's hair, the temperature of the face portion of the occupant is not detected on the temperature image. Therefore, there is a possibility that the occupant temperature estimating device 2 cannot determine the face region of the occupant as a high temperature region on the basis of the temperature image. That is, the occupant temperature estimating device 2 may erroneously estimate the temperature of the face of the occupant.
Note that
Therefore, the occupant temperature estimating device 2a includes the situation detecting unit 26, and the estimation result determining unit 24 does not adopt the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223, for example, when the region of the occupant's hair detected by the situation detecting unit 26 is equal to or higher than the hair region determination threshold.
As a result, the occupant temperature estimating device 2a can prevent erroneous estimation of the temperature of the face of the occupant.
Note that, for example,
Since specific operations in steps ST1601 to ST1605 in
The camera image acquiring unit 25 acquires a camera image obtained by imaging an occupant in the vehicle interior by the camera 3 (step ST1606).
The camera image acquiring unit 25 outputs the acquired camera image to the situation detecting unit 26.
The situation detecting unit 26 detects the situation of the occupant on the basis of the camera image acquired by the camera image acquiring unit 25 in step ST1606 (step ST1607).
The situation detecting unit 26 outputs information regarding the detected situation of the occupant, specifically, for example, information regarding the region of the occupant's hair in the temperature image or information regarding the face direction angle of the occupant to the estimation result determining unit 24.
When the region of the occupant's hair detected by the situation detecting unit 26 in step ST1607 is equal to or higher than the hair region determination threshold which is set in advance, or when the face direction angle of the occupant detected by the situation detecting unit 26 is equal to or higher than the face direction angle determination threshold, the estimation result determining unit 24 does not adopt the temperature of the face of the occupant estimated by the temperature estimating unit 223 (step ST1608).
In the above description, in the occupant temperature estimating device 2a, the estimation result determining unit 24 does not adopt the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 depending on a detection result of the situation of the occupant detected by the situation detecting unit 26. It is not limited to this, and in the occupant temperature estimating device 2a, for example, the reliability estimating unit 23 may estimate that the reliability of the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 is low depending on a detection result of the situation of the occupant detected by the situation detecting unit 26. Specifically, for example, when the region of the occupant's hair detected by the situation detecting unit 26 is equal to or higher than the hair region determination threshold which is set in advance, or when the face direction angle of the occupant detected by the situation detecting unit 26 is equal to or higher than the face direction angle determination threshold, the reliability estimating unit 23 sets the reliability of the temperatures of the hand and the face of the occupant estimated by the temperature estimating unit 223 to a value lower than the reliability determination threshold.
By the reliability of the estimated temperatures of the hand and the face of the occupant being set to a value lower than the reliability determination threshold by the reliability estimating unit 23, the estimation result determining unit 24 can determine that the temperatures are unreliable and prevent the temperatures from being output.
The information regarding the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2 according to the first embodiment is used to detect a wakefulness level of the occupant or a sensible temperature of the occupant in an occupant state detection device.
Note that, in
In
The occupant state detection device 101 detects a wakefulness level of the occupant by using information regarding the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2. In addition, the occupant state detection device 101 detects a sensible temperature of the occupant by using information regarding the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2.
The occupant state detection device 101 is mounted, for example, on a vehicle.
The occupant state detection device 101 includes the occupant temperature estimating device 2, the wakefulness level detecting unit 4, and the sensible temperature detecting unit 5.
The wakefulness level detecting unit 4 detects the wakefulness level of the occupant on the basis of the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2.
The wakefulness level detecting unit 4 detects the wakefulness level of the occupant on the basis of a difference between the temperature of the hand of the occupant and the temperature of the face of the occupant. For example, the wakefulness level is detected as being high or being low. When the difference between the temperature of the hand of the occupant and the temperature of the face of the occupant is equal to or lower than a threshold, the wakefulness level detecting unit 4 detects that the wakefulness level of the occupant is low. Note that the above-described method for detecting the wakefulness level is merely an example. The wakefulness level detecting unit 4 may detect the wakefulness level of the occupant using a known technique of detecting the wakefulness level from the temperatures of the hand and the face.
The wakefulness level detecting unit 4 outputs information regarding the detected wakefulness level to, for example, a warning system (not illustrated) or an automatic driving system (not illustrated).
The warning system warns the occupant of the vehicle on the basis of, for example, the wakefulness level detected by the wakefulness level detecting unit 4. Specifically, for example, when it is detected that the wakefulness level is low, the warning system outputs a warning sound from a sound output device mounted on the vehicle, such as a speaker.
The automatic driving system switches driving control of the vehicle to automatic driving control on the basis of, for example, the wakefulness level detected by the wakefulness level detecting unit 4.
Therefore, a driving support function for the occupant is implemented.
The sensible temperature detecting unit 5 detects the sensible temperature of the occupant on the basis of the temperatures of the hand and the face of the occupant. The sensible temperature detecting unit 5 may detect the sensible temperature of the occupant using a known technique of detecting the sensible temperature from the temperatures of the hand and the face.
The sensible temperature detecting unit 5 outputs information regarding the detected sensible temperature to, for example, an air conditioning system (not illustrated).
The air conditioning system controls an air conditioner (not shown) mounted on the vehicle on the basis of, for example, the sensible temperature detected by the sensible temperature detecting unit 5. As a result, air conditioning control comfortable for the occupant is implemented.
An operation of the occupant state detection device 101 according to the first embodiment will be described.
Note that, in the occupant state detection device 101, before the operation illustrated in
When the estimation result determining unit 24 of the occupant temperature estimating device 2 outputs information regarding the temperatures of the hand and the face of the occupant, the wakefulness level detecting unit 4 detects the wakefulness level of the occupant on the basis of the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2 (step ST1801).
The wakefulness level detecting unit 4 outputs information regarding the detected wakefulness level to, for example, the warning system or the automatic driving system.
The sensible temperature detecting unit 5 detects the sensible temperature of the occupant on the basis of a difference between the temperature of the hand of the occupant and the temperature of the face of the occupant (step ST1802).
The sensible temperature detecting unit 5 outputs information regarding the detected sensible temperature to, for example, the air conditioning system.
Note that, in the flowchart of
As described above, the occupant state detection device 101 according to the first embodiment can detect the wakefulness level of the occupant on the basis of the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2. For example, a driving support function for the occupant is implemented in the warning system or the automatic driving system on the basis of the wakefulness level of the occupant.
In addition, the occupant state detection device 101 according to the first embodiment can detect the sensible temperature of the occupant on the basis of the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2. Comfortable air conditioning control for the occupant is implemented, for example, in the air conditioning system on the basis of the sensible temperature of the occupant.
In the first embodiment, the functions of the temperature image acquiring unit 21, the estimation processing unit 22, the reliability estimating unit 23, and the estimation result determining unit 24 are implemented by a processing circuit 1901. That is, the occupant temperature estimating device 2 includes the processing circuit 1901 for estimating temperatures of the hand and the face of the occupant in the vehicle interior on the basis of the temperature image.
The processing circuit 1901 may be dedicated hardware as illustrated in
When the processing circuit 1901 is dedicated hardware, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination thereof corresponds to the processing circuit 1901.
When the processing circuit 1901 is the CPU 1904, the functions of the temperature image acquiring unit 21, the estimation processing unit 22, the reliability estimating unit 23, and the estimation result determining unit 24 are implemented by software, firmware, or a combination of software and firmware. Software or firmware is described as a program and stored in the memory 1905. By reading and executing the program stored in the memory 1905, the processing circuit 1901 executes the functions of the temperature image acquiring unit 21, the estimation processing unit 22, the reliability estimating unit 23, and the estimation result determining unit 24. That is, the occupant temperature estimating device 2 includes the memory 1905 for storing a program that causes the above-described processes in steps ST1301 to ST1306 illustrated in
Note that some of the functions of the temperature image acquiring unit 21, the estimation processing unit 22, the reliability estimating unit 23, and the estimation result determining unit 24 may be implemented by dedicated hardware, and some of the functions may be implemented by software or firmware. For example, the function of the temperature image acquiring unit 21 can be implemented by the processing circuit 1901 as dedicated hardware, and the functions of the estimation processing unit 22, the reliability estimating unit 23, and the estimation result determining unit 24 can be implemented by reading and executing a program stored in the memory 1905 by the processing circuit 1901.
The occupant temperature estimating device 2 includes an input interface device 1902 and an output interface device 1903 that perform wired communication or wireless communication with a device such as the sensor 1.
Note that the hardware configuration of the occupant temperature estimating device 2a is similar to the hardware configuration of the occupant temperature estimating device 2.
The functions of the camera image acquiring unit 25 and the situation detecting unit 26 are implemented by the processing circuit 1901. That is, the occupant temperature estimating device 2a includes the processing circuit 1901 for estimating the temperatures of the hand and the face of the occupant in the vehicle interior on the basis of the temperature image and performing control to determine that the estimated result of the temperature of the hand or the face of the occupant is not adopted depending on a situation of the occupant.
By reading and executing the program stored in the memory 1905, the processing circuit 1901 implements the functions of the camera image acquiring unit 25 and the situation detecting unit 26. That is, the occupant temperature estimating device 2a includes the memory 1905 for storing a program that causes the above-described processes in steps ST1606 and ST1607 illustrated in
The occupant temperature estimating device 2a includes the input interface device 1902 and the output interface device 1903 that perform wired communication or wireless communication with a device such as the sensor 1 or the camera 3.
The hardware configuration of the occupant state detection device 101 according to the first embodiment is similar to the hardware configuration of the occupant temperature estimating device 2.
The functions of the wakefulness level detecting unit 4 and the sensible temperature detecting unit 5 are implemented by the processing circuit 1901. That is, the occupant state detection device 101 includes the processing circuit 1901 for performing control to detect the wakefulness level or the sensible temperature of the occupant on the basis of the temperatures of the hand and the face of the occupant estimated by the occupant temperature estimating device 2.
By reading and executing the program stored in the memory 1905, the processing circuit 1901 implements the functions of the wakefulness level detecting unit 4 and the sensible temperature detecting unit 5. That is, the occupant state detection device 101 includes the memory 1905 for storing a program that causes the above-described processes in steps ST1801 and ST1802 illustrated in
The occupant state detection device 101 includes the input interface device 1902 and the output interface device 1903 that perform wired communication or wireless communication with a device such as the sensor 1 or an air conditioner.
In the first embodiment described above, in the occupant temperature estimating device 2, 2a, the binarization processing unit 221 performs Otsu's binarization twice, but this is merely an example, and the binarization processing unit 221 may perform Otsu's binarization only once.
Note that the binarization processing unit 221 can set a temperature candidate region more accurately by performing Otsu's binarization twice.
As described above, since the temperature image is an image having a medium or lower resolution, a boundary between the occupant's hand and a portion other than the occupant's hand or a boundary between the occupant's face and a portion other than the occupant's face is blurred on the temperature image. Therefore, when Otsu's binarization is performed only once, the boundary is not clear, and a relatively large region including the hand or the face of the occupant is set as a temperature candidate region.
By performing Otsu's binarization twice, the binarization processing unit 221 can further narrow down the temperature candidate region. Therefore, the binarization processing unit 221 can set a temperature candidate region which includes relatively central portions of the hand and the face of the occupant and which is more separated from the surroundings. As a result, when determining a temperature region, the temperature estimating unit 223 can determine that a temperature candidate region which includes relatively central portions of the hand and the face of the occupant and which is more separated is the temperature region, and thereby the estimation accuracy of the temperatures of the hand and the face of the occupant can be enhanced.
In the first embodiment described above, the binarization processing unit 221 creates a binary image in which pixels in the target region are binarized by performing Otsu's binarization, but this is merely an example. The binarization processing unit 221 may binarize pixels in the target region by a method other than Otsu's binarization. For example, the binarization processing unit 221 may binarize pixels in the target region using another known image processing means.
In the first embodiment described above, the occupant temperature estimating device 2, 2a includes the reliability estimating unit 23 and the estimation result determining unit 24, but does not have to include the reliability estimating unit 23 and the estimation result determining unit 24 necessarily.
In the first embodiment described above, the occupant temperature estimating device 2, 2a is an in-vehicle device mounted on a vehicle, and the temperature image acquiring unit 21, the estimation processing unit 22, the reliability estimating unit 23, the estimation result determining unit 24, the camera image acquiring unit 25, and the situation detecting unit 26 are included in the occupant temperature estimating device 2, 2a. It is not limited to this, and some of the temperature image acquiring unit 21, the estimation processing unit 22, the reliability estimating unit 23, the estimation result determining unit 24, the camera image acquiring unit 25, and the situation detecting unit 26 may be mounted on an in-vehicle device of a vehicle, and the others may be included in a server connected to the in-vehicle device via a network. In this manner, the in-vehicle device and the server may constitute a system.
In the first embodiment described above, the occupant state detection device 101 includes the occupant temperature estimating device 2, the wakefulness level detecting unit 4, and the sensible temperature detecting unit 5, but this is merely an example. For example, any one of the occupant temperature estimating device 2, the wakefulness level detecting unit 4, and the sensible temperature detecting unit 5 may be disposed outside the occupant state detection device 101.
In the first embodiment described above, the body part of the occupant is the hand or the face, but this is merely an example. The occupant temperature estimating device 2, 2a may estimate the temperature of a body part of the occupant other than the hand and the face. The occupant temperature estimating device 2, 2a only needs to estimate the temperature of at least one of the hand and the face of the occupant.
In the first embodiment described above, the occupant is assumed to be a driver of the vehicle, but this is merely an example. The occupant may be, for example, an occupant other than the driver, such as an occupant in an assistant driver's seat.
As described above, according to the first embodiment, the occupant temperature estimating device 2, 2a includes: the temperature image acquiring unit 21 that acquires a temperature image which is obtained by imaging a vehicle interior and whose pixels each have temperature information; the binarization processing unit 221 that sets at least one temperature candidate region in a target region from which a temperature of a body part of an occupant present in the vehicle interior is to be estimated, by binarizing pixels in the target region on the basis of the temperature information possessed by the pixels in the target region, the target region being included in a region of the temperature image acquired by the temperature image acquiring unit 21; the candidate region temperature calculating unit 222 that calculates a region temperature for the temperature candidate region in the target region on the basis of the temperature information possessed by a pixel in the temperature candidate region; and the temperature estimating unit 223 that determines a temperature region from among the at least one temperature candidate region on the basis of a separation degree indicating how much the temperature information possessed by the pixel in the temperature candidate region stands out from the temperature information possessed by a pixel in a region other than the temperature candidate region in the target region, the separation degree being calculated for the temperature candidate region in the target region, and estimates that the temperature of the body part of the occupant is the region temperature for the temperature region.
Therefore, the occupant temperature estimating device 2, 2a can enhance estimation accuracy of the temperatures of the hand and the face of the occupant based on the temperature image as compared with a conventional temperature estimating technique based on the temperature image.
In addition, the occupant temperature estimating device 2, 2a can accurately estimate the temperatures of the hand and the face of the occupant even from a temperature image having a medium or lower resolution.
In addition, according to the first embodiment, the occupant temperature estimating device 2, 2a includes: the reliability estimating unit 23 that estimates reliability of the temperature of the body part of the occupant estimated by the temperature estimating unit 223; and the estimation result determining unit 24 that determines whether or not to adopt the temperature of the body part of the occupant estimated by the temperature estimating unit 223 by comparing the reliability estimated by the reliability estimating unit 23 with the reliability determination threshold.
Therefore, the occupant temperature estimating device 2, 2a can further enhance the estimation accuracy of the temperatures of the hand and the face of the occupant based on the temperature image, and can prevent an estimation result with a low reliability among the estimation results of the temperatures of the hand and the face of the occupant from being used in another device or the like.
Note that, in the present disclosure, any component in the embodiment can be modified, or any component in the embodiment can be omitted.
The occupant temperature estimating device according to the present disclosure can enhance estimation accuracy of the temperatures of the hand and the face of the occupant based on the temperature image as compared with a conventional temperature estimating technique based on the temperature image.
1: sensor, 2, 2a: occupant temperature estimating device, 3: camera, 4: wakefulness level detecting unit, 5: sensible temperature detecting unit, 21: temperature image acquiring unit, 22: estimation processing unit, 221: binarization processing unit, 2211: target region extracting unit, 222: candidate region temperature calculating unit, 223: temperature estimating unit, 23: reliability estimating unit, 24: estimation result determining unit, 25: camera image acquiring unit, 26: situation detecting unit, 100: occupant temperature estimating system, 101: occupant state detection device, 231: machine learning model, 1901: processing circuit, 1902: input interface device, 1903: output interface device, 1904: CPU, 1905: memory
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/045315 | 12/4/2020 | WO |