This application is a National Stage Entry of PCT/JP2019/003752 filed on Feb. 1, 2019, the contents of all of which are incorporated herein by reference, in their entirety.
The present invention relates to an alertness estimation apparatus and an alertness estimation method for estimating an alertness of human, and further relates to a computer-readable recording medium in which a program for executing these apparatus and method is recorded.
In recent years, the working-age population has decreased due to the declining birthrate and aging population, and the labor shortage is progressing. Under such circumstances, attempts to replace some of the work that humans have done up to now with robots or AI (Artificial Intelligence) are increasing. However, among the jobs performed by humans, it is difficult to replace them with robots or AI for jobs that require intellectual labor. For this reason, it will be essential for humans to maintain and improve the productivity of intellectual labor in the future.
By the way, unlike machines, humans feel drowsy during work (become a state of low arousal) due to lack of sleep or hangover The cause of lack of sleep is a disease such as sleep apnea syndrome. Naturally, in such a situation, it becomes difficult for the worker to maintain normal productivity.
It is assumed that the wakefulness of the worker is detected and the detected wakefulness is presented to the worker, thereby causing the worker to think about the cause of lack of sleep and to give an opportunity to review the lifestyle. According to this, it may be possible to suppress a decrease in an alertness of the worker during work. Therefore, it is important to accurately detect the wakefulness of humans.
For example, Patent Document 1 discloses an apparatus for detecting a wakefulness of a human. Specifically, the apparatus disclosed in Patent Document 1 measures the blinking time from a face image, and further measures the heart rate variability from a heart rate sensor. Then, the apparatus disclosed in Patent Document 1 determines that the human is a sign of falling asleep when the measured blink time is extended and the start of increase in the HF component is observed in the measured heart rate variability.
By the way, according to the apparatus disclosed in Patent Document 1, it is possible to detect a wakefulness of humans, but for that purpose, it is necessary to equip each human with a sensor for detecting a heart rate. As a result, there is a problem that the cost is too high for the apparatus disclosed in Patent Document 1 to be introduced by a company or the like.
On the other hand, it is considered that the above problem can be solved by introducing a technique for detecting a heart rate from a face image of a human into the apparatus disclosed in Patent Document 1. However, in this case, in addition to the process of measuring the blinking time from the face image, the apparatus needs to constantly perform the process of detecting the heart rate from the face image, which increases the processing load on the apparatus. Further, since the apparatus disclosed in Patent Document 1 is realized by a computer used by employees or the like in business or a computer prepared for another purpose, increasing the processing load is a problem that cannot be overlooked.
An example object of the present invention is to provide an alertness estimation apparatus, an alertness estimation method, and a computer-readable recording medium that solve the aforementioned problems and detect a wakefulness of human without increasing the introduction cost and processing load.
To achieve the aforementioned example object, an alertness estimation apparatus according to an example aspect of the present invention includes:
Furthermore, to achieve the aforementioned example object, an alertness estimation method according to an example aspect of the present invention includes:
Moreover, to achieve the aforementioned example object, a computer-readable recording medium according to an example aspect of the present invention has recorded therein a program including instructions that causes a computer to execute:
As described above, according to the present invention, it is possible to detect a wakefulness of human without increasing the introduction cost and the processing load.
The following describes an alertness estimation apparatus, an alertness estimation method, and a program according to an example embodiment of the present invention with reference to
[Apparatus Configuration]
First, a configuration of the alertness estimation apparatus according to the present example embodiment will be described using
The alertness estimation apparatus 10 according to the present example embodiment shown in
The image extraction unit 11 extracts an image of a portion including eyes from an image data of a face image of a human 30. The first alertness estimation unit 12 estimates a first alertness based on the extracted image. The pulse wave detection unit 13 determines whether or not the estimated first alertness satisfies a set condition, and detect a pulse wave of the human from the face image, when the first alertness satisfies the set condition. The second alertness estimation unit 14 estimates a second alertness of the human 30 based on the detected pulse wave.
As described above, in the present example embodiment, first, the pulse wave detection process from the face image and the alertness estimation process from the pulse wave are executed, only when the alertness estimated from the state of eyes portions satisfies the set condition. Therefore, an increase in the processing load of the apparatus is suppressed. Further, in the present example embodiment, it is not necessary for each human to wear a sensor in order to detect the pulse wave. Therefore, according to the present example embodiment, it is possible to detect the wakefulness of humans without increasing an introduction cost and the processing load.
Subsequently, the configuration of the alertness estimation apparatus 10 and the functions of each unit in the present example embodiment will be described more specifically.
As shown in
In the present example embodiment, the image extraction unit 11 first receives the image data output by the imaging apparatus 20, and extracts a feature value of the received image data each time. Subsequently, the image extraction unit 11 compares the extracted feature value with the feature value of eye and other facial parts registered in advance. The image extraction unit 11 specifies region of eyes and region around the eyes in the image data, and extracts an image of the specified region.
In the present example embodiment, the first alertness estimation unit 12 detects states (open or closed) of the eyelids from the image extracted by the image extraction unit 11 (hereinafter referred to as “extracted image”). The first alertness estimation unitl 2 determines whether or not the human 30 is awake according to the detected state, and sets the determination result as the first alertness. Therefore, the first alertness is either a value “1” indicating that the huma 30 is awake or a value “0” indicating that the human 30 is sleeping.
Specifically, the first alertness estimation unit 12 acquires the extracted images for the set period, and for each extracted image, calculates the degree of opening and closing of the eyelids in the extracted image (closed state 0.0 or open state 1.0). The first alertness estimation unit 12, for example, calculates the time when the eyelids are closed, based on time-series of the calculated the degree of opening and closing. Subsequently, the first alertness estimation unit 12 determines that the human 30 is awake in case that the time when the eyelids are closed is equal to or less than the threshold value. On the other hand, the first alertness estimation unit 12 determines that the human 30 is sleeping in case that the time when the eyelids are closed exceeds the threshold value.
Then, when the first alertness estimation unit 12 determines that the human 30 is awake, the first alertness estimation unit 12 sets the first alertness indicating that the human human 30 is awake to “1”. Further, when the first alertness estimation unit 12 determines that the human 30 is sleeping the first alertness estimation unit 12 sets the first alertness indicating that the human 30 is sleeping to “0 (zero)”.
When the first alertness estimation unit 12 determines that the human 30 is not awake, that is, when the first alertness is “0”, the pulse wave detection unit 13 determines that the first alertness satisfies the set condition. Then, the pulse wave detection unit 13 detects the pulse wave of the human 30 from the face image taken by the imaging apparatus 20. In the present embodiment, the pulse wave is detected by detecting a change in the brightness (G value) of the green component in the face image. This technique takes advantage of the property that hemoglobin absorbs green light.
Specifically, the pulse wave detection unit 13 calculates the average value of the brightness values of the extracted green components of all the pixels in the image for each output image data for the set period. Then, the brain wave detection unit 13 acquires the time-series change of the brightness value of the green component by using the calculated average value. Next, the pulse wave detection unit 13 performs a process of removing noise generated by the movement of the body or head, the change of facial expression, the change of the surrounding light environment, etc. from the acquired time-series change of the brightness value. The pulse wave detection unit 13 set the time-series change of the brightness value after processing as a pulse wave.
As a noise removal processing method, the following method can be mentioned as a desirable method. First, the pulse wave detection unit 13 frequency-converts a signal of time-series change in brightness value including noise for a set period (for example, 60 seconds) in units of windows for a time shorter than the set period (for example, 4 seconds). The pulse wave detection unit 13 calculates the frequency (frequency corresponding to the pulse component) at which the power is maximized for each window. Then, the pulse wave detection unit 13 extracts a signal of time-series change of the frequency corresponding to the pulse component by using the calculation result.
Next, the pulse wave detection unit 13 calculates statistical values (for example, average value, standard deviation, maximum value, minimum value, etc.) from the time-series change signal of the extracted frequency. The pulse wave detection unit uses the calculated statistical value to specify the frequency range in which the time-series change signal of that frequency is distributed. After that, the pulse wave detection unit 13 applies a bandpass filter that passes only the signals in the frequency range in which the time-series change signals are distributed, to the signal indicating the time-series change of the brightness value including the noise for the set period (60 seconds). As a result, noise unrelated to the pulse component is removed.
According to the noise removal processing method described above, the frequency value corresponding to the pulse calculated from the signal of the time-series change of the brightness value including noise includes an error. However, in the above-mentioned noise removal processing method, the error is suppressed in terms of the frequency range in which the time-series change signal is distributed by calculating the statistical value for the set period (60 seconds). Therefore, in the above-mentioned noise removal processing method, it is possible to effectively obtain a time-series change in the brightness value after noise removal, that is, a pulse wave.
In the present example embodiment, the second alertness estimation unit 14 determines whether or not the human 30 is awake according to the time-series change detected as the pulse wave by the pulse wave detection unit 13, and set the determination result as the second alertness. Specifically, the second alertness estimation unit 14 determines whether or not the human 30 is awake by using the neural network learned using the training data shown in
Further, examples of the neural network used in the present example embodiment include a convolutional neural network and the like. In
The second alertness estimation unit 14 inputs a new pulse wave time-series signal other than the training data to the neural network. As a result, the neural network outputs “0” or “1” as the determination result of the second alertness. Then, the second alertness estimation unit 14 determines that the human 30 is awake when “1” is output and determines that the human 30 is asleep when “0” is output.
As shown in
[Apparatus Operations]
Next, the operations of the alertness estimation apparatus 10 according to the present example embodiment will be described using
As shown in
Next, the first alertness estimation unit 12 estimates the first alertness based on the image extracted in step A1 (step A2). Specifically, the first alertness estimation unit 12 detects the state of eyes from the image extracted in step A1, determines whether or not the human 30 is awake according to the detected state, and set the determination resalt as the first alertness.
Next, the pulse wave detection unit 13 determines whether or not the first alertness estimated in step A2 satisfies the set condition (step A3). Specifically, the pulse wave detection unit 13 determines whether or not the first alertness indicates that the human 30 is sleeping.
As a result of the determination in step A3, when the first alertness does not satisfy the set condition, that is, when the first alertness indicates that the human 30 is awake, the pulse wave detection unit 13 output the first alertness to an external device (step A7). Therefore, the first alertness is displayed on the screen of the external device. Examples of the external device include a display device, a terminal device, and the like.
On the other hand, as a result of the determination in step A3, when the first alertness satisfies the set condition, that is, when the first alertness indicates that the human 30 is sleeping, the pulse wave detection unit 13 detects the pulse wave of the human 30 from the face image of the image data acquired in step A1 (step A4).
Next, the second alertness estimation unit 14 estimates the second alertness of the human 30 based on the pulse wave detected in step A4 (step A5). Specifically, in the second example embodiment, the second alertness estimation unit 14 determines whether or not the human 30 is awakened according to the time-series change of the brightness value of the specific pixel detected by the pulse wave detection unit 13. The second alertness estimation unit 14 set the determination result as the second alertness.
After that, the second alertness estimation unit 14 outputs the second alertness estimated in step A5 to an external device (step A6). As a result, the second alertness is displayed on the screen of the external device.
Further, after the execution of step A6 or A7, step A1 is executed again. Steps A1 to A7 are repeatedly executed. As a result, the latest first alertness or second alertness is always displayed on the screen of the external device.
Hereinafter, modified examples 1 to 4 in the example embodiment will be described.
Modification 1:
In this modification 1, in the learning of the first alertness estimation model used in the first alertness estimation unit 12, a multi-step alertness to which a face image is given, for example, a five-step alertness (1: not sleepy at all, 0.75: slightly sleepy, 0.5: sleepy, 0.25: quite sleepy, 0.0: very sleepy) are used, as the first alertness that becomes the correct answer data. In the learning of the first alertness estimation model, the time when the eyes are closed, or the number of blinks are input as the state of the eyes, and the learning is performed so that the output matches the above correct answer data.
Further, in the present modification 1, the output of the first alertness estimation unit 12 is not a binary value (0 or 1) but a multi-value of 0 to 1. Therefore, the pulse wave detection unit 13 operates depending on whether or not the first alertness is below the threshold (condition). Further, when the first alertness is above the threshold, the pulse wave detection unit 13 does not operate, and when the first alertness is below the threshold, the pulse wave detection unit 13 operates.
Modification 2:
In this modification 2, in the learning of the second alertness estimation model used in the second alertness estimation unit 14, behavior information is used as the second alertness that becomes the correct answer data. The behavior information is obtained, for example, by asking the subject, which of the four stages (1.0: waking up, 0.66: light sleep, 0.33: normal sleep, 0.0: deep sleep) the behavior corresponds to.
Further, in the present modification 2, the output of the second alertness estimation unit 14 is not a binary value (0 or 1) but a multi-value of 0 to 1. Further, in the present modification 2, the multi-valued output may be converted into a binary value (awakening or sleeping) by the second alertness estimation unit 14, and then output.
Modification 3:
In this modification 3, the second alertness estimation unit 14 determines whether or not the human 30 is awake using a learning model, and the determination result as the second alertness. The learning model is constructed by machine learning the relationship between a brain activity of the human 30 and the pulse wave of the human 30.
In the third modification, the learning model can be constructed, for example, by learning the convolutional neural network using the training data shown in
In the example of
As described above, according to the present example embodiment, the second alertness is estimated by the pulse wave only after the first alertness estimated from the state of eyes is determined to be “sleeping”. Therefore, the increase in the processing load in the alertness estimation apparatus 10 is suppressed. Further, since the pulse wave is detected from the face image as in the first alertness, it is not necessary to prepare a new sensor for detecting the pulse wave, so that the increase in the introduction cost is suppressed.
[Program]
It is sufficient for the program according to the present example embodiment to be a program that causes a computer to execute steps A1 to A7 shown in
Furthermore, the program according to the present example embodiment may be executed by a computer system constructed with a plurality of computers. In this case, for example, each computer may function as one of the image extraction unit 11, the first alertness estimation unit 12, the pulse wave detection unit 13, and the second alertness estimation unit 14.
Here, using
As shown in
The CPU 111 carries out various types of calculation by deploying the program (codes) according to the present example embodiment stored in the storage device 113 to the main memory 112, and executing the codes in a predetermined order. The main memory 112 is typically a volatile storage device, such as a DRAM (dynamic random-access memory). Also, the program according to the present example embodiment is provided in a state where it is stored in a computer-readable recording medium 120. Note that the program according to the present example embodiment may be distributed over the Internet connected via the communication interface 117.
Also, specific examples of the storage device 113 include a hard disk drive and a semiconductor storage device, such as a flash memory. The input interface 114 mediates data transmission between the CPU 111 and an input apparatus 118, such as a keyboard and a mouse. The display controller 115 is connected to a display apparatus 119, and controls display on the display apparatus 119.
The data reader/writer 116 mediates data transmission between the CPU 111 and the recording medium 120, reads out the program from the recording medium 120, and writes the result of processing in the computer 110 to the recording medium 120. The communication interface 117 mediates data transmission between the CPU 111 and another computer.
Specific examples of the recording medium 120 include a general-purpose semiconductor storage device such as CF (CompactFlash®) and SD (Secure Digital), a magnetic recording medium such as a flexible disk; and an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory).
Note that the alertness estimation apparatus 10 according to the present example embodiment can also be realized by using items of hardware that respectively correspond to the components, rather than the computer in which the program is installed. Furthermore, a part of the alertness estimation apparatus 10 may be realized by the program, and the remaining part of the alertness estimation apparatus 10 may be realized by hardware.
A part or an entirety of the above-described example embodiment can be represented by (Supplementary Note 1) to (Supplementary Note 21) described below, but is not limited to the description below.
(Supplementary Note 1)
An alertness estimation apparatus, including:
(Supplementary Note 2)
The alertness estimation apparatus according to Supplementary Note 1, wherein
(Supplementary Note 3)
The alertness estimation apparatus according to Supplementary Note 2, wherein
(Supplementary Note 4)
The alertness estimation apparatus according to Supplementary Note 1 or 2, wherein
(Supplementary Note 5)
The alertness estimation apparatus according to Supplementary Note 4, wherein
(Supplementary Note 6)
The alertness estimation apparatus according to Supplementary Note 4, wherein
(Supplementary Note 7)
The alertness estimation apparatus according to any of Supplementary Notes 1 to 6, wherein
(Supplementary Note 8)
An alertness estimation method, including:
(Supplementary Note 9)
The alertness estimation method according to Supplementary Note 8, wherein
(Supplementary Note 10)
The alertness estimation method according to Supplementary Note 9, wherein
(Supplementary Note 11)
The alertness estimation method according to Supplementary Note 8 or 9, wherein
(Supplementary Note 12)
The alertness estimation method according to Supplementary Note 11, wherein
(Supplementary Note 13)
The alertness estimation method according to Supplementary Note 12, wherein
(Supplementary Note 14)
The alertness estimation method according to any of Supplementary Notes 8 to 13, wherein
(Supplementary Note 15)
A computer-readable recording medium in which a program is recorded, the program including instructions that causes a computer to cany out:
(Supplementary Note 16)
The computer-readable recording medium according to Supplementary Note 15, wherein
(Supplementary Note 17)
The computer-readable recording medium according to Supplementary Note 16, wherein
(Supplementary Note 18)
The computer-readable recording medium according to Supplementary Note 15 or 16, wherein
(Supplementary Note 19)
The computer-readable recording medium according to Supplementary Note 18, wherein
(Supplementary Note 20)
The computer-readable recording medium according to Supplementary Note 18, wherein
(Supplementary Note 21)
The computer-readable recording medium according to any of Supplementary Notes 15 to 20, wherein
Although the invention of the present application has been described above with reference to the example embodiment, the invention of the present application is not limited to the above-described example embodiment. Various changes that can be understood by a person skilled in the art within the scope of the invention of the present application can be made to the configuration and the details of the invention of the present application.
As described above, according to the present invention, it is possible to detect a wakefulness of human without increasing the introduction cost and the processing load. The present invention is useful for systems in which detection of the wakefulness of humans is required, for example, a vehicle driving support system, a computer system for business use, and the like.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/003752 | 2/1/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/157989 | 8/6/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5859921 | Suzuki | Jan 1999 | A |
6717518 | Pirim | Apr 2004 | B1 |
10467488 | Sicconi | Nov 2019 | B2 |
20080037837 | Noguchi | Feb 2008 | A1 |
20100090839 | Omi | Apr 2010 | A1 |
20110216181 | Yoda | Sep 2011 | A1 |
20140210978 | Gunaratne | Jul 2014 | A1 |
20160120477 | Takahashi | May 2016 | A1 |
20160374606 | Shikii et al. | Dec 2016 | A1 |
20170020432 | Kusukame et al. | Jan 2017 | A1 |
20170325750 | Tanabe | Nov 2017 | A1 |
20180116597 | Yu | May 2018 | A1 |
20180176741 | Cremer | Jun 2018 | A1 |
20180202823 | Maekawa | Jul 2018 | A1 |
20180279892 | Qi | Oct 2018 | A1 |
20190227547 | Sugahara | Jul 2019 | A1 |
Number | Date | Country |
---|---|---|
2007-257043 | Oct 2007 | JP |
2009-018091 | Jan 2009 | JP |
2009-028239 | Feb 2009 | JP |
2012-164040 | Aug 2012 | JP |
2013-081708 | May 2013 | JP |
2017-012730 | Jan 2017 | JP |
2017-164526 | Sep 2017 | JP |
Entry |
---|
International Search Report for PCT Application No. PCT/JP2019/003752, dated Apr. 23, 2019. |
English translation of Written opinion for PCT Application No. PCT/JP2019/003752, dated Apr. 23, 2019. |
Number | Date | Country | |
---|---|---|---|
20220313132 A1 | Oct 2022 | US |