The present invention relates to an eye state detecting method and an eye state detecting system, and particularly relates to an eye state detecting method and an eye state detecting system which can determine an eye state via an image with a low resolution and a smaller determining range.
More and more electronic apparatuses have the function for detecting an eye opening state or an eye closing state. Such functions can remind the user his eye is close, to avoid the user's eye closes at an improper timing (ex. while taking a picture). Also, the user can accordingly control the electronic apparatus via opening eyes or closing eyes. Such electronic apparatus needs a detecting apparatus to detect the eye opening and the eye closing. One of the detecting methods is capturing images via an image sensor, and detecting whether the user opens his eye or closes his eye based on features of images.
However, images with high resolutions or larger determining ranges are needed if a proper determination for features of images is desired, thus the cost for the electronic apparatus rises, or more computing loading is needed, which causes higher power consumption. However, it is hard to identify the features of images if images with low resolutions are applied for detecting, thus it is hard to detect whether the user open his eye or close his eye.
One objective of the present invention is to provide a detecting method that can use an image with a low resolution to determine the eye state.
Another objective of the present invention is to provide a detecting system that can use an image with a low resolution to determine the eye state.
One embodiment of the present invention discloses an eye state detecting method, applied to an electronic apparatus with an image sensor, which comprises: (a) acquiring a detecting image via the image sensor; (b) defining a face range on the detecting image; (c) defining a determining range on the face range; and (d) determining if the determining range comprises an open eye image or a close eye image.
Another embodiment of the present invention discloses an eye state detecting system, comprising: a control circuit; an image sensor, wherein the control circuit controls the image sensor to capture a detecting range; and a computing circuit, configured to define a face range on the detecting image, to define a determining range on the face range, and to determine if the determining range comprises an open eye image or a close eye image.
In view of above-mentioned embodiments, the eye state for the user can be determined without detail image features and without an image having a large range, thus the issue for prior art that an image with a high resolution is needed for determining the eye state for the user and high power consumption due to large computing loading can be solved.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
In following descriptions, several embodiments are provided to explain the concept of the present invention. Please note, the devices in following embodiments, for example, the unit, the module or the system, can be implemented by hardware (ex. a circuit) or hardware with firm ware (ex. programs written to a microprocessor).
In this embodiment, the detecting range DR is smaller than a maximum detecting range MDR, and the location thereof is pre-defined. In one embodiment, the possible location for the user's eye is pre-defined, and the detecting range DR is decided based on the possible location.
In the embodiment of
Step 401
Decide a detecting range according to a possible location of a user's eye. Take
Step 403
Capture a detecting image via the detecting range of the step 401.
Step 405
Determine whether the user's eye is in an opening state or in a closing state according a brightness of to the detecting image.
Another embodiment of the present invention is disclosed as below, which determines whether the user's eye is in an opening state or in a closing state according a brightness variation tendency. One of the determining rule is: the darkest part for the image is always one part of the eye while the user opens his eye and the peripheral region for the darkest part is also one part of the eye, thus has a dark image as well. Accordingly, the brightness variation tendency for the peripheral region of the darkest part is gentle while the user opens his eye. On the opposite, the darkest part for the image is always a region that is not skin (ex. the eyelash) while the user closes his eye and the peripheral region for the darkest part is skin in such case. Accordingly, the peripheral region for the darkest part in this case has a brighter image. Therefore, the brightness variation tendency for the peripheral region of the darkest part is sharp while the user closes his eye. Please note, the following embodiments can be implemented with the embodiment illustrated in
In one embodiment, the standard image line is an N-th image line of the detecting image. In such case, brightness sum differences between the brightness sum of the standard image line and brightness sums for each one of image lines from an N+1-th image line to an N+K-th image line of the detecting image are computed. Furthermore, the brightness sum differences between the brightness sum of the standard image line and brightness sums for each one of image lines from an N−1-th image line to an N−K-th image line of the detecting image are computed. The K is a positive integer larger or equaling to 1.
Such embodiment will be explained via an example as below:
Table 1 illustrates brightness sums for different pixel rows while an eye is open and while an eye is close. The ax indicates the brightness sum for the x-th pixel row. For example, a9 indicates the brightness sum for the 9-th pixel row, and a15 indicates the brightness sum for the 15-th pixel row. In such example, the pixel row with a lowest brightness while the eye is open is the 12-th row, which has a brightness sum of 2542 (a12). If the above-mentioned K is 3, brightness sum differences between the brightness sum of the 12-th image row and brightness sums for each one of image rows from a 9-th image row to an 11-th image row of the detecting image are computed. Also, brightness sum differences between the brightness sum of the 12-th image row and brightness sums for each one of image rows from a 13-th image row to a 15-th image row of the detecting image are computed. Such operations are depicted in the Equation (1):
Brightness sum difference=(a9−a12)+(a10−a12)+(a11−a12)+(a13−a12)+(a14−a12)+(a15−a12) Equation (1): Eye open
Similarly, the pixel row with a lowest brightness while the eye is close is the 16-th row, which has a brightness sum of 2643 (a16). If the above-mentioned K is 3, brightness sum differences between the brightness sum of the 16-th image row and brightness sums for each one of image rows from a 13-th image row to a 15-th image row of the detecting image are computed. Also, brightness sum differences between the brightness sum of the 12-th image row and brightness sums for each one of image rows from a 17-th image row to a 19-th image row of the detecting image are computed. Such operations are depicted in the Equation (2):
Brightness sum difference=(a13−a16)+(a14−a16)+(a15−a16)+(a17−a16)+(a18−a16)+(a19−a16) Equation (2): Eye close
Based on Equation (1), the brightness sum difference while the eye is open is:
(4035−2542)+(3514−2542)+(2813−2542)+(2669−2542)+(2645−2542)+(2835−2542)=3259.
Based on Equation (2), the brightness sum difference while the eye is close is:
(3772−2643)+(3226−2643)+(2703−2643)+(2878−2643)+(3365−2643)+(3745−2643)=3831
The above-mentioned Equation (1) and Equation (2) can be regarded as a cost function. New cost functions can be acquired if the concept of absolute values is added to Equation (1) and (2), thereby Equation (3) and (4) are acquired.
Brightness sum difference=|a9−a10|+|a10−a11|+|a11−a12|+|a13−a12|+|a14−a13|+|a15−a14| Equation (3): Eye open
Brightness sum difference=|a13−a14|+|a14−a15|+|a15−a16|+|a17−a16|+|a18−a17|+|a19−a18| Equation (4): Eye close
Based on Equation (3), the brightness sum difference while the eye is open is:
|4035−3514|+|3514−2813|+|2813−2542|+|2669−2542|+|2669−2645|+|2835−2645|=1834
Based on Equation (4), the brightness sum difference while the eye is close is:
|3772−3226|+|3226−2703|+|2703−2643|+|2878−2643|+|3365−2878|+|3745−3365|=2231
In view of above-mentioned examples, the brightness sum difference while the eye is close is larger than the brightness sum difference while the eye is open. That is, the brightness for a peripheral part for the darkest part of the detecting image while the eye is close varies more sharply than the brightness for a peripheral part for the darkest part of the detecting image while the eye is open. Therefore, the brightness variation for a peripheral part for the darkest part of the detecting image can be applied to determine if the user's is in an opening state or in a closing state.
Please note although the pixel row is applied as an example to explain the embodiment in
Step 601
Capture a detecting image. Such step can apply the detecting range in
Step 603
Compute a brightness sum for a plurality of image lines of the detecting image in a particular direction. For example, pixel rows or pixel columns.
Step 605
Apply one of the image lines which has the brightness sum with a lowest value as a standard image line.
Step 607
Compute a brightness sum difference between the brightness sum of the standard image line and brightness sums for at least two of the image lines.
Step 609
Decide a brightness variation tendency according to the brightness sum difference
Step 611
Determine whether a user's eye is in an opening state or in a closing state according to the brightness variation tendency.
Please note the above-mentioned steps 603-609 can be combined to form a step of “compute a brightness variation tendency for a peripheral part of a darkest part of the detecting image”. However, such step can be formed by other steps rather than steps 603-609.
If the eye state detecting system 700 applies the embodiment illustrated in
Other operations for the eye state detecting system 700 are described in above-mentioned embodiments, thus are omitted for brevity here.
The above-mentioned embodiments firstly decide a detecting range according to a possible location of a user's eye, and then determine whether the user's eye is in an opening state or in a closing state according to a brightness variation tendency of the image. In following embodiments, the face range is firstly determined, and then a searching range on the face range is decided. After that, determine whether the user's eye is in an opening state or in a closing state according to the image in the determining range. Detail steps will be explained in following descriptions.
Please refer to
The above-mentioned embodiment applies a smaller determining range CR and the computing for a whole image is not needed, such that the computing loading can be decreased. In one embodiment, if it is determined that the detecting image SI does not comprise a face image, the step for defining the determining range CR and the step for computing if the determining range CR comprises an open eye image or a close eye image can be removed. By this way, the computing loading can be further decreased. Various methods can be applied to define the determining range CR. In one embodiment (but not limited), a possible location for the eye is determined, and then the determining range CR is defined according to the possible location.
Also, the step 905 extracts features from the module building data, and the step 907 builds the module corresponding to the features extracted in the step 905. For example, at least one image comprising the face image is input in the step 901. Also, the step 905 extracts features for the face image, and the step 907 builds a face image feature module corresponding to the face image features extracted in the step 905. By this way, it can be known that which features should exist if an image comprises a face image. Besides, in the step 907, the detecting image to be determined is input. The step 911 performs a pre-process similar with the step 903. The step 913 extracts features from the detecting image. The step 915 determines if the detecting image comprises a face image, an open eye image or a close eye image according to which one of the determining modules do features of the detecting image meet. After that, it can be determined that if the detecting image comprises a face image, an open eye image or a close eye image.
Various conventional algorithms can be applied to perform the step 905 or the step 913 to extract features of images, for example, the gabor algorithm or the harr algorithm. Similarly, various conventional algorithms can be applied to determine which one of the determining modules does the detecting image meet (i.e. classify the detecting image), for example, the adaboost algorithm. It will be appreciated that the present invention is not limited to above-mentioned algorithms.
The embodiments illustrated in
Based upon the embodiments illustrated in
Step 1001
Acquire a detecting image via the image sensor (ex. SI in
Step 1003
Define a face range on the detecting image (ex. Fr in
Step 1005
Define a determining range on the face range (ex. CR in
Step 1007
Determine if the determining range comprises an open eye image or a close eye image.
In one embodiment, the methods illustrated in
In view of above-mentioned embodiments, the eye state for the user can be determined without detail image features and without an image having a large range, thus the issue for prior art that an image with a high resolution is needed for determining the eye state for the user and high power consumption due to large computing loading can be solved.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
104121917 | Jul 2015 | TW | national |
105117315 | Jun 2016 | TW | national |
This application is a continuation of applicant's earlier application, Ser. No. 15/985,632, filed 2018 May 21, which is a continuation of applicant's earlier application, Ser. No. 15/199,965, filed 2016 Jun. 30. The entire contents thereof are included herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5801763 | Suzuki | Sep 1998 | A |
5878156 | Okumura | Mar 1999 | A |
6714665 | Hanna | Mar 2004 | B1 |
6718050 | Yamamoto | Apr 2004 | B1 |
RE39539 | Torch | Apr 2007 | E |
8761516 | Hermant-Santini | Jun 2014 | B2 |
9201512 | Raffle | Dec 2015 | B1 |
9207760 | Wu | Dec 2015 | B1 |
20010009591 | Hiraishi | Jul 2001 | A1 |
20040179716 | Tafuku | Sep 2004 | A1 |
20050232461 | Hammoud | Oct 2005 | A1 |
20060203088 | Hammoud | Sep 2006 | A1 |
20080151186 | Adachi | Jun 2008 | A1 |
20080226138 | Suzuki | Sep 2008 | A1 |
20080252745 | Nakamura | Oct 2008 | A1 |
20090003709 | Kaneda | Jan 2009 | A1 |
20100073497 | Katsumata | Mar 2010 | A1 |
20110115967 | Lee | May 2011 | A1 |
20120269442 | Hermant-Santini | Oct 2012 | A1 |
20130222642 | Watanabe | Aug 2013 | A1 |
20130257709 | Raffle | Oct 2013 | A1 |
20140078281 | Tsou | Mar 2014 | A1 |
20140112580 | Hanita | Apr 2014 | A1 |
20150186720 | Tsou | Jul 2015 | A1 |
20160058158 | Tomita | Mar 2016 | A1 |
20160373645 | Lin | Dec 2016 | A1 |
20170091955 | Nakai | Mar 2017 | A1 |
20170160799 | Shi | Jun 2017 | A1 |
20170316264 | Gustafsson | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
101196993 | Jun 2008 | CN |
101520842 | Sep 2009 | CN |
101930535 | Dec 2010 | CN |
102006407 | Apr 2011 | CN |
102164541 | Aug 2011 | CN |
103680064 | Mar 2014 | CN |
103729646 | Apr 2014 | CN |
104463081 | Mar 2015 | CN |
104573704 | Apr 2015 | CN |
2 096 577 | Sep 2009 | EP |
2001-229499 | Aug 2001 | JP |
2010-15463 | Jan 2010 | JP |
201140511 | Nov 2011 | TW |
I432012 | Mar 2014 | TW |
2007092512 | Aug 2007 | WO |
Number | Date | Country | |
---|---|---|---|
20210042498 A1 | Feb 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15985632 | May 2018 | US |
Child | 17083308 | US | |
Parent | 15199965 | Jun 2016 | US |
Child | 15985632 | US |