Driver monitoring systems are becoming more and more popular for inclusion in vehicles, so as to warn a driver when the driver is detected to be in a non-alert state.
According to one exemplary embodiment, a driver alertness detection system includes an imaging unit configured to image an area in a vehicle compartment where a driver's head is located; an image processing unit configured to receive the image from the imaging unit, and to determine positions of the driver's head and eyes; and a warning unit configured to determine, based on the determined position of the driver's head and eyes as output by the image processing unit, whether the driver is in an alert state or a non-alert state, and to output a warning to the driver when the driver is determined to be in the non-alert state.
According to another exemplary embodiment, a method of detecting driver alertness includes imaging an area in a vehicle compartment where a driver's head is located, to obtain imaging data; determining positions of the driver's head and eyes based on the imaging data; based on the determined position of the driver's head and eyes, determining whether the driver is in an alert state or a non-alert state; and outputting a warning to the driver when the driver is determined to be in the non-alert state.
According to yet another exemplary embodiment, a non-transitory computer readable medium storing computer program code that, when executed by a computer, causes the computer to perform the functions of imaging an area in a vehicle compartment where a driver's head is located, to obtain imaging data; determining positions of the driver's head and eyes based on the imaging data; determining, based on the determined position of the driver's head and eyes, whether the driver is in an alert state or a non-alert state; and outputting a warning to the driver when the driver is determined to be in the non-alert state.
These and other features, aspects, and advantages of the present invention become apparent from the following description and the accompanying exemplary embodiments shown in the drawings, which are briefly described below.
The incidence of single-vehicle and multi-vehicle accidents continues to increase due to driver distraction from increasingly sophisticated communication and entertainment devices. Drivers are increasingly distracted by cell phones which can function as a variety of devices such as instant messengers, cameras, GPS navigation systems, internet web browsers, movie players, and spreadsheet analysis tools. The allure of these applications tempts both novice and experienced drivers from actively comprehending their lane position, distance to other vehicles, traffic signals, and roadway boundaries.
According to an exemplary embodiment, a driver alertness system is provided to monitor the alertness of a driver and issue a warning if it is determined that the driver is distracted for a sufficient period. Referring to
The instantaneous positional reports of the driver's head and eye vector are aggregated over short time durations and used to determine where the driver's attention is focused using a driver alertness algorithm. If the driver's attention is not adequately focused upon a calibrated forward-looking region, a warning signal will be given to the driver. The warning signal is a signal that is able to be conveyed to the driver regardless of where the driver's attention is. A wide variety of audio, visual, or tactile warnings signals may be issued, including an in-dash buzzer or vibration sent through the driver's seat with a motorized cam operating on the seat pan.
According to an exemplary embodiment, the driver alertness algorithm includes two independent detection modules operating on image data, such as a 752-column×480-row 10-bit intensity image. The first detection module discriminates background structures from the driver's silhouette and computing head rotation (turning left or right) and tilt/roll (touching ear to shoulder). The second detection module identifies head pitch, control points from discernable facial structures (eyes, nostrils, and lips), and a matrix of inter-point metrics. Both modules derive a pseudo-depth scale from image intensity gradients using a Gouraud shading model or other suitable shading model. Referring to
Referring to
Referring to
Referring to
Alignment to the coordinate grid, using a principal axis transform, normalizes facial geometry and simplifies the search for related facial features. Control points on the driver's eyes, nostrils, and lips are rotated to be collinear, reducing the likelihood the system will confuse an eye with a nostril. Without the principal axis transformation, the matrix of inter-point metrics could contain corrupted measurements from mismatched facial control points.
Referring to
Referring to
Referring to
The algorithm detects the head and outlines the boundary of the head in step 1045. The head is then rotated about the principle axis to align it with the camera's coordinate system in step 1050, whereby a rotation matrix 1052 is created. The head symmetry is then analyzed to determine the orientation of the driver's head relative to a forward-facing orientation in step 1055.
The algorithm also detects facial control points from the captured image in step 1060. The facial control points are then rotated to match the rotation of the head about the principal axis in step 1065 (using the rotation matrix 1052). The facial control points are then analyzed using inter-point metrics to determine the orientation of the driver's face relative to a forward-facing orientation in step 1070. Inter-point metrics, or the relationships between the control points, may comprise a set of vectors connecting control points (e.g. vectors between any combination of pupils, nostrils, corners of the mouth, or other suitable control points).
In steps 1055 and 1070, inter-frame metrics may be used to determine changes in the position of the head over time (i.e. between image frames). As the head of the driver moves, the control points change positions. Correspondingly, the vectors between the control points shift (see, e.g.,
Using the position of the head and facial control points, the algorithm classifies the attention state of the driver in step 1075. A warning status is then assigned based on the duration and magnitude of the driver's deviation from a forward-facing orientation in step 1080. The warning status may be used to activate a visual, audio, or tactile warning signal to the driver. The warning status and captured image may also be output to a graphical user interface.
The driver alertness algorithm determines a driver's attention state based on driver characteristics (head position and eye vector) and not what the vehicle is doing relative to roadway lane markers (lane departure warning systems). The driver alertness algorithm has the advantage of warning an inattentive driver before his vehicle deviates from roadway boundaries. The driver alertness algorithm uses interior vision sensing which is more robust than exterior sensing during nighttime and high dynamic range conditions (going through a tunnel with on-coming vehicle headlights).
Referring to
The present disclosure has been described with reference to example embodiments, however persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the disclosed subject matter. For example, although different example embodiments may have been described as including one or more features providing one or more benefits, it is contemplated that the described features may be interchanged with one another or alternatively be combined with one another in the described example embodiments or in other alternative embodiments. Because the technology of the present disclosure is relatively complex, not all changes in the technology are foreseeable. The present disclosure described with reference to the exemplary embodiments is manifestly intended to be as broad as possible. For example, unless specifically otherwise noted, the exemplary embodiments reciting a single particular element also encompass a plurality of such particular elements.
Exemplary embodiments may include program products comprising computer or machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. For example, the driver monitoring system may be computer driven. Exemplary embodiments illustrated in the methods of the figures may be controlled by program products comprising computer or machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such computer or machine-readable media can be any available media which can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such computer or machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of computer or machine-readable media. Computer or machine-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions. Software implementations of the present invention could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
It is also important to note that the construction and arrangement of the elements of the system as shown in the preferred and other exemplary embodiments is illustrative only. Although only a certain number of embodiments have been described in detail in this disclosure, those skilled in the art who review this disclosure will readily appreciate that many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited. For example, elements shown as integrally formed may be constructed of multiple parts or elements shown as multiple parts may be integrally formed, the operation of the assemblies may be reversed or otherwise varied, the length or width of the structures and/or members or connectors or other elements of the system may be varied, the nature or number of adjustment or attachment positions provided between the elements may be varied. It should be noted that the elements and/or assemblies of the system may be constructed from any of a wide variety of materials that provide sufficient strength or durability. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the preferred and other exemplary embodiments without departing from the spirit of the present subject matter.
This application claims priority from Provisional Application 61/467,849, filed Mar. 25, 2011, incorporated herein by reference in its entirety. The present disclosure relates generally to the field of driver monitoring systems. More specifically, the present disclosure relates to a system and method for determining the alertness of a driver using image processing to assess the head position and eye vector of the driver.
Number | Name | Date | Kind |
---|---|---|---|
5034679 | Henderson et al. | Jul 1991 | A |
5351044 | Mathur et al. | Sep 1994 | A |
5689241 | Clarke et al. | Nov 1997 | A |
6092014 | Okada | Jul 2000 | A |
6172600 | Kakinami et al. | Jan 2001 | B1 |
6313749 | Horne et al. | Nov 2001 | B1 |
6756903 | Omry et al. | Jun 2004 | B2 |
6927694 | Smith et al. | Aug 2005 | B1 |
7639148 | Victor | Dec 2009 | B2 |
7647180 | Breed | Jan 2010 | B2 |
7751973 | Ibrahim | Jul 2010 | B2 |
7755682 | Lin | Jul 2010 | B2 |
7788008 | Breed | Aug 2010 | B2 |
7835834 | Smith et al. | Nov 2010 | B2 |
7840355 | Breed et al. | Nov 2010 | B2 |
7876957 | Ovsiannikov et al. | Jan 2011 | B2 |
7899616 | Breed | Mar 2011 | B2 |
7970172 | Hendrickson | Jun 2011 | B1 |
8000897 | Breed et al. | Aug 2011 | B2 |
8004588 | Lukac | Aug 2011 | B2 |
8005297 | Hung et al. | Aug 2011 | B2 |
8035704 | Hu et al. | Oct 2011 | B2 |
8036788 | Breed | Oct 2011 | B2 |
8044789 | Daura Luna et al. | Oct 2011 | B2 |
8077921 | Kawasaki | Dec 2011 | B2 |
8248220 | Nagamine et al. | Aug 2012 | B2 |
8314707 | Kobetski et al. | Nov 2012 | B2 |
8331621 | Allen et al. | Dec 2012 | B1 |
8385600 | Nara et al. | Feb 2013 | B2 |
20020118212 | Lake et al. | Aug 2002 | A1 |
20040022416 | Lemelson et al. | Feb 2004 | A1 |
20050083211 | Shafir et al. | Apr 2005 | A1 |
20050226472 | Komura | Oct 2005 | A1 |
20050232461 | Hammoud | Oct 2005 | A1 |
20060048800 | Rast et al. | Mar 2006 | A1 |
20060215076 | Karim | Sep 2006 | A1 |
20070154068 | Stein et al. | Jul 2007 | A1 |
20080069400 | Zhu et al. | Mar 2008 | A1 |
20090109648 | Hay | Apr 2009 | A1 |
20100253495 | Asano et al. | Oct 2010 | A1 |
20100274641 | Allen et al. | Oct 2010 | A1 |
20110046843 | Caveney | Feb 2011 | A1 |
20110054716 | Stahlin et al. | Mar 2011 | A1 |
20110068934 | Weng et al. | Mar 2011 | A1 |
20110098922 | Ibrahim | Apr 2011 | A1 |
20120288154 | Shima et al. | Nov 2012 | A1 |
20130021462 | Kadoya et al. | Jan 2013 | A1 |
Number | Date | Country |
---|---|---|
08-115500 | May 1996 | JP |
08-202991 | Aug 1996 | JP |
2002-260190 | Sep 2002 | JP |
2006-111184 | Apr 2006 | JP |
2007-106170 | Apr 2007 | JP |
WO-2009060172 | May 2009 | WO |
Entry |
---|
International Search Report dated Oct. 18, 2012 issued in connection with International Application No. PCT/US2012/030162. |
Written Opinion of the International Searching Authority dated Oct. 18, 2012 issued in connection with International Application No. PCT/US2012/030162. |
Office Action dated Sep. 25, 2013 issued in connection with U.S. Appl. No. 13/427,808. |
Extended European Search Report dated Oct. 10, 2014 issued in European Application No. 12765209. |
Number | Date | Country | |
---|---|---|---|
20120242819 A1 | Sep 2012 | US |
Number | Date | Country | |
---|---|---|---|
61467849 | Mar 2011 | US |