This invention relates to driver condition detecting devices and driver condition informing devices, particularly, to driver condition detecting devices for detecting inattentive driving or discursive driving of the driver, and driver condition informing devices for informing inattentive driving or discursive driving.
Independently of age demographic, motor vehicle accidents are attributed the most to inattentive driving and discursive driving (refer to Japanese government statistics home page, http://www.e-stat.go.jp/). In such situations, advocated as technologies for detecting conditions of drivers are those for detecting awakefullness, especially sleepiness or doze states (see PTLs 1 to 3). Notwithstanding, there has been posed a drawback that detections of sleepiness or doze states cannot solely in the future reduce the vehicle accidents.
Therefore, an object of the invention is to provide driver condition detecting devices for detecting inattentive driving or discursive driving by movement of head and movement of eyeball of driver during driving.
According to one aspect of the invention to achieve the above mentioned object, there is provided a driver condition detecting device, comprising: a camera unit photographing a face of a driver; a head detector detecting a movement of a head of the driver from an image photographed by the camera unit; an eyeball detector detecting a movement of an eyeball of the driver from the image photographed by the camera unit; a vestibule oculogyral reflex movement detector detecting a vestibule oculogyral reflex movement based on the movement of the head detected by the head detector and the movement of the eyeball detected by the eyeball detector; a semicircular carnal cervix reflex movement detector detecting a semicircular carnal cervix reflex movement based on the movement of the head detected by the head detector and the movement of the eyeball detected by the eyeball detector;
a front face determining unit determining whether or not the head and the eyeball of the driver face forward based on the movement of the head detected by the head detector and the movement of the eyeball detected by the eyeball detector; and an inattentive driving determining unit determining an inattentive driving, wherein said inattentive driving determining unit actuates the vestibule oculogyral reflex movement detector and the semicircular carnal cervix reflex movement detector upon the front face determining unit determining the head and the eyeball of the driver as not facing forward, and determines as the inattentive driving upon neither the vestibule oculogyral reflex movement nor the semicircular carnal cervix reflex movement being detected.
Preferably, the driver condition detecting device further comprises an angle-of-convergence detector detecting an angle of convergence of the eyeball from the image photographed by the camera unit, wherein the inattentive driving determining unit determines the inattentive driving by the angle of convergence detected by the angle-of-convergence detector lying above a threshold upon the front face determining unit determining the head and the eyeball of the driver as not facing forward.
According to another aspect of the invention, the driver condition detecting device comprises: a camera unit photographing a face of a driver; a head detector detecting a movement of a head of the driver from an image photographed by the camera unit; an eyeball detector detecting a movement of an eyeball of the driver from the image photographed by the camera unit; an angle-of-convergence detector detecting an angle of convergence of the eyeball from the image photographed by the camera unit; a front face determining unit determining whether or not the head and the eyeball of the driver face forward based on the movement of the head detected by the head detector and the movement of the eyeball detected by the eyeball detector; and an inattentive driving determining unit determining an inattentive driving by the angle of convergence detected by the angle-of-convergence detector lying above a threshold upon the front face determining unit determining the head and the eyeball of the driver not facing forward.
Preferably, the driver condition detecting device comprises a discursive condition determining unit determining a discursive condition upon an angular velocity distribution of the eyeball per unit time lying within a predetermined range.
Preferably, the driver condition detecting device further comprises an informing unit informing the driver of the inattentive driving when a determination as the inattentive driving by the inattentive driving determining unit continues over an inattentive driving determining period.
Preferably, the driver condition detecting device further comprises a speed detector detecting a speed of a vehicle, wherein the informing unit sets the inattentive driving determining period short as the detected speed increases.
According to the first aspect of the invention, since when the head and the eyeball of the driver is determined as not facing forward by the front face determining unit, the inattentive driving determining unit, if neither the vestibule oculogyral reflex movement nor the semicircular carnal cervix reflex movement is detected, determines inattentive driving. It is thus made possible to determine inattentive driving based on the movement of the head and the eyeball. Furthermore, because the vestibule oculogyral reflex movement and the semicircular carnal cervix reflex movement are detected when line-of-sight faces forward though the head faces rightward, leftward, upward, or downward, it is made possible accurately to determine inattentive driving without determining the above case as inattentive driving.
Furthermore, since the inattentive driving determining unit determines the inattentive driving if the angle of convergence lies above a threshold even though the front face determining unit determines that the head and the eyeball of the driver face forward, it is made possible to determine inattentive driving even though the head and the eyeball of the driver face forward and the angle of convergence becomes large, i.e., the driver looks near side.
Furthermore, besides inattentive driving, discursive condition can be determined.
While the driver during driving looks at every place, the informing unit informs the driver of the inattentive driving according to the invention when determination as the inattentive driving by the inattentive driving determining unit continues over inattentive driving determining period, therefore it is made possible to prevent informing because of only looking at mirror for confirming backward or panels for confirming driving speed, and thus reduce informing the driver of wasted information.
According to the invention, the informing unit sets the inattentive driving determining period shorter as the detected speed increases. It follows from this that when the vehicle travels at low speed, movement variation of the head is relatively large so that the inattentive driving determining period is set long, and that when the vehicle travels at high speed, movement variation of the head is relatively small so that the inattentive driving determining period is set short, allowing to provide the driver with less wasted information.
Hereinafter, a head up display (hereafter referred to as HUD) in which a driver condition detecting device of the present invention is installed, is described with reference to
The camera 3 is, as shown in
Such the display 4 and the aspheric mirror 5, while accommodated in a HUD housing 9, are accommodated in the instrument panel 8. The aspheric mirror 5 is held such that the inclination is arranged adjustable so that the inclination is adjusted by the motor 6. The display 4 is configured of, for example, a known liquid crystal display device, emitting display light toward the aspheric mirror 5. The aspheric mirror 5 reflects the display light emitted from the display 4 toward a window shield 13 in the vehicle.
The display light reflected on the aspheric mirror 5 travels along a display light pass 10, reflecting on the wind shield 13, reaching to an eye point of the driver 4. This allows the driver 2 to look at the virtual image 12, outside the window shield 13, corresponding to the display image displayed on the display 4.
The CPU 7 is provided with a driver detector 71 detecting a position of eyeball of the driver 2, a rotation angle of the eyeball (movement of the eyeball), and a rotation angle of the head (movement of the head) by image-processing an image photographed by the camera 3, and an HUD controller 72 for controlling the display 4 or the aspheric mirror 5 in accordance with a result of detection from the driver 71.
The driver detector 71 is provided with a face detector 71A for deriving a face area from the image photographed by the camera 3 using known face detecting technology, an eyeball detector 71B for deriving an eyeball area from the face area derived by the face detector 71A using known eyeball detecting technology, an eyeball position calculating unit 71C for calculating an eyeball position (eye point 11) based on the eyeball area derived by the eyeball detector 71B, a line-of-sight detector 71D for detecting a line-of-sight by deriving an iris area from the eyeball area derived by the eyeball detector 71B, an eyeball rotation angle calculating unit 71E as an eyeball detector for calculating a rotation angel of the eyeball based on the iris area derived by the line-of-sight detector 71D, and a head rotation angle calculating unit 71F as head detector for calculating a rotation angle of the head from the face area derived by the face detector 71A.
The HUD controller 72 is provided with a display light path calculating unit 72A for calculating display light path from the eyeball position calculated by the eyeball position calculating unit 71C, a display image position controller 72B for inclining the aspheric mirror 5 so as to reflect on the display light path 10 calculated by the display light path calculating unit 72A by controlling the motor 6, a driver condition determining unit 72C for determining the condition of the driver 2 from the eyeball rotation angle and the head rotation angle, and an image display unit 72D for controlling the display 4.
Then, an operation of the HUD which is configured as mentioned above is generally described. At first, the CPU 7 serves as the driver detector 71, performing driver detection process in accordance with an ignition button being switched-on. The driver detection process is composed of image-processing the image photographed by the camera 3, repeating a face detection of the driver 2, an eyeball detection, a line-of-sight detection, a calculation of the head rotation angle, a calculation of the eye point, a calculation of the eyeball rotation angle and the like, and storing the data into memory.
The CPU also serves as display image position controller 72B, performing virtual image display position controlling process for controlling the aspheric mirror 5 so as to look the virtual image 12 at the eye point 11 of the driver 2 calculated by the driver detection process. Furthermore, the CPU 7 serves as the driver condition detector 72C, performing the driver condition detection process for determining the inattentive driving based on the head rotation angle and the eyeball rotation angle calculated by the driver detection process.
Then, the operation of the HUD 1 is described in detail which is mentioned in the above general description. Firstly, with reference to
Then, the step S3 of driver detection process is described in detail with reference to the
The CPU 7, after detecting the face (Y in step S32), also serves as the eyeball detecting unit 71B in parallel with calculating of the head rotation angle, performing the eyeball detection process for deriving the eyeball area from the image photographed by the camera 3 (step S34). The CPU 7, if not detecting the eyeball (N in step S35), again returns to step S32, and again performs face detection. The CPU 7, if detecting the face, repeats the eyeball detection, and when detecting the eyeball (Y in step S 35), calculates the eye point 11 based on the eye area derived (step S36), returning to step S35 after storing it into the memory, then repeating the eyeball detection and calculation of the eye point so as to update the memory. Note that the CPU 7 performs, in parallel with the driver detection process, the virtual image display position control process of controlling the position of the virtual image 12 by controlling the position of the aspheric mirror 5 based on the eye point 11 calculated.
Herein, before the driver detection process is described subsequently, the virtual image display position control process mentioned above is descried with reference to
However, because of its visible angel being limited to much narrow range, it is impossible for general HUD 1 (in which sufficient sight distance is secured) to be designed to cover the range of presence of the eye point 11 as far as the virtual image 12 is fixed. It is thus generally necessary for the driver 2 to adjust vertical position of the virtual image 12 manually in such a way that the virtual image 12 can be looked at in the eye point 11 of the driver 2.
In this embodiment, to calculate automatically the eye point 11 of the driver 2 by the driver detection process, and to move automatically to optimal position allow the virtual image 12 to be necessarily looked at in any body type or driving pose. Extreme poses inadequate for driving are omitted.
Then, the virtual image position control process is described in detail. In
Herein, the general position of the eye point 11 is described. Positions of those represented by A in
A corresponding table of the elevation angle a of the eye from the camera 3 and the representative A0 of the eye point 11 corresponding to the elevation angel a of the eyeball being thus stored into not-shown memory. While calculating the eye point at the step S36, the CPU 7 obtains the elevation angle a of the eyeball, calculates and stores in the memory the representative point A0 as the eye point 11 corresponding to the elevation angel a obtained from the corresponding table.
Then, in the virtual image display position process, as shown in
Now, a control of the motor 6 in the above mentioned step S43 is described in detail. As shown in
In the
Then, as shown in
In fact, designing adequately visible range of the virtual image 12 and the camera 3, or visible room from the farthest two points of a range of presence of the eye point only by two dimensional information allows the position of the virtual image 12 to lie within a usual visible range.
Also, as a method of specifying the eye point 11 using means of deriving distance information for eyeball position as represented by PTL such as Japanese Patent Application Laid-Open 2011-149942, or 2008-09162, the eye point 11 can be specified.
Furthermore, the CPU 7 in the example shown in
Note that the display position or the size of the HUD 1 is expressed using equations (1), and (2) below (see
[Math. 1]
S=((L−L0)/L)Heye (1)
S=(L0/L)S0 (2)
where S is display height of the virtual image, S is display size in the display, Heye is a height of the eye point, L0 is a distance from the virtual image 12 to the eye point 11, L is a distance from the position to the eye point 11, and S0 is a size at face displaying the virtual image 12.
Next, returning to the driver detection process shown in FIG. 4, operation after detection of the eyeball is described. After detecting the eyeball, The CPU 7 performs line-of-sight detection (step S37) in parallel with calculation for the eye point. If not detecting the eyeball (N in step S38) the CPU 7 returns to step 35. In step S35, when determining failure to detect the eyeball the CPU 7 again performs the line-of-sight detection. When repeating this process, and determining detection of the eyeball (Y in step S38), the CPU 7 calculates eyeball rotation angel at that time (line-of-sight direction), and stores it to the memory (step S39). While being able to detect this line-of-sight detection, the CPU 7 continuously calculates this eyeball rotation angel to update the memory.
The CPU 7, using the head rotation angle and the eyeball rotation angle thus calculated, performs driver condition determination process. This driver condition determination process is described with reference to
Note that in the analysis for the head rotation angle and the eyeball rotation angle, as shown in
Herein, a method for calculating angular velocity omega from the head rotation angel and the eyeball rotation angle is described with reference to
[Math. 2]
ω=θAV/t (3)
where θAV is angle for movement from the vector a calculated last time to the vector b newly detected, t is time required for movement from the vector a calculated last time to the vector b newly detected.
Next, a method for determining inattentive driving of the driver 2 from the head rotation angle and the eyeball rotation angel is described with references to
The CPU 7, using the analysis result in step S51, detects the initial condition, the vestibule oculogyral reflex movement, the semicircular carnal cervix reflex movement, or the angle of convergence of the eyeball of the driver 2, and thus determines the driver condition. After analyzing the driver condition, as shown in
When determining the driver 2 as not near the initial condition, i.e., the head and the eyeball not facing forward (N in step S52), the CPU 7 serves as the vestibule oculogyral reflex movement detecting means, determining whether or not the vestibule oculogyral reflex movement can be detected from the analysis result of the driver's condition (step S53). The vestibule oculogyral reflex movement, as shown in
When angular velocities of the head and the eyeball oppose each other in the yaw direction, the CPU 7 detects the vestibule oculogyral reflex movement. When the vestibule oculogyral reflex movement is detected (Y in step S53), the CPU 7 terminates without informing. On the other hand, if not detecting the vestibule oculogyral reflex movement (N in step S53), the CPU serves as semicircular carnal cervix reflex movement detecting means, determining whether or not semicircular carnal cervix reflex movement can be detected from the analysis result of the driver's condition (step S54).
The semicircular carnal cervix reflex movement, as shown in
On the other hand, when determining the driver 2 as near the initial condition, i.e., the head and the eyeball facing forward (Y in step S52), the CPU 7 serves as angle-of-convergence detecting means, calculating angle of convergence (step S56). Herein, before calculating angle of convergence is discussed, convergence and divergence movement is discussed. The convergence and divergence movement is divided into convergence movement (driver's condition of D14 in TABLE 1) in which as shown in
Therefore, if the calculated angle of convergence lies above the threshold (N in step S57), the CPU 7 serves as inattentive driving determining means and informing means, determining inattentive driving and calling for attention to the driver 2 (step S58). On the other hand, if the calculated angle of convergence lies below the threshold (Y in step S 57), the CPU 7 terminates without informing. Note that angle of convergence thetac at this time can be expressed by equation (4) of inner product of line-of-sight vector for right eye and line-of-sight vector for left eye, as shown in
According to the above mentioned embodiment, when the driver 2 does not lie in the initial condition, the head movement and the eyeball movement are determined as not facing forward, and the vestibule oculogyral reflex movement and the semicircular carnal cervix reflex movement are detected by the head movement and the eyeball movement of the driver 2, the driver 2 is determined as looking at forward, but in the case except for the above (the case of neither the vestibule oculogyral reflex movement nor the semicircular carnal cervix reflex movement is detected), the driver 2 is otherwise determined as lying in inattentive driving, and is called for attention. Therefore, inattentive driving is determined on the basis of movement of the head and the eyeball. In addition, since when line-of-sight faces forward even though the head faces rightward, leftward, upward, or downward, the vestibule oculogyral reflex movement and the semicircular carnal cervix reflex movement are detected, this case is not determined as inattentive driving, and thus inattentive driving is determined correctly.
Furthermore, according to the above mentioned embodiment, even though the driver 2 is determined as lying in the initial condition, and the head and the eyeball facing forward, the driver 2 is determined as lying in inattentive driving, looking at neighborhood forward when convergence angle lies above the threshold, and may be called for attention.
Meanwhile, as shown in
The driver 2, during driving, needs to confirm not only forward but also rearview mirror (room mirror, right and left side mirrors), and instrument panels for confirming car condition or car speed in order to obtain information necessary for driving. In the aforementioned first embodiment, the driver 2 is determined as lying in inattentive driving in such the case, and is called for attention. If there increases such erroneous decision, the driver 2 develops distrustfulness, resulting in uselessness. Therefore, sight range is divided at which the driver 2 should look during driving as shown in
It may be though that because variation of the head movement as the inattentive driving determining period while driving fast is relatively large, the inattentive determining period is made long, and that because variation of the head movement while driving low is relatively small, the inattentive determining period is made short. Described in detail, the CPU 7 serves as speed detecting means, detecting car speed from output of running sensor, as the detected speed becomes high, the inattentive driving determining period is arranged short. This makes it possible to reduce wasted calling for attention to the driver 2. Note that though in the above is described period of boarding, low speed, or high speed, but period of middle speed may be set.
Meanwhile, it is known that while entering into and passing through curved road, the diver 2 intends to look at tangent point (tangent point of straight line passing eye point and inside of the curved) (refer to, M. F. Land and D. N. Lee, “Where we look when we steer”, pp. 742-744, vol. 369, Nature, 30th June, 1994). As far as the car being entering into curve is detected by an image from a mounted camera for photographing traveling direction, an inside edge of the curve is detected from the image from the mounted camera for photographing traveling direction, and as shown in
In the aforementioned embodiment, as shown in
In the aforementioned first embodiment, each time variation for the eye point 11 exceeds control determination value (D/2 in the first embodiment) with respect to control of the aspheric mirror 5, inclination of the aspheric mirror 5 was newly adjusted where the eye point at that time corresponds to. The invention, however, is not limited to this embodiment. For example, it is thought that since the head movement becomes large during boarding or traveling at low speed, decreasing the above mentioned control determination value makes it possible to follow susceptibly relative to the eye point calculation value (shortening average movement period), and that since the head movement becomes small during traveling at high speed, increasing the above mentioned control determination value makes it possible to follow insensitively relative to the eye point calculation value (lengthening average movement period). Note that in the above period of boarding, low speed, or high speed is described but period of middle speed may be set.
In the display image positioning control, it is though that for example, on the basis of the eye point calculation value for the initial condition (driving position) of the driver 2 upon boarding, the position at which the driver 2 should look is determined. This makes calling for attention even when the driving pose collapses.
Though in the above mentioned first embodiment, inattentive driving is determined and informed, but the invention is not limited to this embodiment. In order for the driver 2 quickly to find subject for sight in forward line-of-sight during driving, the driver's eye is made moving instinctively and finely. Therefore, the driver 2 during driving concentrates so as to predict any risk from forward line-of-sight and quickly find risky subject, the eyeball movement thus moves quickly and frequently during short stop period such as about 0.1 second to 1 second, and the eyeball fluctuates finely to such an extent as 0.1 angular degree (
In normal driving, as the driver 2 concentrates to drive, in relationship between the head movement and the eyeball movement, the distribution of the head movement and the eyeball movement ranges widely as shown in
As shown in
Accordingly, the head movement and the eyeball movement makes possible detecting not only inattentive driving of the driver but incursive condition. Note that while the determination threshold for the incursive condition is set to nearly +/−1.0 deg/sec in the above description, it may be set tight (for example, +/−2.0 deg/sec) because of presence of individual difference.
Note that in the embodiment when the driver is determined as not in condition of driving, method for calling for attention to the driver may be such as visual transmitting means (information presentation), aural transmitting means (warning), or somatosensory transmitting means (vibration of steering, sheet, or sheet belt).
The aforementioned embodiments merely disclose such as, but not limited to, typical embodiment of the present invention. Namely, it is to be understood that various changes and modifications will be apparent to those skilled in the art. Therefore, unless otherwise such changes and modifications depart from the scope of the present invention hereafter defined, they should be construed as being included therein.
Number | Date | Country | Kind |
---|---|---|---|
2012-152363 | Jul 2012 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2013/004158 | Jul 2013 | US |
Child | 14583972 | US |