This application is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2004-092922, filed Mar. 26, 2004, the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to a vehicle running state judging apparatus which judges whether a driver has lost consciousness and fallen asleep at the wheel.
2. Description of the Related Art
Recently, as the road network has developed and leisure hours have increased, the opportunity to drive a car has increased. A driver is required to keep mental and physical health while driving. However, there may be a case of driving a car in poor health. In such a case, if a driver drives continuously or for a long time, the driver may fall asleep when fatigued and having lost concentration.
To prevent the above state, a camera is mounted on a vehicle to monitor the view in front of the vehicle, recognize a white road line to check meander driving, detect falling asleep, and warn against being asleep at the wheel. (Refer to the U.S. Pat. No. 3,039,327.)
According to the above system, the camera capture the running zone line on a road and judges meander driving based on that image. There is another method of judging meander driving by recognizing a white line on both sides of a road. However, in this method, if there is no while line on both sides of the road or snow is accumulated on the whole surface of a road, it is impossible to judge meander driving.
It is an object of the present invention to provide a vehicle running state judging apparatus which can judge meander driving even if there is no white line on a road.
According to an aspect of the present invention, there is provided a vehicle running state judging apparatus comprising;
a camera which captures an image of a road surface in a vehicle running direction; and
a meander state detection section configured to obtain the amount of meandering based on the road surface image captured by the camera,
wherein the meander state detection section is comprised of:
a first gray value detection section configured to detect the edge of a gray value of a first pixel group arrayed in the horizontal direction out of the road surface images captured by the camera, and to detect the gray value at the edge detecting position;
a second gray value detection section configured to detect the edge of a gray value of a second pixel group arrayed in the horizontal direction out of the road surface images, and to detect the gray value at the edge detecting position;
a calculation section configured to calculate the edge detecting position of the first pixel group detected by the first gray value detection section, the edge detecting position of the second pixel group based on the gray value at that position, and the gray value at that position; and
a judging section configured to for judge meandering of a vehicle by comparing the edge detecting position of the second pixel group detected by the second gray value detection section and the gray value at that position with the edge detecting position of the second pixel group calculated by the calculation section and the gray value at that position.
Additional objects and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objects and advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention, and together with the general description given above and the detailed description of the embodiments given below, serve to explain the principles of the invention.
Hereinafter an embodiment of the present invention will be explained with reference to the accompanying drawings.
The image processing unit 12 is connected to a control unit 13. The control unit 13 consists mainly of a microprocessor, and includes a meander state judging means 14, a first gray value detection means 15 and a second gray value detection means 16.
The camera 11 is mounted in the upper part of the center of the cabin of a vehicle, as shown in
Now explanation will be given on the operation of the embodiment of the present invention configured as above by referring to the flowchart of
First, an image in front of a vehicle is taken by the camera 11. The road image captured by the camera 11 is input to the image processing unit 12 (step S1). In this embodiment, the road has no white line on either side of the road surface 1 and has cracks 2 in the road surface, as shown in
Assuming the horizontal direction of the CCD to be an X-axis and the vertical direction to be a y-axis, calculate the gray value on the three pixel groups y1, Y2, Y3 arrayed in the x-axis direction (the horizontal direction pixel group: group A), and perform the same on the other horizontal direction pixel group z1, z2, z3 arrayed in the y-axis direction. The gray value means the value that white is 0, black is 255, and the middle is determined according to the shades. For example, the crack 2 is darker than the other road surface parts, and the gray value of the crack 2 is larger than the gray values of the other parts.
Then, the control unit 13 calculates the edge of each pixel group A and B (
Calculate the absolute gray value (gray value level) at the position where the edge occurs (the detecting position). The edge detecting position on each pixel group y1-y3 of the group A and the gray value at that position are detected by the first gray value detection means 15. The edge detecting position on each pixel group z1-z3 of the group B and the gray value at that position are detected by the second gray value detection means 16. The gray values at the edge detecting positions of the groups A and B are stored in the predetermined memory at every control cycle, as shown in
The edge detecting position of the group A and the gray value at that position are stored in the predetermined memory for the next time processing (step S5).
It is possible to calculate the edge detecting position of the group B where the edge detecting position of the group A detected last time, in step S3, displaced by a vehicle speed (a calculation means) (step S6).
The amount of displacing the edge detecting position is corrected according to the vehicle speed, and the edge detecting position of the group A can be certainly shifted to the group B by the calculation even if the vehicle speed is changed.
Next, compare the edge detecting position, which is displaced by the vehicle speed against the group A in the last time processing by the calculation of step S6, and the gray value at that position with the edge detecting position of the group B and the gray value (a judging means) (step S7). If there is a displacement Δx (
It may be allowed to provide a means for judging whether the average meandering amount calculated in step S8 rises over the reference level, and judge meandering when this judging means judges YES.
The reference level can be obtained by averaging the meandering amount until a certain time passes after the start of driving. Namely, it is judged that the driver has not fallen asleep until a certain time passes after the start of driving, and the meander amount within this certain time is averaged. It is also permitted to seek the reference level experimentally.
The control unit 13 is separated from the image processing unit 12 in this embodiment, but they may be constructed as a single unit.
As explained above, the meandering of a vehicle is judged by detecting the edge detecting position on each pixel group at the position where the road surface image captured by the camera is displaced in the advancing direction. Thus, meandering of vehicle can be judged even if there is no white line on the road surface.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2004-092922 | Mar 2004 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4471825 | Kuhn et al. | Sep 1984 | A |
4482136 | Wolf et al. | Nov 1984 | A |
4496938 | Seko et al. | Jan 1985 | A |
4868752 | Fujii et al. | Sep 1989 | A |
4954962 | Evans et al. | Sep 1990 | A |
5172317 | Asanuma et al. | Dec 1992 | A |
5648755 | Yagihashi | Jul 1997 | A |
5685925 | Riquier et al. | Nov 1997 | A |
5694116 | Kojima | Dec 1997 | A |
5745031 | Yamamoto | Apr 1998 | A |
5815070 | Yoshikawa | Sep 1998 | A |
5974792 | Isobe | Nov 1999 | A |
6023227 | Yanko et al. | Feb 2000 | A |
6184781 | Ramakesavan | Feb 2001 | B1 |
6218947 | Sutherland | Apr 2001 | B1 |
6285778 | Nakajima et al. | Sep 2001 | B1 |
6366207 | Murphy | Apr 2002 | B1 |
6523591 | Billieres et al. | Feb 2003 | B1 |
6831591 | Horibe | Dec 2004 | B2 |
6845172 | Furusho | Jan 2005 | B2 |
6879890 | Matsumoto et al. | Apr 2005 | B2 |
6925206 | Akutagawa | Aug 2005 | B2 |
6950027 | Banas | Sep 2005 | B2 |
6973380 | Tange et al. | Dec 2005 | B2 |
7006667 | Akutagawa | Feb 2006 | B2 |
7054723 | Seto et al. | May 2006 | B2 |
7084772 | Oyama | Aug 2006 | B2 |
7152000 | Ihara et al. | Dec 2006 | B2 |
7190274 | Ihara et al. | Mar 2007 | B2 |
7204130 | Koram et al. | Apr 2007 | B2 |
20020042676 | Furusho | Apr 2002 | A1 |
20020061123 | Akutagawa | May 2002 | A1 |
20040080449 | Horibe | Apr 2004 | A1 |
20050203706 | Ihara et al. | Sep 2005 | A1 |
Number | Date | Country |
---|---|---|
3613669 | Oct 1987 | DE |
20016384 | Feb 2001 | DE |
59410156 | Aug 2002 | DE |
73299 | Jan 1986 | EP |
73966 | Jan 1986 | EP |
354562 | Feb 1990 | EP |
354562 | Jun 1997 | EP |
1053111 | Jan 2002 | EP |
1802496 | Jul 2007 | EP |
06-274786 | Sep 1994 | JP |
2830475 | Sep 1998 | JP |
10334393 | Dec 1998 | JP |
11013038 | Jan 1999 | JP |
11230273 | Aug 1999 | JP |
3039327 | Mar 2000 | JP |
2001289029 | Oct 2001 | JP |
2002325347 | Nov 2002 | JP |
3635403 | Apr 2005 | JP |
2005146862 | Jun 2005 | JP |
2005258936 | Sep 2005 | JP |
490164 | May 2005 | KR |
WO9518433 | Jul 1995 | WO |
WO 9925572 | May 1999 | WO |
WO 2006042248 | Apr 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20050232464 A1 | Oct 2005 | US |