This application is based on and incorporates herein by reference Japanese Patent Application No. 2004-65922 filed on Mar. 9, 2004.
1. Field of the Invention
The present invention relates to a vehicle state sensing system and a vehicle state sensing method for sensing a moving state of a vehicle based on image data of an image, which is captured through an on-vehicle camera.
2. Description of Related Art
one previously proposed system recognizes a physical state (e.g., a curvature, a width) of a road lane, along which a vehicle is traveling, based on image data of an image, which is captured by, for example, an on-vehicle camera (see, for example, Japanese Unexamined Patent Publication No. H09-263200). When the vehicle approaches a curve of the road lane at an excessively high speed, the system determines a possibility of moving out of the road lane based on the recognized result.
In the above system, the surface of the road lane, i.e., the state of the surrounding environment around the traveling vehicle is sensed based on the image data of the image, which is captured by the on-vehicle camera. However, there has not been proposed a system, which determines a moving state of the vehicle (e.g., a momentum of translational movement of the vehicle) based on the image data of the image, which is captured by the on-vehicle camera.
The present invention is made with respect to the above point. Thus, it is an objective of the present invention to provide a vehicle state sensing system and a vehicle state sensing method for sensing a moving state of a vehicle based on image data of an image, which is captured by an image recognizing means, such as an on-vehicle camera.
To achieve the objective of the present invention, there is provided a vehicle state sensing system that includes an image recognizing means, a vehicle speed sensing means and a vehicle state determining means. The image recognizing means is installed in a vehicle. The image recognizing means is for capturing an image of an outside visual scene that is outside of the vehicle and is for outputting image data of the captured image. The vehicle speed sensing means is for outputting a measurement signal, which corresponds to a moving speed of the vehicle. The vehicle state determining means is for determining a state of the vehicle based on the measurement signal of the vehicle speed sensing means and an optical flow of a predetermined point of the captured image, which is captured during traveling of the vehicle. The optical flow of the predetermined point is obtained based on the image data of the captured image.
To achieve the objective of the present invention, there is also provided a vehicle state sensing method. According to the method, a moving speed of a vehicle is obtained. Also, image data of an image of an outside visual scene is obtained. The image data of the image of the outside visual scene is outside of the vehicle and is captured by an image recognizing means during traveling of the vehicle. An optical flow of a predetermined point of the captured image is obtained based on the image data. A state of the vehicle is determined based on the moving speed of the vehicle and the optical flow of the predetermined point.
The invention, together with additional objectives, features and advantages thereof, will be best understood from the following description, the appended claims and the accompanying drawings in which:
The on-vehicle camera 1 captures an image of a front visual scene on a front side of a vehicle. The on-vehicle camera 1 continuously captures the image of the front visual scene during traveling of the vehicle and transmits image date of the captured image to the ECU 3.
The wheel speed sensor 2 outputs a measurement signal, which corresponds to a wheel speed of the vehicle. A vehicle speed (i.e., a moving speed of the vehicle) V is determined based on the measurement signal of the wheel speed sensor 2. A method for computing such a vehicle speed V is well known in the field of brake systems, so that details of the method for computing the vehicle speed V will not be described.
The ECU 3 senses, i.e., determines the moving state (driving state) of the vehicle based on the image data of the on-vehicle camera 1 and the measurement signal of the wheel speed sensor 2. Specifically, the ECU 3 obtains a yaw rate γ and a slip angle β as parameters, which indicate the moving state of the vehicle. More specifically, the yaw rate γ indicates a vehicle rotational angular speed, i.e., a rotational angular speed of the vehicle about a vehicle center axis. The slip angle β indicates a transverse moving speed of the vehicle, which is under translational movement.
The yaw rate γ and the slip angle β are obtained based on the image data of the captured image, which is captured through the on-vehicle camera 1. A way of obtaining the yaw rate γ and the slip angle β will be described with reference to
The optical flow indicates the vehicle transverse moving speed that is measured in the point of regard (a predetermined point) in the front visual scene, which is viewed through the on-vehicle camera 1. The optical flow corresponds to a vector that connects between two points, i.e., the former point of regard and the latter point of regard in an image coordinate. Here, the latter point of regard has moved from the former point of regard and is obtained upon elapse of a predetermined time period from the former point of regard. The inventors of the present invention have found that the optical flow contains the physical quantities, which indicate the moving state of the vehicle. Based on this finding, the inventors of the present invention have proposed to obtain the yaw rate γ and the slip angle β based on the optical flow.
The physical significance of the optical flow will be described.
As shown in
Based on these results, it is clearly understood that the optical flow differs between the translational movement of the vehicle and the rotational movement of the vehicle. This means that the optical flow can indicate whether the subject movement of the vehicle is the translational movement or the rotational movement. Thus, the translational movement and the rotational movement of the vehicle can be sensed through the analysis of the optical flow of the image data of the captured image, which is captured through the on-vehicle camera 1.
With reference to
d·γ+Vβ=Vy_flow (1)
The above Equation (1) indicates the following matter. That is, the transverse component of the optical flow of the point P of regard is expressed by the sum of the transverse moving speed Vβ of the vehicle, which is caused by the side slip of the vehicle, and the transverse moving speed d·γ of the point P of regard, which is caused by the rotational movement of the vehicle about the center of mass of the vehicle. In other words, the transverse moving speed component of the optical flow indicates the relationship between the side slip of the vehicle and the rotational movement of the vehicle.
In a case of steady circular turn of the vehicle, the slip angle β and the yaw rate γ are defined in a manner described below.
In the above Equation (2), “A” denotes the stability factor, which is expressed by the following Equation (3) and is vehicle specific.
The turning radius ρ can be expressed by the following Equation (4) based on the vehicle speed V and the yaw rate γ.
ρ=V/γ (4)
Also, based on the Equations (2) and (8), the yaw rate γ can be expressed by the following Equation (5).
Similarly, the slip angle β at the time of the steady circular turn of the vehicle can be expressed by the following Equation (6).
In the above Equation (6), “K” is defined by the following Equation (7).
Thus, based on the Equations (5) and (6), the following Equation (8) is derived.
In the above Equation (8), α=(1−KV2)Lr/d. The transverse moving speed d·γ of the point of regard, which is caused by the rotational angular speed, is proportional to the transverse moving speed Vβ, which is caused by the side slip. It has been confirmed that the transverse moving speed Vβ is increased relative to the transverse moving speed d·γ at a predetermined gradient α. Thus, the above equation (8) can be also expressed to indicate that the transverse moving speed Vβ increases relative to the transverse moving speed d·γ at the gradient α.
Because of the above relationships, the transverse moving speed d·γ can be expressed by the following Equation (9).
Also, the transverse moving speed Vβ can be expressed by the following Equation (10).
Therefore, the yaw rate γ can be expressed by the flowing Equation (11).
Also, the slip angle β can be expressed by the following Equation (12).
In the above described manner, the yaw rate γ and the slip angle β can be obtained based on the optical flow.
Thus, in the present embodiment, when the ECU 3 receives the signal from the wheel speed sensor 2 and the image data from the on-vehicle camera 1, the ECU 3 determines the vehicle speed V and also obtains the optical flow. Then, the ECU 3 applies the vehicle speed and the optical flow as “V” and “Vy_flow” of the above Equations (11) and (12) to determine the yaw rate γ and the slip angle β.
The factors of the Equations (15), (16) other than “V” and “Vy_flow” should be determined based on the subject vehicle type and the subject tire type. Thus, the yaw rate γ and the slip angle β can be obtained based on “V” and “Vy_flow”.
Next, the vehicle state sensing system of the present embodiment will be described with reference to
First, when the image, which indicates the scene outside the vehicle, is captured through the on-vehicle camera 1, the image data of the captured image is transmitted to the ECU 33. When the wheel speed sensor 22 outputs the measurement signal, which indicates the rotation of the corresponding wheel, the measurement signal is transmitted from the wheel speed sensor 22 to the ECU 33.
In this way, the ECU 33 performs the vehicle state sensing process shown in
Next, at step 110, the vehicle speed at the time of extracting the optical flow is determined based on the measurement signal of the wheel speed sensor 22.
Then, at step 120, the yaw rate γ and the slip angle β are computed by using the extracted optical flow and the vehicle speed as “Vy_flow” and “V” in the above Equations (11) and (12). In this way, the yaw rate γ and the slip angle β are obtained. By repeating the above process, the yaw rate γ and the slip angle β can be always obtained during the traveling of the vehicle.
As described above, in the vehicle state sensing system of the present embodiment, the image of the front visual scene on the front side of the vehicle is captured by the on-vehicle camera 1. Then, the optical flow is extracted from the image data of the captured image to obtain the yaw rate γ and the slip angle β. In this way, the state of the vehicle, such as the translational movement of the vehicle or the rotational movement of the vehicle about the center of mass, can be sensed based on the image data of the image, which is captured through the on-vehicle camera 1.
In general, humans outperform the computers with respect to the recognizing capability for recognizing a specific subject, such as a white line, a vehicle, a pedestrian or an obstacle, from its background scene in the captured image, which is captured through the on-vehicle camera 1. However, the computers outperform the humans with respect to the recognizing capability for exactly monitoring the movement of the vehicle relative to the surrounding environment. Thus, as described above, when the moving state of the vehicle is sensed based on the optical flow through use of the ECU 3, the moving state of the vehicle can be more precisely sensed. Based on the sensed result of the moving state of the vehicle, the various vehicle driving control operations can be performed.
In the above embodiment, the measurement signal of the wheel speed sensor 2 is used to obtain the vehicle speed. However, the vehicle speed can be determined based on a measurement signal, which is outputted from a vehicle speed sensor. Furthermore, the image of the front visual scene on the front side of the vehicle is captured by the on-vehicle camera 1. However, the present invention is not limited to the image of the front visual scene on the front side of the vehicle and can be equally applicable to an image of any other appropriate outside visual scene, which is taken in any other direction and includes the translational movement of the vehicle in the transverse direction and the rotational movement of the vehicle about the center of mass.
Additional advantages and modifications will readily occur to those skilled in the art. The invention in its broader terms is therefore not limited to the specific details, representative apparatus, and illustrative examples shown and described.
Number | Date | Country | Kind |
---|---|---|---|
2004-065922 | Mar 2004 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5323152 | Morita | Jun 1994 | A |
5757949 | Kinoshita et al. | May 1998 | A |
6535114 | Suzuki et al. | Mar 2003 | B1 |
6840343 | Mattson et al. | Jan 2005 | B2 |
20020001398 | Shimano et al. | Jan 2002 | A1 |
20030030546 | Tseng | Feb 2003 | A1 |
Number | Date | Country |
---|---|---|
1 033 085 | Sep 2000 | EP |
9-134499 | May 1997 | JP |
9-263200 | Oct 1997 | JP |
WO 0139120 | May 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20050201593 A1 | Sep 2005 | US |