1. Field of the Invention
The present invention generally relates to a system and method for range and lateral position measurement of a preceding vehicle.
2. Description of Related Art
Radar and stereo camera systems for adaptive cruise control (ACC), have been already introduced into the market. Recently, radar has been applied to for pre-crash safety systems and collision avoidance. Typically, the range and lateral position measurement of a preceding vehicle is accomplished utilizing radar and/or stereo camera systems. Radar systems can provide a very accurate range. However, millimeter wave type radar systems such as 77 Ghz systems are typically quite expensive. Laser radar is low cost but has the drawback of requiring moving parts which scan the laser across a field of view. Furthermore, the laser is less effective in adverse weather conditions which may effect the reflectivity of the preceding vehicle. For example, if mud covers the reflectors of the preceding vehicle, the reflectivity of the preceding vehicle will be less which minimizes the effectiveness of the laser. Finally, radar generally is not well suited to identify the object and give an accurate lateral position.
Stereo cameras can determine the range and identity of an object. However, these systems are typically difficult to manufacture due to the accurate alignment required between the two stereo cameras and requires two image processors.
In view of the above, it can be seen that conventional ACC systems typically do not have a high cost-performance ratio even though they may perform to the desired functional requirements. Further, it is apparent that there exists a need for an improved system and method for measuring the range and lateral position of the preceding vehicle.
In satisfying the above need, as well as overcoming the enumerated drawbacks and other limitations of the related art, the present invention provides a system for determining range and lateral position of a vehicle. The primary components of the system include a single camera, at least one sonar sensor and a processor. The camera is configured to view a first region of interest containing a preceding vehicle and generate an electrical image of the region. The processor is in electrical communication with the camera to receive the electrical image. To analyze the electrical image, the processor identifies a series of windows within the image, each window corresponding to a fixed physical size at a different target range. For example, from the perspective of the camera the vehicle will appear larger when it is closer to the camera than if it is further away from the camera. Accordingly, each window is sized proportionally in the image as it would appear to the camera at each target range. The processor evaluates characteristics of the electrical image within each window to identify the vehicle. For example, the size of the vehicle is compared to the size of the window to create a size ratio. A score is determined indicating the likelihood that certain characteristics of the electrical image actually correspond to the vehicle and also that the vehicle is at the target range for that window.
The sonar sensor is configured to view a second region of interest. The processor receives data indicating if the preceding vehicle is in the second field of view. If the processor determines that the preceding vehicle is within the second field of view, the processor will calculate the range of the preceding vehicle based on the data generated by the sonar sensor.
In another aspect of the present invention, the characteristics of electrical image evaluated by the processor includes the width and height of edge segments in the image, as well as, the height, width, and location of objects constructed from multiple edge segments. The position of the window in the electrical image is calculated based on the azimuth angle and the elevation angle of the camera.
In yet another aspect of the present invention, a method is provided for identifying the vehicle within the electrical image and a sonar signal to determine the vehicle range. To simplify the image, an edge enhanced algorithm is applied to the image. Only characteristics of the electrical image within a particular window are evaluated. The edge enhanced image is trinalized and segmented. The segments are evaluated and objects are constructed from multiple segments. A score is determined for each object based on criteria, such as, the object width, object height position, object height, and segment width. Based on the width ratio, the range and lateral position of the object are estimated on the basis of the target range of the window. However, if the sonar signal indicates that the object is closer than estimated by evaluating the electrical image, the sonar signal will be used to estimate the range of the object.
Further objects, features and advantages of this invention will become readily apparent to persons skilled in the art after a review of the following description, with reference to the drawings and claims that are appended to and form a part of this specification.
Referring now to
The optical image received by the camera 12, is converted to an electrical image that is provided to the processor 14. To filter out unwanted distractions in the electronic image and aid in determining the range of the vehicle 18, the processor 14 calculates the position of multiple windows 20, 22, 24 within the first region of interest 16. The windows 20, 22, 24 are located at varying target ranges from the camera 12. The size of the windows 20, 22, 24 are a predetermined physical size (about 4×2 m as shown) and may correspond to the size of a typical vehicle. To provide increased resolution the windows 20, 22, 24 are spaced closer together and the number of windows is increased. Although the system 10, as shown, is configured to track a vehicle 18 preceding the system 10, it is fully contemplated that the camera 12 could be directed to the side or rear the system 10 to track a vehicle 18 that may be approaching from other directions.
Now referring to
Now referring to
θ1=a tan(−r1/hc) (1)
Where hc is the height of the camera 12 from the road surface, r1 is the horizontal range of window 20 from the camera 12, and the module is [0, π].
Similarly, the upper edge of the first window is calculated based on Equation (2).
θ1h=a tan(rc/(hw−hc)) (2)
Where hw is the height of the window, hc is the height of the camera 12 from the road surface, r1 is the range of window 20 from the camera 12, and the module is [0, π]. The difference, Δθ1=θ1−θ1h, corresponds to the height of the window in the electronic image.
Now referring to
φ1=a tan(−width13 w/(2*r1))+(π/2) (3)
Similarly, the left edge of the range window 20 is calculated according to Equation (4).
φ1h=a tan(width13 w/(2*r1))+(π/2) (4)
Where window w is the distance from the center of the window 20 to the horizontal edges, r1 is the horizontal range of the window 20 from the camera 12, and the module is [−π/2, π/2].
The window positions for the additional windows 22, 24 are calculated according to Equations (1)-(4), substituting their respective target ranges for r1.
Now referring to
Now referring to
Now referring to
Now referring to
Now referring to
Relating these segments back to the original image, Segment 42 and 43 represent the lane marking on the road. Segment 44 represents the upper portion of the left side of the vehicle. Segment 46 represents the lower left side of the vehicle. Segment 48 represents the left tire of the vehicle. Segment 50 represents the upper right side of the vehicle. Segment 52 represents the lower right side of the vehicle while segment 54 represents the right tire.
Still referring to
Now referring to
Now referring to
Referring to
In order to take into account situations as described above, the processor 14 will receive data from the sonar sensor 15. The sonar sensor 15 has a shorter range than the camera 12 as illustrated by the second region of interest 17. The inherent characteristics of sonar sensor 15, which are well known in the art, allow the sonar sensor 15 to detect large objects, such as the larger vehicle 19, that would otherwise be undetectable to the processed image captured by camera 12. If the processor 14 received data from the sonar sensor 15 indicating that a vehicle is present and undetected by the camera 12 in the first region of interest, the processor 14 will use the data generated by the sonar sensor 15 to calculate the range of the larger vehicle 19.
As a person skilled in the art will readily appreciate, the above description is meant as an illustration of implementation of the principles this invention. This description is not intended to limit the scope or application of this invention in that the invention is susceptible to modification, variation and change, without departing from spirit of this invention, as defined in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
5790403 | Nakayama | Aug 1998 | A |
5901806 | Takahashi | May 1999 | A |
6459982 | Kabayashi et al. | Oct 2002 | B1 |
6470257 | Seto | Oct 2002 | B1 |
6484086 | Jeon | Nov 2002 | B2 |
6556913 | Morizane et al. | Apr 2003 | B2 |
6580385 | Winner et al. | Jun 2003 | B1 |
6708099 | Tellis et al. | Mar 2004 | B2 |
6771208 | Lutter et al. | Aug 2004 | B2 |
6792344 | Minowa et al. | Sep 2004 | B2 |
6795014 | Cheong | Sep 2004 | B2 |
6825778 | Bergan et al. | Nov 2004 | B2 |
6834232 | Malhotra | Dec 2004 | B1 |
6856887 | Akabori et al. | Feb 2005 | B2 |
6859716 | Kikuchi | Feb 2005 | B2 |
6873899 | Sawamoto | Mar 2005 | B2 |
6873911 | Nishira et al. | Mar 2005 | B2 |
20020057195 | Yamamura | May 2002 | A1 |
20030045990 | Adachi | Mar 2003 | A1 |
20030204299 | Waldis et al. | Oct 2003 | A1 |
20040056950 | Takeda | Mar 2004 | A1 |
20040061626 | Kubota | Apr 2004 | A1 |
20040149504 | Swoboda et al. | Aug 2004 | A1 |
20040176900 | Yajima | Sep 2004 | A1 |
20050010351 | Wagner et al. | Jan 2005 | A1 |
20050102070 | Takahama et al. | May 2005 | A1 |
Number | Date | Country |
---|---|---|
11-142168 | May 1999 | JP |
2003-143597 | May 2003 | JP |
Number | Date | Country | |
---|---|---|---|
20070035385 A1 | Feb 2007 | US |