1. Field of the Invention
The present invention generally relates to a system and method for range and lateral position measurement of a preceding vehicle.
2. Description of Related Art
Radar and stereo camera systems for adaptive cruise control (ACC), have been already introduced into the market. Recently, radar has been applied to for pre-crash safety systems and collision avoidance. Typically, the range and lateral position measurement of a preceding vehicle is accomplished utilizing radar and/or stereo camera systems. Radar systems can provide a very accurate range. However, millimeter wave type radar systems such as 77 Ghz systems are typically quite expensive. Laser radar is low cost but is not effective in adverse weather. Further, radar generally is not well suited to identify the object and give an accurate lateral position.
Stereo cameras can determine the range and identity of an object. However, these systems do not perform well in adverse weather and are typically difficult to manufacture due to the accurate alignment required between the two stereo cameras and requires two image processors.
In view of the above, it can be seen that conventional ACC systems typically do not have a high cost-performance ratio even though they may perform to the desired functional requirements. Further, it is apparent that there exists a need for an improved system and method for measuring the range and lateral position of the preceding vehicle.
In satisfying the above need, as well as overcoming the enumerated drawbacks and other limitations of the related art, the present invention provides a system for determining range and lateral position of a vehicle. The primary components of the system include a single camera and a processor. The camera is configured to view a region of interest containing a preceding vehicle and generate an electrical image of the region. The processor is in electrical communication with the camera to receive the electrical image. To analyze the electrical image, the processor identifies a series of windows within the image, each window corresponding to a fixed physical size at a different target range. For example, from the perspective of the camera the vehicle will appear larger when it is closer to the camera than if it is further away from the camera. Accordingly, each window is sized proportionally in the image as it would appear to the camera at each target range. The processor evaluates characteristics of the electrical image within each window to identify the vehicle. For example, the size of the vehicle is compared to the size of the window to create a size ratio. A score is determined indicating the likelihood that certain characteristics of the electrical image actually correspond to the vehicle and also that the vehicle is at the target range for that window.
In another aspect of the present invention, the characteristics of electrical image evaluated by the processor include the width and height of edge segments in the image, as well as, the height, width, and location of objects constructed from multiple edge segments. The position of the window in the electrical image is calculated based on the azimuth angle and the elevation angle of the camera.
In yet another aspect of the present invention, a method is provided for identifying the vehicle within the electrical image and determining the vehicle range. To simplify the image, an edge enhanced algorithm is applied to the image. Only characteristics of the electrical image within a particular window are evaluated. The edge enhanced image is binarized and segmented. The segments are evaluated and objects are constructed from multiple segments. A score is determined for each object based on criteria, such as, the object width, object height position, object height, and segment width. Based on the score of the object, the range of the object is estimated on the basis of the target range of the window.
Further objects, features and advantages of this invention will become readily apparent to persons skilled in the art after a review of the following description, with reference to the drawings and claims that are appended to and form a part of this specification.
Referring now to
Now referring to
Now referring to
Θ1=a tan(−hc/r1) (1)
Where hc is the height of the camera 12 from the road surface, r1 is the horizontal range of window 20 from the camera 12, and the module is [0, 2π].
Similarly, the upper edge of the first window is calculated based on Equation (2).
Θ1h=a tan((hw−hc)/r1) (2)
Where hw is the height of the window, hc is the height of the camera 12 from the road surface, r1 is the range of window 20 from the camera 12, and the module is [0, 2π]. The difference, ΔΘ1=Θ1−Θ1h, corresponds to the height of the window in the electronic image.
Now referring to
Φ1=a tan(−width—w/r1)+(π/2) (3)
Similarly, the left edge of the range window 20 is calculated according to Equation (4).
Φ1h=a tan(width—w/r1)+(π/2) (4)
Where window w is the distance from the center of the window 20 to the horizontal edges, r1 is the horizontal range of the window 20 from the camera 12, and the module is [0, 2π].
The window positions for the additional windows 22, 24 are calculated according to Equations (1)-(4), substituting their respective target ranges for r1.
Now referring to
Now referring to
Now referring to
Now referring to
Now referring to
Relating these segments back to the original image, Segment 42 represents the lane marking on the road. Segment 44 represents the upper portion of the left side of the vehicle. Segment 46 represents the lower left side of the vehicle. Segment 48 represents the left tire of the vehicle. Segment 50 represents the upper right side of the vehicle. Segment 52 represents the lower right side of the vehicle while segment 54 represents the right tire.
Now referring to
As a person skilled in the art will readily appreciate, the above description is meant as an illustration of implementation of the principles this invention. This description is not intended to limit the scope or application of this invention in that the invention is susceptible to modification, variation and change, without departing from spirit of this invention, as defined in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4931937 | Kakinami et al. | Jun 1990 | A |
4970653 | Kenue | Nov 1990 | A |
5487116 | Nakano et al. | Jan 1996 | A |
5515448 | Nishitani | May 1996 | A |
5555312 | Shima et al. | Sep 1996 | A |
5555555 | Sato et al. | Sep 1996 | A |
5557323 | Kajiwara | Sep 1996 | A |
5646612 | Byon | Jul 1997 | A |
5757287 | Kitamura et al. | May 1998 | A |
5850254 | Takano et al. | Dec 1998 | A |
5930383 | Netzer | Jul 1999 | A |
6021209 | Hirabayashi et al. | Feb 2000 | A |
6205234 | Kakinami et al. | Mar 2001 | B1 |
6285393 | Shimoura et al. | Sep 2001 | B1 |
6327536 | Tsuji et al. | Dec 2001 | B1 |
6430303 | Naoi et al. | Aug 2002 | B1 |
6445809 | Sasaki et al. | Sep 2002 | B1 |
6463369 | Sadano et al. | Oct 2002 | B2 |
6470271 | Matsunaga | Oct 2002 | B2 |
6477260 | Shimomura | Nov 2002 | B1 |
6484086 | Jeon | Nov 2002 | B2 |
6535114 | Suzuki et al. | Mar 2003 | B1 |
6590521 | Saka et al. | Jul 2003 | B1 |
6665439 | Tatakashi | Dec 2003 | B1 |
6754369 | Sazawa | Jun 2004 | B1 |
6775395 | Nishigaki et al. | Aug 2004 | B2 |
6823261 | Sekiguchi | Nov 2004 | B2 |
6834232 | Malhotra | Dec 2004 | B1 |
6985075 | Takeda | Jan 2006 | B2 |
7042389 | Shirai | May 2006 | B2 |
7231288 | Koulinitch | Jun 2007 | B2 |
20020001398 | Shimano et al. | Jan 2002 | A1 |
20020005778 | Breed et al. | Jan 2002 | A1 |
20020131620 | Shirato | Sep 2002 | A1 |
20020131621 | Ohta | Sep 2002 | A1 |
20020134151 | Naruoka et al. | Sep 2002 | A1 |
20020191837 | Takeda et al. | Dec 2002 | A1 |
20030001732 | Furusho | Jan 2003 | A1 |
20030011509 | Honda | Jan 2003 | A1 |
20030039546 | Tseng | Feb 2003 | A1 |
20030076414 | Sato et al. | Apr 2003 | A1 |
20030091228 | Nagaoka et al. | May 2003 | A1 |
20030099400 | Ishikawa | May 2003 | A1 |
20030108222 | Sato et al. | Jun 2003 | A1 |
20030125855 | Breed et al. | Jul 2003 | A1 |
20030235327 | Srinivasa | Dec 2003 | A1 |
20040016870 | Pawlicki et al. | Jan 2004 | A1 |
20040054473 | Shimomura | Mar 2004 | A1 |
20040057601 | Honda | Mar 2004 | A1 |
20040096082 | Nakai et al. | May 2004 | A1 |
20040183906 | Nagaoka et al. | Sep 2004 | A1 |
20040189512 | Takashima et al. | Sep 2004 | A1 |
20040252863 | Chang et al. | Dec 2004 | A1 |
20050001715 | Itoh et al. | Jan 2005 | A1 |
20050015201 | Fields et al. | Jan 2005 | A1 |
20050036660 | Otsuka et al. | Feb 2005 | A1 |
20050063565 | Nagaka et al. | Mar 2005 | A1 |
20060002587 | Takahama et al. | Jan 2006 | A1 |
20070031008 | Miyahara | Feb 2007 | A1 |
20070035384 | Belcher et al. | Feb 2007 | A1 |
20070171033 | Nagaoka et al. | Jul 2007 | A1 |
Number | Date | Country |
---|---|---|
3170012 | Jul 1991 | JP |
5020593 | Jan 1993 | JP |
5313736 | Nov 1993 | JP |
8083392 | Mar 1996 | JP |
9016782 | Jan 1997 | JP |
10-255019 | Sep 1998 | JP |
Number | Date | Country | |
---|---|---|---|
20050244034 A1 | Nov 2005 | US |