The present invention relates to an image acquisition system and a method for distance determination between an image acquisition system and a luminous object moving relative to the image acquisition system.
An image acquisition system may be used in motor vehicles in order to obtain images of the vehicle's surroundings and, in connection with a driver assistance system, to make it easier for the driver to manage the vehicle. An image acquisition system of this kind encompasses at least one image sensor and an optical system, associated with said image sensor, that images onto the image sensor an imaged field of the vehicle's surroundings. One task of any such driver assistance system is precise measurement of distance, since it is only with a knowledge of accurate distance values that optically based lane and distance monitoring systems, having functions such as, for example, lane departure warning (LDW) and lane keeping support (LKS), can function with sufficient reliability. Conventionally, image acquisition systems using two cameras that generate a stereo image pair of an object may be used, the contents of one image being slightly shifted with respect to the other image. This shift is referred to as a disparity. If the arrangement of the cameras is known, the distance of the object from the cameras can be deduced from the measured disparity. With an image acquisition system having only one image sensor, however, the disparity can no longer be readily determined, since no stereo image is generated. The image sensors used with image acquisition systems of this kind must process a wide range of illumination intensities so that they can still supply usable output signals on the one hand in bright sunlight and on the other hand in dimly illuminated tunnels. Whereas with conventional image sensors the exposure sensitivity often follows a preset linear or logarithmic characteristic curve, image sensors have been described, for example, in German Patent Application No. DE 103 01 898 A1, in which this characteristic curve is individually adjustable in individual linear segments. A characteristic curve of this kind correlates the absolute brightness of an object with the grayscale value in the image obtained of the object.
German Patent No. DE 4332612 A1 describes an external viewing method for motor vehicles that is characterized by the following steps: acquiring an external view from the driver's own motor vehicle, which is moving; sensing a motion of an individual point in two images as an optical flow, one of the two images being acquired at an earlier point in time and the other of the two images at a later point in time; and monitoring a correlation of the driver's own motor vehicle with respect at least to either a preceding motor vehicle or an obstacle on the road, a hazard level being evaluated as a function of a magnitude and a location of a vector of an optical flow that is derived from a point on at least the preceding motor vehicle, the following motor vehicle, or the obstacle on the road. Taking into account the fact that the optical flow becomes greater as the distance from the driver's own vehicle to the preceding motor vehicle or obstacle becomes smaller, or as the relative velocity becomes greater, this conventional method is therefore designed so that the hazard can be evaluated on the basis of the magnitude of an optical flow that is derived from a point on a preceding motor vehicle or on an obstacle on the road. It is therefore not particularly necessary to install a distance measuring instrument in order to measure the distance to a preceding motor vehicle.
An object of the present invention is to improve the performance of an onboard image acquisition system interacting with a driver assistance system.
This object may be achieved by an example image acquisition system having an image sensor that has a characteristic curve assembled from linear segments, which determine the distance of luminous objects imaged by the image sensor.
An example method for determining the distance between an image acquisition system encompassing an image sensor and a luminous object moving relative to the image acquisition system, includes:
The present invention may make possible a determination of the distance of an object that has been imaged by an image acquisition system having only one image sensor using monoscopic technology. The example embodiment of the present invention exploits in this context the recognition that when a luminous object is imaged with an image sensor having a characteristic curve that is linear in segments, “smearing” or tailing occurs in the image of the object. The closer the luminous object is to the image sensor, the greater this smearing. Luminous objects of interest are, in particular, the lights of motor vehicles, such as the taillights of preceding motor vehicles or the headlights of motor vehicles approaching from the opposite direction. For night driving or when viewing conditions are poor because of weather, the aforesaid lights are often the only parts of other vehicles that are in any way detectable. With the present invention, it may be possible to determine the distance of these lights from the image sensor of the vehicle, and thus also to enable an assessment of risk. The closer a luminous object is located to the motor vehicle, the greater the potential risk of a collision.
The present invention is explained below with reference to example embodiments depicted in the figures.
An example embodiment of the invention is described below. It proceeds from an, in particular, vehicle-based image acquisition system having only one image sensor, preferably a CMOS camera. The example embodiment of the present invention further proceeds from the fact that for optimum utilization of the performance of the CMOS camera used as an image sensor, a characteristic curve that is linear in segments is used. A characteristic curve K1 of this kind is depicted by way of example in
tta=Δt/(sc−1.0).
The value sc=1 means that no change in size is taking place. It is at precisely this point that tailing (smearing over time) and the camera control parameters come into play. The integration times of the pixels enters into the calculation of Δt. The tailing, i.e., the geometry of the smearing, enters into the change in relative size. The aforementioned smearing of a sensed luminous object represents, in principle, an image of the object's motion during the acquisition time. This motion is usually determined in image sequences by correlating corresponding image features, and is referred to as “optical flow.” If the vehicle speed is known, the distance of an object sensed by the image acquisition system can be determined on the basis of this optical flow. Instead of determining the optical flow from an image sequence, here the optical flow can be measured directly. The control times and control thresholds of the image acquisition system can be used to infer the relative distance of the object, and also to perform a risk assessment. For example, a greater risk can be inferred in the context of a closer object. The tailing or “smearing” in the context of sensing of a luminous object with the image sensor of the image acquisition system is explained below with reference to
Number | Date | Country | Kind |
---|---|---|---|
10 2006 027 121 | Jun 2006 | DE | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/EP2007/053441 | 4/10/2007 | WO | 00 | 4/17/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2007/144213 | 12/21/2007 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4257703 | Goodrich | Mar 1981 | A |
5521633 | Nakajima et al. | May 1996 | A |
6114951 | Kinoshita et al. | Sep 2000 | A |
7124027 | Ernst et al. | Oct 2006 | B1 |
20020039187 | Keranen | Apr 2002 | A1 |
20020047901 | Nobori et al. | Apr 2002 | A1 |
20020101360 | Schrage | Aug 2002 | A1 |
20020154032 | Hilliard et al. | Oct 2002 | A1 |
20040136568 | Milgram et al. | Jul 2004 | A1 |
20040207652 | Ratti et al. | Oct 2004 | A1 |
20040213463 | Morrison | Oct 2004 | A1 |
20060216674 | Baranov et al. | Sep 2006 | A1 |
20060228024 | Franz et al. | Oct 2006 | A1 |
20070031006 | Leleve et al. | Feb 2007 | A1 |
20070115357 | Stein et al. | May 2007 | A1 |
20070263096 | Bouzar | Nov 2007 | A1 |
20090295920 | Simon | Dec 2009 | A1 |
Number | Date | Country |
---|---|---|
4332612 | Apr 1994 | DE |
10301898 | Aug 2004 | DE |
Entry |
---|
International Search Report, PCT International Patent Application No. PCT/EP2007/054331, dated Feb. 26, 2008. |
Xu S B: “Qualitative depth from monoscopic cues” Image Processing and Its Applications, 1992, International Conference on Maastricht, Netherlands, London, UK, IEE, UK, 1992, pp. 437-440, XP006500210. |
Wei-Ge Chen et al: “Investigating New Visual Cue for Image Motions Estimation Motion-from-Smear” Proceedings of the International Conference on Image Processing (ICIP) Austin, Nov. 13-16, 1994, Los Alamitos, IEEE Comp. Soc. Press, US, Bd. vol. 3 Conf. 1, Nov. 13, 1994, pp. 761-765 XP000522900. |
Rekleitis I M ED—Institute of Electrical and Electronics Engineers: “Optical flow recognition from the power spectrum of a single blurred image” Proceedings of the International Conference on Image Processing (ICIP) Lausanne, Sep. 16-19, 1996, New York, IEEE, US, Bd. vol. 1, Sep. 16, 1996, pp. 791-794, XP010202513. |
Number | Date | Country | |
---|---|---|---|
20090322872 A1 | Dec 2009 | US |