This application claims priority on Japanese Patent Application 2005-289719 filed Oct. 3, 2005.
This invention relates to a forward direction monitoring device for monitoring the front of an automobile. In particular, this invention relates to a forward direction monitoring device for monitoring the forward direction by using detection results by a radar device and those by an image-taking device such as a camera.
As examples of a forward monitoring device for monitoring objects in front of an automobile such as another automobile, there have been known various devices having mounted thereto a radar part for detecting an object by transmitting electromagnetic waves forward to a specified area and receiving reflected waves and an image detecting part for detecting a target object of detection from an image of a forward area taken by a camera.
Japanese Patent 3,264,060 discloses a device adapted to apply the position coordinate of an object detected by a radar to an image taken by a camera and to carry out an image processing process only in the area corresponding to the position coordinate of the object detected by the radar to thereby detect the object.
Japanese Patent Publication Tokkai 2002-303671 discloses a device adapted to detect a white line on the road by carrying out a specified image processing process on an image taken by a camera, to detect the delineator detection point by a radar from the position of this white line, to exclude the delineator detection point from the detection point by the radar and to thereby detect an object from the remaining detection point.
Japanese Patent 3,619,628 discloses a device adapted to detect a front-going vehicle in one's own lane and another front-going vehicle in the adjoining lane by using a radar, to set a partial area based on the front-going vehicle detected from an image taken by a camera, to detect a white line on the road within this area and to thereby recognize the environmental condition in front. In other words, the area for detecting the front-going vehicle and the area for detecting a white line are distinguished on the basis of the front-going vehicle detected by the radar.
Japanese Patent Publication Tokkai 9-264954 discloses a device adapted to used a radar to detect a vehicle in front of the own vehicle, to monitor a specified area inclusive of the detected vehicle and also to detect a white line in a specified image area set by the detected vehicle. In other words, this, like the device according to aforementioned Japanese Patent 3,619,628, is also adapted to distinguish between the area for detecting the front-going vehicle and the area for detecting a white line on the basis of the front-going vehicle detected by the radar.
Each of these prior art devices is adapted to monitor a front-going vehicle by narrowing the whole image of a detection area to a partial image area on the basis of the detection results of a front-going vehicle obtained by the radar. All these devices are adapted to distinguish a front-going vehicle from a white line or a delineator on the basis of detection result of an object in front of the own vehicle and to set a partial image area according to a front-going vehicle but if there is a road surface marking such as an arrow mark or a maximum speed display on the road surface immediately behind the front-going vehicle, there is a possibility of mistaking such a display as a front-going vehicle.
It is therefore an object of this invention to provide a forward direction monitoring device capable of distinguishing between a road surface marking near a front-going vehicle from a vehicle and monitoring a front-going vehicle by eliminating data caused by such road surface markings.
A forward direction monitoring device according to this invention may be characterized as comprising an image taking part for taking an image of road condition in front of the vehicle (own vehicle) onto which it is mounted, a radar part for projecting detecting waves into a detection area in front of the own vehicle and receiving reflected waves of the detecting waves from objects in front to thereby detect relative positions of the objects with respect to the own vehicle, a mapping part for mapping the relative positions of the detected objects on the image taken by the image taking part, an object identifying part for specifying image portions of the image with respect to the relative positions, identifying kinds of these objects by analyzing brightness distribution of the image portions and eliminating those of relative position data of the objects other than relative position data of desired objects from the relative position data of objects obtained by the radar part, and a monitoring part for obtaining continuously in time sequence the relative position data after the step of eliminating and continuously monitoring the desired objects based on the obtained relative position data.
With the forward direction monitoring device thus structured, the radar part serves not only to detect objects in front of the own vehicle but also to calculate their relative positions with respect to the own vehicle and the image taking part takes an image of road condition in front of the vehicle. The mapping part maps detected measurement points on the image take based on the detected relative distances. Throughout herein, anything that reflects light waves inclusive of vehicles and road surface markings are broadly referred to as “objects”.
The object identifying part specifies portions of the image with respect to the relative positions and analyzes brightness distribution of the image portions. In the above, brightness means the intensity of reflection from each object obtained by the image taking part such as a camera. If the kind of the object is different, the brightness distribution will be accordingly different. The object identifying part makes use of this property and identifies the kinds of the objects. For example, a front-going vehicle and a road surface marking are thereby distinguished. After the objects are identified, the object identifying part eliminates those of relative position data of the objects other than relative position data of desired objects. For example, relative position data of road surface markings are eliminated and only the relative position data of a front-going vehicle are outputted. The monitoring part monitors the desired object or road surface marking (by detecting and/or tracing) based on the relative position data obtained as described above.
The mapping part may further be characterized as converting the image taken by the image taking part into a back-projection image and mapping the relative positions of the objects onto this back-projection image. On this back-projection image of a plan view of the road in front, the road length on the image and the relative distance obtained by the radar part become nearly equal. Thus, the mapping of relative positions by the radar with respect to objects on the image become more reliable.
The object identifying part may further be characterized as setting each of the image portions as a rectangular area defining vertical direction as the direction of motion of the own vehicle and horizontal direction as the direction of the width of the road, generating a histogram by cumulatively adding brightness of each point in the vertical direction corresponding to each point in the horizontal direction, detecting a width in the horizontal direction on the histogram where the cumulatively added brightness is in excess of a preliminarily determined reference value, and eliminating from consideration, if the detected width is greater than a preliminary defined first threshold value and the average of cumulatively added brightness corresponding to the detected width is greater than a preliminarily defined second threshold value, the object corresponding to the corresponding image portion. In summary, the object identifying part thus characterized creates a histogram by setting a rectangular area for each image portion as a method for identifying an object based on brightness distribution.
The forward direction monitoring device of this invention may further comprise a white line detector for detecting white lines extending parallel to the direction of the road and the said object identifying part may further have the functions of setting the width of the image portions in the horizontal direction and the first threshold value according to a change in the distance between the white lines detected by the white line detector. With such a structure, the brightness distribution inside image portions can be maintained within a certain range of condition even if the road width appearing on the image changes suddenly due to a change in the slope condition of the road.
With such white line detector provided, the mapping part may be further provided with the function of detecting a change in the slope condition of the road in front according to a change in the distance between the white lines detected by the white line detector and to correcting and mapping to the back-projection image the relative positions of the objects along the direction of the road according to the change in the slope condition. This means that the mapping of the relative positions by the radar device can be effected more reliably.
According to this invention, relative position data obtained by the radar can be distinguished by image processing based on a brightness distribution. In particular, a front-going vehicle which is moving approximately at the speed in front of the own vehicle can be distinguished from a road surface marking, say, in the shape of an arrow and only the relative position data of only road surface markings can be selectively eliminated. This was what prior art technologies could not accomplish. As a result, it is now possible to output the relative position data of only a desired object and to monitor only the desired object. In other words, the monitoring process becomes simplified and the process can be carried out at a higher speed and more reliably.
A forward direction monitoring device according to a first embodiment of this invention will be described first with reference to
As explained above, the radar 3 serves to transmit detection waves and receive reflected waves to generate detection data and transmits the generated data to the radar signal processor 4. Such detection data include the difference between the time of transmitting the detection waves and the time of receiving the reflected waves, as well as the directional angle. The radar signal processor 4 calculates the relative distance to the detection point of the object, based on these detection data, and generates the measurement point data set by this relative distance and the direction angle, as shown in
Thus, the back-projection image based on the image taken by the camera 1 and the measurement point data based on the radar detection are inputted to the mapping part 22, and the mapping part 22 maps measurement points 211-216 and 221-223 onto the back-projection image as shown in
The object identifying part 23 sets rectangular areas 311-316 and 321-323 each having a specified area and with centers respectively at the measurement points 211-216 and 221-223 mapped on the back-projection image (Step S105), as shown in
The object identifying part 23 creates a brightness histogram for each of the rectangular areas 311-316 and 321-323 (Step S106). This may be done firstly by dividing each area into n columns and m rows respectively along the longer and shorter sides to set a two-dimensionally arranged dot pattern. Next, the brightness of each of the m dots is calculated and cumulatively added along each column. A brightness histogram is obtained by corresponding the cumulatively added value to each of the columnar positions. The brightness values are then normalized by setting the highest of the brightness values in all of the columns of all of the rectangular areas 311-316 and 321-323 as 100% and the lowest brightness value as 0%. Histograms such as shown in
By using all these histograms thus created, the object identifying part 23 calculates for each of the rectangular areas 311-316 and 321-323 the width W1 (referred to as the half-value width) in the row-direction of each portion where the cumulatively added brightness value is 50% or higher (Step S107). Each half-value width W1 thus calculated is compared with a (first) threshold value TH1 which is preliminarily defined (Step S108). The cumulatively added brightness values of the portions for which W1 is found to be greater than TH1 are added and the sum is divided by the number of corresponding rows for each corresponding rectangular area to obtain an average brightness Bav (Step S109). If W1 is less than TH1 (NO in Step S108), it is judged that the object corresponding to that rectangular area is something other than a road surface marking 103 (Step S112).
After the average brightness Bav is calculated for each rectangular area, the object identifying part 23 compares it with another (second) preliminarily defined threshold value TH2 (Step S110). If Bav is larger than TH2 (YES in Step S110), it is judged that the object corresponding to this rectangular area is a road surface marking 103 (Step S111). If Bav is less than TH2 (NO in Step S110), it is judged that this object is other than a road surface marking (Step S112).
The first threshold value TH1 is based on the observation that the reflector at the back of the front-going vehicle 102 and road surface markings 103 have a high light reflectivity and produce an image with high brightness. The second threshold value TH2 is based on the observation that the reflection from a front-going vehicle is uniform and has fluctuations in brightness, while the reflection from a road surface marking 103 has hardly any fluctuations in brightness.
The object identifying part 23 carries out this kind of identification process sequentially for all of the rectangular areas 311-316 and 321-323 and determines whether the object corresponding to each rectangular area is a road surface marking or other than a road surface marking.
Based on the inputted results of these identifications, the radar signal processor 4 arranges the measurement points 211-216 and 221-223 into groups 210 and 220 as shown in
The outputted radar detection results are inputted to the monitoring processor 5 which serves to sequentially process the received radar detection results (from which data corresponding to road surface markings have been deleted) by a known method and to carry out monitoring processes such as the tracing of a front-going vehicle and the detection of relative speed.
Thus, since data on unwanted objects such as road surface markings that are different from a front-going vehicle are prevented from being inputted to the monitoring processor 5, the forward direction monitoring process such as the tracing of the front-going vehicle can be effected more quickly.
Next, another forward direction monitoring device according to a second embodiment of this invention is described with reference to
As shown in
The white line detector 24 is for detecting a white line from an image taken by the camera 1. Examples of the method for detecting a white line by this white line detector 24 include one described by Mineta, et al. in “Development of White Line Detecting Systems in Lane Keep Assist System” (Honda R&D Technical Review (2000), Vol. 12, No. 1, pages 101-108). The data on the position of a detected white line on the image is converted into back-projection position data and transmitted to the mapping part 22.
The mapping part 22 corrects the relative distances of measurement points obtained from the radar signal processor 4 based on this result of white line detection with respect to the sloped road condition and maps them onto the back-projection image obtained from the image converting part 21. The object identifying part 23 calculates the road width L based on the result of white line detection and sets the width S in the longitudinal direction of rectangular areas based on this road width L. The object identifying part 23 also sets a first threshold value TH1 based on the width S as explained above regarding the first embodiment. A brightness histogram is similarly calculated for each rectangular area and the object is identified by using this threshold value TH1.
Images taken by the camera 1 as shown in
The radar 3 transmits detection waves as explained above and generates detection data by receiving reflected waves. The generated detection data are provided to the radar signal processor 4. Such detection data include the difference between the time of transmitting the detection waves and the time of receiving the reflected waves, as well as the directional angle. The radar signal processor 4 calculates the relative distance to the detection point of the object, based on these detection data, and generates the measurement point data set by this relative distance and the direction angle, as shown in
The mapping part 22 maps measurement points 211-216 and 221-223 onto the back-projection image based on the relative distances of the measurement data and the directions (Step S208). If no correction were made regarding the sloped road condition on the relative distances, displacements would result as shown in
This is explained more in detail with reference to
D′=D(L′/L).
The road width L at the position of the own vehicle 101 may be taken as the road width at the closest measurable position or extracted from a navigation system (not sown).
If such a correction process is carried out, measurement points 211c-216c and 221c-223c obtained by using corrected relative distances match the positions of the front-going vehicle 102 or the road surface marking 103 displayed on the back-projection image, as shown in
The object identifying part 23 sets rectangular areas 311-316 and 321-323 each having a specified area and with centers respectively at the measurement points 211c-216c and 221c-223c mapped on the back-projection image (Step S211), each rectangular area having longer sides in the direction of the width of the road and shorter sides in the direction of the road. The lengths of the longer and shorter sides of these rectangles are preliminarily determined according to the shape of the object to be eliminated (to be explained below), or specifically according to the shape of the road surface markings 103. The length of the longer side of the rectangular area (width S) may be set according to the road width L at the corrected positions of measurement points 211c-216c and 221c-223c (Step S209). In the case of the example of
The object identifying part 23 sets the first threshold value TH1 based on the road width L at the corrected positions of the measurement points 211c-216c and 221c-223c (Step S210), say, about equal to ½ of the width S of the rectangular areas.
The object identifying part 23 creates a brightness histogram for each of the rectangular areas 311-316 and 321-323 (Step S212) as explained above regarding the first embodiment. The brightness values are then normalized by setting the highest of the brightness values in all of the columns of all of the rectangular areas 311-316 and 321-323 as 100% and the lowest brightness value as 0%.
By using all these histograms thus created, the object identifying part 23 calculates for each of the rectangular areas 311-316 and 321-323 the width W1 (referred to as the half-value width) in the row-direction of each portion where the cumulatively added brightness value is 50% or higher (Step S213). Each half-value width W1 thus calculated is compared with a (first) threshold value TH1 which is preliminarily defined (Step S108). The cumulatively added brightness values of the portions for which W1 is found to be greater than TH1 are added and the sum is divided by the number of corresponding rows for each corresponding rectangular area to obtain an average brightness Bav. If W1 is less than TH1, it is judged that the object corresponding to that rectangular area is something other than a road surface marking 103 (Step S214 to Step S218).
After the average brightness Bav is calculated for each rectangular area, the object identifying part 23 compares it with another (second) preliminarily defined threshold value TH2 (Step S216). If Bav is larger than TH2 (YES in Step S216), it is judged that the object corresponding to this rectangular area is a road surface marking 103 (Step S217). If Bav is less than TH2 (NO in Step S216), it is judged that this object is other than a road surface marking (Step S218).
The object identifying part 23 carries out this kind of identification process sequentially for all of the rectangular areas 311-316 and 321-323 and determines whether the object corresponding to each rectangular area is a road surface marking or other than a road surface marking. After this identification process is completed for all of the rectangular areas 311-316 and 321-323 (YES in Step S219), the results of these identifications are outputted to the radar signal processor 4 (Step S220).
Based on the inputted results of these identifications, the radar signal processor 4 arranges the measurement points 211-216 and 221-223 into groups 210 and 220 as shown in
The outputted radar detection results are inputted to the monitoring processor 5 which serves to sequentially process the received radar detection results (from which data corresponding to road surface markings have been deleted) by a known method and to carry out monitoring processes such as the tracing of a front-going vehicle and the detection of relative speed.
Thus, even if the slope of the road changes between the own vehicle and the vehicle in front, the relative distances are corrected such that the measurement points detected by the radar and the positions of objects on the image match and errors in detection can be prevented.
Moreover, since the rectangular areas and the threshold value are set according to the road width, road surface markings can be detected reliably without being affected by the changes in the widths of road surface markings caused by changes in the sloped condition of the road, and errors can be even more reliably avoided.
Number | Date | Country | Kind |
---|---|---|---|
2005-289719 | Oct 2005 | JP | national |