1. Field of the Invention
The disclosures herein generally relate to an object recognition apparatus and the like, especially relating to an object recognition apparatus and an object recognition method and the like for detecting a distance to an object and a direction to the object.
2. Description of the Related Art
Conventionally, driving support technologies for recognizing an object such as a vehicle, a pedestrian, an installation object or the like in front of an own vehicle, and reporting a result of the recognition to a driver or controlling the own vehicle have been known. Since travelling environments can greatly change depending on a location, a time period, a region, a climate condition or the like, recognition of an object corresponding to a change in the travelling environment has been required.
However, due to a restriction for a dynamic range of imaging elements, such as CMOS sensors or CCD sensors, used in a camera, an exposure control may not follow a rapid change in brightness and an image processing apparatus may not recognize an object appearing in image data. For example, it may be difficult to recognize the object in the image data in the case where headlights from an oncoming vehicle suddenly enter a camera, or an exit of a tunnel appears within an imaging range.
Japanese Published Patent Application No. 2013-215804 discloses, for example, assisting recognition of an object in a travelling environment, in which it is difficult to recognize the object with a camera, by using a sensor other than the camera. Japanese Published Patent Application No. 2013-215804 discloses an obstacle recognition apparatus that recognizes an object by a stereo-camera and by a scanning laser radar, respectively, and determines whether the object exists by weighting results of recognition according to reliability for each of the ranging devices depending on a surrounding condition.
However, there is a problem that in the case where the camera loses the object due to a rapid change in the travelling environment, since a horizontal resolution of the laser radar is generally lower than a resolution for object detection by the stereo camera, the object cannot always be detected by the laser radar. The above problem will be explained with reference to drawings in the following:
In such a travelling environment, a laser radar resistant to backlight may be used for assisting acquiring the preceding vehicle 1001. However, as shown in
It is a general object of at least one embodiment of the present invention to provide an object recognition apparatus and an object recognition method that substantially obviate one or more problems caused by the limitations and disadvantages of the related art.
In one embodiment, an object recognition apparatus includes a first object detection unit configured to detect an object based on an image of the object captured by an imaging unit, and output first distance information including a distance to the object and first direction information including a direction to the object; a second object detection unit configured to emit light, detect the object based on reflected light reflected at the object, and output second distance information including a distance to the object and second direction information including a direction to the object; an output unit configured to output at least one of the first distance information and the second distance information and at least one of the first direction information and the second direction information; a disappearance detection unit configured to detect disappearance upon when the first object detection unit becomes unable to detect the object; and an emission timing control unit configured to control the second object detection unit, in a case where the disappearance detection unit detects the disappearance, to emit light in an object direction based on the first direction information before the disappearance detection unit detects the disappearance. Upon the disappearance detection unit detecting the disappearance, the output unit starts outputting the second distance information and the second direction information.
In another embodiment, an object recognition method includes detecting an object based on an image of the object captured by an imaging unit, and outputting first distance information including a distance to the object and first direction information including a direction to the object; emitting light, detecting the object based on reflected light reflected at the object, and outputting second distance information including a distance to the object and second direction information including a direction to the object; outputting at least one of the first distance information and the second distance information and at least one of the first direction information and the second direction information; detecting disappearance upon when the object becomes unable to be detected based on the image of the object; and emitting light, in a case where the disappearance is detected, in an object direction based on the first direction information before the disappearance is detected. Upon the disappearance being detected, the second distance information and the second direction information start to be output.
According to the embodiment of the present invention, an object recognition apparatus and an object recognition method, in which an object difficult to be recognized by a camera in the case of rapid change in a travelling environment can be detected by an other detection unit, can be provided.
Other objects and further features of embodiments will be apparent from the following detailed description when read in conjunction with the accompanying drawings, in which:
In the following, embodiments of the present invention will be described with reference to the accompanying drawings.
Accordingly, by pinpoint-emitting laser light in the direction in which the vehicle “A” exists where the camera had recognized until right before, even in the case where the horizontal resolution of the laser radar is low, it is possible to assist the camera to recognize the vehicle “A” and to support driving.
[Example of Configuration]
The stereo camera ranging unit 11 captures a forward area including an object by using two or more cameras, and detects a distance for each pixel from parallax data. The process will be described later in detail. The stereo camera ranging unit 11 outputs object data (distance, position or the like) of the object recognized by analyzing an image to the ranging result selection unit 14 via the recognition disappearance detection unit 13. The position is an example of the direction information.
The recognition disappearance detection unit 13 detects that the object, which the stereo camera ranging unit 11 has recognized, becomes unrecognizable, i.e. the object disappears. At this time, an emission request for laser light and a direction to the disappeared object, which had been recognized before the disappearance, are sent to the laser radar ranging unit 12.
The laser radar ranging unit 12 emits laser light, and detects a distance to the object based on a time length from an emission of the laser light till a reception of the laser light reflected at the object. The process will be described later in detail. The laser radar ranging unit 12 outputs detected object data (distance, position or the like) of the object to the ranging result selection unit 14.
The ranging result selection unit 14 determines whether to output the object data from the stereo camera ranging unit 11 or the object data from the laser radar ranging unit 12 to the drive support unit 15, and output the selected data. The stereo camera ranging unit 11 can output two dimensional object data with high resolution, since image sensors such as CMOS sensors or CCD sensors are used as light reception elements. However, the laser radar ranging unit 12 outputs the object data with low spatial resolution. Accordingly, the ranging result selection unit 14 essentially outputs the object data from the stereo camera ranging unit 11 to the drive support unit 15 in preference to the object data from the laser radar ranging unit 12. Exceptionally, when the recognition disappearance detection unit 13 detects that the object disappears, the object data from the laser radar ranging unit 12 are output. Meanwhile, in the case where the recognition disappearance detection unit 13 cannot recognize the object, the ranging result selection unit 14 may output both object data from the stereo camera ranging unit 11 and from the laser radar ranging unit 12.
The drive support unit 15 performs various drive supports by using the object data from the stereo camera ranging unit 11 or from the laser radar ranging unit 12. The drive support varies depending on a vehicle. For example, in the case where a position of the object overlaps with an extended line of a vehicle width, warning, braking or the like is performed according TTC (Time To Collision). Moreover, in the case where it is difficult to stop before collision, the own vehicle is steered to an opposite direction so as to avoid the collision.
Moreover, the drive support unit 15 performs a vehicle distance control over the entire vehicle speed, by which the own vehicle travels following a preceding vehicle with a vehicle distance depending on the vehicle speed. When the preceding vehicle stops, the own vehicle stops. When the preceding vehicle starts, the own vehicle starts. Moreover, in the case where the stereo camera ranging unit 11 performs white line recognition, the drive support unit 15 performs a lane keeping control for steering the own vehicle so as to travel in the center of the lane.
Moreover, in the case where there is an obstacle in the traveling direction when the own vehicle stops, sudden starting is inhibited. For example, in the case where there is an obstacle in the traveling direction, which is determined from an operation position of a shift lever and the operation amount of an accelerator pedal is great, damage is reduced by restricting an engine output or by warning.
Meanwhile, in the case where there are plural objects in front of the own vehicle, object data of the plural objects from the stereo camera ranging unit 11 or from the laser radar ranging unit 12 may be input to the drive support unit 15. In this case, the drive support unit 15 performs drive support for the object having the highest probability of collision. The object has, for example, the horizontal position within a threshold range of the own vehicle and has the least TTC.
[Configuration and Principle of Stereo Camera Ranging Unit]
The brightness image input unit 23 extracts a brightness image from the right image or the left image and outputs it. For example, a predetermined one of the right image and the left image, an image with better image quality of the right image and the left image (for example, with better contrast or greater edge number), or an overlapped part of the right image and the left image is extracted.
The object recognition processing unit 24 recognizes the object from the brightness image. The object in the present embodiment is mainly a vehicle, a pedestrian, a bicycle or the like. However, the present invention is not limited to this. A stationary object, such as a traffic light, may be recognized. Recognition of the object is performed, for example, by a matching of the brightness image with a block image of a vehicle or the like for pattern recognition which is prepared in advance. An image region of nearly the same velocity may be specified by an optical flow or the like in advance, a matching region may be narrowed and an appropriate block image for pattern recognition may be selected according to its aspect ratio. A time to recognize the object can be shortened compared with the matching of the whole screen. Meanwhile, since the brightness image is output periodically, the object which is once recognized to be a vehicle can be tracked in the next image based on a recognition position in the present image.
The parallax image calculation unit 26 calculates, using the right image and the left image as shown in
When the base-line length is B, a distance to the object is Z and a focal length is f, a distance between images of the same object appearing on the left and right CMOS sensors 22R, 22L is B+d (See
The distance calculation unit 25 calculates the distance to a measuring object from the parallax image. When the respective distances are defined as shown in
Z:B=Z+f:B+d→Z=Bf/d.
A position in a real space can be calculated from a position (pixel) in the image, in which the object is captured, and the focal length f. Moreover, since the distance Z is calculated, a direction can be obtained from the position.
Meanwhile, in the present embodiment, a stereo camera will be exemplified as an imaging means for detecting distance information. However, the imaging means may be a camera such as a monocular TOF camera in which a part of pixels are used for detecting a distance or a monocular camera which calculates a parallax by comparing plural frames on the basis that a camera moves. Since for the monocular camera, a calibration process or the like is unnecessary, a cost can be reduced.
Moreover, although the stereo camera is installed in a vehicle so as to image a front area of the vehicle, the stereo camera may be installed so that not only the front area but also a rear area, a rear side area or the like can be imaged.
[Configuration and Principle of Laser Radar Ranging Unit]
An emission timing control unit 31 controls a timing at which the laser diode 32 emits laser light. Conventionally, the emission timing control unit 31 is not provided, or even if it is provided laser light is merely emitted periodically. The emission timing control unit 31 according to the present embodiment causes the laser diode 32 to emit laser light periodically by an emission trigger signal. Furthermore, the emission timing can be controlled arbitrarily.
Meanwhile, the periodic laser light may be emitted always in the fixed direction. Furthermore, though it is periodic, the emission direction may vary according to an angle of a polygon mirror 33 when the laser diode 32 emits light.
Laser light emitted from the laser diode 32 is reflected at the polygon mirror 33 and emitted. Since the polygon mirror 33 rotates at a predetermined rotating speed, there are plural laser lights with various angles in the horizontal direction for respective surfaces of the polygon mirror 33.
The laser light reflected at the polygon mirror 33, afterwards reaches the object and is reflected. The reflected light is collected by a lens 36 and received by a in (dimensional) array type light reception element 35. The 1D array type light reception element 35 includes light reception elements which are arranged one dimensionally in the horizontal direction. A direction to the object can be detected depending on which light reception element in the horizontal direction detects laser light greater than or equal to a threshold.
Meanwhile, in the case of detecting the direction to the object according to the timing at which the laser diode 32 emits laser light and the angle of the polygon mirror 33, the 1D array type light reception element 35 outputs a reception trigger signal indicating that laser light is received to a time measurement unit 34. The 1D array type light reception element 35 is preferably sensitive to a wavelength of the laser light.
The time measurement unit 34 measures a time length from a timing at which the emission trigger signal for laser light is input till a timing at which a reception trigger signal is input, and calculates the distance to the object based on the measured time length and the velocity of light.
In order to receive the reflected light by the 1D array type light reception element 35 as distinguished from light such as the headlights of other vehicles or light from signals (so as to increase an intensity of the reflected light), an emission intensity of the laser diode 32 is preferably great. However, since when the emission intensity is increased, the laser diode is degraded by itself, a pulse interval of predetermined time length according to the emission intensity is required. As a result, there is a problem that the pulse interval increases and higher resolution in the horizontal direction cannot be obtained.
For example, as shown in
Thus, the laser radar ranging unit 12 according to the present embodiment is provided with the emission timing control unit 31, which controls a timing of emission of laser light, thereby controlling an emission angle of the laser light. As a result, the emission direction within an angle of view in one dimensional direction (horizontal direction) can be controlled arbitrarily.
Meanwhile, laser light emitted from the laser diode 32 is not limited to visible light. The laser light may include visible light and light other than the visible light. That is, the laser diode 32 has only to emit an electromagnetic wave and to receive it.
Moreover, in the present embodiment, pulsed laser lights are emitted while changing the emission direction, and the 1D array type light reception element 35 detects a direction to the object from the emission angle. However, an electromagnetic wave may be emitted while changing the emission direction of the electromagnetic wave without a mechanical movement, such as a phased array antenna, for example.
[About Function of Recognition Disappearance Detection Unit]
In order to detect disappearance of the object, the stereo camera ranging unit 11 outputs the brightness image to the recognition disappearance detection unit 13 in addition to the object data. Moreover, the recognition disappearance detection unit 13 can acquire a distance, a position, a relative velocity and an image position for each object, as shown in
As shown in
The recognition disappearance detection unit 13 determines for an object, for which “NULL” is registered in the object data, whether the object disappears due to whitening by referring to the preceding object data. That is, the image position 22 of the object 2 for t2 is read, and it is determined whether whitening occurs at the image position 22 in the brightness image for t3 according to the following procedure, for example:
(i) Variance value of brightness values for pixels in the rectangular region (x1,y1) (x2,y2) is calculated. In the case where the variance value is less than or equal to a threshold, it is determined that whitening occurs. When whitening occurs, according to limiting of the dynamic range, brightness values of a part of the pixels are close to the maximum brightness value, and brightness values of a part of the pixels become smaller. Accordingly, the variance value becomes smaller than that before the whitening occurs. Meanwhile, the threshold is experimentally obtained by calculating variance values for images in which whitening occurs and variance values for images in which whitening does not occur.
(ii) Histogram of brightness values for pixels in the rectangular region (x1,y1) (x2,y2) is calculated. In the case where a ratio of sufficiently bright pixels (for example, greater than or equal to 220 on a scale of 0 to 255) and sufficiently dark pixels (for example, less than or equal to 30 on a scale of 0 to 255) to the whole pixel is greater than or equal to a threshold, it is determined that whitening occurs. The idea is the same as in the above-described example (i). The property that brightness values of a lot of pixels in the rectangular region (x1,y1) (x2,y2) at the image position 22 becomes close to the maximum brightness value is used.
(iii) Brightness values of the pixels in the rectangular region (x1,y1) (x2,y2) at the image position 22 are differentiated in at least one direction of the vertical direction and the horizontal direction. In the case where the sum of differential values is less than or equal to a threshold, it is determined that whitening occurs. When whitening occurs, a difference between pixel values of adjacent pixels becomes smaller. Accordingly, in the case where the sum of the differential values is less than or equal to the threshold, it is determined that whitening occurs.
The recognition disappearance detection unit 13, upon detecting disappearance of the object, reports the direction to the object to the laser radar ranging unit 12. The direction to the object is obtained from the horizontal position and the distance for t2 where a front direction to the vehicle is zero as follows:
Direction=arcsin(horizontal position/distance).
Moreover, the recognition disappearance detection unit 13 sends to the ranging result selection unit 14 a selection request to cause to select a detection result by the laser radar ranging unit 12. Thus, the ranging result selection unit 14, using object data by the laser radar ranging unit 12, selects a distance or a direction to an object which disappears from an image and outputs it to the drive support unit 15. Meanwhile, the recognition disappearance detection unit 13 continues to output object data generated by the stereo camera ranging unit 11 to the ranging result selection unit 14. However, the ranging unit selection unit 14 outputs only the object data by the laser radar ranging unit 12. As described above, both object data by the stereo camera ranging unit 11 and object data by the laser radar ranging unit 12 may be input to the ranging result selection unit 14.
Meanwhile,
[Timing Control by Emission Timing Control Unit]
Since the laser radar ranging unit 12 emits radar light in the direction reported from the recognition disappearance detection unit 13, the object which disappears in a brightness image due to whitening can be detected by laser light. Since an emission direction of laser light of the laser diode 32 is fixed, an emission direction of laser light from the vehicle is determined according to an angle between a surface of the polygon mirror 33 and the emission direction of laser light of a radar diode. The emission timing control unit 31 detects, for example, a timing at which an angle between a surface of the polygon mirror 33 and the emission direction of laser light of the radar diode is 90 degrees, counts clocks starting at the timing, and calculates an emission angle of laser light from the vehicle on the basis of a number of clocks. A relation between the number of clocks and an emission angle of laser light from the vehicle is retained in advance. In the case where the calculated emission angle coincides with the direction reported from the recognition disappearance detection unit 13, an emission trigger signal is output to the laser diode 32. The emission timing control unit 31 repeats the above process when the polygon mirror 33 changes a surface.
Meanwhile, in the case of being reported a direction to an object that disappears, all the periodic emissions may stop, and laser light may be emitted only in the direction to the point of disappearance.
Moreover, in the case of being reported the direction to the object that disappears, when the laser radar ranging unit 12 continues the periodic emissions of laser light, as shown in
Meanwhile, in the case where laser light is emitted periodically with a time interval which is sufficiently wide so as not to deteriorate the laser diode 32, laser light may be emitted also at the second timing and the third timing. Accordingly, the resolution in the horizontal direction can be enhanced.
Moreover, laser light may be emitted only at a timing of the second timing and the third timing. For example, in the case where an angle between the direction to the point of disappearance and the emission direction at the third timing is less than or equal to a threshold (sufficiently close to the emission direction at the third timing), even if laser light is emitted in the emission direction at the second timing, degradation of the laser diode 32 can be prevented. Similarly, in the case where an angle between the direction to the point of disappearance and the emission direction at the second timing is less than or equal to the threshold (sufficiently close to the emission direction at the second timing), even if laser light is emitted in the direction at the third timing, degradation of the laser diode 32 can be prevented.
Incidentally, since a relative position between an object of a point of disappearance and the own vehicle can change, the object may become undetectable while laser light is emitted repeatedly in a direction to the point of disappearance. For example, the relative position can change when a preceding vehicle moves horizontally. In this case, the laser radar ranging unit 12 detects the horizontal movement of the preceding vehicle because a reflected wave of the laser light emitted to the point of disappearance is not detected. The emission timing control unit 31 controls the emission trigger signal again so that laser light is emitted in directions before and after in the horizontal direction centering around the direction to the point of disappearance, at an angle of +/− three degrees, for example (i.e. changing an angle leftward and rightward in the horizontal direction), as shown in
Meanwhile, in the case where the object cannot be detected even if laser light is emitted at an angle of +/− three degrees, by increasing the angle gradually (e.g. by one degree), the object of the point of disappearance can be detected definitely. Meanwhile, in the present embodiment, the maximum emission width for changing the direction back and forth is set within a range which coincides with the periodic emission direction. Further wider range can be covered by the periodic emission.
[Switch from Laser Radar Ranging Unit to Stereo Camera Ranging Unit]
Next, switch from detection according to laser light to a recognition result based on a brightness image will be explained. When the own vehicle exits from the tunnel following the preceding vehicle and the stereo camera ranging unit 11 becomes possible to recognize the preceding vehicle which had disappeared, the laser radar ranging unit 12 stops emission of laser light in the direction to the point of disappearance. The ranging result selection unit 14 switches to output object data by the stereo camera ranging unit 11.
This switching may be performed in reverse to the switching from the recognition by the stereo camera ranging unit 11 to the detection by the laser radar ranging unit 12. That is, the laser radar ranging unit 12 outputs object data of an object which is detected to the stereo camera ranging unit 11. The stereo camera ranging unit 11 performs object recognition for an object at an image position corresponding to the direction and distance detected according to laser light. In the case where the object is recognized, it is determined that the object, which had disappeared, is recognized by a brightness image again.
The stereo camera ranging unit 11 requires the laser radar ranging unit 12 to stop emission of laser light in the direction to the point of disappearance. Thus, the laser radar ranging unit 12 returns to the operation of the periodic emission of laser light. Moreover, the stereo camera ranging unit 11 requires, via the recognition disappearance detection unit 13, the ranging result selection unit 14 to select object data from the stereo camera ranging unit 11.
[Operation Procedure]
The recognition disappearance detection unit 13 determines whether the object disappears referring to object data (step S20).
In the case where the object disappears (step S20: YES), the recognition disappearance detection unit 13 requires the laser radar ranging unit 12 to emit laser light in a direction to the object (step S30). Moreover, the recognition disappearance detection unit 13 sends to the ranging result selection unit 14 a selection request to cause to select a detection result by the laser radar ranging unit 12. Meanwhile, even if the recognition disappearance detection unit 13 sends the above request, the stereo camera ranging unit 11 repeats recognition of an object from a brightness image within a recognizable range.
Next, the stereo camera ranging unit 11 determines whether the object which had disappeared can be recognized (step S40).
In the case where the object which had disappeared can be recognized (step S40: YES), the stereo camera ranging unit 11 requires, via the recognition disappearance detection unit 13, the ranging result selection unit 14 to select object data by the stereo camera ranging unit 11 (step S50). Accordingly, the ranging result selection unit 14 restarts selection of object data by the stereo camera ranging unit 11.
Then, the stereo camera ranging unit 11 outputs to the laser radar ranging unit 12 a stop request to stop emitting laser light in the direction of the object (step S60).
Next, an operation procedure of the laser radar ranging unit 12 will be explained.
The laser radar ranging unit 12 emits laser light periodically and detects an object (step S110).
In the case of receiving a request for emission of laser light to the object which had disappeared (step S120: YES), the laser radar ranging unit 12 detects the object by emitting laser light to the point of disappearance in addition to the periodical emitting direction (step S130).
Next, the laser radar ranging unit 12 determines whether stopping of emission of laser light is required (step S140).
In the case where the stopping of emission of laser light is required (step S140: YES), the laser radar ranging unit 12 stops the emission of laser light to the point of disappearance (step S150).
Accordingly, described as above, the object recognition apparatus 100 according to the present embodiment, for a rapid change in travelling environment, by emitting laser light in a direction to an object, right before the object becomes unrecognizable by the stereo camera, even if a resolution of a laser radar in the horizontal direction is low, can detect the object which had disappeared by the stereo camera.
Further, the present invention is not limited to these embodiments, but various variations and modifications may be made without departing from the scope of the present invention.
For example, in the present embodiment, an exit of a tunnel is explained as an example, but the present invention is suitably applicable to a travelling environment in which brightness rapidly changes, such as a case where headlights from an oncoming vehicle suddenly enter a camera, or a case of being suddenly exposed to the afternoon sun.
The present application is based on and claims the benefit of priority of Japanese Priority Applications No. 2014-033078 filed on Feb. 24, 2014, and No. 2015-030514 filed on Feb. 19, 2015, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2014-033078 | Feb 2014 | JP | national |
2015-030514 | Feb 2015 | JP | national |