This application is based on Japanese Patent Application No. 2015-246687 filed on Dec. 17, 2015, the contents of which are incorporated herein by reference.
The present disclosure relates to an object detection device and an object detection method for detecting an object present around the own vehicle.
According to a technique described in PTL 1, in order to increase accuracy in detection of an object present around the own vehicle, the object is detected individually by using a radar and a camera. An object present around the own vehicle is detected on the condition that a positional relationship between an object detected by the radar and an object detected by the camera satisfies a predetermined determination criterion and it is determined that the object detected by the radar and the object detected by the camera are the same object.
During the detection, when an object detection range of the radar does not agree with an object detection range of the camera, the object may be outside one of the detection ranges. In such a case, the reliability the result of detection of the object decreases, and thus control such as collision avoidance with respect to the object needs to be limited.
Depending on a type of the object, if accuracy in detection of the object decreases, the object may approach the own vehicle, and thus collision avoidance control may be necessary.
A main object of the present disclosure is to provide an object detection device and an object detection method which are capable of appropriately performing collision avoidance control with respect to an object detected by using a plurality of detection sections having different detection ranges.
A first aspect of the present disclosure includes: a first target detection section for detecting, as a first target, an object included in a first detection range ahead of the own vehicle; a second target detection section for detecting, as a second target, an object included in the first detection range and in a second detection range laterally narrower than the first detection range; a selection section for selecting, as a target object to be subjected to collision avoidance control, the object on the condition that the first target and the second target are the same object, in a first state where a detection position of the object is inside the first detection range and inside the second detection range; a state determination section for determining whether a detection position of the target object has transitioned from the first state to a second state where the detection position of the object is outside the second detection range and inside the first detection range; a moving object determination section for determining whether the target object is a predetermined moving object; and a continuation determination section for continuing selection as the target object on the condition that the target object is the predetermined moving object, when the detection position of the target object has transitioned to the second state.
According to the present disclosure, when the second detection range of the second target detection section is narrower in the vehicle width direction than the first detection range of the first target detection section, the detection position of the target object may transition from the first state where the detection position of the target object is inside the first detection range and inside the second detection range to the second state where the detection position of the target object is outside the second detection range and inside the first detection range. When the detection position of the target object has transitioned to the second state, reliability of a result of detection of the target object decreases, and thus collision avoidance control with respect to the target object needs to be limited.
Depending on a type of the target object; however, limitation of collision avoidance control with respect to the target object may cause a problem. Thus, when the detection position of the target object has transitioned to the second state, the state of the object being selected as a target object is maintained on the condition that the target object is the predetermined moving object. In other words, when the detection position of the target object has transitioned to the second state, the state where the object is selected as the target object is not cancelled and is maintained on the condition that the target object is the predetermined moving object. Accordingly, if the reliability in the detection position of the object decreases, collision avoidance control will be appropriately performed with respect to the predetermined moving object that has been once determined as a target object.
The object described above and other objects, features, and advantageous effects of the present disclosure are clarified by the detailed description below with reference to the accompanying drawings. In the accompanying drawings:
Some embodiments will be described hereinafter with reference to the drawings. The components identical with or similar to those in the embodiments described below are given the same reference signs for the sake of omitting unnecessary explanation.
An object detection device 10 according to the present embodiment is mounted in the own vehicle and detects an object present around the own vehicle.
Information on the object detected by the object detection device 10 is used for various types of vehicle control such as avoidance of a collision with the own vehicle. For example, the information is used for various types of vehicle collision avoidance control such as a pre-crash safety system (PCS).
In
The radar 21 transmits and receives a directional electromagnetic waves such as millimeter waves or a laser. The radar 21 then outputs transmitted/received data to the object detection device 10 as a radar signal, the data including the distance to the object that has reflected the electromagnetic waves, the azimuth of the object, and the speed of the object relative to the own vehicle, and the like.
For example, as shown in
The image sensor 22 includes an imaging element such as a charged-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) and captures an image of an area around the own vehicle M at a predetermined angle of view. Then the image sensor 22 outputs an image captured by the image sensor 22 to the object detection device 10 as an image signal.
For example, as shown in
As shown in
The radar target detection section 11 generates a radar target (corresponding to a first target) by using position information of a target specified based on the radar signal outputted from the radar 21. The radar target detection section 11 then outputs the radar target to the object detection section 13.
The image target detection section 12 recognizes, as an image target (corresponding to a second target), an object detected by analyzing the captured image indicated by the image signal outputted from the image sensor 22. The image target detection section 12 includes an image processing section 12A for identifying a type of the object by image processing of a captured image of an area ahead of the own vehicle M. The image target detection section 12 recognizes the image target, for example, by a matching process using a target model registered in advance. The target model is prepared for each type of image target, so that the type of the image target is identified. Examples of the type of image target include a moving object such as a four-wheeled vehicle, a two-wheeled vehicle, or a pedestrian and a stationary object such as a guardrail. Note that the two-wheeled vehicle here includes a bicycle, a saddle riding type motorcycle, and the like. The image target detection section 12 then outputs, to the object detection section 13, information such as the extracted type of the image target, the distance between the own vehicle M and the image target, the azimuth of the image target with respect to the own vehicle M, the relative speed between the own vehicle M and the image target, and lateral width of the image target.
The object detection section 13 determines whether the radar target outputted from the radar target detection section 11 and the image target outputted from the image target detection section 12 have been generated from the same object. For example, the object detection section 13 sets a predetermined image search range in the captured image by using position information of the target identified as the radar target. The image search range is set as a range for which an error in a detection position of the image target is taken into account. When the image target is included in the image search range, the object detection section 13 determines that the radar target and the image target have been generated from the same object. When the object detection section 13 determines that the radar target and the image target have been generated from the same object, a selection section 13A of the object detection section 13 selects the object as a control target object (target object) to be subjected to collision avoidance control. The selection section 13A then transmits information on the target object to the driving assistance device 30.
The driving assistance device 30 includes a vehicle control electronic control unit (ECU) 31 for performing collision avoidance control and vehicle-mounted devices 32 such as a loudspeaker outputting an alarm sound and a guide sound, a seat belt tightening device, a brake device, and a steering device. The vehicle control ECU 31 determines whether driving assistance is necessary with respect to the target object detected by the object detection section 13. When the vehicle control ECU 31 determines that driving assistance is necessary, the vehicle control ECU 31 activates each of the vehicle-mounted devices 32.
For example, the vehicle control ECU 31 calculates time to collision (TTC) between the target object and the own vehicle M. The time to collision (TTC) is an evaluation value indicating the remaining number of seconds before the own vehicle collides with the target object when the own vehicle continues to travel at the current speed. As the time to collision (TTC) is decreased, the own vehicle is more likely to collide with the target object, and as the time to collision (TTC) is increased, the own vehicle is less likely to collide with the target object. The time to collision (TTC) can be calculated, for example, by dividing a distance between the target object and the own vehicle M in a direction of travel by a relative speed to the target object. The relative speed to the target object is obtained by subtracting a vehicle speed of the own vehicle from a vehicle speed of a preceding vehicle. The time to collision (TTC) can also be calculated by taking into account relative acceleration.
When the time to collision (TTC) is equal to or less than an activation time of each of the vehicle-mounted devices 32, the vehicle control ECU 31 activates the corresponding one of the vehicle-mounted devices 32. The activation time of each of the vehicle-mounted devices 32 is set according to a type of the object.
When the detection range (first detection range) 01 of the radar target does not agree with the detection range (second detection range) θ2 of the image target, a detection position of the object may transition from the state (corresponding to a first state) where the detection position of the object is included in both (overlapping part) of the detection range θ1 of the radar target and the detection range θ2 of the image target to the state (corresponding to a second state) where the detection position of the object is included in only one of the detection ranges θ1 and θ2.
For example, in
More specifically, in
When the object B is detected at the position A5 in the region D2, the object is detected only by the radar 21, and thus the object is not selected as the target object by the selection section 13A of the object detection section 13.
Depending on a type of the object B; however, even when accuracy in detection of the object decreases, the object may approach the own vehicle M, and thus collision avoidance control with respect to the object (moving object) may be necessary. That is, when the type of the object is a two-wheeled vehicle, the two-wheeled vehicle may cross an own vehicle lane at high movement speed (lateral speed). Accordingly, when the object is a two-wheeled vehicle, it is preferred that the object B is continuously selected as the target object even in a situation where the object B is detected only by the radar 21.
Thus, the moving object determination section 14 determines whether the target object is a two-wheeled vehicle. The moving object determination section 14 can determine that whether the target object is a two-wheeled vehicle based on a signal provided from the image processing section 12A of the image target detection section 12. If the type of the target object is a two-wheeled vehicle, a continuation determination section 13C of the object detection section 13 continues to select the two-wheeled vehicle as the target object, even when the target object is detected only by the radar 21.
During the determination, even when the type of the target object is a two-wheeled vehicle, when the two-wheeled vehicle is standing still (when the movement speed is zero), the two-wheeled vehicle is less likely to approach the own vehicle M. Furthermore, even when the two-wheeled vehicle is moving at a predetermined speed, when the two-wheeled vehicle is moving away from the own vehicle M in a direction orthogonal to the direction of travel of the own vehicle M, the two-wheeled vehicle and the own vehicle M are less likely to collide with each other.
Thus, when the target object is a two-wheeled vehicle, the continuation determination section 13C of the object detection section 13 of the present embodiment continues to select the two-wheeled vehicle as the target object on the condition that the movement speed (lateral speed) of the two-wheeled vehicle in the direction orthogonal to the direction of travel of the own vehicle M is higher than a predetermined value and that the two-wheeled vehicle is moving closer to the own vehicle M. The movement speed of the two-wheeled vehicle is obtained based on a relative speed to the own vehicle M. A direction in which the two-wheeled vehicle is moving is obtained, for example, based on a vector component in a lateral direction obtained by resolving the movement speed of the two-wheeled vehicle into a vector component in a longitudinal direction and the vector component in the lateral direction.
Furthermore, as more time has elapsed after the transition to the state where the object selected as the control target object is detected in the region D2, that is, the state where the object is detected only by the radar 21, reliability of a result of detection of the object decreases. As the reliability decreases, unnecessary activation with respect to the object is more likely to be performed. Thus, the continuation determination section 13C of the object detection section 13 of the present embodiment continues to select the object selected as the control target object as the target object on the condition that time that has elapsed after the transition to the state where the target object is detected only by the radar 21 is short.
With reference to a flow chart shown
First, it is determined whether there is a radar target (S11). When a negative determination is made in S11, the process terminates. When it is determined in S11 that there is a radar target, it is determined whether there is an image target (S12). When there is an image target, it is determined whether the radar target detected in S11 and the image target detected in S12 are the same object (S13). When it is determined in S13 that the radar target and the image target are not the same object, the process terminates. When it is determined in S13 that the radar target and the image target are the same object, a control target flag is turned ON (S14). When the control target flag is ON, the object is selected as a target object for the driving assistance device 30, and it becomes possible to control such as collision avoidance with respect to the object. Specifically, information indicating that the control target flag is ON is outputted from the object detection device 10 to the driving assistance device 30. Then, based on time to collision (TTC) at each time, the driving assistance device 30 performs collision avoidance control by at least one of an alarm, brake control, steering control, and the like.
When a negative determination is made in S12, it is determined whether the control target flag is ON (S15). Specifically, in S15, the state determination section 13B determines whether a detection position of the object has transited from the state where the detection position of the object is included in both of the detection range θ1 of the radar target and the detection range θ2 of the image target to the state where the detection position of the object is outside the detection range θ2 of the image target and inside the detection range θ1 of the radar target. When an affirmative determination is made in S15, it is determined whether a type of the object is a two-wheeled vehicle (S16). Determination in this process is made based on a type of the image target identified by the matching process using the target model. When it is determined in S16 that the type of the object is a two-wheeled vehicle (YES in S16), it is determined whether a lateral speed of the object is higher than a predetermined value Vth (S17).
When the lateral speed of the object is higher than the predetermined value Vth (YES in S17), it is determined whether the object is in an approaching state where the object is approaching the own vehicle M (S18). When an affirmative determination is made in S18, it is determined whether time that has elapsed after the transition to the state where only the radar target is detected for the target object is less than a predetermined value Tth (S19).
When an affirmative determination is made in S19, control proceeds to S14. In this case, the state where the control target flag is ON is continued, and thus driving assistance with respect to the object can be provided even in a situation where only the radar target is detected (situation where no image target is detected) for the target object. When a negative determination is made in S16 to S19, the control target flag is turned OFF (S20).
With reference to
(a) Case where Two-Wheeled Vehicle is Moving Closer to Own Vehicle M at Lateral Speed Higher than Predetermined Value Vth
In
(b) Case where Two-Wheeled Vehicle is Standing Still
In
The above configuration provides the following advantageous effects.
When the detection range θ2 of the image sensor 22 is narrower in the vehicle width direction than the detection range θ1 of the radar 21, the detection position of the target object may transitioned from the first state where the detection position of the target object is inside the detection range θ1 and inside the detection range θ2 to the second state where the detection position of the target object is outside the detection range θ2 and inside the detection range θ1. When the detection position of the target object has transitioned to the second state, the reliability of the result of detection of the target object decreases, and thus the collision avoidance control with respect to the target object needs to be limited.
When the type of the target object is a two-wheeled vehicle; however, limitation of collision avoidance control with respect to the two-wheeled vehicle may cause a problem. Thus, when the detection position of the target object has transitioned to the second state, the object detection section 13 continues to select the two-wheeled vehicle as the target object on the condition that the target object is a two-wheeled vehicle. Accordingly, if the reliability of the detection position of the object decreases, collision avoidance control with respect to the two-wheeled vehicle will be appropriately performed, if the reliability of the detection position of the object decreases.
When the movement speed of the target object in the direction orthogonal to the direction of travel of the own vehicle M is higher than the predetermined value and the target object is moving closer to the own vehicle M, the selection as the target object is continued. Accordingly, if the reliability of the detection position of the target object decreases, collision avoidance control will be appropriately performed with respect to the target object when the target object and the own vehicle M are likely to collide with each other with high probability.
When the target object is moving away from the own vehicle M in the lateral direction orthogonal to the direction of travel of the own vehicle M, the own vehicle M and the target object are less likely to collide with each other. In such a case, therefore, the selection as the target object is cancelled. Accordingly, unwanted activation of the vehicle-mounted devices can be avoided when the reliability of the detection position of the target object decreases, and when the own vehicle M and the target object are less likely to collide with each other.
The selection as the target object is continued on the condition that time that has elapsed after the target object has transitioned to the second state is less than the predetermined value. In this case, unwanted activation of the vehicle-mounted devices can be avoided with respect to the object due to the duration of decreasing in reliability of the object.
The aforementioned embodiment may be altered, for example, as below. In the following description, components similar to the aforementioned components are given the same reference signs of the drawings and detailed descriptions on such components are omitted. The aforementioned embodiment and embodiments described below can be implemented in combination.
According to the aforementioned embodiment, when the control target flag is turned ON by the object detection device 10, the flag information is outputted from the object detection device 10 to the driving assistance device 30, and the driving assistance device 30 performs collision avoidance control. However, the configuration can be altered such that when the control target flag is turned ON, the object detection device 10 determines whether to perform collision avoidance control.
For example, the object detection device 10 performs a collision avoidance process shown in
In the flow chart shown in
In this regard, when the two-wheeled vehicle is traveling ahead of the own vehicle M with wheels W being juxtaposed longitudinal to the own vehicle M as shown in
As described above, in the case where the selection as the target object is continued on the condition that the two-wheeled vehicle as the target object is located ahead of the own vehicle M and that the direction of travel of the two-wheeled vehicle is oriented to the own vehicle lane, regardless of the movement speed of the two-wheeled vehicle, the selection as the object which is the control target object is continued when the two-wheeled vehicle is likely to approach the own vehicle M with high probability. That is, if the two-wheeled vehicle is standing still, the selection as the target object is continued when the two-wheeled vehicle is perpendicular to the own vehicle M while having a high probability of approaching the own vehicle M (the two-wheeled vehicle crosses the own vehicle lane with high probability).
In this regard, a pedestrian may also cross the own vehicle lane and approach the own vehicle M. Thus, also when the moving object determination section 14 determines that the type of the object is a pedestrian, the selection as the target object can be continued. When the target object is a pedestrian and a lateral speed of the pedestrian is low, a collision risk with the own vehicle M is low. Thus, it is preferred that the continuation determination section 13C of the object detection section 13 continues to select the pedestrian as the target object on the condition that the lateral speed of the pedestrian is higher than a predetermined value.
In the flow chart shown in
The aforementioned embodiment shows the example where an object is detected by using the radar 21 and the image sensor 22. Besides this example, the above process is applicable to the case where an object present around the own vehicle M is detected by using a plurality of detection sections having different detection ranges.
The aforementioned embodiment shows the example where the image processing section 12A is provided in the image target detection section 12. The configuration is not limited to this and the image processing section 12A can be provided also in the radar target detection section 11. That is, the image processing section 12A is only necessary to be provided in at least one of the radar target detection section 11 and the image target detection section 12.
The radar 21 of the present embodiment corresponds to a first target detection section. The image sensor 22 of the present embodiment corresponds to a second target detection section. In
The present disclosure is described with reference to the examples, but it will be understood that the present disclosure is not limited to the examples or configurations. The present disclosure encompasses various modified examples and variations within an equal range. In addition, a category or range of thought of the present disclosure encompasses various combinations or forms and other combinations or forms including only one element, one or more elements, or one or less elements of those.
Number | Date | Country | Kind |
---|---|---|---|
2015-246687 | Dec 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/085518 | 11/30/2016 | WO | 00 |