This application claims the benefit of Taiwan application Serial No. 106129051, filed Aug. 25, 2017, the disclosure of which is incorporated by reference herein in its entirety.
The disclosure relates in general to a detection assisting method and a detection assisting system, and more particularly to a vehicle detection assisting method and a vehicle detection assisting system.
Driving safety is an important issue in vehicle industry. In the technology of vehicle, various security detection systems, such as Vision-Lidar Fusion, are invented for improving the driving safety. However, in the conventional technology, whole of the front range is detected even if the lane is curved. The curved lane often causes the detection of the vehicle error, which makes the traffic safety cannot be guaranteed.
The disclosure is directed to a vehicle detection assisting method and a vehicle detection assisting system. A dynamic region of interest (dynamic ROI) is used to assist the vehicle detection. As such, even if the lane is curved, whether the vehicle drives on the lane and whether the vehicle leaves the lane can be correctly determined to improve the driving safety.
According to one embodiment, a vehicle detection assisting method is provided. The vehicle detection assisting method includes the following steps: A scanning range of a lidar unit is obtained. A width of a lane is obtained. A trace of the lane is obtained. A dynamic region of interest (dynamic ROI) in the scanning range is created according to the width and the trace.
According to another embodiment, a vehicle detection assisting system is provided. The vehicle detection assisting system includes a lidar unit, a width analyzing unit, a trace analyzing unit and a processing unit. The lidar unit is used for emitting a plurality of scanning lines. The lidar unit has a scanning range. The width analyzing unit is used for obtaining a width of a lane. The trace analyzing unit is used for obtaining a trace of the lane. The processing unit is used for creating a dynamic region of interest (dynamic ROI) in the scanning range according to the width and the trace.
The disclosure is directed to a vehicle detection assisting method and a vehicle detection assisting system. A dynamic region of interest (dynamic ROI) is used to assist the vehicle detection. As such, even if the lane is curved, whether the vehicle drives on the lane and whether the vehicle leaves the lane can be correctly determined to improve the driving safety.
According to one embodiment, a vehicle detection assisting method is provided. The vehicle detection assisting method includes the following steps: A scanning range of a lidar unit is obtained. A width of a lane is obtained. A trace of the lane is obtained. A dynamic region of interest (dynamic ROI) in the scanning range is created according to the width and the trace.
According to another embodiment, a vehicle detection assisting system is provided. The vehicle detection assisting system includes a lidar unit, a width analyzing unit, a trace analyzing unit and a processing unit. The lidar unit is used for emitting a plurality of scanning lines. The lidar unit has a scanning range. The width analyzing unit is used for obtaining a width of a lane. The trace analyzing unit is used for obtaining a trace of the lane. The processing unit is used for creating a dynamic region of interest (dynamic ROI) in the scanning range according to the width and the trace.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Please refer to
Please refer to
In the present embodiment, the width analyzing unit 120, the trace analyzing unit 130 and the processing unit 140 are used to create a dynamic region of interest (dynamic ROI) AR (shown in
The step S100 includes steps S110 to S140. In step S110, the processing unit 140 obtains the scanning region R1 from the lidar unit 110. For example, the scanning region R1 may be a fan area having a particular scanning angle and a particular scanning radius.
In the step S120, the width analyzing unit 120 obtains a width WD of the lane L1. In one embodiment, the width analyzing unit 120 obtains the width WD according to an image IM captured by the image capturing unit 121. For example, the width analyzing unit 120 can identify two marking lines on the image IM first. Then, the width analyzing unit 120 calculates the width WD of the lane L1 according to the ratio of the width of the vehicle 800 to the distance of the two marking lines. Or, in another embodiment, the width analyzing unit 120 can obtain the width WD by reading a map information at the location of the vehicle 800.
In the step S130, the trace analyzing unit 130 obtains a trace TR of the lane L1. In one embodiment, the trace analyzing unit 130 can obtain the trace TR according to a steering wheel information SW detected by the steering wheel monitoring unit 131 and a speed information SP detected by the steering wheel information SW. The vehicle 800 drives along the trace TR. As shown in
The performing order of the steps S110, S120, S130 is not limited as shown in
In the step S140, the processing unit 140 creates the dynamic ROI AR in the scanning region R1 according to the width WD and the trace TR. In one embodiment, the dynamic ROI AR is a range which extends along the trace TR with the width WD. The dynamic ROI AR is an overlapping range between the lane L1 and the scanning region R1.
As shown in
As shown in
As such, by using the dynamic ROI AR, even if the lane L1 is curved, the detection of the status of the vehicle 930 is correct.
Moreover, the area of the dynamic ROI AR is smaller than the area of the scanning region R1, and the detection on the region outside the dynamic ROI AR is omitted. As such, the processing loading can be reduced.
Please refer to
After the dynamic ROI AR is created in the step S140, the process proceeds to the step S200. The step S200 includes steps S210, S220. In the step S210, the distance calculating unit 210 calculates a distance amount DA of a plurality of reflection points of a plurality of scanning lines SL in the dynamic ROI AR. The distance amount DA can represent the status of an object located in the dynamic ROI AR. The lower the distance amount DA is, the larger the region in the dynamic ROI AR occupied by the object is.
In the step S220, the processing unit 140 analyzes the status of a vehicle located in front according to the distance amount DA. As shown in
As shown in
As shown in
Please refer to
The step S300 includes steps S310 and S320. In step S310, the curve analyzing unit 310 obtains a reflection point curve RC of the scanning lines SL (shown in
In step S320, the processing unit 140 analyzes the status of the vehicle 970 according to the reflection point curve RC. As shown in
According to the embodiments described above, the dynamic ROI is used to assist the detection of the vehicle, such that even if the lane is curved, whether the vehicle drives on the lane and whether the vehicle leaves the lane can be correctly determined to improve the driving safety.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
106129051 A | Aug 2017 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
6311119 | Sawamoto | Oct 2001 | B2 |
6489887 | Satoh | Dec 2002 | B2 |
6691018 | Miyahara | Feb 2004 | B1 |
8224550 | Kudo | Jul 2012 | B2 |
9061590 | Kurumisawa | Jun 2015 | B2 |
9428187 | Lee | Aug 2016 | B2 |
10220844 | Ko | Mar 2019 | B2 |
10332400 | Oooka | Jun 2019 | B2 |
20160325753 | Stein | Nov 2016 | A1 |
20160339959 | Lee | Nov 2016 | A1 |
20170326981 | Masui | Nov 2017 | A1 |
20170356983 | Jeong | Dec 2017 | A1 |
20180009438 | Masui | Jan 2018 | A1 |
20180189574 | Brueckner | Jul 2018 | A1 |
20180306905 | Kapusta | Oct 2018 | A1 |
Number | Date | Country |
---|---|---|
105403893 | Mar 2016 | CN |
Entry |
---|
Asvadi et al., “3D Lidar-based static and moving obstacle detection in driving environments: An approach based on voxels and multi-region ground planes”, Robotics and Autonomous Systems 83 (2016), pp. 299-311. |
Number | Date | Country | |
---|---|---|---|
20190064322 A1 | Feb 2019 | US |