The present application generally relates to obstacle detection for vehicles using planar sensor data.
In accordance with one or more embodiments, a computer-implemented method is provided for automatically detecting an obstacle from a moving vehicle using a planar sensor mounted on the vehicle. The method includes the steps of: (a) receiving data at a computer system from the sensor, said data comprising a series of points defined by a planar coordinate system relative to the sensor where the sensor detected a return during a frame, and receiving data at the computer system indicating the vehicle's geo-referenced location and orientation; (b) identifying, by the computer system, an obstacle candidate profile from the series of points; (c) transforming, by the computer system, the obstacle candidate profile from a sensor-relative representation to a geo-referenced representation; (d) repeating steps (a) to (c) for a given number of frames; and (e) detecting, by the computer system, an obstacle when the obstacle candidate profiles in the geo-referenced representation for each of the given number of frames substantially coincide.
In accordance with one or more further embodiments, a computer system is provided, comprising at least one processor, memory associated with the at least one processor, and a program supported in the memory for automatically detecting an obstacle from a moving vehicle using a planar sensor mounted on the vehicle. The program has a plurality of instructions stored therein which, when executed by the at least one processor, cause the at least one processor to: (a) receive data from the sensor, said data comprising a series of points defined by a planar coordinate system relative to the sensor where the sensor detected a return during a frame, and receive data indicating the vehicle's geo-referenced location and orientation; (b) identify an obstacle candidate profile from the series of points; (c) transform the obstacle candidate profile from a sensor-relative representation to a geo-referenced representation; (d) repeat (a) to (c) for a given number of frames; and (e) detect an obstacle when the obstacle candidate profiles in the geo-referenced representation for each of the given number of frames substantially coincide.
Various embodiments disclosed herein are directed to an obstacle detection system having various possible uses including, e.g., in off-road vehicles. The obstacle detection system uses sensors having a primarily planar field-of-view (such as a LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging) or RADAR (RAdio Detection And Ranging)) for detection of stationary obstacles in the path of a vehicle, and suppresses false detection of obstacles caused by sensor returns from terrain that is otherwise safe to traverse. The obstacle detection system inter-operates with other vehicle control modules in order to avoid collisions with obstacles by either steering around them or halting the vehicle before reaching them.
In the illustrative embodiments disclosed herein, the obstacle detection system is implemented in a driverless vehicle in off-road conditions that is designed to avoid collisions with stationary objects by analyzing data collected via an onboard LIDAR or RADAR. However, it should be understood that the system could have various other implementations including, e.g., in a manned system with an automated collision warning or avoidance system. It could also be implemented in vehicles operating on paved or other surfaces where the terrain in the operational area is sufficiently uneven so as to present challenges in distinguishing sensor returns from ground features from legitimate obstacles. In addition, the system can use a variety of sensor types other than LIDAR or RADAR for which the field of view lies primarily in a horizontal plane, with a relatively narrow vertical field-of-view including, e.g., SONAR (SOund Navigation And Ranging).
As shown in
An obstacle detection system and process in accordance with one or more embodiments takes advantage of changes in the vehicle's pitch and roll over time in order to collect information about how the sensor data varies in the vertical direction. In this way, detected targets with a mostly-vertical orientation will be classified as obstacles, whereas detections with a significantly lower angle-of-incidence with the sensor plane can be rejected as likely ground clutter.
An obstacle detection process in accordance with one or more embodiments is run once per sensor update time interval, using the inputs specified below. (The sensor update time interval comprises one sweep of the sensor or frame, and can be, e.g., 1/30th of a second.)
As output, the process produces a set of obstacle profiles that have been determined to represent likely obstacles to be avoided.
The sensitivity, accuracy, and run-time complexity of the process can be configured via the following parameters:
A representative obstacle detection process in accordance with one or more embodiments is as follows and as illustrated in the flow chart of
As an example, consider the obstacle detection process illustrated in
As an example of ground clutter rejection, consider the example in
The processes of the obstacle detection system described above may be implemented in software, hardware, firmware, or any combination thereof. The processes are preferably implemented in one or more computer programs executing on a programmable computer system including a processor, a storage medium readable by the processor (including, e.g., volatile and non-volatile memory and/or storage elements), and input and output devices. Each computer program can be a set of instructions (program code) in a code module resident in the random access memory of the computer system. Until required by the computer system, the set of instructions may be stored in another computer memory (e.g., in a hard disk drive, or in a removable memory such as an optical disk, external hard drive, memory card, or flash drive) or stored on another computer system and downloaded via the Internet or other network.
Having thus described several illustrative embodiments, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to form a part of this disclosure, and are intended to be within the spirit and scope of this disclosure. While some examples presented herein involve specific combinations of functions or structural elements, it should be understood that those functions and elements may be combined in other ways according to the present disclosure to accomplish the same or different objectives. In particular, acts, elements, and features discussed in connection with one embodiment are not intended to be excluded from similar or other roles in other embodiments.
Additionally, elements and components described herein may be further divided into additional components or joined together to form fewer components for performing the same functions. For example, the obstacle detection system may comprise one or more physical devices or machines, or virtual machines running on one or more physical machines. In addition, the obstacle detection system may comprise a cluster of computers or numerous distributed computers that are connected by the Internet or another network.
Accordingly, the foregoing description and attached drawings are by way of example only, and are not intended to be limiting.
| Number | Name | Date | Kind |
|---|---|---|---|
| 6055042 | Sarangapani | Apr 2000 | A |
| 6130639 | Agnesina et al. | Oct 2000 | A |
| 6515614 | Sakai et al. | Feb 2003 | B2 |
| 8139109 | Schmiedel et al. | Mar 2012 | B2 |
| 8605998 | Samples et al. | Dec 2013 | B2 |
| 8781643 | Naka | Jul 2014 | B2 |
| 20020013675 | Knoll et al. | Jan 2002 | A1 |
| 20050165550 | Okada | Jul 2005 | A1 |
| 20080243334 | Bujak et al. | Oct 2008 | A1 |
| 20100020074 | Taborowski et al. | Jan 2010 | A1 |
| 20100026555 | Whittaker et al. | Feb 2010 | A1 |
| 20100030473 | Au et al. | Feb 2010 | A1 |
| 20100114416 | Au et al. | May 2010 | A1 |
| 20100305857 | Byrne et al. | Dec 2010 | A1 |
| 20110262009 | Duan et al. | Oct 2011 | A1 |
| 20120095619 | Pack et al. | Apr 2012 | A1 |
| 20130070095 | Yankun et al. | Mar 2013 | A1 |
| Number | Date | Country |
|---|---|---|
| 2008-076390 | Apr 2008 | JP |
| Entry |
|---|
| Fulcher, Advances in Applied Artificial Intelligence, Mar. 31, 2006, Google eBook. |
| Lages et al., Laserscanner Innovations for Detection of Obstacles and Road, 2003, VDI-Buch. |
| Nalos, LiDAR and Coordinate Systems: Journey to the Center of the Earth, Mar. 30, 2011, http://blog.safe.com/2011/03/lidar-and-coordinate-systems-journey-to-the-center-of-the-earth/. |
| International Search Report and Written Opinion for PCT/US2014/038962, dated Nov. 11, 2014. |
| Number | Date | Country | |
|---|---|---|---|
| 20140350835 A1 | Nov 2014 | US |