The present invention is directed to various embodiments of a Road Departure Sensing System and an Intelligent Driving System which can collect a continuous sequence of images from an area ahead of a moving vehicle, measure the distance to potential obstacles, calculate the potential for collisions, and warn the operator in advance of the vehicle departing the drivable surface or communicate with an unmanned-vehicle navigation system.
In the theatres of war in Iraq and Afghanistan extremists have used weapons of opportunity such as roadside bombs and improvised explosive devices (IEDs) to wage war. One response to mitigate the effects of roadside bombs and IEDs is to raise the hull of military vehicles to increase ground clearance. Raising a vehicle raises its center of gravity which increases the likelihood of “tripping” rollovers. A tripping rollover can occur when the outside wheels of a vehicle strike a curb, enter a soft shoulder, or encounter a change in grade. The center of gravity moves beyond these outer wheels and the vehicle is said to “trip” and a rollover commences. High center of gravity vehicles, such as the Mine Resistant Ambush Protected (MRAP) vehicle and the Joint Light Tactical Vehicle (JLTV), are prone to tripping in this manner. Tripping can also occur with mining or farm vehicles that operate in rural or “off-road” conditions on unpaved or soft-shoulder paths or trails. A “tripping” type of rollover typically occurs at a road's edge where the road can be bordered by a ditch or berm. A soft dirt shoulder that may or may not include vegetation could define the edge of a dirt road or track soft. Abruptly encountering any of these drivable or non-drivable combinations can change the friction condition between the road and tire surface causing the vehicle to trip. Avoiding road edges can help mitigate tripping and reduce the likelihood of encountering this type of rollover problem.
There are examples of optical cameras equipped with algorithms to find road edges on improved roads, such as highways, freeways, and secondary roads. These optical camera systems generally operate only in daylight conditions on a structured road. These systems do not operate well at night, on unstructured dirt roads, or tracks with dirt shoulders. For example, “Application Analysis of Near Infrared Illuminators Using Diode Laser Light Sources,” by Stout and Fohl, published in the Proceedings of the SPIE, Vol. 5403, which is incorporated herein by reference, teaches the use of an infrared illuminator and CCD or CMOS sensors to create images.
Light Detection and Ranging (LIDAR) systems utilizing a narrow laser beam can be used to map physical features with very high resolution and such systems have previously been utilized for experimental vehicle navigation systems. However, the cost of multiple 2-D or 3-D LIDAR sensors has generally limited their use to expensive or experimental systems. Existing systems are also not capable of operation in a harsh military environment.
Current unmanned ground vehicles (UGVs) rely on electro-optical (EO) cameras and/or LIDAR sensors for viewing the road ahead of the vehicle. Use of these sensors creates limitations on the operation of the vehicle. Common EO sensors are useful in daylight but are not optimal for nighttime or low light operations.
Embodiments of the present invention are directed toward a Road Departure Sensing System (RDSS) which collects a continuous sequence of images from the area ahead of a forward moving vehicle, processes these images, establishing drivable and non-drivable surfaces, and communicates a warning to the driver in advance of the vehicle departing the drivable surface. An embodiment of the system can extract information from images in day or night conditions to discern drivable and non-drivable surfaces and to provide warnings automatically on improved roads, for instance, highways, freeways, and secondary roads, or dirt roads with or without dirt shoulders. The RDSS can operate under changing lighting conditions in both day and night-time illumination. Providing ample warning to the vehicle's driver can help to mitigate road departure accidents, reducing both equipment cost and the risk of injury or death to the vehicle's occupants.
In one embodiment, a RDSS operates with the aid of self-contained infrared illumination. A charge coupled device (CCD) sensor collects reflected electromagnetic radiation from the visible as well as near infrared spectrums making the system operable in either day or night. The RDSS includes image analysis algorithms that extract both edge and texture information from an image. Being equipped to analyze both edges and textures provides the system with the capability to operate on structured highways, freeways, secondary paved roads, and dirt roads with dirt or grass shoulders. The RDSS can act to warn the driver of a vehicle that the path that the vehicle is moving on will result in an imminent departure from the road based on an analysis of edges and surface textures of the road and surrounding area. In one embodiment the RDSS can issue a warning at least one to two seconds prior to road departure. The advanced warning provides the driver with sufficient time to react and change course to avoid a vehicle-tripping incident.
In one embodiment the RDSS can be manually or automatically adjusted, through real world operation, to minimize false alarm rates and maximize true positive rates. The system does not need to take control of the vehicle; it can issue an audible or other alert to the driver to attend to changing the current course of the vehicle to avoid a potentially catastrophic rollover. The system is compact and can support many different mechanical shapes and configurations. One embodiment utilizes a commercially available single board computer, in combination with a DDC camera and IR illumination, in a rugged, durable design built to operate in extreme ambient temperature environments.
In one embodiment, the RDSS includes a built in illuminator in the near infrared spectrum that allows for day or night operation. Without input from the driver or operator, high-resolution images are obtained to determine fine detail of the area ahead of a vehicle to allow discernment of road edges and textures that indicate a change between a drivable and a non-drivable surface. In addition to use for road departure warning, an embodiment of the system can be combined with an appropriate radar or navigation system to provide a driving sensor system for unmanned ground vehicles.
In one embodiment, an Intelligent Driving Sensor Suite (IDDS) in combination with an RDSS embodiment provides vehicles with a sensor suite for autonomous (unmanned) operation or for manned driver assistance that is low-cost, rugged, and reliable. The IDDS includes a near infrared illuminated IR imaging sensor, algorithms to optimize the image quality in real time, and a laser range finder or a microwave radar transceiver and algorithms for data analysis to determine object extent, range, and bearing data for objects on the drivable surface in the intended path ahead of a vehicle.
An IDDS processor configured with a data fusion algorithm continuously provides object extent, range to the object, and bearing angle or heading of the object to the vehicle to collision avoidance software which uses the information to correct the path of the vehicle to avoid objects in the path of the vehicle.
One embodiment of IDDS, provides a sensor suite for autonomous driving capability that is much less expensive than the cost of experimental unmanned vehicle systems and can be integrated with low-cost, low-weight vehicles, such as cars, light trucks, tactical trucks or MTV; and is an easy upgrade to heavy platforms, such as farm equipment, mining vehicles, or the Bradley family of vehicles, the ground combat vehicle, or a marine personnel carrier. One advantage of a near IR illuminated sensor in an IDDS is the ability to discern the boundary of the drivable surface from non-drivable shoulder. This advantage derives from the fact that a passive IR sensor tuned to any wavelength will not distinguish between a road and its shoulder if both are constructed of the same material (same emissivity) and both are at the same temperature.
One embodiment of the present invention, combining the IDDS and the RDDS, can be integrated with existing passenger vehicles to provide warning, alerts, or the application of the vehicle's brakes when a road-departure event is anticipated. The IDDS and RDDS can also be utilized in conjunction with existing passenger vehicle back-up warning systems to alert a driver if a vehicle is about to depart from a road surface or strike a curb while the vehicle is being driven in reverse.
The invention may be more completely understood in consideration of the following detailed description of various embodiments of the invention in connection with the accompanying drawings, in which:
a depicts a road scene image acquired by an exemplary RDSS.
b depicts the road scene of
c depicts the road scene of
a and 20b depict the IDDS cooperation of an optical RDSS with a radar sensor.
While the invention is amenable to various modifications and alternative forms, specifics thereof have been shown by way of example in the drawings and will be described in detail. It should be understood, however, that the intention is not to limit the invention to the particular embodiments described. On the contrary, the intention is to cover all modifications, equivalents, and alternatives.
Referring to
An exemplary IDDS combines an embodiment of an RDSS 50 with a ranging laser or radar to provide alerts, to a driver or autonomous vehicle navigation system, of potential obstacles or obstructions, and to provide sensor data to an vehicle navigation system in real-time.
Referring to
A warning signal can be presented to the operator of the vehicle as an auditory or optical alert indicating that the operator should reduce speed and/or change the steering angle. The alert can be presented with varying degrees of severity. For example, a severe alert can be issued when the vehicle is traveling at a high rate of speed and the operator changes the steering angle such that a road departure is imminent. A less severe alert can be raised in a situation where the vehicle is approaching the boundary of a road or path while traveling at a moderate or low speed where there is a lesser risk of a vehicle rollover or tripping condition.
Disposed behind the optical assembly 104, CCD assembly 106 includes a CCD sensor coupled to a CCD controller board 154, a PCI to IEEE-1394 board, and camera controller board 125. The PCI to IEEE-1394 board can be configured to acquire images, or frames, from the CCD sensor and provide the digital image data to the camera controller board 125 over a PCI bus. Camera controller 125 is disposed adjacent to the CCD assembly 106. The camera controller board 125 includes an interface to the main circuit board 128 that includes a processor and a system power supply.
In one embodiment the laser-diode illumination assembly 108 comprises a laser diode 130 that can be any of a variety of commercially available laser diodes that emits infrared electromagnetic radiation having a wavelength of approximately 808 nm. Additional or alternative laser diodes of different wavelengths can also be employed with appropriate adjustment to the filters and detection sensor(s) to accommodate the alternative wavelength(s).
Referring to
In one aspect, the system includes a control circuit for activating the laser diode 130 also checks the wheel speed of the vehicle, a signal extracted from a CAN bus of the vehicle, to ensure the vehicle is moving at a preset slow speed of approximately five miles-per-hour (mph) before initiating laser diode activation. The circuit checks the wheel speed and issues the trigger pulse to maintain the power to activate the laser diode. This circuit ensures that during maintenance or other idle time the laser will remain inactive protecting any unsuspecting or unaware person.
At the USB interface connection a signal is received to turn on the NIRIS sensor. If health and status are good the laser diode controller is turned on. When appropriate command word(s) are received via the USB interface to turn on the laser diode 130, MOS B is turned on and a trigger is issued to a timing device, such as the depicted NE555 timer. The timing device produces defined (e.g., five-second) pulse once triggered. This pulse turns on MOS Switch A. If a subsequent trigger is not received within five seconds the output goes to zero volts, the MOS switch A is off and the laser diode 130 is off.
Referring to
The rear mount 150 provides a housing for the cold plate 156, the attached cooler 158, and a heat sink 160 to transfer heat away from the cooler. The heat sink 160 can dissipate excess thermal energy to the atmosphere or be thermally coupled to a large assembly such as a housing 101 or a mounting assembly on a vehicle.
Referring to
The CCD assembly 106 is held in the optical assembly 104 by a rear optical assembly mount 150 and heat sink 160. Lenses 162 are mounted in the optical assembly along then central axis of the CCD sensor 152 and focus light into the sensor. Additionally, the optical assembly 104 includes a band-pass filter 120 that can be configured to selectively transmit a narrow range of wavelengths while blocking all others. In one embodiment the band-pass filter 120 blocks wavelengths that are not approximately 808 nm from reaching the CCD sensor 152. Alternative band pass filters can be utilized to block background light such as sunlight, fires, flares, etc.
A replaceable window 164 closes the optical tube 144 and protects the interior of the optical tube (lenses 162, filter 120, CCD sensor 152, and associated electronics) from the ambient environment while allowing the appropriate light or IR radiation to enter the assembly 104. The window 164 can be transparent or include a variety of filters to further optimize the performance of the CCD sensor 152.
Referring to
In one embodiment the CCD sensor 152 can be an interline transfer CCD with progressive scan, having a resolution of 640 horizontal pixels and 480 vertical pixels. In one embodiment a CCD sensor 152 can be a commercially available unit such as the KAI-0340 IMAGE SENSOR available from Kodak, Image Sensor Solutions, of Rochester, N.Y. Alternative image sensors can be substituted for alternate resolutions depending on cost, processor capability, and performance goals. Higher resolutions sensors will require a corresponding increase in processor capability to meet the real-world performance needs of an RDSS System.
During operation, the SPTGM controller converts each pixel of stored charge into an eight-bit value. The DCAM MCM module contains a FPGA that acts to buffer the pixel data, which is transferred into random access memory (RAM) storage for retrieval by a processor. Each image can be stored in RAM for processing as four-byte words. Various software routines, such as those provided with the Intel® Integrated Performance Primitives (IPP) software library, can be utilized to configure the processor with routines for image manipulation. Software can utilize memory mapping techniques, and call individual IPP routines to adjust pixel data to optimize the information content of each individual image. Each image captured by the CCD sensor 152 can be sequentially optimized and evaluated as they are acquired and buffered from the CCD sensor 152 to the processor. Images that do not provide sufficient detail can be discarded.
This optimization can be achieved by using the entire dynamic range of the CCD sensor 152 regardless of illumination conditions or camera settings. A histogram of an image (i.e., the number of pixels with captured intensities at each level of gray between black (0) and white (255)) can be adjusted or stretched over the available range to optimize the information over the dynamic range of the lens and sensor assembly. Images that are not severely under or over exposed provide the best data for edge detection analysis. In varying conditions, the exposure time, i.e., the length of time the CCD captures photons to create an image, must be adjusted to eliminate under exposure by increasing the exposure time, or to eliminate over exposure by decreasing the exposure time. For example, as a vehicle moves along a path the lighting conditions can rapidly change. Fast processing of images ensures that approximately eight to ten images are properly exposed and captured for analysis every second. In one embodiment thirty images (frames) are captured and evaluated to achieve approximately ten properly exposed images for edge detection analysis. Any images that are not properly exposed can be discarded after exposure analysis.
The overall camera architecture comprises an exemplary DCAM MCM Frame Capture module 126 and its internal data path that couples a 1394a electrical interface to a personal computer, and provides a digital signal processor (DSP) having DCAM software and a FPGA that can buffer the pixel data received from the SPTGM board 127 to provide commands to the SPTGM board 127.
A main circuit board 128 converts vehicle power, nominally twenty-eight volts, to regulated power required for thermoelectric coolers and the laser diode. It also hosts the control circuit for the laser diode 130 which provides for safe activation of the laser diode. The main circuit board 128 also includes a USB interface to couple system 100 to a commercial laptop computer that can include software to optimize images, extract important features from the images, collect wheel speed and steering angle of the vehicle from the vehicle's CAN bus (data bus), compare vehicle position and trajectory to the road ahead, predict vehicle path, and issue a warning signal if road departure is imminent.
Referring to
Housing 230, similar to the trapezoid configuration depicted in
In one embodiment, a computer processor board can be oriented such that the side with the heat generating processor chip set faces a center septum. The septum is an integral part of the housing and the primary thermal conduction path to remove heat. Zero degrees of rotation indicates the bottom board is flipped under the top board without rotation. Thermal analysis indicates that the 270-degree rotation is a preferred orientation to minimize hotspots and maximize thermal dispersion. This orientation permits a common interface PWB design to be used for both processors and eliminates interference of screws used in the interior of the processor boards. Thermal analysis of a prototype embodiment indicates that the temperature of the microprocessor in contact with a septum 231 at an ambient temperature of approximately 70° C. reaches a steady state temperature of approximately 89° C., which is generally within the operating temperature range of the microprocessor.
In one embodiment, a custom microprocessor board having one or more processors and integrated digital camera and human-machine interface connections can replace the microprocessor boards 229a/b. A custom board can be optimized to further manage and reduce the operating temperature of the apparatus 200.
Software embedded on the commercial computer board or a custom processor board can be used extensively to control the operation of the RDSS hardware. An important safety feature of RDSS is the control over the activation of the laser diode illumination. The processor(s) can be programmed to extract wheel-speed data from a vehicle's electronic systems and operate a laser diode illuminator only when the vehicle is in motion.
In order to prevent a moving vehicle from departing a surface, road or path, information from the entire field of view in front of the vehicle is not required. The drivable surface, road or highway, ahead of the vehicle is important. An important truism for a RDSS application is that the road edges tend to meet at a vanishing point beyond the horizon. The important region of the image is the portion that is immediately ahead of the vehicle to the horizon, as depicted in
In the exemplary case of a CCD sensor with 480 vertical pixels, the top 96 rows of pixels can comprise a region farthest from the vehicle (and the NIRIS) labeled ROI A in
Referring to
Referring to
An initial test is performed to determine if the image was properly exposed. This test can include evaluating the entire image and discarding the image if the average histogram value for the image is less than seventy. If the image is under or over exposed appropriate correction is calculated, based on the average histogram value, and a subsequent image is acquired.
If the difference between the average pixel intensity of ROI A and ROI B is more than forty the image was acquired during night-time (darkened) conditions, typically with a longer exposure. A comparison of the histogram value for ROI C1, C2, and C3 can be conducted of images taken at night to account for the use of vehicle headlights, street lamps, or other lighting variations that may impact the exposure in each ROI. If the difference between the average pixel intensity of ROI A and ROI B is less than forty indicates that the image was acquired during daylight conditions, as depicted in
A processor can change the exposure time between the collection of individual images by issuing a command to the CCD controller to change the exposure time of the CCD sensor. The next image collected will then have the new exposure time. Experimental results show that adjustments of information optimization take approximately 16 milliseconds using an INTEL® Core 2 Duo processor. At this rate of adjustments images can be acquired and evaluated for a first image before data for the next image arrives at the processor. This image pre-processing can ensure that an optimal image is captured in real time such that at least thirty frames per second (fps) are accurately acquired by the system.
Referring to
The algorithm depicted in
A captured RDSS image has a fixed horizontal and vertical field of view that can be calculated in degrees offset from the RDSS apparatus or the center of the vehicle. Plane geometry provides the wheel position relative to the road edge and trajectory information establishes an estimate for the future vehicle path based on the size and wheel base of the vehicle. If the vehicle and its speed indicate that the vehicle is more than three seconds from a road departure event then no warning is given. If a road departure event is calculated to be between two and three seconds from occurring a preliminary warning can be presented to the driver as a cautionary series of low beeps. If the RDSS calculates that there are less than two seconds until a road departure event the pitch and frequency of the preliminary warning beeps increases to alert the driver that immediate action is required to prevent the vehicle from departing the path of the road.
Referring to
A range measuring device is also included with the sensor suite and can be used to detect objects, whether they are obstacles or obstructions, in the path of the vehicle and provides an instantaneous range from the vehicle to the object and bearing angle. As the range detector scans the area in front of the vehicle the RDSS can categorize objects at various distances and prioritize navigational warnings for those objects that are closest to the vehicle or most directly in the path ahead of the vehicle. Four different range categories A through D are depicted in
An object with finite extent is an obstacle, while one with infinite extent is an obstruction. The trajectory of the vehicle must be re-planned to avoid the obstruction while an object can be navigated around. The range measuring device may be an economical, addressable laser range finder or a commercial microwave (millimeter wavelength) radar device. Either of these components will provide the range and bearing information required with data from the IR image to navigate the vehicle to avoid an obstacle in its path or re-plan around an obstruction.
An Intelligent Driving Sensor Suite (IDSS) combining a RDSS and a Range Measuring Device (RMD), such as a laser range finder or a microwave radar sensor. The near-IR illuminated sensor is equipped with embedded computing capability and hosts algorithms for optimizing the information content of each image in the video stream in real time. The RDSS also hosts algorithms that determine frame-by-frame from the video stream using texture based techniques, the drivable surface ahead of the vehicle, the road boundary, e.g.
The RDSS then instructs the RMD to investigate the object at a specific bearing angle to the vehicle and report its range and bearing angle. The significance of the invention is the compensation the system, combined algorithm and hardware, makes for the relatively low resolution of the radar sensor, about three degrees, with the relatively high resolution afforded by the IR imaging sensor, about 0.04 degrees.
As shown in
The design of IDSS includes shielding and a durable housing sufficient for extreme environmental requirements, such as use with military vehicles, and thereby is rugged and reliable in harsh environments.
The embodiments above are intended to be illustrative and not limiting. Additional embodiments are encompassed within the scope of the claims. Although the present invention has been described with reference to particular embodiments, those skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention. For purposes of interpreting the claims for the present invention, it is expressly intended that the provisions of Section 112, sixth paragraph of 35 U.S.C. are not to be invoked unless the specific terms “means for” or “step for” are recited in a claim.