The subject invention relates to vehicle navigation and object detection and in particular to systems and methods for determining an object's location from a reflection of a structured light pattern from the object.
Driver-assisted vehicles can include a digital camera that takes a view of an area surrounding the vehicle in order to provide a view of blind spots and other hard-to-see areas. Such cameras work well in the daylight but can be impaired at night. Accordingly, it is desirable to provide a system and method for augmenting the ability of the digital camera at night or during other difficult viewing conditions.
In one exemplary embodiment, a method for detecting a location of an object with respect to a vehicle is disclosed. The method includes transmitting, at the vehicle, a structured light pattern at a selected frequency into a volume that includes the object and receiving, at a detector of the vehicle, a reflection of the light pattern from the volume. A processor determines a deviation in the reflection of the structured light pattern from the object in the volume, and determines the location of the object in the volume from the deviation.
The structured light pattern can be a pattern of vertical stripes. The deviation can be determined by comparing reflection intensities at a location with an expected intensity at the location from a line model indicative of reflection of the structure light pattern from a planar horizontal surface. In various embodiments, the vehicle can be navigated based on the location of the object.
An image of the object can be captured and compared to the deviation in the reflection of the light pattern in order to train a neural network to associate the deviation in the reflection of the structured light pattern with the object. The location of an object can then be determined from a location of a deviation in a reflection of the light pattern and the association of the trained neural network. The structured light pattern can be produced, for example, by one of a diffractive lens combined with a one-dimensional microelectromechanical system (MEMS) scanner, refractive optics with a two-dimensional MEMS scanner, an array of light sources, a polygon scanner, and an optical phase array.
In another exemplary embodiment, a system for detecting a location of an object with respect to a vehicle is disclosed. The system includes an illuminator configured to produce a structured light pattern into a volume at a selected frequency, a detector configured to detect a reflection of the light pattern from an object in the volume, and a processor. The processor is configured to: determine a deviation in the reflection of the light pattern due to the object; and determine the location of the object from the determined deviation.
The illuminator produces a pattern of vertical stripes at the selected frequency. The processor determines the deviation by comparing reflection intensities at a selected location with an expected intensity at the selected location from a line model indicative of reflection of the structure light pattern from a planar horizontal surface. The processor can then navigate the vehicle based on the detected location of the object.
In an embodiment, the processor illuminates the object with the pattern and compares the deviation in the reflection of the light pattern to an image of the object that causes the deviation in order to train a neural network to associate the deviation of the light pattern with the selected object. The processor can then determine a location of an object from the location of a deviation in the reflection of the light pattern and the association of the trained neural network.
The illuminator includes can be one of a diffractive lens combined with a one-dimensional microelectromechanical system (MEMS) scanner, refractive optic with a two-dimensional MEMS scanner, an array of light sources, a polygon scanner, and an optical phase array, in various embodiments. The detector can include a filter that passes light within the visible range and with a selected range about 850 nanometers.
In yet another exemplary embodiment, a vehicle is disclosed. The vehicle includes an illuminator configured to produce a structured light pattern in a volume at a selected frequency, a detector configured to detect a reflection of the light pattern from the volume, and a processor. The processor determines a deviation in the reflection of the light pattern due to the object, and determine a location of the object from the determined deviation.
The illuminator produces a pattern of vertical stripes at the selected frequency. The processor determines the deviation by comparing reflection intensities at a selected location with an expected intensity at the selected location from a line model indicative of reflection of the structure light pattern from a planar horizontal surface.
The processor illuminates the object with the pattern and compares the deviation in the reflection of the light pattern to an image of the object that causes the deviation in order to train a neural network to associate the deviation of the light pattern with the selected object. The processor can then determine a location of an object from a location of a deviation in a reflection of the light pattern and the association of the trained neural network.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
In accordance with an exemplary embodiment of the invention,
In various embodiments, the vehicle 10 is an autonomous vehicle and the trajectory planning system 100 is incorporated into the autonomous vehicle 10 (hereinafter referred to as the autonomous vehicle 10). The autonomous vehicle 10 is, for example, a vehicle that is automatically controlled to carry passengers from one location to another. The autonomous vehicle 10 is depicted in the illustrated embodiment as a passenger car, but it should be appreciated that any other vehicle including motorcycles, trucks, sport utility vehicles (SUVs), recreational vehicles (RVs), marine vessels, aircraft, etc., can also be used. In an exemplary embodiment, the autonomous vehicle 10 is a so-called Level Four or Level Five automation system. A Level Four system indicates “high automation”, referring to the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene. A Level Five system indicates “full automation”, referring to the full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver.
As shown, the autonomous vehicle 10 generally includes a propulsion system 20, a transmission system 22, a steering system 24, a brake system 26, a sensor system 28, an actuator system 30, at least one data storage device 32, at least one controller 34, and a communication system 36. The propulsion system 20 may, in various embodiments, include an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The transmission system 22 is configured to transmit power from the propulsion system 20 to the vehicle wheels 16-18 according to selectable speed ratios. According to various embodiments, the transmission system 22 may include a step-ratio automatic transmission, a continuously-variable transmission, or other appropriate transmission. The brake system 26 is configured to provide braking torque to the vehicle wheels 16-18. The brake system 26 may, in various embodiments, include friction brakes, brake by wire, a regenerative braking system such as an electric machine, and/or other appropriate braking systems. The steering system 24 influences a position of the of the vehicle wheels 16-18. While depicted as including a steering wheel for illustrative purposes, in some embodiments contemplated within the scope of the present disclosure, the steering system 24 may not include a steering wheel.
The sensor system 28 includes one or more sensing devices 40a-40n that sense observable conditions of the exterior environment and/or the interior environment of the autonomous vehicle 10. The sensing devices 40a-40n can include, but are not limited to, radars, LIDARs, global positioning systems, optical cameras, digital cameras, thermal cameras, ultrasonic sensors, and/or other sensors. The actuator system 30 includes one or more actuator devices 42a-42n that control one or more vehicle features such as, but not limited to, the propulsion system 20, the transmission system 22, the steering system 24, and the brake system 26. In various embodiments, the vehicle features can further include interior and/or exterior vehicle features such as, but are not limited to, doors, a trunk, and cabin features such as air, music, lighting, etc. (not numbered).
The data storage device 32 stores data for use in automatically controlling the autonomous vehicle 10. In various embodiments, the data storage device 32 stores defined maps of the navigable environment. In various embodiments, the defined maps may be predefined by, and obtained from, a remote system (described in further detail with regard to
The controller 34 includes at least one processor 44 and a computer readable storage device or media 46. The processor 44 can be any custom made or commercially available processor, a central processing unit (CPU), a graphics processing unit (GPU), an auxiliary processor among several processors associated with the controller 34, a semiconductor based microprocessor (in the form of a microchip or chip set), a macroprocessor, any combination thereof, or generally any device for executing instructions. The computer readable storage device or media 46 may include volatile and nonvolatile storage in read-only memory (ROM), random-access memory (RAM), and keep-alive memory (KAM), for example. KAM is a persistent or non-volatile memory that may be used to store various operating variables while the processor 44 is powered down. The computer-readable storage device or media 46 may be implemented using any of a number of known memory devices such as PROMs (programmable read-only memory), EPROMs (electrically PROM), EEPROMs (electrically erasable PROM), flash memory, or any other electric, magnetic, optical, or combination memory devices capable of storing data, some of which represent executable instructions, used by the controller 34 in controlling the autonomous vehicle 10.
The instructions may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions. The instructions, when executed by the processor 44, receive and process signals from the sensor system 28, perform logic, calculations, methods and/or algorithms for automatically controlling the components of the autonomous vehicle 10, and generate control signals to the actuator system 30 to automatically control the components of the autonomous vehicle 10 based on the logic, calculations, methods, and/or algorithms. Although only one controller 34 is shown in
In various embodiments, one or more instructions of the controller 34 are embodied in the trajectory planning system 100 and, when executed by the processor 44, projects a structured light pattern into a volume proximate the vehicle 10 and records a reflection of the structured light pattern from one or more objects in the volume in order to determine the presence and/or location of the object within the volume.
The communication system 36 is configured to wirelessly communicate information to and from other entities 48, such as but not limited to, other vehicles (“V2V” communication), infrastructure (“V2I” communication), remote systems, and/or personal devices (described in more detail with regard to
In other embodiments, the vehicle 10 can be a non-autonomous vehicle or a driver-assisted vehicle. The vehicle may provide audio or visual signals to warn the driver of a presence of an object, allowing the driver to take a selected action. In various embodiments, the vehicle provides a visual signal to the driver that allows the driver to view an area surrounding the vehicle, in particular, an area behind the vehicle.
In various embodiments, the structured illuminator 204 employs a diffractive lens to form the vertical stripes 216. The diffractive lens can include a refractive element combined with a one-dimensional microelectromechanical system (MEMS) scanner, in an embodiment of the present invention. Alternatively, the diffractive lens may combine refractive optics with a two-dimensional MEMS scanner. In further alternative embodiments, the illuminator 204 can include an optical phase array, a vertical-cavity surface-emitting laser (VCSEL) imaged via refractive optics, a polygon scanner, etc.
The light 206 projected into the volume is reflected by an object 212 and is then received at detector 208. In one embodiment, the detector 208 is a complementary metal-oxide semiconductor (CMOS) pixel array that is sensitive to light in the visible light spectrum (e.g., from about 400 nm to about 700 nm) as well as light in the infrared spectrum, e.g., at about 850 nm. A filter 210 is disposed over the detector 208. The filter 210 passes light within the visible spectrum as well as in the infrared region of electromagnetic radiation. In various embodiments, the filter 210 allows light at a frequency within a range of about 850 nm. In one mode, the detector 208 can be used as a visible light imaging device when the structured illuminator 204 is not is use. For example, the detector 208 can capture an image from behind the vehicle 10 in order to provide the image to a driver of the vehicle 10 or to a processor that detects the object and/or navigates the vehicle 10. In another mode, the structured illuminator 204 can be activated to produce the structured pattern of light 206 in the infrared region (e.g., at about 850 nm) and the detector 208 can capture both the visual image and the reflection of the structured pattern of infrared light. The visual image captured by the detector 208 can be used with the reflection of the structured pattern of light to determine a location of the objects. In alternative embodiments, only the light at 850 nm is used to detect and locate objects.
While the detector 208 and structured illuminator 204 are shown at a rear location of the vehicle 10 in order to assist the driver as the vehicle is backing up, the detector 208 and illuminator 204 can be placed anywhere on the vehicle for any suitable purposes.
In one embodiment, the processor determines the location of the deviations in the vertical strips 216a-216i and tracks the changed direction of the reflected lines due to the presence of the object 610,
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof