LIDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses laser light to measure distance to a target object by illuminating the target object with laser light and measuring the reflected light with a sensor. LIDAR systems work on the same general principles of radar, but use laser light instead of radio frequency radiation. LIDAR systems generally use pulsed laser to measure distances. Differences in laser return times and wavelengths can then be used to make digital 3-D representations of the target. LIDAR systems have a variety of applications including cartography, surveying, and in vehicular applications as an information source that can provide useful data for augmented or autonomous driving systems.
A LIDAR system is disclosed which includes a laser projecting a first beam of laser light upon a spatial light modulator (SLM) configured to modulate the first beam to project a second beam having a pattern of points of relatively high intensity separated from one another by regions of relatively low intensity.
The LIDAR system includes a first camera having a first field of view to generate an image signal of the pattern of points of the second beam overlaid upon an image of one or more objects within the first field of view.
The LIDAR system further includes a controller including a picture creation module for generating the pattern of points and to communicate the pattern of points to the spatial light modulator, and including an image processing module in communication with the first camera to receive the image signal therefrom and to detect the position and distance of the one or more objects using the image signal.
A method 100 for operating a LIDAR system 20 is also provided. The method 100 includes the steps of 102 projecting a first beam of laser light upon a spatial light modulator (SLM) by a laser. The method 100 also includes 110 modulating the first beam by the spatial light modulator to project a second beam having the pattern of points of varying intensity.
The method 100 also includes 112 observing, by a first camera having a first field of view, an actual image including the pattern of the second beam overlaid upon an image of one or more objects. The method 100 also includes 114 generating an image signal by the first camera representing the actual image. The method 100 also includes 116 communicating the image signal from the first camera to the image processing module. The method 100 also includes 118 detecting a position and distance of the one or more objects by the image processing module by comparing the pattern of points with the actual image.
Further details, features and advantages of designs of the invention result from the following description of embodiment examples in reference to the associated drawings.
Recurring features are marked with identical reference numerals in the figures. A LIDAR system 20 is disclosed. As shown in the block diagram
As shown in
As shown in
According to an aspect of the disclosure, the image processing module 48 may use a location of one or more of the points 30 of relatively high intensity and amplitude thereof relative to adjacent regions within the image signal 36 in detecting the position and distance of the one or more objects. In other words, the non-illuminated adjacent regions within the image signal 36 may provide a reference to determine the relative intensity of the relatively high intensity illuminated points 30. The pattern of points 30 may be dynamic and change over time. For example, the points may move to scan over different regions of space.
According to an aspect of the disclosure, the picture creation module 46 generates the pattern of points 30 as one of a plurality of different patterns 30′, 30″, 30′″ to increase the spatial resolution of the LIDAR system 20. As shown in
As shown in
As shown in
The subject LIDAR system 20 requires only a limited number of patterns, and may be implemented where the spatial light modulator 26 is a liquid crystal device that is switchable between two or more different states. As shown in
The diffractive elements 58 may, for example, be mounted to a disk 60 which is rotated to place the selected one of the diffractive elements 58 in a position to intersect the first beam 24 and to generate the associated of points 30 in the second beam 28.
As illustrated in the flow charts of
The method 100 also includes 104 generating a pattern of points 30 by a picture creation module 46. Examples of such patterns of points 30 are illustrated in
The method 100 also includes 106 communicating the pattern of points 30 from the picture creation module 46 to the spatial light modulator 26. This communication may be, for example, digital or analog, and may be communicated electronically or optically.
The method 100 also includes 108 communicating the pattern of points 30 from the picture creation module 46 to an image processing module 48. This communication may be, for example, digital or analog, and may be communicated electronically or optically.
The method 100 also includes 110 modulating the first beam 24 by the spatial light modulator 26 to project a second beam 28 having the pattern of points 30 of varying intensity. In the example shown in
The method 100 also includes 112 observing, by a first camera 32 having a first field of view 34, an actual image including the pattern of points 30 of the second beam 28 overlaid upon an image of one or more objects.
The method 100 also includes 114 generating an image signal 38 by the first camera 32 representing the actual image.
The method 100 also includes 116 communicating the image signal 38 from the first camera 32 to the image processing module 48.
The method 100 also includes 118 detecting a position and distance of the one or more objects by the image processing module 48 by comparing the pattern of points 30 with the actual image. In other words, the image processing module 48 compares the pattern of points 30 that was communicated directly from the picture creation module 46 with the actual image from the first camera 32 in order to detect the position and distance of the one or more objects. This step 118 may include the substep of 118A using, by the image processing module 48, a location of one or more points of relatively high intensity and an amplitude thereof relative to adjacent regions within the actual image.
As shown in the flow chart of
The method 100 may also include 122 communicating a corresponding image signal from the second camera 54 to the image processing module 48. The image processing module 48 may then use that corresponding image signal to improve distance estimation of objects located far away from the LIDAR system 20.
As shown in the flow chart of
The method 100 may also include 126 projecting a different second beam 28 to corresponding ones of the fields of view 34, 56 by each of a plurality of spatial light modulators 26.
The method 100 may also include 128 illuminating a corresponding one of the spatial light modulators 26 with a corresponding first beam 24 by each of a plurality of lasers 22.
The system, methods and/or processes described above, and steps thereof, may be realized in hardware, software or any combination of hardware and software suitable for a particular application. The hardware may include a general purpose computer and/or dedicated computing device or specific computing device or particular aspect or component of a specific computing device. The processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable device, along with internal and/or external memory. The processes may also, or alternatively, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It will further be appreciated that one or more of the processes may be realized as a computer executable code capable of being executed on a machine readable medium.
The computer executable code may be created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices as well as heterogeneous combinations of processors processor architectures, or combinations of different hardware and software, or any other machine capable of executing program instructions.
Thus, in one aspect, each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices performs the steps thereof. In another aspect, the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware. In another aspect, the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.
Obviously, many modifications and variations of the present invention are possible in light of the above teachings and may be practiced otherwise than as specifically described while within the scope of the appended claims.