There is an ongoing demand for three-dimensional (3D) object tracking and object scanning for various applications, one of which is autonomous driving. The wavelengths of some types of signals, such as radar, are too long to provide the sub-millimeter resolution needed to detect smaller objects.
Light detection and ranging (LiDAR) systems use optical wavelengths that can provide finer resolution than other types of systems, thereby providing good range, accuracy, and resolution. In general, to determine the distances to objects, LiDAR systems illuminate a target area or scene with pulsed laser light and measure how long it takes for reflected pulses to be returned to a receiver.
One type of LiDAR system is referred to in the art as flash LiDAR. A flash LiDAR system operates similarly to a camera. In conventional flash LiDAR systems, a single, high-powered laser pulse illuminates a large field-of-view (FOV). An array of detectors (typically in close proximity to the laser) simultaneously detects light reflected by objects in the FOV. Typically, a lens focuses the reflected light onto the array of detectors. For each pulsed beam of light directed by the flash LiDAR system into the FOV, the detector array can receive reflected light corresponding to a frame of data. By using one or more frames of data, the ranges or distances of objects in the FOV can be obtained by determining the elapsed time between transmission of the pulsed beam of light by the laser and reception of the reflected light at the light detector array.
For some applications (e.g., autonomous driving), it may be challenging or impossible to design a flash LiDAR system that meets all of the cost, size, resolution, and power consumption requirements. Moreover, because of at least power limitations, the range of a conventional flash LiDAR system is generally limited to a couple hundred meters, which may be inadequate for some applications (e.g., autonomous driving).
This summary represents non-limiting embodiments of the disclosure.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, including: a long-range LiDAR subsystem characterized by a first range and a first azimuth angular coverage; and a short-range LiDAR subsystem characterized by a second range and a second azimuth angular coverage, wherein: the first range is greater than the second range, and the second azimuth angular coverage is greater than the first azimuth angular coverage.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the long-range LiDAR subsystem and the short-range LiDAR subsystem are configured to emit light simultaneously.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the first range is at least 800 meters.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the second range is less than or equal to 300 meters.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the second azimuth angular coverage is at least 120 degrees.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the first azimuth angular coverage is less than or equal to ten degrees.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the long-range LiDAR subsystem is further characterized by a first elevation angular coverage, and the short-range LiDAR subsystem is further characterized by a second elevation angular coverage, wherein the second elevation angular coverage is larger than the first elevation angular coverage.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the second elevation angular coverage is at least ten degrees.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the first elevation angular coverage is less than or equal to five degrees.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein: the long-range LiDAR subsystem includes a first illuminator array, and a first detector array, and the short-range LiDAR subsystem includes a second illuminator array and a second detector array.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the first illuminator array and the second illuminator array are configured to emit light simultaneously.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein a field-of-view (FOV) of the first illuminator array partially overlaps a FOV of the second illuminator array.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, further including: at least one processor coupled to the first illuminator array, the second illuminator array, the first detector array, and the second detector array.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one processor is configured to: cause the first illuminator array and the second illuminator array to emit light simultaneously, obtain a first signal from the first detector array, obtain a second signal from the second detector array, and process the first signal and the second signal to estimate a position of at least one object in view of the hybrid LiDAR system.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the first illuminator array includes a first illuminator and a second illuminator, wherein the first illuminator is configured to generate a first pulse sequence, and the second illuminator is configured to generate a second pulse sequence, wherein the first pulse sequence and the second pulse sequence are different.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein at least one of the long-range LiDAR subsystem or the short-range LiDAR subsystem includes: an illuminator array including one or more illuminators; and a detector array including one or more detectors.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the one or more detectors include an avalanche photo-diode (APD), a single-photon avalanche diode (SPAD) detector, or a silicon photomultiplier (SiPM) detector.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein: the long-range LiDAR subsystem is configured to sense a first volume of space, and the short-range LiDAR subsystem is situated to sense a second volume of space.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein: the first volume of space and the second volume of space partially overlap.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein: the long-range LiDAR subsystem is further configured to create a first three-dimensional point cloud of the first volume of space, and the short-range LiDAR subsystem is further configured to create a second three-dimensional point cloud of the second volume of space.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, further including: at least one processor configured to fuse the first three-dimensional point cloud and the second three-dimensional point cloud.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, further including: at least one processor configured to apply optimal transport theory to fuse the first three-dimensional point cloud and the second three-dimensional point cloud.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the long-range LiDAR system or the short-range LiDAR system includes at least one processor configured to fuse the first three-dimensional point cloud and the second three-dimensional point cloud.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the long-range LiDAR system or the short-range LiDAR system includes at least one processor configured to apply optimal transport theory to fuse the first three-dimensional point cloud and the second three-dimensional point cloud.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein: the first volume of space and the second volume of space are non-intersecting.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein: the long-range LiDAR subsystem is further configured to create a first three-dimensional point cloud of the first volume of space, and the short-range LiDAR subsystem is further configured to create a second three-dimensional point cloud of the second volume of space.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein at least one of the long-range LiDAR subsystem or the short-range LiDAR subsystem includes: a plurality of N illuminators, each of the plurality of N illuminators configured to illuminate a respective one of a plurality of N illuminator fields-of-view (FOVs); a detector including at least one focusing component and at least one detector array, wherein the detector is configured to observe a detector FOV that overlaps at least a first illuminator FOV of the plurality ofN illuminator FOVs; and at least one processor configured to: cause a first illuminator of the plurality of N illuminators to emit an optical pulse to illuminate the first illuminator FOV, obtain a signal representing at least one reflected optical pulse detected by the detector, and determine a position of at least one target using the signal.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the detector FOV is a first detector FOV, and wherein the detector is further configured to observe a second detector FOV that overlaps at least a second illuminator FOV of the plurality of N illuminator FOVs.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the detector FOV overlaps a second illuminator FOV of the plurality of N illuminator FOVs.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one detector array includes a plurality of detector arrays, and wherein a particular focusing component of the at least one focusing component is configured to focus reflected signals on the plurality of detector arrays.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the particular focusing component includes a lens and/or a mirror.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, each of the plurality of N illuminators includes a respective laser.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one focusing component includes a plurality of focusing components, and the at least one detector array includes a plurality of detector arrays.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the plurality of focusing components includes N focusing components and the plurality of detector arrays includes N detector arrays.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein each of the plurality of N illuminators is associated with a respective one of the N focusing components and a respective one of the N detector arrays.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein each of the N detector arrays includes at least 200 optical detectors.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein each of the at least 200 optical detectors includes an avalanche photodiode (APD), a single-photon avalanche diode (SPAD), or a silicon photomultiplier (SiPM).
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one detector array includes a plurality of avalanche photodiodes, single-photon avalanche diode (SPAD) detectors, or silicon photomultiplier (SiPM) detectors.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein each of the plurality ofN illuminators includes a respective laser.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one focusing component includes a lens.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one detector array includes a plurality of detector arrays, and wherein the lens is shared by the plurality of detector arrays.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein each of the plurality of detector arrays includes at least 200 optical detectors.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one focusing component includes a mirror.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein each of the plurality of N illuminator FOVs is 1 degree or less in an azimuth direction and 1 degree or less in an elevation direction.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the plurality of N illuminators includes at least 40 illuminators.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the at least one detector array includes at least 200 optical detectors.
In some aspects, the techniques described herein relate to a hybrid LiDAR system, wherein the detector FOV is a first detector FOV and the optical pulse is a first optical pulse, and wherein the detector is further configured to observe a second detector FOV that overlaps a second illuminator FOV of the plurality of N illuminator FOVs, and wherein the at least one processor is further configured to cause a second illuminator of the plurality of N illuminators to emit a second optical pulse to illuminate the second illuminator FOV.
In some aspects, the techniques described herein relate to a vehicle including a hybrid LiDAR system, the hybrid LiDAR system including: a long-range LiDAR subsystem characterized by a first range and a first azimuth angular coverage; and a short-range LiDAR subsystem characterized by a second range and a second azimuth angular coverage, wherein: the first range is greater than the second range, and the second azimuth angular coverage is greater than the first azimuth angular coverage.
In some aspects, the techniques described herein relate to a vehicle, wherein: the long-range LiDAR subsystem includes a first portion situated to sense a first volume of space in front of the vehicle and a second portion situated to sense a second volume of space behind the vehicle, and the short-range LiDAR subsystem is situated to sense a third volume of space in front of the vehicle.
In some aspects, the techniques described herein relate to a vehicle, wherein: the first volume of space and the third volume of space partially overlap.
In some aspects, the techniques described herein relate to a vehicle, wherein: the first volume of space and the third volume of space are non-intersecting.
In some aspects, the techniques described herein relate to a vehicle, wherein the long-range LiDAR subsystem is further characterized by a first elevation angular coverage, and the short-range LiDAR subsystem is further characterized by a second elevation angular coverage, wherein the second elevation angular coverage is larger than the first elevation angular coverage.
In some aspects, the techniques described herein relate to a vehicle, wherein the first elevation angular coverage is less than or equal to five degrees, and the second elevation angular coverage is at least ten degrees.
In some aspects, the techniques described herein relate to a vehicle, wherein the first range is at least 800 meters, and the second range is less than or equal to 300 meters.
In some aspects, the techniques described herein relate to a vehicle, wherein the first azimuth angular coverage is less than or equal to ten degrees, and the second azimuth angular coverage is at least 120 degrees.
Objects, features, and advantages of the disclosure will be readily apparent from the following description of certain embodiments taken in conjunction with the accompanying drawings in which:
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized in other embodiments without specific recitation. Moreover, the description of an element in the context of one drawing is applicable to other drawings illustrating that element.
Conventional flash LiDAR systems suffer from a number of drawbacks, including a need for high power and expensive components. Because the FOV 22 is large, there is a trade-off between the power emitted by the illuminator 20 and the distance at which objects can be detected. For example, in order to illuminate an entire scene of interest and allow the detector array 35 to detect reflections off of objects at a reasonable distance from the flash LiDAR system 10, the illuminator 20 generally must emit high-power pulses so that enough energy reflected off of a target 15 reaches the detector array 35. Among other issues, these high-power pulses might not meet eye-safety standards. Furthermore, in order to provide high resolution, which is imperative for certain applications (e.g., autonomous driving), the quality of the lens 33 that focuses reflected pulses onto the detector array 35 must be high, which increases the cost of the lens 33. Additionally, the detector array 35 typically contains tens of thousands or, not uncommonly, hundreds of thousands of individual optical detectors, each for detecting a different, small portion of the scene, in order to unambiguously detect the angles of reflected pulses.
Disclosed herein are hybrid LiDAR systems that include a long-range LiDAR subsystem for detecting targets at longer distances and a short-range LiDAR subsystem for detecting targets at closer ranges. The hybrid LiDAR system includes a long-range LiDAR subsystem for detecting targets at longer ranges and a short-range LiDAR subsystem for detecting targets at closer ranges. In some example embodiments, distances of up to 300 meters can be measured with 15 cm accuracy by the short-range LiDAR subsystem. The hybrid LiDAR system combines the advantages of the short-range LiDAR subsystem, which can generate dense uniform point clouds of objects in near and short range, with the advantages of the long-range LiDAR subsystem, which can identify long-range point targets with high range resolution and high angular resolution.
Each of the long-range LiDAR subsystem and the short-range LiDAR subsystem can include a respective array of illuminators and a respective array of detectors, as described further below. Both the long-range LiDAR subsystem and the short-range LiDAR subsystem can be implemented with reasonable complexity and with eye-safe power levels.
The hybrid LiDAR system 200 example illustrated in
The hybrid LiDAR system 200 may optionally also include one or more analog-to-digital converters (ADCs) 115 situated in the data path between the array of optical components 110 and the at least one processor 150. If present, the one or more ADCs 115 convert analog signals provided by detectors 130 in the detector array 140A and/or the detector array 140B to digital format for processing by the at least one processor 150. The analog signal provided by each of the detector array 140A and/or detector array 140B may be a superposition of reflected optical signals, which the at least one processor 150 may then process to determine (estimate) the positions of targets corresponding to (causing) the reflected optical signals.
It is to be understood that in addition to or instead of the ADC(s) 115 illustrated in
It will be appreciated that there are myriad suitable hardware implementations of the hybrid LiDAR system 200 illustrated in
As illustrated in
As shown in
The elevation FOV angle 127 of an illuminator 120 may be the same as or different from the azimuth FOV angle 126 of that illuminator 120. As will be understood by those having ordinary skill in the art, the beams emitted by illuminators 120 can have any suitable shape in three dimensions. For example, the emitted beams may be generally conical (where a cone is an object made up of a collection of (infinitely many) rays). The cross section of the cone can be any arbitrary shape, e.g., circular, ellipsoidal, square, rectangular, etc. In some embodiments, the cross section of the emitted beams are circular or square.
The volume of space illuminated by an illuminator 120 having boresight angles 124, 125 and FOV angles 126, 127 is referred to herein as the illuminator FOV 122. Objects that are within the illuminator FOV 122 of a particular illuminator 120 are illuminated by optical signals transmitted by that illuminator 120. The illuminator FOV 122 of an illuminator 120 is dependent on and determined by the position of the illuminator 120, and the boresight angles 124, 125 and FOV angles 126, 127 of the illuminator 120. The range of the illuminator 120 is dependent on its optical power and its vertical and horizontal FOV angles (e.g., intensity in watts per steradian). As explained further below, the long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B use illuminators 120 having different characteristics (e.g., fields of view).
In some embodiments, the detector 130 comprises a single-photon avalanche diode (SPAD) detector (e.g., a solid-state detector that can detect individual photons), a silicon photomultiplier (SiPM) detectors (e.g., solid-state single-photon-sensitive devices based on single-photon avalanche diodes implemented on a common silicon substrate), or another suitable detector.
Like the illuminator 120, the detector 130 may include a lens to focus the received signal, as discussed further below. In addition, or alternatively, like the illuminator 120, the detector 130 may include one or more mirrors to direct the received light in a selected direction.
The detector 130 is shown having a cuboid shape, which is merely symbolic. Each detector 130 has a position in three-dimensional space, which, as explained previously, can be characterized by Cartesian coordinates (x, y, z) on x-, y-, and z-axes, as shown in
As illustrated in
As shown in
The volume of space sensed by a detector 130 having boresight angles 134, 135 and FOV angles 136, 137 is referred to herein as a detector FOV 132. Optical signals reflected by objects within a particular detector 130's detector FOV 132 can be detected by that detector 130. The detector FOV 132 of a detector 130 is dependent on and determined by the position of the detector 130 within the hybrid LiDAR system 200 (e.g., it may be different for detector(s) 130 within the long-range LiDAR subsystem 100A as compared to detector(s) 130 within short-range LiDAR subsystem 100B), and the boresight angles 134, 135 and FOV angles 136, 137 of the detector 130. In some embodiments, the azimuth boresight angle 124, the azimuth FOV angle 126, the azimuth boresight angle 134, and the azimuth FOV angle 136 of a particular detector 130 are selected so that the detector FOV 132 largely coincides with the illuminator FOV 122 of a respective illuminator 120. The range of the detector 130 is dependent on the sensitivity of the detector 130 and irradiance on target.
The long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B can be similar or identical in some respects. This section describes a LiDAR subsystem 100, which, unless otherwise indicated, can be the long-range LiDAR subsystem 100A and/or short-range LiDAR subsystem 100B.
As shown in
The detector array 140 shown in
The array of optical components 110 may be in the same physical housing (or enclosure) as the at least one processor 150, or it may be physically separate. Although the description herein refers to a single array of optical components 110, it is to be understood that the illuminators 120 and the detector(s) 130 can be situated within the LiDAR subsystem 100 in any suitable physical arrangement (e.g., in multiple sub-arrays, etc.).
The LiDAR subsystem 100 may optionally also include one or more analog-to-digital converters (ADCs) 115 disposed between the array of optical components 110 and the at least one processor 150. If present, the one or more ADCs 115 convert analog signals provided by detectors 130 in the array of optical components 110 to digital format for processing by the at least one processor 150. The analog signal provided by each of the detectors 130 may be a superposition of reflected optical signals (e.g., reflected pulses 61) detected by that detector 130, which the at least one processor 150 may then process to determine the positions of targets 15 corresponding to (causing) the reflected optical signals.
As explained above, it is to be understood that in addition to or instead of the ADC(s) 115 illustrated in
Although
As described further below, in some embodiments, each illuminator 120 of a LiDAR subsystem 100 (e.g., the long-range LiDAR subsystem 100A or short-range LiDAR subsystem 100B) is associated with a respective detector array 140 that can be significantly smaller (e.g., have fewer optical detectors 130) than the massive detector array that is typically required in a conventional flash LiDAR system. In these embodiments, the number of detector arrays 140 is equal to the number of illuminators 120.
In other embodiments, a plurality of illuminators 120 with non-overlapping illuminator FOVs 122 can be fired (caused to emit signals) simultaneously. The corresponding detectors 130 assigned to each illuminator 120, whether portions of a single detector 130 or a respective plurality of detectors 130, will correspondingly have non-overlapping detector FOVs 132. Therefore, each portion of the detector array 140 is unambiguously associated with a respective one of the plurality of illuminators 120. This allows the LiDAR subsystem 100 to unambiguously detect the time-of-flight and angular position of a target even when illuminators 120 are fired simultaneously. The ability to fire a plurality of illuminators 120 (e.g., lasers) simultaneously allows one to scan the scenery in a more rapid fashion and yields a higher frame-per-second rate for the output of the LiDAR subsystem 100.
In some embodiments, a single detector array 140 is used to detect reflections of optical signals emitted by all of the illuminators 120 in the LiDAR subsystem 100.
The illuminators 120 in a LiDAR subsystem 100 may be identical to each other, or they may differ in one or more characteristics. For example, different illuminators 120 have different positions in the LiDAR subsystem 100 and therefore in space (i.e., they have different (x, y, z) coordinates). The boresight angles 124, 125 and FOV angles 126, 127 of different illuminators 120 may also be the same or different. For example, subsets of illuminators 120 may have configurations whereby they illuminate primarily targets within a certain range of the LiDAR subsystem 100 and are used in connection with detectors 130 that are configured primarily to detect targets within that same range. Similarly, the power of optical signals emitted by different illuminators 120 can be the same or different. For example, illuminators 120 intended to illuminate targets at very large distances from the long-range LiDAR subsystem 100A may use more power than illuminators 120 intended to illuminate targets at somewhat closer distances from the long-range LiDAR subsystem 100A and/or more power than illuminators 120 used in short-range LiDAR subsystem 100B.
The boresight angles 124, 125 and the FOV angles 126, 127 of the illuminators 120 can be selected so that the beams emitted by different illuminators 120 overlap, thereby resulting in different illuminators 120 illuminating overlapping portions of a scene. Unlike conventional LiDAR systems, embodiments of the hybrid LiDAR system 200 disclosed herein are able to resolve the three-dimensional positions of multiple targets within these overlapping regions of space. Moreover, they do not require any moving parts.
In some embodiments, multiple illuminators 120 emit optical signals simultaneously. If the illuminator FOVs 122 of the illuminators 120 that emit optical signals simultaneously are non-overlapping, there is no ambiguity in the times-of-flight of optical signals emitted by the illuminators 120, reflected by the target(s) 15, and detected by the detectors 130. The ability to fire (cause optical signals to be emitted by) multiple illuminators 120 at the same time can allow the LiDAR subsystem 100 to scan the scenery faster and thus increase the number frames per second (FPS) that the LiDAR subsystem 100 generates.
The detectors 130 of the LiDAR subsystem 100 may be identical to each other, or they may differ in one or more characteristics. For example, different detectors 130 have different positions in the LiDAR subsystem 100 and therefore in space (i.e., they have different (x, y, z) coordinates). The boresight angles 134, 135 and FOV angles 136, 137 of different detectors 130 may also be the same or different. For example, subsets of detectors 130 may have configurations whereby they observe targets within a certain range of the LiDAR subsystem 100 and are used in connection with illuminators 120 that are configured primarily to illuminate targets within that same range.
The LiDAR subsystem 100 example of
In the example of
An example illustrates potential benefits of the disclosed LiDAR subsystem 100, such as the example embodiment shown in
The disclosed LiDAR subsystems 100 offer several advantages relative to conventional LiDAR systems (e.g., flash LiDAR systems). For example, because the illuminator FOVs 122 are narrow, pulses emitted by the illuminators 120 travel further without being dispersed as they would be if the illuminator FOVs 122 were wider. Thus, for a given power level, pulses originating from the illuminators 120 (emitted pulses 60) can reach and be reflected by objects (targets) at distances that are considerably larger than the maximum detectable-object distance of a conventional flash LiDAR system. Likewise, because the illuminator FOVs 122 are narrow, the reflected pulses 61 caused by emitted optical signals from individual illuminators 120 can reach and be detected by detectors 130 using a much smaller number of optical detectors 142, each of which “looks at” only a narrow detector FOV 132. The narrow detector FOV 132 of each detector 130 substantially coincides with the illuminator FOV 122 of the respective illuminator 120 (e.g., by collocating each illuminator 120 and its respective detector 130 and choosing suitable azimuth boresight angle 124, elevation boresight angle 125, azimuth FOV angle 126, elevation FOV angle 127, azimuth boresight angle 134, elevation boresight angle 135, azimuth FOV angle 136, and elevation FOV angle 137).
Additionally, a benefit of having multiple spatially-separated illuminators 120 is that the LiDAR subsystem 100 can reach (detect objects at) longer distances without violating eye safety restrictions. For example, if the beams of two illuminators 120 overlap at a particular point in the field (scene), a person situated at that location will see two separated beams from the illuminators 120, which will form two different spots on the person's retina. Laser eye-safety guidelines (e.g., ANSI Z13.1-2014 or similar) may treat this configuration as an extended source and may be less restrictive than if all the incident power at the person's eye were coming from a single illuminator 120.
Furthermore, the power levels of individual illuminators 120 can be dynamically adjusted to, for example, maintain the quality of reflected pulses 61 (and thereby avoid detector 130 saturation), and to meet eye safety standards while not affecting the overall long-range FOV of the LiDAR subsystem 100.
The LiDAR subsystem 100 shown in
As illustrated in the example of
As explained above, a benefit of having multiple spatially-separated illuminators 120 is that the LiDAR subsystem 100 (whether the long-range LiDAR subsystem 100A or short-range LiDAR subsystem 100B) can reach longer distances without violating eye safety restrictions. For example, referring to
In some embodiments, individual illuminators 120 in the LiDAR subsystem 100 comprise multiple spatially-separated illuminators 120 that illuminate overlapping illuminator FOVs 122. As an example,
It is also to be appreciated that although the drawings herein show lenses 133 as the focusing components, the detectors 130 can include additional and/or alternative focusing components (e.g., mirrors, etc.), as explained above.
The long-range LiDAR subsystem 100A provides high target resolution over much larger distances than conventional LiDAR systems, and over larger distances than the short-range LiDAR subsystem 100B, which is described further below. The long-range LiDAR subsystem 100A includes a plurality of illuminators 120 (e.g., lasers) and a plurality of optical detectors 130 (e.g., photodetectors, such as avalanche photodiodes (APDs)). The individual illuminators 120 and detectors 130 can be, for example, as described above in the discussions of
Rather than using a single, high-powered laser to illuminate the entire scene, the long-range LiDAR subsystem 100A uses an array of illuminators 120, each of which has an illuminator FOV 122 that is much narrower than that of the single laser used in conventional flash LiDAR systems. Together, the array of illuminators 120 can simultaneously illuminate the entire scene at distances that are considerably further away from the system than the maximum distance at which a conventional flash LiDAR system can detect objects. Furthermore, the long-range LiDAR subsystem 100A provides high resolution at distances much larger than those feasible for conventional flash LiDAR systems. Because the illuminator FOV 122 of each illuminator 120 is narrow, the power of each illuminator 120 can be lower than in a conventional LiDAR system, yet illuminate objects at larger distances from the long-range LiDAR subsystem 100A without violating eye-safety standards.
In some embodiments, the azimuth FOV angle 126 of the illuminator(s) 120 of the long-range LiDAR subsystem 100A is 1 degree or less. It is to be appreciated that, in general, there is no requirement for the azimuth FOV angle 126 to be any particular value.
In some embodiments, the elevation FOV angle 127 of the illuminator(s) 120 of the long-range LiDAR subsystem 100A is 1 degree or less. It is to be appreciated that, in general, there is no requirement for the elevation FOV angle 127 to be any particular value.
The short-range LiDAR subsystem 100B provides high accuracy over shorter distances than covered by the long-range LiDAR subsystem 100A. For example, distances of up to 300 meters can be measured with 15 cm accuracy by the short-range LiDAR subsystem 100B.
In some embodiments, the short-range LiDAR subsystem 100B includes a plurality of illuminators 120 (e.g., lasers) and a plurality of optical detectors 130 (e.g., photodetectors, such as avalanche photodiodes (APDs)). The individual illuminators 120 and detectors 130 can be, for example, as described above in the discussions of
In some embodiments, rather than using a single, high-powered laser to illuminate the entire scene, the short-range LiDAR subsystem 100B uses an array of illuminators 120, each of which has an illuminator FOV 122 that is much narrower than that of the single laser used in conventional flash LiDAR systems. Together, the array of illuminators 120 can simultaneously illuminate the entire scene at distances that are considerably further away from the system than the maximum distance at which a conventional flash LiDAR system can detect objects. Alternatively, the array of illuminators 120 can provide the same range as a conventional flash LiDAR system but by emitting less power. Because the illuminator FOV 122 of each illuminator 120 is narrow, the power of each illuminator 120 can be lower than in a conventional LiDAR system, yet illuminate objects at larger distances from short-range LiDAR subsystem 100B without violating eye-safety standards.
A conventional flash LiDAR could alternatively be used as a short-range LiDAR subsystem 100B.
A primary difference between the long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B is that short-range LiDAR subsystem 100B has a wider FOV. For example, referring to
For example, in some embodiments, the short-range LiDAR subsystem 100B has a large azimuth angular coverage (e.g., the azimuth FOV angle 126 can be 180° or 360°), some elevation angular coverage (e.g., the elevation FOV angle 127 can be 10° to 30°), and a range coverage up to some maximum range rshort (e.g., 150 m or more), where the azimuth angular coverage and elevation angular coverage are larger than those of the long-range LiDAR subsystem 100A, and the range is less than the maximum range of the long-range LiDAR subsystem 100A.
As explained above, the hybrid LiDAR system 200 includes a long-range LiDAR subsystem 100A for detecting targets at longer ranges and a short-range LiDAR subsystem 100B for detecting targets at closer ranges. In some example embodiments, distances of up to 300 meters can be measured with 15 cm accuracy by the short-range LiDAR subsystem. The hybrid LiDAR system 200 combines the advantages of the short-range LiDAR subsystem 100B, which can generate dense uniform point clouds of objects in near and short range, with the advantages of the long-range LiDAR subsystem 100A, which can identify long-range point targets with high range resolution and high angular resolution.
The long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B can operate simultaneously. In some embodiments, the illuminators 120 of the long-range LiDAR subsystem 100A emit light using pulse sequences that are different from the pulse sequences emitted by the illuminators 120 of the short-range LiDAR subsystem 100B.
The long-range LiDAR subsystem 100A has an azimuth angular coverage that is focused on a particular area of interest. For autonomous driving applications, the long-range LiDAR subsystem 100A is typically focused on the front and/or back of the vehicle, though other areas of focus (e.g., on the sides) are also possible.
The azimuth angular coverage of the long-range LiDAR subsystem 100A can be much smaller than that of the short-range LiDAR subsystem 100B (e.g., 20° to 30°). The elevation angular coverage could be, for example, only a few degrees because the long-range LiDAR subsystem 100A is focused on distances far away.
The long-range LiDAR subsystem 100A has a range, rlong (shown as d2), that is much longer than the range rshort (shown as d1), e.g., typically somewhere between 400 m and 800 m, though it could be longer or shorter.
In areas (e.g., at distances, ranges of distances, or volumes of space) where the coverage of the short-range LiDAR subsystem 100B does not overlap the coverage of the long-range LiDAR subsystem 100A, each subsystem can create its own three-dimensional (3D) point cloud of the scenery. As will be appreciated by those having ordinary skill in the art, a point cloud is a collection of points that represent a 3D shape or object/target. Each point cloud is a collection of points that represent a three-dimensional shape or feature, from which range, angle, and velocity information can be determined) that can be processed by a perception engine (e.g., the at least one processor 150). The point cloud from the long-range LiDAR subsystem 100A maps part of the scene, and the point cloud from short-range LiDAR subsystem 100B maps a non-intersecting part of the scene.
In areas (e.g., at distances, ranges of distances, or volumes of space) where the coverage 215 of the short-range LiDAR subsystem 100B and the coverage 225 of the long-range LiDAR subsystem 100A overlap (e.g., areas within the azimuth and elevation fields of view of the long-range LiDAR subsystem 100A that are within the range rshort), the short-range LiDAR subsystem 100B and the long-range LiDAR subsystem 100A can cooperate (e.g., directly or via a processor (e.g., the at least one processor 150) or other subsystem of the hybrid LiDAR system 200 that is coupled to short-range LiDAR subsystem 100B and long-range LiDAR subsystem 100A) and fuse their 3D point clouds to yield one or more composite 3D point clouds for the overlap area. In other words, the point cloud from the long-range LiDAR subsystem 100A can be combined with the point cloud from short-range LiDAR subsystem 100B to improve accuracy of target detection within the region that both the long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B can observe.
Various methods can be used to fuse the 3D point clouds obtained from short-range LiDAR subsystem 100B and long-range LiDAR subsystem 100A in the common overlap area. These include Bayesian methods, SNR-based selection methods, and others. One way to fuse the 3D point clouds obtained from the long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B is using optimal transport theory. As will be appreciated, it is useful to identify a measure of “distance” between pairs of probability distributions, and optimal transport theory can be used to construct a notion of distance between probability distributions. Stated another way, optimal transport theory provides a framework that explicitly accounts for geometric relationships by modeling a signal as mass that incurs a cost to move around its support.
As an example, a 3D point cloud from the long-range LiDAR subsystem 100A can be fused with a 3D point cloud of an overlapping region from short-range LiDAR subsystem 100B by, for example, pointwise multiplication of the individual point clouds from the different bands to obtain a fused point cloud. As more observations are gathered over time, the fused point cloud evolves and becomes more accurate. As a result, ghost targets can be eliminated (e.g., by eliminating candidate positions that are below a threshold probability), and the true positions of targets can be determined.
Accordingly, in some embodiments, a hybrid LiDAR system 200 comprises a long-range LiDAR subsystem 100A characterized by a first range (e.g., 800 meters or more) and a first azimuth angular coverage (e.g., less than or equal to 10 degrees) and a short-range LiDAR subsystem 100B characterized by a second range (e.g., less than or equal to 300 meters) and a second azimuth angular coverage (e.g., at least 120 degrees), where the first range is greater than the second range, and the second azimuth angular coverage is greater than the first azimuth angular coverage. As explained above, in some embodiments, the long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B configured to emit light simultaneously. The long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B may use distinct pulse sequences to allow a determination of which LiDAR subsystem 100 emitted the pulse sequence that resulted in a particular reflected signal.
In some embodiments, the long-range LiDAR subsystem 100A is further characterized by a first elevation angular coverage, and the short-range LiDAR subsystem 100B is further characterized by a second elevation angular coverage that is larger than the first elevation angular coverage. For example, the first elevation angular coverage may be less than or equal to 5 degrees, and the second elevation angular coverage may be at least 10 degrees.
In some embodiments, the long-range LiDAR subsystem 100A comprises an illuminator array 112A and a detector array 140A, and the short-range LiDAR subsystem 100B comprises a illuminator array 112B and a detector array 140B. The illuminator array 112A and the illuminator array 112B may be configured to emit light simultaneously (e.g., as directed by at least one processor 150). In some embodiments, a FOV of the long-range LiDAR subsystem 100A partially overlaps a FOV of the short-range LiDAR subsystem 100B. For example, any illuminator FOV 122 resulting from the azimuth boresight angle 124, elevation boresight angle 125, azimuth FOV angle 126, and/or elevation FOV angle 127 of the long-range LiDAR subsystem 100A can partially overlap an illuminator FOV 122 resulting from the azimuth boresight angle 124, elevation boresight angle 125, azimuth FOV angle 126, and/or elevation FOV angle 127 of the short-range LiDAR subsystem 100B. Alternatively, or in addition, any detector FOV 132 resulting from the azimuth boresight angle 134, elevation boresight angle 135, azimuth FOV angle 136, and/or elevation FOV angle 137 of the long-range LiDAR subsystem 100A can partially overlap a detector FOV 132 resulting from the azimuth boresight angle 134, elevation boresight angle 135, azimuth FOV angle 136, and/or elevation FOV angle 137 of the short-range LiDAR subsystem 100B.
In some embodiments, the hybrid LiDAR system 200 also includes at least one processor 150 coupled to the illuminator array 112A, the illuminator array 112B, the detector array 140A, and the detector array 140B. In some such embodiments, the at least one processor 150 is configured to cause the first illuminator array and the second illuminator array to emit light simultaneously and to obtain a first signal from the first detector array, obtain a second signal from the second detector array, and process the first signal and the second signal to estimate a position of at least one object (e.g., at least one target 15) in view of the hybrid LiDAR system 200.
In some embodiments, the hybrid LiDAR system 200 comprises an illuminator 120A and an illuminator 120B. In some embodiments, the illuminator 120A is configured to generate a first pulse sequence, and the illuminator 120B is configured to generate a second pulse sequence that is different from the first pulse sequence.
In some embodiments, the long-range LiDAR subsystem 100A and/or the short-range LiDAR subsystem 100B includes an illuminator array 112 comprising one or more illuminators 120 (e.g., lasers), and a detector array 140 comprising one or more detectors 130 (e.g., an avalanche photo-diode (APD), a single-photon avalanche diode (SPAD) detector, or a silicon photomultiplier (SiPM) detector).
In some embodiments, the long-range LiDAR subsystem 100A is configured to sense a first volume of space, and the short-range LiDAR subsystem 100B is configured to sense a second volume of space. The volumes of space sensed by the long-range LiDAR subsystem 100A and the short-range LiDAR subsystem 100B may partially overlap (e.g., as shown in
In some embodiments, the long-range LiDAR subsystem 100A is configured to create a first three-dimensional (3D) point cloud representing the first volume of space, and the short-range LiDAR subsystem 100B is configured to create a second 3D point cloud representing the second volume of space. In some embodiments, the hybrid LiDAR system 200 includes at least one processor 150 configured to fuse the first and second 3D point clouds (e.g., to eliminate ghost targets and improve accuracy of target detection). In some embodiments, the long-range LiDAR subsystem 100A and/or the short-range LiDAR subsystem 100B includes at least one processor 150 configured to fuse the first and second 3D point clouds. The fusing process may take advantage of optimal transport theory.
As explained above, a vehicle can include a hybrid LiDAR system 200 as described herein. In some embodiments, the long-range LiDAR subsystem 100A comprises a first portion situated to sense a first volume of space in front of the vehicle and a second portion situated to sense a second volume of space behind the vehicle, and the short-range LiDAR subsystem 100B is situated to sense a third volume of space in front of the vehicle. In some embodiments, the first volume of space and the third volume of space partially overlap. In some embodiments, the first volume of space and the third volume of space are non-intersecting (non-overlapping). In some embodiments, the long-range LiDAR subsystem 100A is further characterized by a first elevation angular coverage (e.g., less than or equal to 5 degrees), and short-range LiDAR subsystem 100B is further characterized by a second elevation angular coverage that is larger than the first elevation angular coverage (e.g., at least 10 degrees). In some embodiments, the long-range LiDAR subsystem 100A is able to detect targets at a range of at least 800 meters, and the short-range LiDAR subsystem 100B is able to detect targets at a range up to about 300 meters. In some embodiments, the long-range LiDAR subsystem 100A has an azimuth angular coverage that is less than or equal to about 10 degrees, and the short-range LiDAR subsystem 100B has an azimuth angular coverage that is at least 120 degrees.
In the foregoing description and in the accompanying drawings, specific terminology has been set forth to provide a thorough understanding of the disclosed embodiments. In some instances, the terminology or drawings may imply specific details that are not required to practice the invention.
To avoid obscuring the present disclosure unnecessarily, well-known components are shown in block diagram form and/or are not discussed in detail or, in some cases, at all.
Unless otherwise specifically defined herein, all terms are to be given their broadest possible interpretation, including meanings implied from the specification and drawings and meanings understood by those skilled in the art and/or as defined in dictionaries, treatises, etc. As set forth explicitly herein, some terms may not comport with their ordinary or customary meanings.
As used in the specification and the appended claims, the singular forms “a,” “an” and “the” do not exclude plural referents unless otherwise specified. The word “or” is to be interpreted as inclusive unless otherwise specified. Thus, the phrase “A or B” is to be interpreted as meaning all of the following: “both A and B,” “A but not B,” and “B but not A.” Any use of “and/or” herein does not mean that the word “or” alone connotes exclusivity.
As used in the specification and the appended claims, phrases of the form “at least one of A, B, and C,” “at least one of A, B, or C,” “one or more of A, B, or C,” and “one or more of A, B, and C” are interchangeable, and each encompasses all of the following meanings: “A only,” “B only,” “C only,” “A and B but not C,” “A and C but not B,” “B and C but not A,” and “all of A, B, and C.”
To the extent that the terms “include(s),” “having,” “has,” “with,” and variants thereof are used in the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising,” i.e., meaning “including but not limited to.”
The terms “exemplary” and “embodiment” are used to express examples, not preferences or requirements.
The term “coupled” is used herein to express a direct connection/attachment as well as a connection/attachment through one or more intervening elements or structures.
The terms “over,” “under,” “between,” and “on” are used herein refer to a relative position of one feature with respect to other features. For example, one feature disposed “over” or “under” another feature may be directly in contact with the other feature or may have intervening material. Moreover, one feature disposed “between” two features may be directly in contact with the two features or may have one or more intervening features or materials. In contrast, a first feature “on” a second feature is in contact with that second feature.
The term “substantially” is used to describe a structure, configuration, dimension, etc. that is largely or nearly as stated, but, due to manufacturing tolerances and the like, may in practice result in a situation in which the structure, configuration, dimension, etc. is not always or necessarily precisely as stated. For example, describing two lengths as “substantially equal” means that the two lengths are the same for all practical purposes, but they may not (and need not) be precisely equal at sufficiently small scales. As another example, a structure that is “substantially vertical” would be considered to be vertical for all practical purposes, even if it is not precisely at 90 degrees relative to horizontal.
The drawings are not necessarily to scale, and the dimensions, shapes, and sizes of the features may differ substantially from how they are depicted in the drawings.
Although specific embodiments have been disclosed, it will be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the disclosure. For example, features or aspects of any of the embodiments may be applied, at least where practicable, in combination with any other of the embodiments or in place of counterpart features or aspects thereof. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Furthermore, certain values are provided herein as examples (e.g., of ranges, angular coverage, accuracy, etc.), but these values are not intended to limit the scope of the disclosure. It will be appreciated by those having ordinary skill in the art that other values may be possible/achievable today, and that certain values may improve (e.g., accuracy, range, angular coverage, etc.) as the technology used to implement hybrid LiDAR systems 200 improves.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/045849 | 10/6/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63253043 | Oct 2021 | US |