Lidar has been widely used in autonomous driving vehicles and Advanced Driver Assistant Systems (ADASs). A lidar system with a wide field of view (FOV), high angular resolution (<0.1 degrees), and fast scanning rate (>30 fps), large detection range are needed for automotive object detection in a 3-dimensional space. For ADAS, a long-range lidar system detects returning photons from objects located at several hundreds of meters with a limited FOV. In contrast, short-range lidar systems detect objects approaching from the sides of the vehicle with a large FOV, i.e., 90 degrees in the horizontal direction. Such an application specific 3D sensing volume defines lidar optical architecture, which is predominantly governed by radiometry of the lidar system. For example, a long range lidar system obtaining enough returning photons reflected from objects plays a critical role because the number of retuning photons scales with
where R is the distance of the object away from the detector. To recover the fall off signal from a distance object, a large aperture for the receiver is needed. Therefore, trade-offs arise in the selection of scanning modalities along with the selection of sensors, such as a single detector for a point-and-shoot lidar system, and 1- or 2-dimensional array sensor for flash scanning lidar systems.
Recently, a solid-state silicon photomultiplier (SiPM), which is also referred to as a Multi-Pixel Photon Counter (MPPC), has been introduced for infrared lidar applications. An MPPC pixel with a microcell size of 100 μm consists of a sub-array of an Avalanche Photo Diode (APD) operating in a Geiger mode with a quenching resistor. For lidar applications, the 2-dimensional sensor array of the MPPC provides about 10% of the photon detection efficiency at 905 nm, which is suitable for lidar sensors. However, the number of pixel counts of the 2D MPPC sensor array is rather modest at this point, around 1-2 k pixels. With a pixel pitch of 25 μm, the device area is on the order of 1-2 mm, which is relatively small compared to a conventional complementary metal-oxide semiconductor (CMOS) image sensor. If a 2D MPPC array image sensor is used for a flash lidar system with 0.1 degree resolution, the total FOV is limited to 1-2 degrees with a 1-2 k pixel count.
To overcome such a limited FOV while accommodating a large aperture, mechanical scanning modalities such as scanning mirrors and Risley prisms have been employed. However, these scanning mechanisms themselves limit the scan speed. Micro Electro Mechanical System (MEMS) resonant mirrors can support a large scanning angle and high scanning speed at the expense of the limited aperture size, which typically are on the order of mms. The challenge is to simultaneously satisfy the requirements of a large scanning angle, a large beam area and a fast scan rate in a reliable manner.
Another class of MEMS devices, the Texas Instrument Digital Micromirror Device (DMD), is uniquely positioned because of its large aperture area (>100 mm2) and fast frame rate (>40 kHz). Over the past several decades, DMDs have been used as display panels for projection displays. The DMD employs an array of electrostatically actuated mirrors to spatially modulate light. The micromirror element tilts along its rotation axis in +12° between the on-state and off-state. The on-state micromirrors redirect the light to the pupil of a projection lens, while the off-state micromirrors redirect the light outside of the pupil. In this way, illuminating light is spatially modulated in a pixelated manner to form an image at projection screen. The frame rate of a DMD exceeds several tens of kHz, enabling a pulse width modulation of light for a full colored RGB display by synchronizing the pattern to a sequential LED RGB illumination source.
Recently, as described in U.S. Pat. No. 11,635,614 and B. Smith, B. Hellman, A. Gin, A. Espinoza, and Y. Takashima, “Single chip lidar with discrete beam steering by digital micromirror device,” Opt. Express 25(13), 14732 (2017), a new illumination scheme employing nano second laser pulses with a DMD was proposed to steer the beam into one out of multiple directions with high efficiency. In the beam steering process, the DMD is first triggered to actuate all the micromirrors from the off-state to the on-state. While the mirrors are in motion, a nanosecond pulse illuminates the micromirrors. Due to the three orders of magnitude difference in the time scale between the transitional period of the DMD from the off- to the on-state (several μs) and ns pulse, the dynamic movement of the micromirrors between the off-state and on-state is effectively “frozen” so that the transitional states of micromirrors satisfies the blazed grating condition. In this way high efficiency beam and image steering is achieved. The ns pulse illumination has an affinity to Time of Flight (ToF) lidar, though the number of scanning points is still limited with multiple laser sources. The limited number of scanning points was addressed by combining two kinds of MEMS devices, a MEMS resonant mirror and a DMD in E. Kang, H. Choi, B. Hellman, J. Rodriguez, B. Smith, X. Deng, P. Liu, T. Lee, E. Evans, Y. Hong, J. Guan, C. Luo, and Y. Takashima, “All-MEMS Lidar Using Hybrid Optical Architecture with Digital Micromirror Devices and a 2D-MEMS Mirror,” Micromachines 13(9), 1444 (2022). Fine steering within a narrow FOV (5 degrees) is employed by a MEMS resonant mirror while a DMD steers the FOV of the MEMS resonant mirror over the FOV. For a receiver a single APD detector is used and the receiver DMD covers the 35 degree FOV.
The lidar system described in E. Kang, H. Choi, B. Hellman, J. Rodriguez, B. Smith, X. Deng, P. Liu, T. Lee, E. Evans, Y. Hong, J. Guan, C. Luo, and Y. Takashima, “All-MEMS Lidar Using Hybrid Optical Architecture with Digital Micromirror Devices and a 2D-MEMS Mirror,” Micromachines 13(9), 1444 (2022) employs a MEMS-resonant mirror and a DMD forms a lidar image by point-by-point steering and detection by using a single APD. By employing an SiPM array as a detector, and with DMD-based beam steering for the transmitter and FOV steering for the receiver, a lidar image is formed in a solid-state manner which use components having a high Technology Readiness Level (TRL), while eliminating the MEMS-resonant mirror from the system. The aforementioned MEMS-resonant mirror and DMD lidar architecture is leveraged for use in a lidar system by employing a 2-dimensional SiPM in lieu of a single APD detector for the receiver. Transmitter optics, a pulse laser and a DMD are used to provide flood illumination over several degrees, which is matched to the FOV of the 2-D SiPM ToF detector array. The illumination areas are sequentially illuminated by the transmitter DMD and the receiver FOV is synchronously scanned. In this manner, the relatively narrow FOV of the SiPM array was overcome by steering the FOV by use of the receiver DMD. In the transmitter, the illumination angle is matched to that of the detector, which is several degrees. Therefore, compared to the conventional flood illumination that illuminates the entire FOV at once, the scanning lidar shown in this reference allows the illumination on the FOV to decrease from tens of degrees to several degrees. Consequently, it increases the power density of the transmitted laser pulses and increases the number of returning photons. By this increased number of returning photons, the maximum detection range is increased.
In one aspect, the subject matter described herein provides a wide field-of-view (FOV) MEMS-based all-solid-state lidar system that employs high technology readiness level (TRL) components. By use of a highly efficient and repeatable lidar image steering technique, the system FOV can be well controlled and pointed to the specific target location.
In another aspect, the lidar system described herein employs an image steering method that achieves a time-multiplexing field of view (FOV) expansion by employing a Digital Micromirror Device (DMD) as a programmable blazed grating and a 2-dimensional lidar sensor array as a detector. In one illustrative embodiment that employs a 905 nm nanosecond pulsed laser, the lidar system demonstrates a seven times improvement in the FOV without sacrificing the angular resolution of the lidar images. A lidar image crosstalk test and an advanced image stitching process reveal that this illustrative embodiment of the lidar system is capable of horizontally expanding the lidar detection area to a 44° full field of view in real-time.
In one particular embodiment, a LIDAR system is provided for detecting one or more objects in an extended region. The Lidar system includes a laser source configured to generate laser light pulses, a first DMD, a second DMD and a two-dimensional (2D) sensor array. The first DMD is configured to receive the laser light pulses and diffractively steer the laser light pulses to sequentially illuminate different sub-regions within the extended region. The second DMD is configured to receive reflected light pulses from the different sub-regions in a sequential manner as each of the different sub-regions is illuminated by the laser light pulses. The 2D sensor array configured to receive reflected light pulses from the second DMD and form an image of the different sub-regions as the reflected light pulses from each of the different sub-regions is sequentially received from the second DMD.
In a conventional flash lidar system, the transmitter (Tx) illuminates the entire FOV that matches the FOV of the receiver (Rx) optics. As the FOV increases, the power density of the transmitted laser beam decreases. Also, the f-number of the receiver optics tends to decrease as the FOV increases. Consequently, the returning signal is further limited by the aperture of the optics. As the receiver FOV increases, the instantaneous FOV (iFOV) also increases, which leads to a decrease in resolution. Due to such trade-offs, flash lidar system are mainly employed to detect nearby objects. The lidar optical system depicted in
The flood illumination provided to the various sub-FOVs by the transmitter DMD 110 and the sequential scanning of the sub-FOVs by the receiver DMD 120 in a synchronous manner to the transmitter DMD 110 employ a discrete DMD beam steering process that has been described in U.S. Pat. No. 11,635,614, which is hereby incorporated by reference in its entirety. This DMD steering process will be described by reference to
In some embodiments, the DMD mirrors move continuously between the ON and OFF states with a typical transition time on the order of a few micro seconds. A transitional state of the DMD is utilized by projecting a short pulsed laser whose pulse duration is much shorter than the transition time of the mirrors. With the short pulsed laser, the micromirror movement can be “frozen” at a plurality of angles between the stationary ON and OFF states. Thus, it is feasible to form a programmable blazed diffraction grating to discretely steer a collimated beam (e.g., a laser beam). It is to be appreciated that in some cases more than one pulse of light may be incident on a DMD during a single transition between the ON and OFF states, the pulsed occurring at different times than one another. The multiple pulses have the effect of freezing the mirrors at multiple blaze angles at the different times during a single transition.
As shown in
As shown in
The diffraction orders generated by projecting a beam onto an array of micromirrors in a manner as described above are defined by the following diffraction grating equation (1):
p sin θm=2mλ (1)
where θm is the angle from the zeroth order to the mth order shown in
As shown in
It will be appreciated that, although the illustrated embodiment has a diamond configuration, any suitable orientation may be used. Additionally, mirrors of any suitable shape may be used (e.g., square or rectangle). It will be appreciated that other mirror array shapes and orientations are governed by an equation similar to equation (1), but modified to account for the configuration of the mirror array.
The light source illuminating the DMD (e.g., light source 130 in
It will be appreciated that a plane wave of short duration (tmax≤T) when projected onto DMD 210 is diffracted into one of the specific diffraction orders with relatively high diffraction efficiency since a short pulse of light duration causes the DMD mirrors to appear to be in frozen state in a particular state that is equivalent to a blazed grating where the slope of the mirror is set to the blaze angle. Typically, all mirrors in the array onto which the light is projected are controlled to be actuated to a same degree (i.e., they have the same blaze angle); however, in some embodiments, only a subset of mirrors (e.g., at least 90% or at least 80% or at least 50%, due to the sequential reset of DMD mirror regions.
Additional details concerning the DMD steering process may be found in U.S. Pat. No. 11,635,614.
In summary, when the mirrors of the DMD transition between the on- and off-states, the tilt angle of the mirrors changes from −12 to +12 degrees. The transitional period is typically several microseconds, which is referred to as the crossover time. Thus, if a nanosecond (ns) laser pulse illuminates the DMD mirror array during the dynamic tilt motion, the movement of the mirrors is effectively “frozen” due to the order of magnitude difference in time scale of the crossover time (us) and pulse length (ns). Thus, the illumination pulse is timed to be received by the DMD while the micromirrors are in transition so that the DMD satisfies the blazed condition. The number of diffraction orders supported by this beam steering method depends on the pixel period, the wavelength, the angle of incidence of the laser with respect to the surface normal of DMD device, and maximum tilt angle of the DMD micromirrors. For example, at a wavelength of 532 nm, a pixel period of 5.4 um, with a normal incidence of the laser, and a maximum tilt angle of ±12 degrees, the number of diffraction orders is about 10, thus the number of sub-FOVs is 10. For a longer wavelength of 905 nm with the same DMD structure, the number of diffraction orders is decreased to 7.
Referring again to
The detector 140 in
In operation, light from the laser source is first directed to the Tx-DMD while the mirrors of the Tx-DMD are in transition between their on and off states and angled at a blaze angle that directs the light to a first selected diffraction order, which corresponds to one of the sub-FOVs. In the example shown in
It should be noted that in some cases the sequential manner in which the illumination is steered to the different sub-FOVs by the Tx-DMD, then directed to the Rx-DMD and finally imaged by the 2D sensor array of the detector may be determined in any desired order. For instance, in
In the scanning sequence depicted by the arrows in
In the workflow depicted in
Thus, the MPPC module 240 provides a master clock signal to the pulsed laser 230 and the Tx- and Rx-DMDs 210 and 220 while adjusting the delay time between the pulsed laser 230 and the Tx- and Rx-DMDs 210 and 220. The MPPC module 240 triggers out a master signal at, for example, 10 kHz with a 100 ns width. The MPPC module 240 starts capturing a lidar image and a trigger is sent to the pulsed laser 230 to trigger a short pulse e.g., an 8 ns pulse. In order to select particular diffraction orders for setting the Tx- and Rx-DMD blaze angles, the Arduino DUE delays the trigger to the pulsed laser 230 with respect to the trigger to the DMDs. However, the DMDs have an additional, global time delay. After the DMDs are triggered their mirrors start transitioning between the on and off states. Between the timing of the trigger in and the start of the mirror transitions, a global delay exists (about 5 μs for the particular DMDs chosen in this example). Due to the global delay of the DMDs, it is not possible to adopt a trigger sequence in which the pulsed laser is triggered before the DMD. Rather, the trigger sequence may be rearranged as shown in the timing diagram of
Once the signal from the MPPC module 240 triggers the Arduino Due 250 via the function generator 260, the function generator 260 triggers the DMDs at the next cycle. For example, if the laser pulse is at the nth cycle, the trigger from the function generator is for the (n+1)th cycle. The pulse from the function generator 260 triggers the Arduino Due 250 to generate two reset signals within a single cycle of the ToF measurement. The first pulse from the Arduino Due 250 changes the state of all the mirrors of the DMDs, and the second pulse is used to reset the DMDs. In each cycle, the DMDs will be reset to their original positions and wait for the next trigger from the MPPC module 240.
The technique in which the prior trigger from the MPPC module 240 is used to control the synchronization of the laser pulses and the DMD mirror transitions accommodates the global delay of DMD. To select diffraction orders, further fine tuning of the timing is required. The Arduino Due can be programmed with the no-operation command (NOP) to finely control the timing of the micromirror transitions. One NOP in the Arduino Due 250 corresponds to one clock cycle, 12.5 ns (1/80 MHz), which is sufficiently short compared to the time window (˜50 ns) need to maintain high diffraction efficiency.
It should be noted that the particular workflow illustrated in
As shown in
The angular resolution test was performed by using 2 targets placed at 100 cm, with the smallest spacing between them being the spacing that the lidar system can still resolve. Table 1 shows the angular resolution of DMD-MPPC lidar system. The smallest angular resolution that the system can achieve is 0.22 degrees, which happens at −1st and +1st orders of diffraction. This corresponds to 2 times of iFOV of system or 0.11 degrees.
In the Tx-DMD, the diffractive beam steering process suffers from an energy spill over to adjacent diffraction orders due to the fill factor, which is about 90%. In a realistic lidar imaging scenario, objects are placed across the different sub-FOVs. With the energy spill-over, the Tx-DMD illuminates not only the object, but also objects that reside in the adjacent sub-FOVs even though the spilled over power is substantially reduced. The same situation occurs with the Rx-DMD. The object of interest returns a signal with high efficiency if the object resides in the sub-FOV that Rx-DMD is observing. If the area of illumination of the Tx-DMD spans beyond the sub-FOV of interest, the Rx-DMD receives the signal from the adjacent sub-FOVs.
To test the detectable range of the MPPC lidar system, the maximum distance test was performed with a desk lamp as a target. A blue background in the color lidar images indicates that there is nothing but the one target in the test field. When the target was located 10 meters away from the system, it was still well captured by the MPPC with the correct depth information. Due to space limitations, however, the maximum range in this test is 20 meters. In fact, the MPPC is capable of detecting the target located up to a distance of meters with use of a lower threshold voltage and a longer focal length camera lens.
To acquire a wide single frame of the MPPC lidar image with correct depth information, an image stitching process may be used in conjunction with the diffractive image steering method described herein.
Ultimately, a lidar system for ADAS applications needs a pixel count of 1M or higher with a high frame rate of 30 fps. With a 32×32 pixel MPPX and 7 sub-FOVs, one embodiment of the solid-state lidar system described herein supports 32×32×7=7 k pixels and operates with a frame rate of 120 fps with a 1 k fps DMD. In alternative embodiments a detector with a higher pixel count and/or faster DMDs may be employed to fill the gap so as to meet ADAS requirements. For instance, a recently available 0.1M pixel CMOS ToF sensor may be employed. Since the DMDs are not detector specific, with a moderate number of sub-FOV multiplexing of 10, a 1M pixel lidar system is feasible. Detectors with smaller pixel counts may require additional FOV multiplexing and an even higher frame rate. Currently, DMDs operate with 100 fps and approach 42 k fps. By employing an additional but slower steering mechanism, such as MEMS Phase Light Modulators, for example, a solid-state implementation of the lidar system may still be achieved.
In summary, a lidar system providing diffractive expansion of the field of view of a time-of-flight (ToF) lidar receiver has been experimentally demonstrated and verified. The limited FOV of the 2-dimensional array of a Multi Pixel Photon Counter is enhanced by a factor of seven without sacrificing the resolution of the ToF lidar image. The diffractive FOV expansion is enabled by nanosecond pulsed illumination of the DMDs while the DMDs' micromirrors are in motion. The use of nanosecond laser illumination turns the DMD into an FOV expander.
The maximum distance testing and the lidar images crosstalk testing of all the diffraction orders have been used to verify the time-multiplexing technique described herein that involves configuring the DMDs as diffraction blazed gratings. With the further use of image stitching, the FOV of the lidar image provided by the MPPC module can theoretically be expanded to a 35° full FOV in real-time without any mechanical moving elements being used, which opens a pathway to advanced lidar applications.
While various embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, embodiments may be practiced otherwise than as specifically described and claimed. Embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
The above-described embodiments of the described subject matter can be implemented in any of numerous ways. For example, some embodiments may be implemented using hardware, software or a combination thereof. When any aspect of an embodiment is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single device or computer or distributed among multiple devices/computers
This application claims the benefit of U.S. Provisional Application Ser. No. 63/399,594, filed Aug. 19, 2022, the contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63399594 | Aug 2022 | US |