The present disclosure is related to light detection and ranging (LIDAR) systems, and more particularly to systems and methods of providing a variety of LIDAR scan patterns.
Frequency-Modulated Continuous-Wave (FMCW) LIDAR systems use tunable lasers for frequency-chirped illumination of targets, and coherent receivers for detection of backscattered or reflected light from the targets that are combined with a local copy of the transmitted signal. Mixing the local copy with the return signal, delayed by the round-trip time to the target and back, generates a beat frequency at the receiver that is proportional to the distance to each target in the field of view of the system.
A multi-wavelength FMCW lidar system provides simultaneous measurement of velocity and range by making use of nondegenerate laser sources, i.e. laser sources with different wavelengths. The light from lasers is combined and split into each beam and then amplified before they are delivered to the scene. The light scattered from the scene is then mixed with the local oscillator (LO) light on the photodiodes where the coherent signal is generated. The signal-to-noise ratio (SNR) in such a system depends on Transmit (Tx) power, target distance, overall light collection efficiency, and lag angle induced by a fast scanner. As the target distance increases, the SNR of such a system drops (number of collected photons drops due to decrease in solid angle in the radiometric equation), resulting in reduced probability of detection and reduced lidar performance.
For a more complete understanding of the various examples, reference is now made to the following detailed description taken in connection with the accompanying drawings in which like identifiers correspond to like elements.
The present disclosure describes various examples of LIDAR systems and methods for detecting distance and relative speed of objects. Embodiments of the present techniques improve SNR by enabling the use of amplitude modulation of the laser light upstream from the optical amplifiers. For example, the amplitude modulation can provide SNR enhancement up to 3 dB on a single wavelength receiver channel in a dual-wavelength nondegenerate FMCW system. In some embodiments, the LIDAR system alternates between single wavelength and dual wavelength operation every frame such that in one frame dual wavelength operation enables simultaneous range and velocity detection of objects and the single wavelength operation in the next frame enables detection of the objects with higher SNR and probability of detection. Furthermore, the doppler information from the previous frame can be used as a priori information to compensate for lack of velocity detection in the next frame. In some embodiments, a single frame can be partitioned into sub-groups that use a different type of laser amplitude modulation. This could allow one portion of the field of view (FOV) to have a signal from a single wavelength laser with higher amplitude while other portions of the field of view have a signal from multiple wavelength lasers.
In the following description, reference may be made herein to quantitative measures, values, relationships or the like. Unless otherwise stated, any one or more if not all of these may be absolute or approximate to account for acceptable variations that may occur, such as those due to engineering tolerances or the like.
Free space optics 115 may include one or more optical waveguides to carry optical signals, and route and manipulate optical signals to appropriate input/output ports of the active optical circuit. The free space optics 115 may also include one or more optical components such as taps, wavelength division multiplexers (WDM), splitters/combiners, polarization beam splitters (PBS), collimators, couplers or the like. In some examples, the free space optics 115 may include components to transform the polarization state and direct received polarized light to optical detectors using a PBS, for example. The free space optics 115 may further include a diffractive element to deflect optical beams having different frequencies at different angles along an axis (e.g., a fast-axis).
In some examples, the LIDAR system 100 includes an optical scanner 102 that includes one or more scanning mirrors that are rotatable along an axis (e.g., a slow-axis) that is orthogonal or substantially orthogonal to the fast-axis of the diffractive element to steer optical signals to scan an environment according to a scan pattern. For instance, the scanning mirrors may be rotatable by one or more galvanometers. Objects in the target environment may scatter an incident light into a return optical beam or a target return signal. The optical scanner 102 also collects the return optical beam or the target return signal, which may be returned to the passive optical circuit component of the optical circuits 101. For example, the return optical beam may be directed to an optical detector by a polarization beam splitter. In addition to the mirrors and galvanometers, the optical scanner 102 may include components such as a quarter-wave plate, lens, anti-reflective coated window or the like.
To control and support the optical circuits 101 and optical scanner 102, the LIDAR system 100 includes LIDAR control systems 110. The LIDAR control systems 110 may include a processing device for the LIDAR system 100. In some examples, the processing device may be one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device may be complex instruction set computing (CISC) microprocessor, reduced instruction set computer (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. The processing device may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
In some examples, the LIDAR control systems 110 may include a signal processing unit 112 such as a DSP. The LIDAR control systems 110 are configured to output digital control signals to control optical drivers 103. In some examples, the digital control signals may be converted to analog signals through signal conversion unit 106. For example, the signal conversion unit 106 may include a digital-to-analog converter. The optical drivers 103 may then provide drive signals to active optical components of optical circuits 101 to drive optical sources such as lasers and amplifiers. In some examples, several optical drivers 103 and signal conversion units 106 may be provided to drive multiple optical sources.
The LIDAR control systems 110 are also configured to output digital control signals for the optical scanner 102. A motion control system 105 may control the galvanometers of the optical scanner 102 based on control signals received from the LIDAR control systems 110. For example, a digital-to-analog converter may convert coordinate routing information from the LIDAR control systems 110 to signals interpretable by the galvanometers in the optical scanner 102. In some examples, a motion control system 105 may also return information to the LIDAR control systems 110 about the position or operation of components of the optical scanner 102. For example, an analog-to-digital converter may in turn convert information about the galvanometers' position to a signal interpretable by the LIDAR control systems 110.
The LIDAR control systems 110 are further configured to analyze incoming digital signals. In this regard, the LIDAR system 100 includes optical receivers 104 to measure one or more beams received by optical circuits 101. For example, a reference beam receiver may measure the amplitude of a reference beam from the active optical component, and an analog-to-digital converter converts signals from the reference receiver to signals interpretable by the LIDAR control systems 110. Target receivers measure the optical signal that carries information about the range and velocity of a target in the form of a beat frequency, modulated optical signal. The reflected beam may be mixed with a second signal from a local oscillator. The optical receivers 104 may include a high-speed analog-to-digital converter to convert signals from the target receiver to signals interpretable by the LIDAR control systems 110. In some examples, the signals from the optical receivers 104 may be subject to signal conditioning by signal conditioning unit 107 prior to receipt by the LIDAR control systems 110. For example, the signals from the optical receivers 104 may be provided to an operational amplifier for amplification of the received signals and the amplified signals may be provided to the LIDAR control systems 110.
In some applications, the LIDAR system 100 may additionally include one or more imaging devices 108 configured to capture images of the environment, a global positioning system 109 configured to provide a geographic location of the system, or other sensor inputs. The LIDAR system 100 may also include an image processing system 114. The image processing system 114 can be configured to receive the images and geographic location, and send the images and location or information related thereto to the LIDAR control systems 110 or other systems connected to the LIDAR system 100.
In operation according to some examples, the LIDAR system 100 is configured to use nondegenerate optical sources to simultaneously measure range and velocity across two dimensions. This capability allows for real-time, long range measurements of range, velocity, azimuth, and elevation of the surrounding environment.
In some examples, the scanning process begins with the optical drivers 103 and LIDAR control systems 110. The LIDAR control systems 110 instruct the optical drivers 103 to independently modulate one or more optical beams, and these modulated signals propagate through the passive optical circuit to the collimator. The collimator directs the light at the optical scanning system that scans the environment over a preprogrammed pattern defined by the motion control system 105. The optical circuits 101 may also include a polarization wave plate (PWP) to transform the polarization of the light as it leaves the optical circuits 101. In some examples, the polarization wave plate may be a quarter-wave plate or a half-wave plate. A portion of the polarized light may also be reflected back to the optical circuits 101. For example, lensing or collimating systems used in LIDAR system 100 may have natural reflective properties or a reflective coating to reflect a portion of the light back to the optical circuits 101.
Optical signals reflected from the environment pass through the optical circuits 101 to the receivers. Because the polarization of the light has been transformed, it may be reflected by a polarization beam splitter along with the portion of polarized light that was reflected back to the optical circuits 101. Accordingly, rather than returning to the same fiber or waveguide as an optical source, the reflected light is reflected to separate optical receivers. These signals interfere with one another and generate a combined signal. Each beam signal that returns from the target produces a time-shifted waveform. The temporal phase difference between the two waveforms generates a beat frequency measured on the optical receivers (photodetectors). The combined signal can then be reflected to the optical receivers 104.
The analog signals from the optical receivers 104 are converted to digital signals using ADCs. The digital signals are then sent to the LIDAR control systems 110. A signal processing unit 112 may then receive the digital signals and interpret them. In some embodiments, the signal processing unit 112 also receives position data from the motion control system 105 and galvanometers (not shown) as well as image data from the image processing system 114. The signal processing unit 112 can then generate a 3D point cloud with information about range and velocity of points in the environment as the optical scanner 102 scans additional points. The signal processing unit 112 can also overlay a 3D point cloud data with the image data to determine velocity and distance of objects in the surrounding area. The system also processes the satellite-based navigation location data to provide a precise global location.
In embodiments, the time delay Δt is not measured directly, but is inferred based on the frequency differences between the transmitted scanning waveforms and the return signals. When the return signals 204 and 206 are optically mixed with the corresponding scanning signals, a signal referred to as a “beat frequency” is generated, which is caused by the combination of two waveforms of similar but slightly different frequencies. The beat frequency indicates the frequency difference between the transmitted scanning waveform and the return signal, which is linearly related to the time delay Δt by the slope of the triangular waveform.
If the return signal has been reflected from an object in motion, the frequency of the return signal will also be affected by the Doppler effect, which is shown in
Thus, the beat frequencies Δfup and Δfdn can be used to differentiate between frequency shifts caused by the range and frequency shifts caused by motion of the measured object. Specifically, ΔfDoppler is the difference between the Δfup and Δfdn and the ΔfRange is the average of Δfup and Δfdn.
The range to the target and velocity of the target can be computed using the following formulas:
In the above formulas, λc=c/fc and fc is the center frequency of the scanning signal. By solving this system of equations for ΔfRange and ΔfDoppler and substituting those values into equations (3) and (4) respectively, one sees that the range and velocity, shown in equations (5) and (6), are proportional to the average and difference, respectively, of the up-sweep and down-sweep beat frequencies.
By employing a counter-chirp mechanism shown in
Another example modulation scheme according to the present disclosure is to have a counter chirp scheme but with a different center frequency as shown in
The beat frequencies can be generated, for example, as an analog signal in optical receivers 104 of system 100. The beat frequency can then be digitized by an analog-to-digital converter (ADC), for example, in a signal conditioning unit such as signal conditioning unit 107 in LIDAR system 100. The digitized beat frequency signal can then be digitally processed, for example, in a signal processing unit, such as signal processing unit 112 in system 100.
In some scenarios, to ensure that the beat frequencies accurately represent the range and velocity of the object, beat frequencies can be measured at a same moment in time, as shown in
In this embodiment, each optical source 304a and 304b outputs an optical beam which is frequency modulated to generate the up-chirp or the down chirp as described above. The center frequency of each optical beam may be the same as described in relation
The output of the optical source 304a and 304b is provided to an output tap 306, which is be configured to split a small portion of the optical beam energy to the reference block 308. The reference block 308 is configured to control the frequency modulation of the optical sources 304a and 304b by ensuring that the frequency modulation is linear or close to linear as shown in
The remainder of each optical beam passes through a variable optical attenuators (VOAs) 310a and 310b. The VOA is a tunable component that enables the LIDAR control system 110 to control the amplitude of each optical beam. For example, the VOA 310a and 310b may be liquid crystal attenuator. The VOAs 310a and 310b may be capable of reducing the amplitude of its respective optical beam to zero or near zero. Additionally, the VOAs 310a and 310b may have a continuous range of states or may be adjustable in a set of discrete states. For example, the variable optical attenuators may operate like a switch that can be set to on (light passing though) or off (light blocked).
Any light passing through the VOAs 310a and 310b is input to a multi-mode interferometer (MMI) 312 that is configured to combine the two optical beams (A and B) into a combined optical beam (A+B). The output of the MMI 312 is input to the semiconductor optical amplifier (SOA) 314. The input to the SOA 314 will be a combination of the optical beams from optical source A 304a and optical source B 304b less whatever portion of each signal has been attenuated by the respective VOAs 310a and 310b. This combined optical beam is amplified by the SOA 314 and output to the free space optics 320. The optical beam output by the SOA 314 may be referred to herein as an output optical beam.
Upon exiting the photonics chip 302, the output optical beams enter the free space optics 320, which responsible for routing the amplified light to the scene and receive the scattered light back to generate the coherent lidar signal. The optical beam may pass through various components such as polarization beam splitters (PBS), lenses to collimate the optical beam prior, components to transform the polarization of the light, and others. Additionally, a local oscillator (not shown) can be generated inside the free space optics 320 and combined with the return signal reflected from the target by coating one of the surfaces with a partial reflective coating.
Additionally, the transmitted optical beams may be transmitted towards the target by an optical scanner that generates the two-dimensional field of view (FOV). When the optical beam hits the target, a portion of the beam is returned back to the LIDAR system 300 as a return signal. The return signal is received by the free space optics 320 where the return signal is combined with the local oscillator signal and redirected to the photodiodes 316. The combined signal can then be used to measure distance, velocity, or other factors about the environment at the target point as described above. Several measurements may be performed simultaneously or near simultaneously to generate a point cloud representing the state of the environment at a given moment. Each instance of the point cloud may be referred to as a frame. The LIDAR system may be capable of generating several frames of data per second.
The quality of the measurements will depend in part on the signal to noise ratio (SNR) of the measured signals, which may be affected by a number of factors. In accordance with embodiments, an amplitude modulation technique is used to improve the SNR by increasing the amplitude of at least one of the optical beams at least part of the time.
The SOA 314 may be configured to operate in the saturation mode well beyond the linear regime such that the gain is fixed. In this mode, the total optical power output by the SOA 314 stays fixed. Accordingly, any reduction in the amplitude of one optical beam will result in an increased amplification of the other optical beam. For example, if the optical beams have an equal amplitude and one of the optical beams is turned off, the amplitude of the other optical beam will approximately double (3 dB increase) compared to when both optical beams are on. Accordingly, both of the optical beams output by the SOA 314 may be amplitude modulated by controlling the amplitude of one of the optical beams input to the SOA 314. Note that the amplitude modulation of the optical beams is in addition to the frequency modulation (i.e., chirps) generated by the optical sources.
The amplitude modulation enables one of the optical beams to have a higher amplitude for some of the measurements. For example, in some embodiments, the amplitude modulation may be configured to cause the amplitude of one optical beam to have a higher amplitude for alternating frames. Increasing the amplitude for some frames means that the SNR will be lower for those frames, resulting in more accurate range calculations. However, the SNR will increase for the optical beam with the reduced amplitude, meaning that the velocity information, may be less accurate or not available. In some embodiments, the velocity information for a previous frame may be used in a current frame to make up for the lack of velocity information for the current frame.
There may be many situations in which it is beneficial for the LIDAR system to change the amplitude modulation scheme depending on operating conditions. Embodiments of the LIDAR systems described herein can change the amplitude modulation scheme during operation to adapt to changing conditions, such as changes in the vehicle operation (speed, direction, etc.) and changes in the environment. For example, the amplitude modulation may be turned on if the SNR of the return optical beam falls below a specified threshold and turned on if the SNR increases above a specified threshold.
Embodiments of the present techniques include various methods of controlling the amplitude modulation. For example, the amplitude of each optical beam input to the SOA 314 may be controlled by controlling the output of each respective optical source 304a 304b or by controlling the attenuation applied by each respective VOA 310a 310b. In some embodiments in which the amplitude modulation is implemented through the optical sources 304a 304b, the system may not include the VOAs 310a or 310b. Additionally, in some embodiments in which the amplitude modulation is implemented through the one of the VOAs 310a or 310b, the system may include a single VOA, either VOA 310a or VOA 310b.
It will be appreciated that a variety of alterations may be made to the depicted system without deviating from the scope of the claims. For example, the embodiment shown in
In
The values N and M may be any suitable power levels suitable of LIDAR systems. In some embodiments, N and M may be equal or approximately equal. In other words, during Frame 1, the optical beam amplitude of laser B may be reduced to zero or near zero. For example, the optical source 304b may be turned off or the VOA 310b may be made fully or nearly fully opaque. In such embodiments, the amplitude of the optical beam of laser A will double or nearly double (3 dB increase). Other embodiments are also possible. For example, the optical beam amplitude of laser B may be reduced to 50 percent, in which case, the optical beam amplitude of laser B will increase to approximately 150 percent.
During frame 0, the LIDAR system can be configured to measure the range and velocity of the target using the first beat frequency corresponding with Laser A and the second beat frequency corresponding with Laser B. During frame 1, the LIDAR system can be configured to measure the range of the target from the first beat frequency but not the velocity of the target. The velocity information for frame 1 may be copied to frame 0 so that, for each point, the velocity of the target determined for frame 0 is coupled to the range of the target which was determined for frame 1.
Although the amplitude of each optical beam is modified for alternating frames, other amplitude variation schemes may be implemented to achieve a different amplitude modulation frequency or duty cycle. For example, the amplitude of each optical beam can be modified after every second frame or third frame.
In some examples, the amplitude modulation may be activated based on measured data. For example, the amplitude modulation may be activated in response to an increase in the measured SNR of the return optical beams, the measured velocity of objects detected in the field of view, etc.
It will be appreciated that a variety of alterations may be made to the depicted system without deviating from the scope of the claims. For example, the combined optical beam may be split into any suitable number of output optical beams depending on the design considerations of a particular implementation.
As in
However, rather than combining the optical beam A with optical beam B and then splitting them into four separate combined optical beams as shown in
Each of the four A optical beams is input to one of four separate VOAs 610, which are configured to amplitude modulate each A optical beam separately. Each of the four A optical beams and four B optical beams are then combined by a set of four multiplexers 612 to generate four combined optical beams. Each of the four combined optical beams is amplified by a respective one of four SOAs 616 and output to the free space optics 504. The free space optics may guide each optical beam separately to increase the image resolution.
With the configuration shown in
The amplitude modulation scheme may be changed during operation of the LIDAR system to adapt to changing conditions, such as changes in the vehicle operation (speed, direction, etc.) and changes in the environment. For example, each transmitted optical beam may be directed by the scanning optics to a different area within the field of view of the LIDAR system. Accordingly, each of the return optical beams may experience a different SNR depending on the objects within the respective area. In some embodiments, each transmitted optical beam may be controlled separately to turn on amplitude modulation if the SNR for the respective return optical beam falls below a specified threshold and turned off if the SNR for the respective return optical beam increases above a specified threshold.
It will be appreciated that a variety of alterations may be made to the depicted system without deviating from the scope of the claims. For example, the optical beams may be split and combined in a variety of ways to achieve separate amplitude modulation schemes for each of the combined optical beams depending on the design considerations of a particular implementation.
At block 702, a first optical beam and a second optical beam are emitted. The optical beams may be emitted by an optical source (e.g. optical laser). Each optical beam may be frequency modulated, such that one optical beam forms an up chirp while the other optical beam simultaneously forms a down chirp. The optical beams may have different center frequencies, or the optical beams may have the same center frequency but different polarization. The first optical beam and a second optical beam are also combined to form a combined optical beam, according to one of the embodiments described above.
At block 704, the first optical beam and the second optical beam are amplified using an optical amplifier to generate a first output optical beam and a second output optical beam. The optical beams may be amplified after being combined into a combined optical beam so that both optical beams are amplified by the same optical amplifier.
At block 706, the first output optical beam and the second output optical beam are emitted towards a target and light returned from the target is collected in a first return beam and a second return beam. The optical beams may be emitted into the field of view of the LIDAR system by any suitable arrangement of optical elements included, for example, in the free space optics 320 shown in
At block 708, a first beat frequency is generated from the first return beam and a second beat frequency is generated from the second return beam. The beat frequencies are generated by combining the return beams with a local oscillator and can be detected by a photodiode such as one of the photodiodes 316 shown in
At block 710, a range and velocity of the target is determined from the first beat frequency and the second beat frequency. The range and velocity may be computed by a processor such as the signal processing unit 112 (
At block 712, the amplitude of the first optical beam input to the optical amplifier is controlled to amplitude modulate the first output optical beam and the second output optical beam. As described above, controlling the amplitude modulation of one of the optical beams serves to amplitude modulate both optical beams. The amplitude modulation may include reducing the amplitude of the first optical beams to a lower level (75, 50, 25 percent, for example) or turning off the first optical beam. The power reduction in the first output optical beam causes a corresponding increase in the second output optical beam. For example, if the first optical beam is turned off, the power in the second output optical beam will approximately double.
Controlling the amplitude of the first optical beam may be performed by controlling the optical source that generates the optical beam or by controlling an optical attenuator situated in the optical path between the optical source and the optical amplifier. In some embodiments, the optical attenuator is one of the VOAs 310a, 310b, or 610 described in relation to
The amplitude modulation may cause the output optical beams to have varying amplitudes for different image frames. For frames in which the first optical beam is reduced or turned off, the signal processor may compute a range but not a velocity. For such frames, the velocity of a previous frame may be substituted. For example, in a first frame, the range and velocity of the target may be computed from the first beat frequency and the second beat frequency, while in a second frame the range of the target is computed from the second beat frequency and the velocity of the target determined from the first frame is coupled to the range of the target determined for the second frame to form a complete range and velocity pair. It should be noted that the first frame and second frame may be adjacent frames but are not necessarily adjacent frames.
It will be appreciated that embodiments of the method 700 may include additional blocks not shown in
The preceding description sets forth numerous specific details such as examples of specific systems, components, methods, and so forth, to provide a thorough understanding of several examples in the present disclosure. It will be apparent to one skilled in the art, however, that at least some examples of the present disclosure may be practiced without these specific details. In other instances, well-known components or methods are not described in detail or are presented in simple block diagram form in order to avoid unnecessarily obscuring the present disclosure. Thus, the specific details set forth are merely exemplary. Particular examples may vary from these exemplary details and still be contemplated to be within the scope of the present disclosure.
Any reference throughout this specification to “one example” or “an example” means that a particular feature, structure, or characteristic described in connection with the examples are included in at least one example. Therefore, the appearances of the phrase “in one example” or “in an example” in various places throughout this specification are not necessarily all referring to the same example.
Although the operations of the methods herein are shown and described in a particular order, the order of the operations of each method may be altered so that certain operations may be performed in an inverse order or so that certain operations may be performed, at least in part, concurrently with other operations. Instructions or sub-operations of distinct operations may be performed in an intermittent or alternating manner.
The above description of illustrated implementations of the present disclosure, including what is described in the Abstract, is not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed. While specific implementations of, and examples for, the present disclosure are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the present disclosure, as those skilled in the relevant art will recognize. The words “example” or “exemplary” are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the words “example” or “exemplary” is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to mean any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form. Furthermore, the terms “first,” “second,” “third,” “fourth,” etc. as used herein are meant as labels to distinguish among different elements and may not necessarily have an ordinal meaning according to their numerical designation.