Ultrashort pulses in LiDAR systems

Information

  • Patent Grant
  • 12313788
  • Patent Number
    12,313,788
  • Date Filed
    Tuesday, October 8, 2019
    5 years ago
  • Date Issued
    Tuesday, May 27, 2025
    16 days ago
  • Inventors
  • Original Assignees
    • SEYOND, INC. (Sunnyvale, CA, US)
  • Examiners
    • Alsomiri; Isam A
    • Naser; Sanjida
    Agents
    • MASCHOFF BRENNAN
    • Huang; Liang
Abstract
LiDAR system and methods discussed herein use ultrafast light pulses. Use of ultrafast light pulses can result in reduced power consumption compared to longer length or conventional light pulses.
Description
FIELD

This disclosure relates generally to laser scanning and, more particularly, to using ultrashort light pulses in laser scanning systems.


BACKGROUND

Light detection and ranging (LiDAR) systems use light pulses to create an image or point cloud of the external environment. Some typical LiDAR systems include a light source, a pulse steering system, and light detector. The light source generates light pulses that are directed by the pulse steering system in particular directions when being transmitted from the LiDAR system. When a transmitted light pulse is scattered by an object, some of the scattered light is returned to the LiDAR system as a returned pulse. The light detector detects the returned pulse. Using the time it took for the returned pulse to be detected after the light pulse was transmitted and the speed of light, the LiDAR system can determine the distance to the object along the path of the transmitted light pulse. The pulse steering system can direct light pulses along different paths to allow the LiDAR system to scan the surrounding environment and produce an image or point cloud. LiDAR systems can also use techniques other than time-of-flight and scanning to measure the surrounding environment.


SUMMARY

LiDAR system and methods discussed herein use ultrafast light pulses. Use of ultrafast light pulses can result in reduced power consumption compared to longer length or conventional light pulses.


In one embodiment, a LiDAR system is provided that can include a light source operative to output a plurality of light pulses; a controller operative to control pulse duration of each of the light pulses and to control a time delay between successively output light pulses, wherein the pulse duration of each of the light pulses is characterized as an ultrashort pulse; a steering system operative to control a scanning direction of each ultrashort light pulse to particular location within a LiDAR FOV; and a detection system operative to monitor for returned ultrashort pulses.


In another embodiment, a method for using a LIDAR system is provided for transmitting a plurality of ultrashort light pulses to a steering system that redirects the light pulses to a LiDAR field of view, wherein a time delay between successively ultrashort light pulses is varied; receiving, via a detection system, light energy signals comprising noise and returned ultrashort pulses; determining a time interval between successively received light energy signals; and correlating the determined time interval with the time delay to distinguish between noise and returned ultrashort pulses.


In yet another embodiment, a method for using a LiDAR system is provided for transmitting a plurality of ultrashort light pulses to a steering system that redirects the light pulses to a LiDAR field of view; receiving, via a detection system, light energy signals comprising noise and returned ultrashort pulses, wherein each received light energy signal produces an analog intensity level; converting the analog intensity level to a digital intensity level; and comparing the digital intensity level to a threshold to distinguish between noise and returned ultrashort pulses.


In yet another embodiment, a method for using a LiDAR system is provided for transmitting a plurality of ultrashort light pulses to a steering system that redirects the light pulses to a LiDAR field of view; receiving, via a detection system, light energy signals comprising noise and returned ultrashort pulses, wherein each received light energy signal produces an analog intensity level; converting the analog intensity level to a digital intensity level; and comparing the digital intensity level to a threshold to distinguish between noise and returned ultrashort pulses.


In yet another embodiment, a method for using a LiDAR system is provided for transmitting a plurality of ultrashort light pulses to a steering system that redirects the light pulses to a LIDAR field of view, wherein each transmitted ultrashort light pulse corresponds to a reference clock; receiving, via a detection system, light energy signals, wherein each received light energy signal produces a current level; converting the current level to a square wave signal; and comparing the square wave signal to the reference clock pulse to obtain a time delay between transmission of an ultrashort light pulse and a return of that particular ultrashort light pulse.





BRIEF DESCRIPTION OF THE DRAWINGS

The present application can be best understood by reference to the figures described below taken in conjunction with the accompanying drawing figures, in which like parts may be referred to by like numerals.



FIGS. 1-3 illustrate an exemplary LiDAR system using pulse signal to measure distances to points in the outside environment.



FIG. 4 depicts a logical block diagram of the exemplary LiDAR system.



FIG. 5 depicts a light source of the exemplary LiDAR system.



FIG. 6 depicts a light detector of the exemplary LiDAR system.



FIG. 7 shows illustrative laser pulses according to an embodiment.



FIG. 8 shows an illustrative LiDAR system according to an embodiment.



FIG. 9 shows illustrative return beam detection system according to an embodiment.



FIG. 10 shows another illustrative return beam detection system according to an embodiment.



FIG. 11 shows an illustrate sequence of ultrashort pulses according to an embodiment.



FIG. 12 shows a correlated return pulse signal surround by noise signals according to an embodiment.



FIG. 13 shows an illustrative process according to an embodiment.



FIG. 14 shows an illustrative process according to an embodiment.



FIG. 15 shows yet another illustrative return beam detection system according to an embodiment.



FIG. 16 shows an illustrative timing diagram according to an embodiment.



FIG. 17 shows illustrative process according to an embodiment.





DETAILED DESCRIPTION

In the following description of examples, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the disclosed examples.


Some light detection and ranging (LiDAR) systems using a single light source to produce pulse of a single wavelength that scan the surrounding environment. The pulses are scanned using steering systems direct the pulses in one or two dimensions to cover an area of the surround environment (the scan area). When these systems use mechanical means to direct the pulses, the system complexity increases because more moving parts are required. Additionally, only a single pulse can be emitted at any one time because two or more identical pulses would introduce ambiguity in returned pulses. In some embodiments of the present technology, these disadvantages and/or others are overcome.


For example, some embodiments of the present technology use two light sources that produce pulses of different wavelengths. These light sources provide the pulses to a pulse steering system at different angles so that the scan area for each light source is different. This allows for tuning the light source to appropriate powers and the possibility of having overlapping scan areas that cover scans of different distances. Longer ranges can be scanned with pulses having higher power and/or slower repetition rate. Shorter ranges can be scanned with pulses having lower power and/or high repetition rate to increase point density.


As another example, some embodiments of the present technology use pulse steering systems with one or more dispersion elements (e.g., gratings, optical combs, prisms, etc.) to direct pulses based on the wavelength of the pulse. A dispersion element can make fine adjustments to a pulse's optical path, which may be difficult or impossible with mechanical systems. Additionally, using one or more dispersion elements allows the pulse steering system to use few mechanical components to achieve the desired scanning capabilities. This results in a simpler, more efficient (e.g., lower power) design that is potentially more reliable (due to few moving components).


Some LiDAR systems use the time-of-flight of light signals (e.g., light pulses) to determine the distance to objects in the path of the light. For example, with respect to FIG. 1, an exemplary LiDAR system 100 includes a laser light source (e.g., a fiber laser), a steering system (e.g., a system of one or more moving mirrors), and a light detector (e.g., a photon detector with one or more optics). LiDAR system 100 transmits light pulse 102 along path 104 as determined by the steering system of LiDAR system 100. In the depicted example, light pulse 102, which is generated by the laser light source, is a short pulse of laser light. Further, the signal steering system of the LiDAR system 100 is a pulse signal steering system. However, it should be appreciated that LiDAR systems can operate by generating, transmitting, and detecting light signals that are not pulsed and/use derive ranges to object in the surrounding environment using techniques other than time-of-flight. For example, some LiDAR systems use frequency modulated continuous waves (i.e., “FMCW”). It should be further appreciated that any of the techniques described herein with respect to time-of-flight based systems that use pulses also may be applicable to LiDAR systems that do not use one or both of these techniques.


Referring back to FIG. 1 (a time-of-flight LiDAR system that uses light pulses) when light pulse 102 reaches object 106, light pulse 102 scatters and returned light pulse 108 will be reflected back to system 100 along path 110. The time from when transmitted light pulse 102 leaves LiDAR system 100 to when returned light pulse 108 arrives back at LiDAR system 100 can be measured (e.g., by a processor or other electronics within the LiDAR system). This time-of-flight combined with the knowledge of the speed of light can be used to determine the range/distance from LiDAR system 100 to the point on object 106 where light pulse 102 scattered.


By directing many light pulses, as depicted in FIG. 2, LiDAR system 100 scans the external environment (e.g., by directing light pulses 102, 202, 206, 210 along paths 104, 204, 208, 212, respectively). As depicted in FIG. 3, LiDAR system 100 receives returned light pulses 108, 302, 306 (which correspond to transmitted light pulses 102, 202, 210, respectively) back after objects 106 and 214 scatter the transmitted light pulses and reflect pulses back along paths 110, 304, 308, respectively. Based on the direction of the transmitted light pulses (as determined by LiDAR system 100) as well as the calculated range from LiDAR system 100 to the points on objects that scatter the light pulses (e.g., the points on objects 106 and 214), the surroundings within the detection range (e.g., the field of view between path 104 and 212, inclusively) can be precisely plotted (e.g., a point cloud or image can be created).


If a corresponding light pulse is not received for a particular transmitted light pulse, then it can be determined that there are no objects within a certain range of LiDAR system 100 (e.g., the max scanning distance of LiDAR system 100). For example, in FIG. 2, light pulse 206 will not have a corresponding returned light pulse (as depicted in FIG. 3) because it did not produce a scattering event along its transmission path 208 within the predetermined detection range. LiDAR system 100 (or an external system communication with LiDAR system 100) can interpret this as no object being along path 208 within the detection range of LiDAR system 100.


In FIG. 2, transmitted light pulses 102, 202, 206, 210 can be transmitted in any order, serially, in parallel, or based on other timings with respect to each other. Additionally, while FIG. 2 depicts a 1-dimensional array of transmitted light pulses, LiDAR system 100 optionally also directs similar arrays of transmitted light pulses along other planes so that a 2-dimensional array of light pulses is transmitted. This 2-dimentional array can be transmitted point-by-point, line-by-line, all at once, or in some other manner. The point cloud or image from a 1-dimensional array (e.g., a single horizontal line) will produce 2-dimensional information (e.g., (1) the horizontal transmission direction and (2) the range to objects). The point cloud or image from a 2-dimensional array will have 3-dimensional information (e.g., (1) the horizontal transmission direction, (2) the vertical transmission direction, and (3) the range to objects).


The density of points in point cloud or image from a LiDAR system 100 is equal to the number of pulses divided by the field of view. Given that the field of view is fixed, to increase the density of points generated by one set of transmission-receiving optics, the LiDAR system should fire a pulse more frequently, in other words, a light source with a higher repetition rate is needed. However, by sending pulses more frequently the farthest distance that the LiDAR system can detect may be more limited. For example, if a returned signal from a far object is received after the system transmits the next pulse, the return signals may be detected in a different order than the order in which the corresponding signals are transmitted and get mixed up if the system cannot correctly correlate the returned signals with the transmitted signals. To illustrate, consider an exemplary LiDAR system that can transmit laser pulses with a repetition rate between 500 kHz and 1 MHz. Based on the time it takes for a pulse to return to the LiDAR system and to avoid mix-up of returned pulses from consecutive pulses in conventional LiDAR design, the farthest distance the LiDAR system can detect may be 300 meters and 150 meters for 500 kHz and 1 Mhz, respectively. The density of points of a LiDAR system with 500 kHz repetition rate is half of that with 1 MHz. Thus, this example demonstrates that, if the system cannot correctly correlate returned signals that arrive out of order, increasing the repetition rate from 500 kHz to 1 Mhz (and thus improving the density of points of the system) would significantly reduce the detection range of the system.



FIG. 4 depicts a logical block diagram of LiDAR system 100, which includes light source 402, signal steering system 404, pulse detector 406, and controller 408. These components are coupled together using communications paths 410, 412, 414, 416, and 418. These communications paths represent communication (bidirectional or unidirectional) among the various LiDAR system components but need not be physical components themselves. While the communications paths can be implemented by one or more electrical wires, busses, or optical fibers, the communication paths can also be wireless channels or open-air optical paths so that no physical communication medium is present. For example, in one exemplary LiDAR system, communication path 410 is one or more optical fibers, communication path 412 represents an optical path, and communication paths 414, 416, 418, and 420 are all one or more electrical wires that carry electrical signals. The communications paths can also include more than one of the above types of communication mediums (e.g., they can include an optical fiber and an optical path or one or more optical fibers and one or more electrical wires).


LIDAR system 100 can also include other components not depicted in FIG. 4, such as power buses, power supplies, LED indicators, switches, etc. Additionally, other connections among components may be present, such as a direct connection between light source 402 and light detector 406 so that light detector 406 can accurately measure the time from when light source 402 transmits a light pulse until light detector 406 detects a returned light pulse.



FIG. 5 depicts a logical block diagram of one example of light source 402 that is based on a laser fiber, although any number of light sources with varying architecture could be used as part of the LiDAR system. Light source 402 uses seed 502 to generate initial light pulses of one or more wavelengths (e.g., 1550 nm), which are provided to wavelength-division multiplexor (WDM) 504 via fiber 503. Pump 506 also provides laser power (of a different wavelength, such as 980 nm) to WDM 504 via fiber 505. The output of WDM 504 is provided to pre-amplifiers 508 (which includes one or more amplifiers) which provides its output to combiner 510 via fiber 509. Combiner 510 also takes laser power from pump 512 via fiber 511 and provides pulses via fiber 513 to booster amplifier 514, which produces output light pulses on fiber 410. The outputted light pulses are then fed to steering system 404. In some variations, light source 402 can produce pulses of different amplitudes based on the fiber gain profile of the fiber used in the source. Communication path 416 couples light source 402 to controller 408 (FIG. 4) so that components of light source 402 can be controlled by or otherwise communicate with controller 408. Alternatively, light source 402 may include its own controller. Instead of controller 408 communicating directly with components of light source 402, a dedicated light source controller communicates with controller 408 and controls and/or communicates with the components of light source 402. Light source 402 also includes other components not shown, such as one or more power connectors, power supplies, and/or power lines.


Some other light sources include one or more laser diodes, short-cavity fiber lasers, solid-state lasers, and/or tunable external cavity diode lasers, configured to generate one or more light signals at various wavelengths. In some examples, light sources use amplifiers (e.g., pre-amps or booster amps) include a doped optical fiber amplifier, a solid-state bulk amplifier, and/or a semiconductor optical amplifier, configured to receive and amplify light signals.


Returning to FIG. 4, signal steering system 404 includes any number of components for steering light signals generated by light source 402. In some examples, signal steering system 404 may include one or more optical redirection elements (e.g., mirrors or lens) that steer light pulses (e.g., by rotating, vibrating, or directing) along a transmit path to scan the external environment. For example, these optical redirection elements may include MEMS mirrors, rotating polyhedron mirrors, or stationary mirrors to steer the transmitted pulse signals to different directions. Signal steering system 404 optionally also includes other optical components, such as dispersion optics (e.g., diffuser lenses, prisms, or gratings) to further expand the coverage of the transmitted signal in order to increase the LiDAR system 100's transmission area (i.e., field of view). In some examples, signal steering system 404 does not contain any active optical components (e.g., it does not contain any amplifiers). In some other examples, one or more of the components from light source 402, such as a booster amplifier, may be included in signal steering system 404. In some instances, signal steering system 404 can be considered a LIDAR head or LiDAR scanner.


Some implementations of signal steering systems include one or more optical redirection elements (e.g., mirrors or lens) that steers returned light signals (e.g., by rotating, vibrating, or directing) along a receive path to direct the returned light signals to the light detector. The optical redirection elements that direct light signals along the transmit and receive paths may be the same components (e.g., shared), separate components (e.g., dedicated), and/or a combination of shared and separate components. This means that in some cases the transmit and receive paths are different although they may partially overlap (or in some cases, substantially overlap).



FIG. 6 depicts a logical block diagram of one possible arrangement of components in light detector 404 of LiDAR system 100 (FIG. 4). Light detector 404 includes optics 604 (e.g., a system of one or more optical lenses) and detector 602 (e.g., a charge coupled device (CCD), a photodiode, an avalanche photodiode, a photomultiplier vacuum tube, an image sensor, etc.) that is connected to controller 408 (FIG. 4) via communication path 418. The optics 604 may include one or more photo lenses to receive, focus, and direct the returned signals. Light detector 404 can include filters to selectively pass light of certain wavelengths. Light detector 404 can also include a timing circuit that measures the time from when a pulse is transmitted to when a corresponding returned pulse is detected. This data can then be transmitted to controller 408 (FIG. 4) or to other devices via communication line 418. Light detector 406 can also receive information about when light source 402 transmitted a light pulse via communication line 418 or other communications lines that are not shown (e.g., an optical fiber from light source 402 that samples transmitted light pulses). Alternatively, light detector 404 can provide signals via communication line 418 that indicate when returned light pulses are detected. Other pulse data, such as power, pulse shape, and/or wavelength, can also be communicated.


Returning to FIG. 4, controller 408 contains components for the control of LiDAR system 100 and communication with external devices that use the system. For example, controller 408 optionally includes one or more processors, memories, communication interfaces, sensors, storage devices, clocks, ASICs, FPGAs, and/or other devices that control light source 402, signal steering system 404, and/or light detector 406. In some examples, controller 408 controls the power, rate, timing, and/or other properties of light signals generated by light source 402; controls the speed, transmit direction, and/or other parameters of light steering system 404; and/or controls the sensitivity and/or other parameters of light detector 406.


Controller 408 optionally is also configured to process data received from these components. In some examples, controller determines the time it takes from transmitting a light pulse until a corresponding returned light pulse is received; determines when a returned light pulse is not received for a transmitted light pulse; determines the transmitted direction (e.g., horizontal and/or vertical information) for a transmitted/returned light pulse; determines the estimated range in a particular direction; and/or determines any other type of data relevant to LiDAR system 100.



FIG. 7 shows illustrative laser pulses according to an embodiment. In particular, FIG. 7 shows conventional pulse 705 and ultrashort pulses 710. Conventional pulse 705 can represent a pulse having a relatively long pulse duration (e.g., on the order of nanoseconds) as compared to the ultrashort pulses 710, which have relatively short durations (e.g., on the order of picoseconds). Ultrashort pulses 710 may be several orders of magnitudes shorter than conventional pulses 705. For example, ultrashort pulses 710 can be in the picosecond range, for example, ranging from 10 picoseconds to 900 picoseconds, or more particularly ranging between 100 and 200 picoseconds. As such, several ultrashort pulses 710 can be emitted during the span of one convention pulse 705. Conventional pulse 705 embodies higher average power than ultrashort pulses 710 that are emitted over the pulse duration of conventional pulse 705. However, each ultrashort pulse 710 has higher peak power than conventional pulse 705, but the average power consumed by ultrashort pulses 710 is much less than the average power of conventional pulse 705.



FIG. 8 shows an illustrative LiDAR system 800 according to an embodiment. LiDAR system 800 can include ultrashort pulse controller 810, laser system 820, beam steering system 830, and return beam detection system 840. Ultrashort pulse controller 810 may control the pulse duration and power intensity of each laser pulse emitted by laser source 820, and may also control the time intervals between successively emitted laser pulses. Laser system 820 may be similar to light source 402, beam steering system may be similar to signal steering system 404, return beam detection system 840 may be similar to light detector 406. Because the ultrafast pulses are much faster than conventional pulses, special consideration may be taken into account by return beam detection system 840 to quickly and accurately process return signals. Several return beam detection circuitry system embodiments are discussed below.


Advantages of using ultrashort pulses is that it reduces average power consumption by the laser. In addition, the ultrashort pulses may enable the size of beam steering system 830 to shrink. For example, in one embodiment, a 200 picosecond ultrashort pulse may enable the receiving aperture of beam steering system 830 to be reduced down to 200 mm2 and power consumption to be reduced down to less than 1 W whereas a 4 ns pulse may require that of beam steering system 830 to be sized to approximately 600 mm2 and power consumption to be more than 10 W. Yet another advantage of using ultra short pulses is that various electronics such as light detecting electronics can be shrunk compared to the size of such electronics needed for conventional light pulses.



FIG. 9 shows illustrative return beam detection system 900 according to an embodiment. System 900 can include receiving lenses 910, fast avalanche photo diode (APD) 920, analog-to-digital converter (ADC) 930, and signal detection circuitry 940. The fast APD 920 monitors for return pulses. Detected return pulses are sent to ADC 930 for conversion from an analog signal to a digital signal. That digital signal can be compared to a threshold by signal detection circuitry 940 to determine whether a return pulse has been detected.



FIG. 10 shows illustrative return beam detection system 1000 according to an embodiment. System 1000 can include receiving lenses 1010, single photon avalanche diode (SPAD) detector 1020, time-to-digital converter (TDC) 1030, and signal detection circuitry 1040. SPAD detector 1020 is very sensitive and generates an output in response to return signals and noise (e.g., ambient sunlight, etc.). TDC 1030 determines the time period between successive outputs generated by SPAD detector 1020. The determined time periods are provided to signal detection circuitry 1040. Signal detection circuitry 1040 can evaluate the determined time periods to assess whether a return pulse has been detected. Using TDC 1030 (as opposed to an ADC) eliminates an intensity measurement of the return pulses, and uses a time-dependent measurement of the return pulses. In order to combat the noise issue, ultrashort pulse controller (e.g., controller 810) may instruct the laser source to emit successive ultrashort pulses with different periods. For example, FIG. 11 shows an illustrative sequence of ultrashort pulses 1101-1104 that are successively emitted at different periods or time delays T1-T3, where T1≠T2≠T3. By varying the periods, signal detection circuitry 1040 can more readily discern whether SPAD signals belong to a real return signal or noise. This can be done by correlating the determined time periods with the varying periods. FIG. 12 shows a correlated return pulse signal 1201 surround by noise signals 1202. In addition, by varying the periods of successive laser pulses, signal detection circuitry 1040 can detect return pulses within 2-3 ms as opposed to techniques that do not dither the successive laser pulses, which can take 100 ms or more to detect return pulses.


System 1000 may not be able to determine intensity of the return light pulse because it is set up to detect time delays between returned pulses. If desired, system 1000 may be used in combination with a camera that can be used to obtain intensity information related to the light pulse.


In some embodiments, the SPAD detector can be based on InGaAs, which can detect light in the 900-1700 nm wavelengths. In another embodiment the SPAD detector can be based on a silicon photo multitube (SiPM), which can detect light in the 400-1050 nm wavelengths.



FIG. 13 shows illustrative process 1300 according to an embodiment. Starting at step 1310, a plurality of ultrashort light pulses can be transmitted to a steering system that redirects the light pulses to a LiDAR field of view, wherein a time delay between successively transmitted ultrashort light pulses is varied. At step 1320, light energy signals are received via a detection system. The light energy signals can include at least one of noise and returned ultrashort pulses. At step 1330, a time interval between successively received light energy signals is determined. At step 1340, the determined time interval can be correlated with the time delay to distinguish between noise and returned ultrashort pulses.


It should be understood that the steps shown in FIG. 13 are merely illustrative and that additional steps may be added, that some steps may be omitted, and that some steps may rearranged.



FIG. 14 shows illustrative process 1400 according to an embodiment. Starting at step 1410, a plurality of ultrashort light pulses can be transmitted to a steering system that redirects the light pulses to a LiDAR field of view. At step 1420, light energy signals are received via a detection system. The light energy signals can include noise and returned ultrashort pulses, wherein each received light energy signal produces an analog intensity level. At step 1430, the analog intensity level is converted to a digital intensity level. At step 1440, the digital intensity level is compared to a threshold to distinguish between noise and returned ultrashort pulses.


It should be understood that the steps shown in FIG. 14 are merely illustrative and that additional steps may be added, that some steps may be omitted, and that some steps may rearranged.



FIG. 15 shows illustrative return beam detection system 1500 according to an embodiment. System 1500 can include receiving lenses 1510, avalanche photo diode (APD) detector 1520, limiting transimpedance amplifier (LTA) 1530, time-to-digital converter (TDC) 1540, laser reference clock 1550, and signal detection circuitry 1560. APD 1520 monitors for return pulses. APD 1520 produces a current signal based on the detected return pulse. The current signal may have a period approximately equal to the ultrashort pulse. LTA 1530 converts the current signal obtained by APD 1520 into a square wave signal. The square wave signal has a period that is approximately the same as the period of the light pulse received by receiving lens 1510. The square wave signal is provided to TDC 1540. TDC 1540 also receives laser reference clock 1550. Laser reference clock 1550 corresponds to transmission of each light pulse by the laser system (not shown). TDC 1540 is able to determine a time delay between a reference clock and the rising edge of square wave for any given light pulse. Each light pulse includes the originating light pulse being emitted by the laser source and a return pulse. Each originating light pulse corresponds to the laser reference clock 1550 and each return pulse is represented by the square wave generated by the LTA 1530. The time delay between the reference clock and the square wave is used to calculate the distance of the object responsible for returning the return pulse.



FIG. 16 shows an illustrative timing diagram according to an embodiment. As shown, laser source 1605 transmits a laser pulse that corresponds to REF CLK pulse transmitted by laser reference clock 1650. A return pulse, reflected by an object in the LiDAR system's field of view, is detected by the APD, which produces illustrative current signal 1610, which is sent to the LTA. The LTA converts current signal 1610 into square wave signal 1620. Square signal 1620 and REF CLK are compared by the TDC to determine the time delay (ta) between square wave signal 1620 and REF CLK. The time delay can be used to determine the distance to the object responsible for returning the return pulse.



FIG. 17 shows an illustrative process 1700 according to an embodiment. Process 1700 may be implemented in system 1500. Starting at step 1710, a plurality of ultrashort light pulses can be transmitted to a steering system that redirects the light pulses to a LiDAR field of view, wherein each transmitted ultrashort light pulse corresponds to a reference clock. At step 1720, light energy signals are received via a detection system, wherein each received light energy signal produces a current level. For example, ADP 1520 may produce a current level based on the returned ultrashort pulse. At step 1730, the current level is converted to a square wave signal. For example, LTA 1530 may generate the square wave signal. At step 1740, the square wave signal is compared to the reference clock pulse to obtain a time delay between transmission of an ultrashort light pulse and a return of that particular ultrashort light pulse. For example, TDC 1540 may determine the time delay.


It should be understood that the steps shown in FIG. 17 are merely illustrative and that additional steps may be added, that some steps may be omitted, and that some steps may rearranged.


Various exemplary embodiments are described herein. Reference is made to these examples in a non-limiting sense. They are provided to illustrate more broadly applicable aspects of the disclosed technology. Various changes may be made and equivalents may be substituted without departing from the true spirit and scope of the various embodiments. In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the various embodiments. Further, as will be appreciated by those with skill in the art, each of the individual variations described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the various embodiments.

Claims
  • 1. A light detection and ranging (LiDAR) system comprising: a light source configured to output a plurality of light pulses;a controller configured to: control pulse duration of each of the light pulses,control a plurality of time delays including variable time delays such that at least two of the plurality of time delays are different among successive light pulses, andselect, in a pseudo-random manner, the plurality of time delays between the successive light pulses, wherein the pulse duration of each of the light pulses is substantially the same and is characterized as an ultrashort light pulse, and wherein at least one ultrashort light pulse of the light pulses is emitted during a span of one non-ultrashort light pulse having a pulse duration on the order of nanosecond;a steering system configured to control a scanning direction of each ultrashort light pulse to a particular location within a LiDAR field-of-view (FOV); anda detection system configured to monitor for returned ultrashort pulses and determine one or more time durations between successive outputs of a detector of the detection system,wherein the controller is further configured to correlate the determined one or more time durations with the variable time delays to distinguish between noise and the returned ultrashort pulses.
  • 2. The LiDAR system of claim 1, wherein the detection system comprises: at least one receiving lens;the detector comprising a single photon avalanche diode (SPAD) detector; anda time-to-digital converter (TDC).
  • 3. The LiDAR system of claim 2, wherein the SPAD detector is configured to generate one or more output signals in response to light energy received through the at least one receiving lens, and wherein the TDC is configured to determine one or more time durations between successive outputs of the SPAD detector.
  • 4. The LiDAR system of claim 1, wherein the pulse duration of an ultrashort light pulse is on the order of picoseconds.
  • 5. The LiDAR system of claim 1, wherein the pulse duration of an ultrashort light pulse ranges between 10 picoseconds and 900 picoseconds.
  • 6. The LiDAR system of claim 1, wherein the pulse duration of an ultrashort light pulse ranges between 100 picoseconds and 200 picoseconds.
  • 7. The LiDAR system of claim 1, wherein the detection system comprises: at least one receiving lens;the detector comprising an avalanche photo diode (APD) detector; andan analog-to-digital converter (ADC).
  • 8. The LiDAR system of claim 7, wherein the APD detector is configured to generate an output signal proportional to light energy received through the at least one receiving lens, and wherein the ADC is configured to convert the output signal to a digital signal.
  • 9. The LiDAR system of claim 8, wherein the controller is configured to compare the digital signal to a threshold to determine whether a returned ultrashort pulse has been received.
  • 10. The LiDAR system of claim 1, wherein the detection system comprises: at least one receiving lens;the detector comprising an avalanche photo diode (APD) detector, wherein the APD detector produces a current signal in response to a returned ultrashort pulse;a limiting transimpedance amplifier (LTA) configured to generate a square wave signal based on the current signal; anda time-to-digital converter (TDC).
  • 11. The LiDAR system of claim 10, wherein at least one of the plurality of time delays between the successive light pulses is provided with a reference clock to the TDC, and wherein the TDC determines a time delay between the reference clock and the square wave signal that is used to determine a distance to an object that returned the returned ultrashort pulse.
  • 12. A method for using a light detection and ranging (LiDAR) system, comprising: transmitting a plurality of ultrashort light pulses to a steering system that redirects the ultrashort light pulses to a field of view of the LiDAR system, wherein pulse duration of each of the ultrashort light pulses is substantially the same and is controlled by a controller of the LIDAR system,wherein a plurality of time delays between successive ultrashort light pulses are variable time delays controlled by the controller of the LiDAR system such that at least two of the time delays are different among the successive ultrashort light pulses,wherein the plurality of time delays between the successive ultrashort light pulses are selected by the controller of the LiDAR system in a pseudo-random manner,and wherein at least one ultrashort light pulse of the plurality of ultrashort light pulses is emitted during a span of one non-ultrashort light pulse having a pulse duration on the order of nanosecond;receiving, via a detection system, light energy signals comprising at least one of noise and returned ultrashort pulses;determining one or more time intervals between successively received light energy signals; andcorrelating the determined one or more time intervals with the plurality of variable time delays to distinguish between noise and returned ultrashort pulses.
  • 13. The method of claim 12, wherein the ultrashort light pulses range between 10 picoseconds and 900 picoseconds.
  • 14. The method of claim 12, wherein the ultrashort light pulses range between 100 picoseconds and 200 picoseconds.
  • 15. A method for using a light detection and ranging (LiDAR) system, comprising: transmitting a plurality of ultrashort light pulses to a steering system that redirects the ultrashort light pulses to a field of view of the LiDAR system, wherein pulse duration of each of the ultrashort light pulses is substantially the same and is controlled by a controller of the LIDAR system,wherein a plurality of time delays between successive ultrashort light pulses are variable time delays controlled by the controller of the LiDAR system such that at least two of the time delays are different among the successive ultrashort light pulses,wherein the plurality of time delays between the successive ultrashort light pulses are selected by the controller of the LiDAR system in a pseudo-random manner,and wherein at least one ultrashort light pulse of the plurality of ultrashort light pulses is emitted during a span of one non-ultrashort light pulse having a pulse duration on the order of nanosecond;receiving, via a detection system, light energy signals comprising noise and returned ultrashort pulses, wherein each of the received light energy signal produces an analog intensity level;converting the analog intensity level to a digital intensity level; anddistinguishing between noise and returned ultrashort pulses based on the digital intensity level and the variable time delays.
  • 16. The method of claim 15, wherein the ultrashort light pulses range between 10 picoseconds and 900 picoseconds.
  • 17. The method of claim 15, wherein the ultrashort light pulses range between 100 picoseconds and 200 picoseconds.
  • 18. A method for using a light detection and ranging (LiDAR) system, comprising: transmitting a plurality of ultrashort light pulses to a steering system that redirects the light pulses to a field of view of the LiDAR system, wherein pulse duration of each of the ultrashort light pulses is substantially the same and is controlled by a controller of the LIDAR system,wherein each of the transmitted ultrashort light pulse corresponds to a reference clock, wherein a plurality of time delays between successive ultrashort light pulses are variable time delays controlled by a controller of the LiDAR system such that at least two of the time delays are different among the successive ultrashort light pulses,wherein the plurality of time delays between the successive ultrashort light pulses are selected by the controller of the LiDAR system in a pseudo-random manner,and wherein at least one ultrashort light pulse of the plurality of ultrashort light pulses is emitted during a span of one non-ultrashort light pulse having a pulse duration on the order of nanosecond;receiving, via a detection system, light energy signals, and wherein each of the received light energy signals produces a current level;converting the current level to a square wave signal; andobtaining, based on the square wave signal and the variable time delays, a time delay between transmission of an ultrashort light pulse and a corresponding return light pulse.
  • 19. The method of claim 18, wherein the ultrashort light pulses range between 10 picoseconds and 900 picoseconds.
  • 20. The method of claim 18, wherein the ultrashort light pulses range between 100 picoseconds and 200 picoseconds.
  • 21. The LiDAR system of claim 1, wherein the controller is further configured to control the pulse duration of each of the light pulses to be ultrashort such that a size of a receiving aperture of the steering system is reduced compared to if the light pulses are non-ultrashort pulse.
  • 22. The LiDAR system of claim 1, wherein the controller is further configured to control the pulse duration of each of the light pulses to be ultrashort such that average power consumption by the light source is reduced compared to if the light pulses are non-ultrashort pulse.
Parent Case Info

This application claims the benefit of U.S. Provisional Application No. 62/743,367, filed Oct. 9, 2018, the disclosure of which is incorporated herein in its entirety.

US Referenced Citations (239)
Number Name Date Kind
3897150 Bridges et al. Jul 1975 A
4464048 Farlow Aug 1984 A
4923263 Johnson May 1990 A
5006721 Cameron et al. Apr 1991 A
5157451 Taboada Oct 1992 A
5319434 Croteau et al. Jun 1994 A
5369661 Yamaguchi et al. Nov 1994 A
5442358 Keeler et al. Aug 1995 A
5546188 Wangler et al. Aug 1996 A
5579153 Laming et al. Nov 1996 A
5657077 Deangelis et al. Aug 1997 A
5793491 Wangler et al. Aug 1998 A
5838239 Stern et al. Nov 1998 A
5864391 Hosokawa et al. Jan 1999 A
5926259 Bamberger et al. Jul 1999 A
5936756 Nakajima Aug 1999 A
6163378 Khoury Dec 2000 A
6317202 Hosokawa et al. Nov 2001 B1
6594000 Green et al. Jul 2003 B2
6650404 Crawford Nov 2003 B1
6950733 Stopczynski Sep 2005 B2
7128267 Reichenbach et al. Oct 2006 B2
7202941 Munro Apr 2007 B2
7345271 Boehlau et al. Mar 2008 B2
7382442 Adachi et al. Jun 2008 B2
7440084 Kane Oct 2008 B2
7440175 Di et al. Oct 2008 B2
7489865 Varshneya et al. Feb 2009 B2
7502395 Cheng et al. Mar 2009 B2
7508496 Mettenleiter et al. Mar 2009 B2
7576837 Liu et al. Aug 2009 B2
7583364 Mayor et al. Sep 2009 B1
7830527 Chen et al. Nov 2010 B2
7835068 Brooks et al. Nov 2010 B1
7847235 Krupkin et al. Dec 2010 B2
7880865 Tanaka et al. Feb 2011 B2
7936448 Albuquerque et al. May 2011 B2
7969558 Hall Jun 2011 B2
7982861 Abshire et al. Jul 2011 B2
8072582 Meneely Dec 2011 B2
8471895 Banks Jun 2013 B2
8736818 Weimer et al. May 2014 B2
8749764 Hsu Jun 2014 B2
8812149 Doak Aug 2014 B2
8994928 Shiraishi Mar 2015 B2
9041762 Bai et al. May 2015 B2
9048616 Robinson Jun 2015 B1
9065243 Asobe et al. Jun 2015 B2
9086273 Gruver et al. Jul 2015 B1
9121703 Droz et al. Sep 2015 B1
9194701 Bosch Nov 2015 B2
9255790 Zhu Feb 2016 B2
9300321 Zalik et al. Mar 2016 B2
9304316 Weiss et al. Apr 2016 B2
9316724 Gehring et al. Apr 2016 B2
9354485 Fermann et al. May 2016 B2
9510505 Halloran et al. Dec 2016 B2
9575184 Gilliland et al. Feb 2017 B2
9605998 Nozawa Mar 2017 B2
9621876 Federspiel Apr 2017 B2
9638799 Goodwin et al. May 2017 B2
9696426 Zuk Jul 2017 B2
9702966 Batcheller et al. Jul 2017 B2
9804264 Villeneuve et al. Oct 2017 B2
9810786 Welford et al. Nov 2017 B1
9812838 Villeneuve et al. Nov 2017 B2
9823353 Eichenholz et al. Nov 2017 B2
9857468 Eichenholz et al. Jan 2018 B1
9869754 Campbell et al. Jan 2018 B1
9880263 Droz et al. Jan 2018 B2
9880278 Uffelen et al. Jan 2018 B2
9885778 Dussan Feb 2018 B2
9897689 Dussan Feb 2018 B2
9915726 Bailey et al. Mar 2018 B2
9927915 Frame et al. Mar 2018 B2
9958545 Eichenholz et al. May 2018 B2
10007001 LaChapelle et al. Jun 2018 B1
10012732 Eichenholz et al. Jul 2018 B2
10042159 Dussan et al. Aug 2018 B2
10061019 Campbell et al. Aug 2018 B1
10073166 Dussan Sep 2018 B2
10078133 Dussan Sep 2018 B2
10094925 LaChapelle Oct 2018 B1
10157630 Vaughn et al. Dec 2018 B2
10185027 O'Keeffe Jan 2019 B2
10191155 Curatu Jan 2019 B2
10215847 Scheim et al. Feb 2019 B2
10267898 Campbell et al. Apr 2019 B2
10295656 Li et al. May 2019 B1
10310058 Campbell et al. Jun 2019 B1
10324170 Enberg, Jr. et al. Jun 2019 B1
10324185 McWhirter et al. Jun 2019 B2
10393877 Hall et al. Aug 2019 B2
10422865 Irish et al. Sep 2019 B2
10429495 Wang et al. Oct 2019 B1
10444356 Wu et al. Oct 2019 B2
10451716 Hughes et al. Oct 2019 B2
10466342 Zhu et al. Nov 2019 B1
10502831 Eichenholz Dec 2019 B2
10509112 Pan Dec 2019 B1
10520602 Villeneuve et al. Dec 2019 B2
10557923 Watnik et al. Feb 2020 B2
10571567 Campbell et al. Feb 2020 B2
10578720 Hughes et al. Mar 2020 B2
10591600 Villeneuve et al. Mar 2020 B2
10627491 Hall et al. Apr 2020 B2
10641872 Dussan et al. May 2020 B2
10663564 LaChapelle May 2020 B2
10663585 McWhirter May 2020 B2
10663596 Dussan et al. May 2020 B2
10684360 Campbell Jun 2020 B2
10732281 LaChapelle Aug 2020 B2
10908262 Dussan Feb 2021 B2
10908265 Dussan Feb 2021 B2
10908268 Zhou et al. Feb 2021 B2
10969475 Li et al. Apr 2021 B2
10983218 Hall et al. Apr 2021 B2
11002835 Pan et al. May 2021 B2
11009605 Li et al. May 2021 B2
11016192 Pacala et al. May 2021 B2
11022689 Villeneuve et al. Jun 2021 B2
11035935 Hinderling Jun 2021 B2
11194048 Burbank et al. Dec 2021 B1
20020136251 Green et al. Sep 2002 A1
20040135992 Munro Jul 2004 A1
20050033497 Stopczynski Feb 2005 A1
20050190424 Reichenbach et al. Sep 2005 A1
20050195383 Breed et al. Sep 2005 A1
20060071846 Yanagisawa et al. Apr 2006 A1
20060132752 Kane Jun 2006 A1
20070091948 Di et al. Apr 2007 A1
20070216995 Bollond et al. Sep 2007 A1
20080174762 Liu et al. Jul 2008 A1
20080193135 Du et al. Aug 2008 A1
20090010644 Varshneya et al. Jan 2009 A1
20090051926 Chen Feb 2009 A1
20090059201 Willner et al. Mar 2009 A1
20090067453 Mizuuchi et al. Mar 2009 A1
20090147239 Zhu Jun 2009 A1
20090262760 Krupkin et al. Oct 2009 A1
20090316134 Michael et al. Dec 2009 A1
20100006760 Lee et al. Jan 2010 A1
20100020306 Hall Jan 2010 A1
20100020377 Borchers et al. Jan 2010 A1
20100027602 Abshire et al. Feb 2010 A1
20100045965 Meneely Feb 2010 A1
20100053715 O'Neill et al. Mar 2010 A1
20100128109 Banks May 2010 A1
20100271614 Albuquerque et al. Oct 2010 A1
20110181864 Schmitt et al. Jul 2011 A1
20120038903 Weimer et al. Feb 2012 A1
20120124113 Zalik et al. May 2012 A1
20120221142 Doak Aug 2012 A1
20130107016 Federspeil May 2013 A1
20130116971 Retkowski et al. May 2013 A1
20130241761 Cooper et al. Sep 2013 A1
20130293867 Hsu et al. Nov 2013 A1
20130293946 Fermann et al. Nov 2013 A1
20130329279 Nati et al. Dec 2013 A1
20130342822 Shiraishi Dec 2013 A1
20140078514 Zhu Mar 2014 A1
20140104594 Gammenthaler Apr 2014 A1
20140347650 Bosch Nov 2014 A1
20140350836 Stettner et al. Nov 2014 A1
20150078123 Batcheller et al. Mar 2015 A1
20150084805 Dawber Mar 2015 A1
20150109603 Kim et al. Apr 2015 A1
20150116692 Zuk et al. Apr 2015 A1
20150139259 Robinson May 2015 A1
20150158489 Oh et al. Jun 2015 A1
20150338270 Williams et al. Nov 2015 A1
20150355327 Goodwin et al. Dec 2015 A1
20160003946 Gilliland et al. Jan 2016 A1
20160006914 Neumann Jan 2016 A1
20160047896 Dussan Feb 2016 A1
20160047900 Dussan Feb 2016 A1
20160061655 Nozawa Mar 2016 A1
20160061935 Mccloskey et al. Mar 2016 A1
20160100521 Halloran et al. Apr 2016 A1
20160117048 Frame et al. Apr 2016 A1
20160172819 Ogaki Jun 2016 A1
20160178736 Chung Jun 2016 A1
20160226210 Zayhowski et al. Aug 2016 A1
20160245902 Natnik Aug 2016 A1
20160291134 Droz et al. Oct 2016 A1
20160313445 Bailey et al. Oct 2016 A1
20160327646 Scheim et al. Nov 2016 A1
20170003116 Yee et al. Jan 2017 A1
20170153319 Villeneuve et al. Jun 2017 A1
20170219695 Hall Aug 2017 A1
20170242104 Dussan Aug 2017 A1
20170299721 Eichenholz et al. Oct 2017 A1
20170307738 Schwarz et al. Oct 2017 A1
20170328991 Yates Nov 2017 A1
20170365105 Rao et al. Dec 2017 A1
20180040171 Kundu et al. Feb 2018 A1
20180050704 Tascione et al. Feb 2018 A1
20180069367 Villeneuve et al. Mar 2018 A1
20180152691 Pacala et al. May 2018 A1
20180158471 Vaughn et al. Jun 2018 A1
20180164439 Droz et al. Jun 2018 A1
20180156896 O'Keeffe Jul 2018 A1
20180188355 Bao et al. Jul 2018 A1
20180188357 Li et al. Jul 2018 A1
20180188358 Li et al. Jul 2018 A1
20180188371 Bao et al. Jul 2018 A1
20180210084 Zwölfer et al. Jul 2018 A1
20180275274 Bao et al. Sep 2018 A1
20180284241 Campbell et al. Oct 2018 A1
20180284242 Campbell Oct 2018 A1
20180284286 Eichenholz et al. Oct 2018 A1
20180329060 Pacala et al. Nov 2018 A1
20180359460 Pacala et al. Dec 2018 A1
20190025428 Li et al. Jan 2019 A1
20190107607 Danziger Apr 2019 A1
20190107623 Campbell et al. Apr 2019 A1
20190120942 Zhang et al. Apr 2019 A1
20190120962 Gimpel et al. Apr 2019 A1
20190154804 Eichenholz May 2019 A1
20190154807 Steinkogler et al. May 2019 A1
20190212416 Li et al. Jul 2019 A1
20190250254 Campbell et al. Aug 2019 A1
20190257924 Li et al. Aug 2019 A1
20190265334 Zhang et al. Aug 2019 A1
20190265336 Zhang et al. Aug 2019 A1
20190265337 Zhang et al. Aug 2019 A1
20190265339 Zhang et al. Aug 2019 A1
20190277952 Beuschel et al. Sep 2019 A1
20190310368 LaChapelle Oct 2019 A1
20190369215 Wang et al. Dec 2019 A1
20190369258 Hall et al. Dec 2019 A1
20190383915 Li et al. Dec 2019 A1
20200142070 Hall et al. May 2020 A1
20200256964 Campbell et al. Aug 2020 A1
20200284906 Eichenholz et al. Sep 2020 A1
20200319310 Hall et al. Oct 2020 A1
20200400798 Rezk et al. Dec 2020 A1
20210088630 Zhang Mar 2021 A9
20220390572 Russell Dec 2022 A1
Foreign Referenced Citations (75)
Number Date Country
1677050 Oct 2005 CN
204758260 Nov 2015 CN
204885804 Dec 2015 CN
108132472 Jun 2018 CN
207457508 Jun 2018 CN
207557465 Jun 2018 CN
208314210 Jan 2019 CN
208421228 Jan 2019 CN
208705506 Apr 2019 CN
10659747 May 2019 CN
209280923 Aug 2019 CN
108445468 Nov 2019 CN
110031823 Mar 2020 CN
108089201 Apr 2020 CN
109116331 Apr 2020 CN
109917408 Apr 2020 CN
109116366 May 2020 CN
109116367 May 2020 CN
110031822 May 2020 CN
211655309 Oct 2020 CN
109188397 Nov 2020 CN
109814086 Nov 2020 CN
109917348 Nov 2020 CN
110492856 Nov 2020 CN
110736975 Nov 2020 CN
109725320 Dec 2020 CN
110780284 Dec 2020 CN
110780283 Jan 2021 CN
110784220 Feb 2021 CN
212623082 Feb 2021 CN
110492349 Mar 2021 CN
109950784 May 2021 CN
213182011 May 2021 CN
213750313 Jul 2021 CN
214151038 Sep 2021 CN
109814082 Oct 2021 CN
113491043 Oct 2021 CN
214795200 Nov 2021 CN
214795206 Nov 2021 CN
214895784 Nov 2021 CN
214895810 Nov 2021 CN
215641806 Jan 2022 CN
112639527 Feb 2022 CN
215932142 Mar 2022 CN
112578396 Apr 2022 CN
0 757 257 Feb 1997 EP
1 237 305 Sep 2002 EP
1 923 721 May 2008 EP
2 157 445 Feb 2010 EP
2 395 368 Dec 2011 EP
2 889 642 Jul 2015 EP
1 427 164 Mar 1976 GB
2000411 Jan 1979 GB
2007144667 Jun 2007 JP
2010035385 Feb 2010 JP
2017-003347 Jan 2017 JP
2017-138301 Aug 2017 JP
10-2010-0096931 Sep 2010 KR
10-2012-0013515 Feb 2012 KR
10-2013-0068224 Jun 2013 KR
10-2018-0107673 Oct 2018 KR
02101408 Dec 2002 WO
2017110417 Jun 2017 WO
2018125725 Jul 2018 WO
2018126248 Jul 2018 WO
2018129408 Jul 2018 WO
2018129409 Jul 2018 WO
2018129410 Jul 2018 WO
2018175990 Sep 2018 WO
2018182812 Oct 2018 WO
2019079642 Apr 2019 WO
2019165095 Aug 2019 WO
2019165289 Aug 2019 WO
2019165294 Aug 2019 WO
2020013890 Jan 2020 WO
Non-Patent Literature Citations (26)
Entry
Chen, X, et al. (Feb. 2010). “Polarization Coupling of Light and Optoelectronics Devices Based on Periodically Poled Lithium Niobate,” Shanghai Jiao Tong University, China, Frontiers in Guided Wave Optics and Optoelectronics, 24 pages.
Goldstein, R. (Apr. 1986) “Electro-Optic Devices in Review, The Linear Electro-Optic (Pockels) Effect Forms the Basis for a Family of Active Devices,” Laser & Applications, FastPulse Technology, Inc., 6 pages.
International Preliminary Report on Patentability, dated Jul. 9, 2019, for International Application No. PCT/US2018/012703, 10 pages.
International Preliminary Report on Patentability, dated Jul. 9, 2019, for International Application No. PCT/US2018/012704, 7 pages.
International Preliminary Report on Patentability, dated Jul. 9, 2019, for International Application No. PCT/US2018/012705, 7 pages.
International Search Report and Written Opinion, dated Jan. 17, 2020, for International Application No. PCT/US2019/019276, 14 pages.
International Search Report and Written Opinion, dated Jul. 9, 2019, for International Application No. PCT/US2019/018987, 17 pages.
International Search Report and Written Opinion, dated Sep. 18, 2018, for International Application No. PCT/US2018/012116, 12 pages.
International Search Report and Written Opinion, dated May 3, 2019, for International Application No. PCT/US2019/019272, 16 pages.
International Search Report and Written Opinion, dated May 6, 2019, for International Application No. PCT/US2019/019264, 15 pages.
International Search Report and Written Opinion, dated Jan. 3, 2019, for International Application No. PCT/US2018/056577, 15 pages.
International Search Report and Written Opinion, dated Mar. 23, 2018, for International Application No. PCT/US2018/012704, 12 pages.
International Search Report and Written Opinion, dated Jun. 7, 2018, for International Application No. PCT/US2018/024185, 9 pages.
International Preliminary Report on Patentability, dated Apr. 30, 2020, for International Application No. PCT/US2018/056577, 8 pages.
European Search Report, dated Jul. 17, 2020, for EP Application No. 18776977.3, 12 pages.
Extended European Search Report, dated Jul. 10, 2020, for EP Application No. 18736738.8, 9 pages.
Gunzung, Kim, et al. (Mar. 2, 2016). “A hybrid 3D LIDAR imager based on pixel-by-pixel scanning and DS-OCDMA,” pages Proceedings of SPIE [Proceedings of SPIE ISSN 0277-786X vol. 10524], SPIE, US, vol. 9751, pp. 975119-975119-8.
Extended European Search Report, dated Jul. 22, 2020, for EP Application No. 18736685.1, 10 pages.
Gluckman, J. (May 13, 2016). “Design of the processing chain for a high-altitude, airbome, single-photon lidar mapping instrument,” Proceedings of SPIE; [Proceedings of SPIE ISSN 0277-786X vol. 10524], SPIE, US, vol. 9832, 9 pages.
Office Action Issued in Japanese Patent Application No. 2019-536019 dated Nov. 30, 2021, 6 pages.
European Search Report, dated Jun. 17, 2021, for EP Application No. 18868896.4, 7 pages.
“Fiber laser,” Wikipedia, https://en.wikipedia.org/wiki/Fiber_laser, 6 pages.
International Search Report and Written Opinion, dated Mar. 19, 2018, for International Application No. PCT/US2018/012705, 12 pages.
International Search Report and Written Opinion, dated Mar. 20, 2018, for International Application No. PCT/US2018/012703, 13 pages.
“Mirrors”, Physics LibreTexts, https://phys.libretexts.org/Bookshelves/Optics/Supplemental_Modules_(Components)/Mirrors, (2021), 2 pages.
“Why Wavelengths Matter in Fiber Optics”, FirstLight, https://www.firstlight.net/why-wavelengths-matter-in-fiber-optics/, (2021), 5 pages.
Provisional Applications (1)
Number Date Country
62743367 Oct 2018 US