Interference mitigation for light detection and ranging

Information

  • Patent Grant
  • 11906670
  • Patent Number
    11,906,670
  • Date Filed
    Monday, April 6, 2020
    4 years ago
  • Date Issued
    Tuesday, February 20, 2024
    2 months ago
Abstract
Methods, apparatus, and systems related to light detection and ranging (LIDAR) are described. In one example aspect, a LIDAR apparatus includes a light emitter configured to generate, according to a first electrical pulse signal, a pulse light signal. The first electrical pulse signal comprises a first set of non-uniformly spaced pulses. The apparatus includes a receiver configured to convert returned light signals from the object into electrical signals and a filtering subsystem in communication with the receiver, configured to receive the electrical signals from the receiver and remove a point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the point and corresponding neighboring points of the point along at least a first direction and a second direction of the set of points.
Description
BACKGROUND

Light Detection and Ranging (LIDAR) is a remote sensing method that uses light in the form of a pulsed laser to measure variable distances to the environment. LIDAR systems have a wide range of applications including autonomous driving and aerial mapping of a surface. These applications may place a high priority on the security, accuracy and reliability of the operation. The disclosed embodiments provide methods and systems to reduce or minimize signal interference when multiple LIDAR sensors are present, thereby generating reliable data for determining surface features of the environment. The additional features and benefits of the disclosed technology, as well as their further applications and implementations, are described in the sections that follow.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an example schematic diagram of a LIDAR sensor in accordance with the present technology.



FIG. 2 illustrates an example schematic diagram of a LIDAR system in accordance with the present technology.



FIG. 3 illustrates another example schematic diagram of a LIDAR system in accordance with the present technology.



FIG. 4 depicts an example pulse signal generated by a controller.



FIG. 5A illustrates an example interference pattern that appears in a point cloud.



FIG. 5B illustrates an example interference pattern that appears in a point cloud.



FIG. 6 illustrates an example dithered pulse signal in accordance with the present technology.



FIG. 7 illustrates an example interference pattern after dithering the pulse signal in accordance with the present technology.



FIG. 8 illustrates an example of a filtering process in accordance with the present technology.



FIG. 9 is a flowchart representation of a method for sensing an object in an external environment using a light detection and ranging device in accordance with the present technology.



FIG. 10 is a block diagram illustrating an example of the architecture for a computer system or other control device that can be utilized to implement various portions of the presently disclosed technology.





DETAILED DESCRIPTION

LIDAR systems operate by transmitting a series of light pulses that reflect off objects. The reflected signal, or return signal, is received by the light detection and ranging system. Based on the detected time-of-flight (TOF), the system determines the range/distance between the system and the object.



FIG. 1 illustrates an example schematic diagram of a LIDAR sensor 100 in accordance with the present technology. The LIDAR sensor 100 includes a light emitter 101 that emits a light signal to an object 111 in an external environment and a receiver 103 that receives the return signal reflected by an object 111. In some embodiments, the LIDAR sensor 100 adopts a beam steering mechanism to steer the light signal to the object 111. In some embodiments, the LIDAR sensor 100 can be configured to rotate around an axis to achieve a larger field of view (e.g., a 360-degree horizontal field of view). The receiver 103 may include an analog-to-digital (A2D) convertor (not shown) that converts the return signal to a corresponding electrical signal. In some embodiments, the LIDAR sensor 100 may optionally include a controller 105. In some embodiments, the controller is a part of the LIDAR systems. The controller 105 is in communication with both the light emitter 101 and the receiver 103 so that the controller 105 can control light emissions by the light emitter 101 and process the electrical signals from the receiver 103. In some embodiments, the controller 105 uses a set of points, such as a point cloud, based on the captured distances to represent at least a partial surface the object 111 in the environment. The points in the point cloud can represent or indicate surface features of the object 111.


In some embodiments, the light emitter 101 can emit multiple light beams. FIG. 2 illustrates an example schematic diagram of a LIDAR system 200 in accordance with the present technology. The LIDAR system 200 includes a LIDAR sensor 201 that emits a plurality of beams over an angular range. In this particular example, the light emitter can emit 32 beams of light (also referred to as 32 channels). In some embodiments, the LIDAR system includes multiple LIDAR sensors to obtain a dense set of data for an accurate representation of the object. FIG. 3 illustrates an example schematic diagram of a LIDAR system 300 in accordance with the present technology. The LIDAR system 300 depicted in this example includes a first LIDAR sensor 301 and a second LIDAR sensor 303. Both of the sensors are capable of emitting multiple light beams. Because the sensors are positioned in proximity to each other within a single LIDAR system, crosstalk interference among the sensors can occur. Furthermore, with an increasing number of autonomous vehicles equipped with LIDAR systems to detect and avoid obstacles in the environment, direct interference among multiple LIDAR sensors may occur. The reception of foreign light signals can lead to problems such as ghost targets or a reduced signal-to-noise ratio. Disclosed herein are techniques that can be implemented in various embodiments to reduce or minimize the impact of interference among multiple LIDAR sensors, thereby providing a more reliable model to represent the environment for safe maneuvering.


In some LIDAR systems, the light emitter generates pulsed light emissions in response to a pulse electrical signal provided by the controller. FIG. 4 depicts an example pulse electrical signal 400 generated by a controller. The pulse signal 400 includes multiple uniformly or non-uniformly spaced pulses. For example, the pulses can be uniformly distributed in a period of time Tp. That is, adjacent pulses are separated by the same distance of t in the time domain. As discussed above, when there are multiple light beams generated according to the same pulse signal, direct or crosstalk interference among the signals is likely to occur. For example, for multi-sensor LIDAR systems, inter-sensor interference can happen, which includes direct interference when sensors are pointing at each other and indirect interference when signals from multiple sensors bounce off the object(s). For single-sensor multi-beam LIDAR systems, intra-sensor interference can happen. Signals from multiple beams can bounce off the object(s) and form an interference pattern. When signals at the same time-domain location interfere, the resulting interference may be transformed into ghost targets that are located at approximately the same distance from the LIDAR sensor. FIGS. 5A-B illustrate example interference patterns that can appear in a point cloud. In these examples, the interference can appear as a curved surface in the point cloud data set. The curve surface can be interpreted as a real obstacle in the environment, thereby severely impacting computations of the controller and subsequent navigation of the vehicle that carries the LIDAR system.


One way to minimize the impact of such interference is to distribute the interference signals across the field of view of the LIDAR sensor. If the interference shows up as small ghost points located at various distances from the LIDAR sensor, there is a lower probability for the controller to interpret these signals as a solid obstacle. Instead of generating pulses that are uniformly spaced in the time domain, the controller can control the light emitter to “dither” the pulse signal so that the firing time of the pulses is randomized in the period Tp. FIG. 6 illustrates an example dithered pulse signal in accordance with the present technology. A randomized Δtrandom can be added or subtracted from t for each firing of the pulse. Thus, the adjacent pulses are separated by different distances (t±Δtrandom) in the time domain. For example, Δtrandom can be generated by a pseudo-random generation algorithm. In some embodiments, Δtrandom is up to around 1 μs. In the specific example shown in FIG. 6, t1=t±Δtrandom1, =t±Δtrandom2, t3=t±Δtrandom3, and t1≠t2≠t3. Due to the nature of pseudo-random generation, occasionally two adjacent pulses may have the same spacing as compared to two pulses in another sequence. For example, 14 in FIG. 6 can be equal tot in FIG. 4. However, the time-domain spacings of the pulses are sufficiently different due to the randomization so that the interference, when it does occur, is distributed in distance to form a more uniform noise pattern.


In some embodiments, the firing sequence can be dynamically re-ordered to allow a larger range of Δtrandom. In some embodiments, the dithering techniques as described herein can be combined with a pulse signal signature, described in U.S. Pub. No. 2019/0178991, the entire content of which is incorporated by reference as part of the disclosure of this patent document to further reduce or minimize the impact of interference. For example, each pulse signal can be given a pulse signal signature. In some embodiments, each pulse signal has a unique signature to ensure the security and reliability of the generated data. In some embodiments, the number of unique signatures available to the LIDAR sensor (also referred to as the code base) may be limited by various factors. Given a small code base, two LIDAR sensors may have the same signature at the same time, leading to an interference pattern such as shown in FIG. 5. Combining the dithering techniques with a code base of unique pulse signatures can greatly reduce the likelihood of signal interference by two LIDAR sensors sharing the same pulse signal.



FIG. 7 illustrates an example interference pattern after dithering the pulse signal in accordance with the present technology. As shown in FIG. 7, the interference is more uniformly distributed to resemble a white noise pattern. The controller now can apply a filtering process to remove the points that are deemed as interference.



FIG. 8 illustrates an example of a filtering process in accordance with the present technology. In FIG. 8, a set of points has been collected to represent an object or at least a partial surface of the object in the environment. The collected points also include a significant amount of interference showing as noise. In order to determine whether a particular point is interference/noise or actual data representing the object, the controller can check information carried in the neighboring points to determine whether there is a coherency among the points.


In some embodiments, the points in the point cloud dataset are represented using a Cartesian coordinate system. Information carried by the neighboring points in x, y, and/or z directions, such as the location or the color, can be used to determine whether the point and its neighbors represent the same surface. For example, as shown in FIG. 8, point 801 can be checked against its neighboring points in x, y, and/or z directions. The neighboring points along x direction of point 801 do not provide sufficient information because they also appear to be noise. However, the neighboring points along y direction of point 801 indicate that there is no coherence between point 801 and some of the neighbors; they appear to represent two separate surfaces. Point 801 can thus be filtered out as a noise point. Similarly, point 802 can be checked against its neighboring points. The neighboring points along both x and y directions indicate that there exists a coherence between point 802 and its neighbors. Thus, point 802 remains in the model.


In some embodiments, the points in the point cloud carry a normal vector N indicating an estimated surface normal of the object. Based on the normal vector N, two substantially orthogonal directions x′ and y′ can be determined. Information carried by the neighboring points in x′ and y′ can be used to determine whether the point and its neighbor represent the same surface. For example, in some embodiments, each point carries color information. If the point has the same color as its neighboring points (or if the color difference is equal to or less to a predefined threshold), a point coherence exists. As another example, if a distance between the point and its neighboring point exceeds a predefined threshold, the controller can determine that there is no coherence between the point and its neighbors. When there is no coherence between a point and its neighbors, the point is removed or filtered from the model as noise.


Examining neighboring points in two or more directions can provide a more reliable determination of the point coherence when the object has a sharp surface feature or a small dimension along one direction. For example, if the object is a thin, long pole, examining points along the x direction in the Cartesian coordinate system may not provide sufficient amount of information because there are only a few surface points available. Examining points along the y direction, however, allows the system to use more surface points of the pole to determine whether a coherence exists between the target point (potentially a noise point) and the object.


In some embodiments, a bounding box is placed around the target point. The spatial neighbors of the target point generated by prior and/or next signals from the same channel or concurrent/prior/next signals from a vertically adjacent channel are examined. If the distance between a neighboring point and the target point is equal to or less than a threshold, the target point can be considered as a coherent point to the neighboring point. Other factors, such as color of the object, reflectivity of the object, strength of the signal, and signal to noise ratio (SNR) of the signal are also considered to determine the coherence. In some implementations, when the number of coherent points in the bounding box exceeds a predetermine threshold (e.g., nine neighboring returns), the target point is considered a valid data point instead of noise/interference.


In some embodiments, to obtain optimal real-time performance, the LIDAR sensor may track a small set of points in the point cloud dataset instead of the entire dataset for the object(s) in the environment. Thus, instead of filtering the entire dataset at once, the LIDAR sensor or a filtering subsystem of the LIDAR sensor performs real-time filtering. The real-time filtering can be performed to one object within a selected few frames, or multiple objects within one frame. The controller or the filtering subsystem can also make certain assumptions about the object (e.g., smoothness, color, location, size) to facilitate real-time filtering of the noise/interference points.


In some implementations, a confidence of validity of the filtering process can be derived based on the number of coherent points in the bounding box. As discussed above, the LIDAR sensors can emit multiple sets of signals concurrently. A combination of the dithering pattern (e.g., the burstiness of multiple channels) and a low confidence of validity in the filtering process can be used together to indicate whether the target point is a result of direct or crosstalk interference.



FIG. 9 is a flowchart representation of a method 900 for sensing an object in an external environment using a light detection and ranging device in accordance with the present technology. The method 900 includes, at operation 902, emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal. The electrical pulse signal comprises a first set of non-uniformly spaced pulses. The method 900 includes, at operation 904, receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object. The method 900 includes, at operation 906, converting, by the receiver, the one or more return light signals into electrical signals. The method 900 includes, at operation 908, generating, based on the electrical signals, a model that comprises a plurality of points representing a surface of the object. The method 900 includes, at operation 910, filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.



FIG. 10 is a block diagram illustrating an example of the architecture for a computer system or other control device 1000 that can be utilized to implement various portions of the presently disclosed technology, such as the controller 105 shown in FIG. 1. In FIG. 10, the computer system 1000 includes one or more processors 1005 and memory 1010 connected via an interconnect 1025. The interconnect 1025 may represent any one or more separate physical buses, point to point connections, or both, connected by appropriate bridges, adapters, or controllers. The interconnect 1025, therefore, may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (12C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 674 bus, sometimes referred to as “Firewire.”


The processor(s) 1005 may include central processing units (CPUs) to control the overall operation of, for example, the host computer. In certain embodiments, the processor(s) 1005 accomplish this by executing software or firmware stored in memory 1010. The processor(s) 1005 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), or the like, or a combination of such devices.


The memory 1010 can be, or include, the main memory of the computer system. The memory 1010 represents any suitable form of random-access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 1010 may contain, among other things, a set of machine instructions which, when executed by processor 1005, causes processor 1005 to perform operations to implement embodiments of the presently disclosed technology.


Also connected to the processor(s) 1005 through the interconnect 1025 is a (optional) network adapter 1015. The network adapter 1015 provides the computer system 1000 with the ability to communicate with remote devices, such as the storage clients, and/or other storage servers, and may be, for example, an Ethernet adapter or Fiber Channel adapter.


It is thus evident that the disclosed techniques can be implemented in various embodiments to effectively reduce the impact of signal interferences in LIDAR sensors and systems. In one example aspect, a light detection and ranging apparatus includes a light emitter configured to generate, according to a first electrical pulse signal, a pulse light signal that is directed toward an object in an external environment. The first electrical pulse signal comprises a first set of non-uniformly spaced pulses. The apparatus includes a receiver configured to receive one or more returned light signals reflected by the object and convert the returned light signals into electrical signals and a filtering subsystem in communication with the receiver. The filtering subsystem is configured to receive the electrical signals from the receiver and remove a point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the point and corresponding neighboring points of the point along at least a first direction and a second direction of the set of points.


In some embodiments, two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain. In some embodiments, the light emitter is configured to generate a second pulse light signal according to a second electrical pulse signal, wherein the second electrical pulse signal comprises a second set of non-uniformly spaced pulses, and wherein at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses. In some embodiments, the light emitter is configured to generate a third pulse light signal according to a third electrical pulse signal, wherein the third electrical pulse signal comprises a third set of non-uniformly spaced pulses, and wherein no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulses. In some embodiments, each of the first, second, and third electrical pulse signal comprises a pulse signature. The pulse signature can be unique for each of the first, second, and third electrical pulse signal.


In some embodiments, each point carries information about a vector that indicates an estimated surface normal of the object, and wherein the first direction and the second direction are determined based on the vector. In some embodiments, the first direction and the second direction are two directions in a Cartesian coordinate system.


In some embodiments, determining the coherence between the point and the corresponding neighboring points comprises examining information carried in the neighboring points of the point and determining whether the point and the neighboring points both represent the surface of the same object. In some embodiments, the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point.


In another example aspect, a method for sensing an object in an external environment using a light detection and ranging device includes emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non-uniformly spaced pulses. The method includes receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object and converting, by the receiver, the one or more return light signals into electrical signals. The method also includes filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.


In some embodiments, two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain. In some embodiments, the method includes emitting a second pulse light signal according to a second electrical pulse signal. The second electrical pulse signal comprises a second set of non-uniformly spaced pulses and at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses. In some embodiments, the method includes emitting a third pulse light signal according to a third electrical pulse signal. The third electrical pulse signal comprises a third set of non-uniformly spaced pulses and no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulse. In some embodiments, each of the first, second, and third electrical pulse signal comprises a pulse signature. The pulse signature can be unique for each of the first, second, and third electrical pulse signal.


In some embodiments, each point carries information about a vector that indicates an estimated surface normal of the object. The method includes determining two orthogonal directions as the first direction and the second direction based on the vector. In some embodiments, the method includes selecting two orthogonal directions in a Cartesian coordinate system as the first direction and the second direction.


In some embodiments, determining the coherence between the target point and the corresponding neighboring points includes examining information carried in the neighboring points of the target point and determining whether the target point and the neighboring points both represent the surface of the same object. In some embodiments, the information carried in a neighboring point includes a location of the neighboring point. In some embodiments, the method can include filtering out the target point upon determining that a distance between the neighboring point and the target point exceeds a predefined threshold. In some embodiments, the information carried in a neighboring point includes a color of the neighboring point. The method can include filtering out the target point upon determining that a color difference between the neighboring point and the target point exceeds a predefined threshold.


In some embodiments, filtering the target point comprises constructing a bounding box for the target point, determining a number of neighboring points that are coherent with the target point, and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.


In another example aspect, a non-transitory computer readable medium having processor code stored thereon including program code for performing a method that comprises emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non-uniformly spaced pulses. The method includes receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object and converting, by the receiver, the one or more return light signals into electrical signals. The method also includes filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.


In some embodiments, two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain. In some embodiments, determining the coherence between the target point and the corresponding neighboring points comprises examining information carried in the neighboring points of the target point and determining whether the target point and the neighboring points both represent the surface of the same object. In some embodiments, the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point. In some embodiments, filtering the target point comprises constructing a bounding box for the target point, determining a number of neighboring points that are coherent with the target point, and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.


Implementations of the subject matter and the functional operations described in this patent document can be implemented in various systems, digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described in this specification can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a tangible and non-transitory computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing unit” or “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including, by way of example, a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.


A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.


The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).


Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random-access memory, or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including, by way of example, semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.


It is intended that the specification, together with the drawings, be considered exemplary only where exemplary means an example. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Additionally, the use of “or” is intended to include “and/or,” unless the context clearly indicates otherwise.


While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document, in the context of separate embodiments, can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.


Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown, or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all embodiments.


Only a few implementations and examples are described, and other implementations, enhancements and variations can be made based on what is described and illustrated in this patent document.

Claims
  • 1. A light detection and ranging apparatus, comprising: a light emitter configured to generate, according to a first electrical pulse signal, a first light signal comprising one or more pulses;a receiver configured to receive a return light signal, wherein a controller in communication with the receiver is configured to generate one more points in a point cloud based on the return light signal, wherein the point cloud includes a set of points representing a partial surface of an object and a subsystem, configured to identify as noise a target point from the set of points representing the partial surface of the object by determining whether there is a coherence between the target point and a first neighboring point of the target point along a first direction and a coherence between the target point and a second neighboring point of the target point along a second direction, wherein the first neighboring point and the second neighboring point are from the set of points representing the partial surface of the object in the point cloud.
  • 2. The apparatus of claim 1, wherein a dithering technique randomizes one or more firing times of the respective one or more pulses of the first light signal.
  • 3. The apparatus of claim 2, wherein the dithering technique is combined with a pulse signature of the first light signal to reduce a likelihood of signal interference.
  • 4. The apparatus of claim 1, wherein the controller is further configured to facilitate the generation of the first light signal by the light emitter.
  • 5. The apparatus of claim 4, wherein the controller has one or more processors and memory.
  • 6. The apparatus of claim 1, further comprising a network adapter configured to facilitate communication over a network.
  • 7. The apparatus of claim 6, wherein the network adapter is further configured to communicate with at least one remote device to process and store data remotely.
  • 8. The apparatus of claim 1, wherein the subsystem is further configured to estimate a confidence of validity of a determination that the target point is noise.
  • 9. The apparatus of claim 1, wherein the subsystem is further configured to remove the target point identified as noise from the set of points.
  • 10. The apparatus of claim 1, wherein the first electrical pulse signal has a first set of non-uniformly spaced pulses.
  • 11. The apparatus of claim 10, wherein the one or more pulses of the first light signal have a pulse signature including the first set of non-uniformly spaced pulses.
  • 12. A method for light detection and ranging, comprising: emitting, by a light emitter, a first light signal according to an electrical pulse signal, the first light signal comprising one or more pulses;receiving, by a receiver, one or more return light signals reflected by an object;converting, by the receiver, the one or more return light signals into electrical signals, wherein said electrical signals are used to generate one or more points in a point cloud, wherein the point cloud includes a set of points representing a partial surface of the object; andidentifying as noise a target point from the set of points representing the partial surface of the object by determining whether there is a coherence between the target point and a first neighboring point of the target point along a first direction and a coherence between the target point and a second neighboring point of the target point along a second direction, wherein the first neighboring point and the second neighboring point are from the set of points representing the partial surface of the object in the point cloud.
  • 13. The method of claim 12, further comprising performing a dithering technique to randomize one or more firing times of the respective one or more pulses of the first light signal.
  • 14. The method of claim 13, further comprising combining the dithering technique with a pulse signature of the first light signal to reduce a likelihood of signal interference with other light signals.
  • 15. The method of claim 12, further comprising facilitating, by a network adapter, communication over a network.
  • 16. The method of claim 15, further comprising communicating, by the network adapter, with at least one remote device to process and store data remotely.
  • 17. The method of claim 12, further comprising estimating a confidence of validity of a determination that the target point is noise.
  • 18. The method of claim 12, further comprising removing the target point from the set of points.
  • 19. The method of claim 12, wherein the electrical pulse signal has a first set of non-uniformly spaced pulses.
  • 20. The method of claim 19, wherein the one or more pulses of the first light signal have a pulse signature including the first set of non-uniformly spaced pulses.
  • 21. A non-transitory computer readable medium having processor code stored thereon including program code for performing operations comprising: emitting, by a light emitter, a first light signal according to an electrical pulse signal, the first light signal comprising one or more pulses;receiving, by a receiver, one or more return light signals reflected by an object;converting, by the receiver, the one or more return light signals into electrical signals, wherein said electrical signals are used to generate one or more points in a point cloud, wherein the point cloud includes a set of points representing a partial surface of the object; andidentifying as noise a target point from the set of points representing the partial surface of the object by determining whether there is a coherence between the target point and a first neighboring point of the target point along a first direction and a coherence between the target point and a second neighboring point of the target point along a second direction, wherein the first neighboring point and the second neighboring point are from the set of points included in the point cloud.
  • 22. The non-transitory computer readable medium of claim 21, wherein the operations further comprise performing a dithering technique to randomize one or more firing times of the respective one or more pulses of the first light signal.
  • 23. The non-transitory computer readable medium of claim 22, wherein the operations further comprise combining the dithering technique is with a pulse signature of the first light signal to reduce a likelihood of signal interference with other light signals.
  • 24. The non-transitory computer readable medium of claim 21, wherein the operations further comprise estimating a confidence of validity of a determination that the target point is noise.
  • 25. The non-transitory computer readable medium of claim 21, wherein the operations further comprise removing the target point from the set of points.
  • 26. The non-transitory computer readable medium of claim 21, wherein the electrical pulse signal has a first set of non-uniformly spaced pulses.
  • 27. The non-transitory computer readable medium of claim 26, wherein the one or more pulses of the first light signal have a pulse signature including the first set of non-uniformly spaced pulses.
  • 28. The non-transitory computer readable medium of claim 26, wherein two adjacent pulses of the first set of non-uniformly spaced pulses are separated by a randomized distance in time domain.
CROSS REFERENCE TO RELATED APPLICATION

The present application is a continuation of and claims priority to U.S. patent application Ser. No. 16/459,557, entitled “Interference Mitigation For Light Detection And Ranging,” filed Jul. 1, 2019.

US Referenced Citations (386)
Number Name Date Kind
3064252 Varela Nov 1962 A
3373441 Zadig Mar 1968 A
3551845 Zelina Dec 1970 A
3636250 Haeff Jan 1972 A
3686514 Dube et al. Aug 1972 A
3781111 Fletcher et al. Dec 1973 A
3862415 Harnden, Jr. et al. Jan 1975 A
3897150 Bridges et al. Jul 1975 A
4179216 Theurer et al. Dec 1979 A
4199697 Edwards Apr 1980 A
4201442 McMahon et al. May 1980 A
4212534 Bodlaj Jul 1980 A
4220103 Kasahara et al. Sep 1980 A
4477184 Endo Oct 1984 A
4516837 Soref et al. May 1985 A
4634272 Endo Jan 1987 A
4656462 Araki et al. Apr 1987 A
4681433 Aeschlimann Jul 1987 A
4700301 Dyke Oct 1987 A
4742337 Haag May 1988 A
4834531 Ward May 1989 A
4862257 Ulich Aug 1989 A
4895440 Cain et al. Jan 1990 A
4896343 Saunders Jan 1990 A
4902126 Koechner Feb 1990 A
4944036 Hyatt Jul 1990 A
4952911 D'Ambrosia et al. Aug 1990 A
4967183 D'Ambrosia et al. Oct 1990 A
5004916 Collins, Jr. Apr 1991 A
5006721 Cameron et al. Apr 1991 A
5023888 Bayston Jun 1991 A
5026156 Bayston et al. Jun 1991 A
5059008 Flood et al. Oct 1991 A
5175694 Amato Dec 1992 A
5177768 Crespo et al. Jan 1993 A
5210586 Grage et al. May 1993 A
5212533 Shibuya et al. May 1993 A
5241481 Olsen Aug 1993 A
5249157 Taylor Sep 1993 A
5291261 Dahl et al. Mar 1994 A
5309212 Clark May 1994 A
5314037 Shaw et al. May 1994 A
5319201 Lee Jun 1994 A
5365218 Otto Nov 1994 A
5463384 Juds Oct 1995 A
5465142 Krumes et al. Nov 1995 A
5515156 Yoshida et al. May 1996 A
5546188 Wangler et al. Aug 1996 A
5563706 Shibuya et al. Oct 1996 A
5572219 Silverstein et al. Nov 1996 A
5691687 Kumagai et al. Nov 1997 A
5710417 Joseph et al. Jan 1998 A
5757472 Wangler et al. May 1998 A
5757501 Hipp May 1998 A
5757677 Lennen May 1998 A
5789739 Schwarz Aug 1998 A
5793163 Okuda Aug 1998 A
5793491 Wangler et al. Aug 1998 A
5805468 Blohbaum Sep 1998 A
5847817 Zediker et al. Dec 1998 A
5877688 Morinaka et al. Mar 1999 A
5889479 Tabel Mar 1999 A
5895984 Renz Apr 1999 A
5903355 Schwarz May 1999 A
5903386 Mantravadi et al. May 1999 A
5923910 Nakahara et al. Jul 1999 A
5942688 Kimura et al. Aug 1999 A
5949530 Wetteborn Sep 1999 A
5953110 Burns Sep 1999 A
5991011 Damm Nov 1999 A
6034803 Sullivan et al. Mar 2000 A
6043868 Dunne Mar 2000 A
6069565 Stern et al. May 2000 A
6088085 Wetteborn Jul 2000 A
6100539 Blumcke et al. Aug 2000 A
6137566 Leonard et al. Oct 2000 A
6153878 Jakob et al. Nov 2000 A
6157294 Urai et al. Dec 2000 A
6201236 Juds Mar 2001 B1
6259714 Kinbara Jul 2001 B1
6297844 Schatz et al. Oct 2001 B1
6321172 Jakob et al. Nov 2001 B1
6327806 Paige Dec 2001 B1
6329800 May Dec 2001 B1
6335789 Kikuchi Jan 2002 B1
6365429 Kneissl et al. Apr 2002 B1
6396577 Ramstack May 2002 B1
6420698 Dimsdale Jul 2002 B1
6441363 Cook, Jr. et al. Aug 2002 B1
6441889 Patterson Aug 2002 B1
6442476 Poropat Aug 2002 B1
6473079 Kacyra et al. Oct 2002 B1
6504712 Hashimoto et al. Jan 2003 B2
6509958 Pierenkemper Jan 2003 B2
6593582 Lee et al. Jul 2003 B2
6621764 Smith Sep 2003 B1
6636300 Doemens et al. Oct 2003 B2
6646725 Eichinger et al. Nov 2003 B1
6650402 Sullivan et al. Nov 2003 B2
6664529 Pack et al. Dec 2003 B2
6665063 Jamieson et al. Dec 2003 B2
6670905 Orr Dec 2003 B1
6682478 Nakamura Jan 2004 B2
6710324 Hipp Mar 2004 B2
6742707 Tsikos et al. Jun 2004 B1
6747747 Hipp Jun 2004 B2
6759649 Hipp Jul 2004 B2
6789527 Sauler et al. Sep 2004 B2
6798527 Fukumoto et al. Sep 2004 B2
6812450 Hipp Nov 2004 B2
6876790 Lee Apr 2005 B2
6879419 Richman et al. Apr 2005 B2
6969558 Walston et al. Nov 2005 B2
7030968 D'Aligny et al. Apr 2006 B2
7041962 Dollmann et al. May 2006 B2
7089114 Huang Aug 2006 B1
7106424 Meneely et al. Sep 2006 B2
7129971 McCutchen Oct 2006 B2
7130672 Pewzner et al. Oct 2006 B2
7131586 Tsikos et al. Nov 2006 B2
7190465 Froehlich et al. Mar 2007 B2
7240314 Leung Jul 2007 B1
7248342 Degnan Jul 2007 B1
7281891 Smith et al. Oct 2007 B2
7295298 Willhoeft et al. Nov 2007 B2
7313424 Mayevsky et al. Dec 2007 B2
7315377 Holland et al. Jan 2008 B2
7319777 Morcom Jan 2008 B2
7345271 Boehlau et al. Mar 2008 B2
7358819 Rollins Apr 2008 B2
7373473 Bukowski et al. May 2008 B2
7408462 Pirkl et al. Aug 2008 B2
7480031 Mack Jan 2009 B2
7544945 Tan et al. Jun 2009 B2
7570793 Lages et al. Aug 2009 B2
7583364 Mayor et al. Sep 2009 B1
7589826 Mack et al. Sep 2009 B2
7619477 Segarra Nov 2009 B2
7623222 Benz et al. Nov 2009 B2
7640068 Johnson et al. Dec 2009 B2
7642946 Wong et al. Jan 2010 B2
7684590 Kampchen et al. Mar 2010 B2
7697581 Walsh et al. Apr 2010 B2
7741618 Lee et al. Jun 2010 B2
7746271 Furstenberg Jun 2010 B2
7868665 Tumer et al. Jan 2011 B2
7969558 Hall Jun 2011 B2
8042056 Wheeler et al. Oct 2011 B2
8077047 Humble et al. Dec 2011 B2
8139685 Simic et al. Mar 2012 B2
8203702 Kane et al. Jun 2012 B1
8274037 Ritter et al. Sep 2012 B2
8310653 Ogawa et al. Nov 2012 B2
8605262 Campbell et al. Dec 2013 B2
8675181 Hall Mar 2014 B2
8736818 Weimer et al. May 2014 B2
8767190 Hall Jul 2014 B2
8976340 Gilliland et al. Mar 2015 B2
8995478 Kobtsev et al. Mar 2015 B1
9063549 Pennecot et al. Jun 2015 B1
9069061 Harwit Jun 2015 B1
9069080 Stettner et al. Jun 2015 B2
9086273 Gruver et al. Jul 2015 B1
9093969 Gebeyehu et al. Jul 2015 B2
9110154 Bates et al. Aug 2015 B1
9151940 Chuang et al. Oct 2015 B2
9194701 Bosch Nov 2015 B2
RE45854 Gittinger et al. Jan 2016 E
9239959 Evans et al. Jan 2016 B1
9246041 Clausen et al. Jan 2016 B1
9285477 Smith et al. Mar 2016 B1
9286538 Chen et al. Mar 2016 B1
9310197 Gogolla et al. Apr 2016 B2
9383753 Templeton et al. Jul 2016 B1
9453914 Stettner et al. Sep 2016 B2
9529079 Droz et al. Dec 2016 B1
9772607 Decoux et al. Sep 2017 B2
RE46672 Hall Jan 2018 E
9964632 Droz et al. May 2018 B1
9983297 Hall et al. May 2018 B2
9989629 LaChapelle Jun 2018 B1
10003168 Villeneuve Jun 2018 B1
10018726 Hall et al. Jul 2018 B2
10048374 Hall et al. Aug 2018 B2
10094925 LaChapelle Oct 2018 B1
10109183 Franz et al. Oct 2018 B1
10120079 Pennecot et al. Nov 2018 B2
10126412 Eldada et al. Nov 2018 B2
10132928 Eldada et al. Nov 2018 B2
10309213 Barfoot et al. Jun 2019 B2
10330780 Hall et al. Jun 2019 B2
10386465 Hall et al. Aug 2019 B2
10393877 Hall et al. Aug 2019 B2
10436904 Moss et al. Oct 2019 B2
10545222 Hall et al. Jan 2020 B2
RE47942 Hall Apr 2020 E
10613203 Rekow et al. Apr 2020 B1
10627490 Hall et al. Apr 2020 B2
10627491 Hall et al. Apr 2020 B2
20010011289 Davis et al. Aug 2001 A1
20010017718 Ikeda et al. Aug 2001 A1
20020003617 Doemens et al. Jan 2002 A1
20020060784 Pack et al. May 2002 A1
20020117545 Tsikos et al. Aug 2002 A1
20030041079 Bellemore et al. Feb 2003 A1
20030043363 Jamieson et al. Mar 2003 A1
20030043364 Jamieson et al. Mar 2003 A1
20030057533 Lemmi et al. Mar 2003 A1
20030066977 Hipp et al. Apr 2003 A1
20030076485 Ruff et al. Apr 2003 A1
20030090646 Riegl et al. May 2003 A1
20030144011 Richards Jul 2003 A1
20030163030 Arriaga Aug 2003 A1
20040021852 DeFlumere Feb 2004 A1
20040066500 Gokturk et al. Apr 2004 A1
20040134879 Kochergin et al. Jul 2004 A1
20040150810 Muenter et al. Aug 2004 A1
20040213463 Morrison Oct 2004 A1
20040240706 Wallace et al. Dec 2004 A1
20040240710 Lages et al. Dec 2004 A1
20040247157 Lages et al. Dec 2004 A1
20050023353 Tsikos et al. Feb 2005 A1
20050168720 Yamashita et al. Aug 2005 A1
20050211893 Paschalidis Sep 2005 A1
20050232466 Kampchen et al. Oct 2005 A1
20050246065 Ricard Nov 2005 A1
20050248749 Kiehn et al. Nov 2005 A1
20050279914 Dimsdale et al. Dec 2005 A1
20060007350 Gao et al. Jan 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100783 Haberer et al. May 2006 A1
20060115113 Lages et al. Jun 2006 A1
20060132635 Land Jun 2006 A1
20060176697 Arruda Aug 2006 A1
20060186326 Ito Aug 2006 A1
20060197867 Johnson et al. Sep 2006 A1
20060231771 Lee et al. Oct 2006 A1
20060290920 Kampchen et al. Dec 2006 A1
20070035624 Lubard et al. Feb 2007 A1
20070071056 Chen Mar 2007 A1
20070121095 Lewis May 2007 A1
20070181810 Tan et al. Aug 2007 A1
20070201027 Doushkina et al. Aug 2007 A1
20070219720 Trepagnier et al. Sep 2007 A1
20070241955 Brosche Oct 2007 A1
20070272841 Wiklof Nov 2007 A1
20080002176 Krasutsky Jan 2008 A1
20080013896 Salzberg et al. Jan 2008 A1
20080074640 Walsh et al. Mar 2008 A1
20080079371 Kang et al. Apr 2008 A1
20080154495 Breed Jun 2008 A1
20080170826 Schaafsma Jul 2008 A1
20080186501 Xie Aug 2008 A1
20090010644 Varshneya et al. Jan 2009 A1
20090026503 Tsuda Jan 2009 A1
20090085901 Antony Apr 2009 A1
20090122295 Eaton May 2009 A1
20090142053 Varshneya et al. Jun 2009 A1
20090168045 Lin et al. Jul 2009 A1
20090218475 Kawakami et al. Sep 2009 A1
20090245788 Varshneya et al. Oct 2009 A1
20090323737 Ensher et al. Dec 2009 A1
20100006760 Lee et al. Jan 2010 A1
20100020306 Hall Jan 2010 A1
20100045965 Meneely Feb 2010 A1
20100046953 Shaw et al. Feb 2010 A1
20100134596 Becker Jun 2010 A1
20100188722 Yamada et al. Jul 2010 A1
20100198487 Vollmer et al. Aug 2010 A1
20100204964 Pack et al. Aug 2010 A1
20100239139 Hunt et al. Sep 2010 A1
20100265077 Humble et al. Oct 2010 A1
20100271615 Sebastian et al. Oct 2010 A1
20100302247 Perez Dec 2010 A1
20100302528 Hall Dec 2010 A1
20110028859 Chian Feb 2011 A1
20110040482 Brimble et al. Feb 2011 A1
20110211188 Juenemann et al. Sep 2011 A1
20110216304 Hall Sep 2011 A1
20110305250 Chann et al. Dec 2011 A1
20120038903 Weimer et al. Feb 2012 A1
20120195597 Malaney Aug 2012 A1
20120287417 Mimeault Nov 2012 A1
20130024176 Woodford Jan 2013 A2
20130050144 Reynolds Feb 2013 A1
20130050486 Omer et al. Feb 2013 A1
20130070239 Crawford et al. Mar 2013 A1
20130094960 Bowyer et al. Apr 2013 A1
20130151198 Brown Jun 2013 A1
20130168673 Yu et al. Jul 2013 A1
20130206967 Shpunt et al. Aug 2013 A1
20130241761 Cooper et al. Sep 2013 A1
20130242283 Bailey et al. Sep 2013 A1
20130258312 Lewis Oct 2013 A1
20130286404 Cenko et al. Oct 2013 A1
20130300479 Thibault Nov 2013 A1
20130314711 Cantin et al. Nov 2013 A1
20130336375 Ranki et al. Dec 2013 A1
20130342366 Kiefer et al. Dec 2013 A1
20140063483 Li Mar 2014 A1
20140071234 Millett Mar 2014 A1
20140078519 Steffey et al. Mar 2014 A1
20140104592 Tien et al. Apr 2014 A1
20140139639 Wagner May 2014 A1
20140220700 Alexander Aug 2014 A1
20140240317 Go et al. Aug 2014 A1
20140240721 Herschbach Aug 2014 A1
20140253369 Kelley et al. Sep 2014 A1
20140267848 Wu Sep 2014 A1
20140274093 Abdelmonem Sep 2014 A1
20140347650 Bosch Nov 2014 A1
20150015895 Bridges et al. Jan 2015 A1
20150035437 Panopoulos et al. Feb 2015 A1
20150055117 Pennecot et al. Feb 2015 A1
20150101234 Priest et al. Apr 2015 A1
20150109290 Chang et al. Apr 2015 A1
20150116695 Bartolome et al. Apr 2015 A1
20150131080 Retterath et al. May 2015 A1
20150144806 Jin et al. May 2015 A1
20150185325 Park et al. Jul 2015 A1
20150202939 Stettner et al. Jul 2015 A1
20150219765 Mead et al. Aug 2015 A1
20150226853 Seo et al. Aug 2015 A1
20150293224 Eldada et al. Oct 2015 A1
20150293228 Retterath et al. Oct 2015 A1
20150303216 Tamaru Oct 2015 A1
20160003946 Gilliland et al. Jan 2016 A1
20160009410 Derenick et al. Jan 2016 A1
20160014309 Ellison et al. Jan 2016 A1
20160021713 Reed Jan 2016 A1
20160049058 Allen et al. Feb 2016 A1
20160117431 Kim et al. Apr 2016 A1
20160161600 Eldada et al. Jun 2016 A1
20160191173 Malaney Jun 2016 A1
20160252617 Tehrani Niknejad et al. Sep 2016 A1
20160259038 Retterath et al. Sep 2016 A1
20160300484 Torbett Oct 2016 A1
20160306032 Schwarz et al. Oct 2016 A1
20160306043 Moss Oct 2016 A1
20160313445 Bailey et al. Oct 2016 A1
20160327646 Scheim et al. Nov 2016 A1
20160363659 Mindell et al. Dec 2016 A1
20160365846 Wyland Dec 2016 A1
20170146639 Carothers May 2017 A1
20170146640 Hall et al. May 2017 A1
20170153319 Villeneuve et al. Jun 2017 A1
20170214861 Rachlin et al. Jul 2017 A1
20170219695 Hall et al. Aug 2017 A1
20170220876 Gao et al. Aug 2017 A1
20170220887 Fathi Aug 2017 A1
20170242102 Dussan et al. Aug 2017 A1
20170269198 Hall et al. Sep 2017 A1
20170269209 Hall et al. Sep 2017 A1
20170269215 Hall et al. Sep 2017 A1
20170299721 Eichenholz et al. Oct 2017 A1
20170301104 Qian Oct 2017 A1
20170350983 Hall et al. Dec 2017 A1
20180011196 Pennecot Jan 2018 A1
20180019155 Tsang et al. Jan 2018 A1
20180058197 Barfoot et al. Mar 2018 A1
20180059219 Irish et al. Mar 2018 A1
20180074382 Lee et al. Mar 2018 A1
20180100924 Brinkmeyer Apr 2018 A1
20180106902 Mase et al. Apr 2018 A1
20180168539 Singh et al. Jun 2018 A1
20180267151 Hall et al. Sep 2018 A1
20180275249 Campbell et al. Sep 2018 A1
20180276793 Kwon Sep 2018 A1
20180284227 Hall et al. Oct 2018 A1
20180284274 LaChapelle Oct 2018 A1
20180321360 Hall et al. Nov 2018 A1
20180364098 McDaniel et al. Dec 2018 A1
20190001442 Unrath et al. Jan 2019 A1
20190011563 Hall et al. Jan 2019 A1
20190033459 Tisdale Jan 2019 A1
20190056497 Pacala et al. Feb 2019 A1
20190178991 Hall et al. Jun 2019 A1
20190339365 Hall et al. Nov 2019 A1
20190361092 Hall et al. Nov 2019 A1
20190369257 Hall et al. Dec 2019 A1
20190369258 Hall et al. Dec 2019 A1
20200064452 Avlas et al. Feb 2020 A1
20200142070 Hall et al. May 2020 A1
20200144971 Pinto et al. May 2020 A1
20200166613 Hall et al. May 2020 A1
20200191915 Hall et al. Jun 2020 A1
Foreign Referenced Citations (1)
Number Date Country
2018129408 Jul 2012 WO
Non-Patent Literature Citations (73)
Entry
U.S. Appl. No. 15/941,302, filed Mar. 30, 2018.
U.S. Appl. No. 16/510,680, filed Jul. 12, 2019, Hall et al.
U.S. Appl. No. 16/510,710, filed Jul. 12, 2019, Hall et al.
U.S. Appl. No. 16/510,749, filed Jul. 12, 2019, Hall et al.
U.S. Appl. No. 15/420,384, filed Jan. 31, 2017, Hall et al.
U.S. Appl. No. 16/030,780, filed Jul. 9, 2018, Hall et al.
U.S. Appl. No. 11/777,802, filed Jul. 13, 2007, Hall.
U.S. Appl. No. 13/109,901, filed May 17, 2011, Hall et al.
U.S. Appl. No. 15/180,580, filed Jun. 13, 2016, Hall et al.
U.S. Appl. No. 15/700,543, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,558, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,571, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,836, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,844, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,959, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,965, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 16/912,648, filed Jun. 25, 2020, Hall et al.
U.S. Appl. No. 15/926,095, filed Mar. 30, 2018, Hall et al.
U.S. Appl. No. 15/464,227, filed Mar. 30, 2017, Hall et al.
U.S. Appl. No. 15/464,221, filed Mar. 30, 2017, Hall et al.
U.S. Appl. No. 15/974,527, filed May 8, 2018, Hall et al.
U.S. Appl. No. 16/748,498, filed Jan. 21, 2020, Hall et al.
U.S. Appl. No. 15/610,975, filed Jun. 1, 2017, Hall et al.
U.S. Appl. No. 16/546,131, filed Aug. 20, 2019, Hall et al.
U.S. Appl. No. 16/842,491, filed Apr. 7, 2020, Hall et al.
U.S. Appl. No. 16/546,184, filed Aug. 20, 2019, Hall et al.
U.S. Appl. No. 16/546,206, filed Aug. 20, 2019, Hall et al.
U.S. Appl. No. 16/909,306, filed Jun. 23, 2020, Hall et al.
U.S. Appl. No. 15/339,790, filed Oct. 31, 2016, Hall et al.
U.S. Appl. No. 16/854,755, filed Apr. 21, 2020, Hall et al.
U.S. Appl. No. 16/905,843, filed Jun. 18, 2020, Hall et al.
U.S. Appl. No. 16/905,849, filed Jun. 18, 2020, Hall et al.
U.S. Appl. No. 16/909,846, filed Jun. 23, 2020, Hall et al.
U.S. Appl. No. 15/835,983, filed Dec. 8, 2017, Hall et al.
U.S. Appl. No. 16/459,557, filed Jul. 1, 2019, Rekow et al.
U.S. Appl. No. 16/112,273, filed Aug. 24, 2018, Avlas et al.
U.S. Appl. No. 16/181,523, filed Nov. 6, 2018, Pinto et al.
U.S. Appl. No. 16/241,849, filed Jan. 7, 2019, Hall et al.
U.S. Appl. No. 16/241,963, filed Jan. 7, 2019, Hall et al.
Quanergy Systems, Inc. v. Velodyne Lidar, Inc. (N.D. Cal.), Docket No. 5:16-cv-05251, filed Sep. 13, 2016, U.S. Pat. No. 7,969,558.
Velodyne Lidar, Inc. v. Hesai Photonics Technology Co., Ltd. (N.D. Cal.), Docket No. 5:16-cv-04742, filed Aug. 13, 2019, U.S. Pat. No. 7,969,558.
Velodyne Lidar, Inc. v. Suteng Innovation Technology Co., Ltd. (N.D. Cal.), Docket No. 5:16-cv-04746, filed Aug. 13, 2019, U.S. Pat. No. 7,969,558.
In re Certain Rotating 3-D Lidar Devices, Components Thereof, and Sensing Systems Containing the Same (ITC), Investigation No. ITC-337-TA-1173, filed Aug. 15, 2019, U.S. Pat. No. 7,969,558.
Petition for Inter Partes Review (USPTO Patent Trial and Appeal Board), Case No. IPR2018-00255, filed Nov. 29, 2017, U.S. Pat. No. 7,969,558.
Petition for Inter Partes Review (USPTO Patent Trial and Appeal Board), Case No. IPR2018-000255, filed Nov. 29, 2017, U.S. Pat. No. 7,969,558.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/059452, dated Jan. 16, 2019, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/025395, dated Jun. 25, 2018, 14 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/015874, dated May 23, 2017, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/015877, dated Apr. 13, 2017, 13 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2007/073490, (dated 2008), 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/023261, dated May 26, 2017, 11 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/051497, dated Nov. 28, 2018, 11 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046422, dated Dec. 3, 2019, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/023283, dated Jun. 1, 2018, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/023259, dated May 31, 2017, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/015869, dated Apr. 10, 2017, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/050934, dated Nov. 20, 2018, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2010/037129, dated Jul. 27, 2010, 6 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/036865, dated Sep. 26, 2017, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/023262, dated Jun. 5, 2017, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/047543, dated Nov. 27, 2017, 11 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/059062, dated Jan. 16, 2019, 6 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046573, dated Nov. 15, 2019, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/016259, dated Apr. 26, 2019, 6 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/051729, dated Nov. 20, 2019, 7 pages.
PCT Search Report and Written Opinion (Corrected), App. No. PCT/US2020/026925, dated May 12, 2020, 5 pages.
PCT Search Report and Written Opinion, App. No. PCT/US2020/012633, dated Jun. 2, 2020, 13 pages.
PCT Search Report and Written Opinion, App. No. PCT/US2020/012635, dated Jun. 4, 2020, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/031682, dated Sep. 17, 2018, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/035427, dated Aug. 29, 2017, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046412, dated Jun. 24, 2020, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046419, dated Oct. 29, 2019, 14 pages.
EP20834220.4, “Extended European Search Report”, dated Sep. 22, 2023, 9 pages.
Related Publications (1)
Number Date Country
20210003681 A1 Jan 2021 US
Continuations (1)
Number Date Country
Parent 16459557 Jul 2019 US
Child 16841506 US