Systems and methods for a configurable sensor system

Information

  • Patent Grant
  • 12061263
  • Patent Number
    12,061,263
  • Date Filed
    Monday, January 7, 2019
    5 years ago
  • Date Issued
    Tuesday, August 13, 2024
    3 months ago
Abstract
The present disclosure relates generally to systems and methods for generating, processing and correlating data from multiple sensors in an autonomous navigation system, and more particularly to the utilization of configurable and dynamic sensor modules within light detection and ranging systems that enable an improved correlation between sensor data as well as configurability and responsiveness of the system to its surrounding environment.
Description
BACKGROUND
A. Technical Field

The present disclosure relates generally to systems and methods for generating, processing and correlating data from multiple sensors in an autonomous navigation system, and more particularly to the utilization of configurable and dynamic sensor modules within light detection and ranging (hereinafter, “LIDAR”) systems that enable an improved correlation between sensor data as well as configurability and responsiveness of the system to its surrounding environment.


B. Background

One skilled in the art will understand the importance in the accuracy and timely analysis of sensor data within autonomous navigation systems. Autonomous navigation requires that a computerized system receive data from sensors, form a sufficiently accurate representation of its environment, and make decisions based on that data in real time. Any error in the interpretation of the sensor data or delays in timely initiating a responsive action to this sensor data can have undesired consequences. Modern autonomous systems must process data from multiple discrete sensor systems and interpret their combined outputs. The quantity of data from these various sensors can be immense and simply processing and correlating the outputs from different sensors may require a significant amount of processing power and time.


Implementations of a LiDAR or mixed-sensor system may have different operational requirements such as field-of-view, range, response rate, etc. Adapting these systems to different types of autonomous navigation systems and vehicles may be challenging as the environment and intended use of the system can present varying performance requirements. Simply positioning and integrating sensors within different types of vehicles may present problems for many of the prior-art, sensor-based navigation systems. For example, sensors cannot be easily collocated because the data is subject to parallax error caused by data taken from different vantage points. In addition, these sensors may also have different rates of data collection. Improper correlation of this sensor data may result in motion errors or conflicts across the sensor data. Accordingly, the autonomous system must first process and interpret data from relevant sensors followed by correlating them with one another before any autonomous navigation decisions can be made. Any unnecessary delay within the decision-making process may result in a failure of the autonomous driving system.


Accordingly, what is needed are systems and methods that provide configurable, accurate, timely and efficient solutions for the reception and processing of sensor data across a plurality of sensors installed within an autonomous navigation system.





BRIEF DESCRIPTION OF THE DRAWINGS

References will be made to embodiments of the invention, examples of which may be illustrated in the accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in the context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments. Items in the figures are not to scale.



FIG. 1 depicts the operation of a LiDAR system according to embodiments of the present document.



FIG. 2A illustrates the operation of a LiDAR system and multiple return light signals according to embodiments of the present document.



FIG. 2B depicts a LIDAR system with an oscillating mirror according to embodiments of the present document.



FIG. 3A depicts a distributed sensor system installed in an automobile utilizing a suite of sensors coupled to a microcontroller (hereinafter, “MCU”) according to embodiments of the present document.



FIG. 3B depicts the framework for a sensor system according to embodiments of the current disclosure.



FIG. 3C depicts the operation of an MCU in an autonomous driving system utilizing sensor modules and a sensor bus according to embodiments of the current disclosure.



FIGS. 3D and 3E illustrate methods for dynamically configuring different sensors and sensor types within an autonomous navigation system according to embodiments of the current disclosure.



FIG. 3F illustrates a method for updating calibration parameters in a calibration engine according to embodiments of the current disclosure.



FIG. 4A and FIG. 4B depict configurable sensor architectures according to embodiments of the current disclosure.



FIG. 4C illustrates a lissajous scan pattern and resolution according to embodiments of the present disclosure. FIGS. 4D, FIG. 4E, and FIG. 4F illustrate scan resolutions for a field of view (FOV) according to embodiments of the present disclosure.



FIG. 4G illustrates a specific scanning pattern for a sensor module comprising eight sensors according to embodiments of the present disclosure.



FIG. 4H and FIG. 4I illustrate exemplary sensor square and pie wedge configurations according to embodiments of the present disclosure.



FIG. 4J illustrates a sensor system that supports detection of objects with various sensor types including LIDAR, infrared radiation (IR), ambient light modalities to detect range, reflectivity, temperature and color respectively according to embodiments of the present disclosure.



FIG. 5 depicts a simplified block diagram of a computing device/information handling system for an automotive application, in accordance with embodiments of the present document.





DETAILED DESCRIPTION OF EMBODIMENTS

In the following description, for purposes of explanation, specific details are set forth in order to provide an understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these details. Furthermore, one skilled in the art will recognize that embodiments of the present invention, described below, may be implemented in a variety of ways, such as a process, an apparatus, a system, a device, or a method on a tangible computer-readable medium.


Components, or modules, shown in diagrams are illustrative of exemplary embodiments of the invention and are meant to avoid obscuring the invention. It shall also be understood that throughout this discussion that components may be described as separate functional units, which may comprise sub-units, but those skilled in the art will recognize that various components, or portions thereof, may be divided into separate components or may be integrated together, including integrated within a single system or component. It should be noted that functions or operations discussed herein may be implemented as components. Components may be implemented in software, hardware, or a combination thereof.


Furthermore, connections between components or systems within the figures are not intended to be limited to direct connections. Rather, data between these components may be modified, re-formatted, or otherwise changed by intermediary components. Also, additional or fewer connections may be used. It shall also be noted that the terms “coupled,” “connected,” or “communicatively coupled” shall be understood to include direct connections, indirect connections through one or more intermediary devices, and wireless connections.


Reference in the specification to “one embodiment,” “preferred embodiment,” “an embodiment,” or “embodiments” means that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the invention and may be in more than one embodiment. Also, the appearances of the above-noted phrases in various places in the specification are not necessarily all referring to the same embodiment or embodiments.


The use of certain terms in various places in the specification is for illustration and should not be construed as limiting. A service, function, or resource is not limited to a single service, function, or resource; usage of these terms may refer to a grouping of related services, functions, or resources, which may be distributed or aggregated.


The terms “include,” “including,” “comprise,” and “comprising” shall be understood to be open terms and any lists that follow are examples and not meant to be limited to the listed items. Any headings used herein are for organizational purposes only and shall not be used to limit the scope of the description or the claims. Each reference mentioned in this patent document is incorporate by reference herein in its entirety.


Furthermore, one skilled in the art shall recognize that: (1) certain steps may optionally be performed; (2) steps may not be limited to the specific order set forth herein; (3) certain steps may be performed in different orders; and (4) certain steps may be done concurrently.


A light detection and ranging system, such as a LIDAR system, may be a tool to measure the shape and contour of the environment surrounding the system. LIDAR systems may be applied to numerous applications including both autonomous navigation and aerial mapping of a surface. LIDAR systems emit a light pulse that is subsequently reflected off an object within the environment in which a system operates. The time each pulse travels from being emitted to being received may be measured (i.e., time-of-flight “TOF”) to determine the distance between the object and the LIDAR system. The science is based on the physics of light and optics.


In a LIDAR system, light may be emitted from a rapidly firing laser. Laser light travels through a medium and reflects off points of surfaces in the environment such as buildings, tree branches and vehicles. The reflected light energy returns to a LIDAR transceiver (detector) where it is recorded and used to map the environment.



FIG. 1 depicts the operation of LiDAR components 102 and data analysis & interpretation 109 according to embodiments of the present disclosure. LiDAR components 102 may comprise a transmitter 104 that transmits emitted light signal 110, receiver 106 comprising a detector, and system control and data acquisition 108. LiDAR components 102 may be referred to as a LIDAR transceiver. Emitted light signal 110 propagates through a medium and reflects off object 112. Return light signal 114 propagates through the medium and is received by receiver 106. System control and data acquisition 108 may control the light emission by transmitter 104 and the data acquisition may record the return light signal 114 detected by receiver 106. Data analysis & interpretation 109 may receive an output via connection 116 from system control and data acquisition 108 and perform data analysis functions. Connection 116 may be implemented with a wireless or non-contact communication method. Transmitter 104 and receiver 106 may include optical lens and mirrors (not shown). Transmitter 104 may emit a laser beam having a plurality of pulses in a particular sequence. In some embodiments, light detection and ranging components 102 and data analysis & interpretation 109 comprise a LIDAR system. A design element of receiver 106 is a horizontal field of view (hereinafter, “FOV”) and a vertical FOV. One skilled in the art will recognize that the FOV effectively defines the visibility area relating to the specific LiDAR system. The horizontal and vertical FOVs may be defined by a single LiDAR sensor or may relate to a plurality of configurable sensors (which may be exclusively LiDAR sensors or may comprise different types of sensors). The FOV may be considered a scanning area for a LIDAR system. A scanning mirror may be utilized to obtain a scanned FOV.



FIG. 2A illustrates the operation 200 of LiDAR system 202 including multiple return light signals: (1) return signal 203 and (2) return signal 205 according to embodiments of the present document. Due to the laser's beam divergence, a single laser firing often hits multiple objects producing multiple returns. The light detection and ranging system 202 may analyze multiple returns and may report either the strongest return, the last return, or both returns. Per FIG. 2A, light detection and ranging system 202 emits a laser in the direction of near wall 204 and far wall 208. As illustrated, the majority of the beam hits the near wall 204 at area 206 resulting in return signal 203, and another portion of the beam hits the far wall 208 at area 210 resulting in return signal 205. Return signal 203 may have a shorter TOF and a stronger received signal strength compared with return signal 205. In both single and multiple return LIDAR systems, it is important that the return signal is accurately associated with the transmitted light signal so that an accurate TOF is calculated.


Some embodiments of a LIDAR system may capture distance data in a 2-D (i.e. single plane) point cloud manner. These LIDAR systems may be often used in industrial applications and may be often repurposed for surveying, mapping, autonomous navigation, and other uses. Some embodiments of these devices rely on the use of a single laser emitter/detector pair combined with some type of moving mirror to effect scanning across at least one plane. This mirror not only reflects the emitted light from the diode but may also reflect the return light to the detector. Use of an oscillating mirror in this application may be a means to achieving 90-180-360 degrees of azimuth (horizontal) view while simplifying both the system design and manufacturability. Many applications require more data than just a single 2-D plane. The 2-D point cloud may be expanded to form a 3-D point cloud, where multiple 2-D clouds are used, each pointing at a different elevation (vertical) angle. Design elements of the receiver of light detection and ranging system 202 include the horizontal FOV and the vertical FOV.



FIG. 2B depicts a LIDAR system 250 with an oscillating mirror according to embodiments of the present document. LIDAR system 250 employs a single laser emitter/detector combined with an oscillating mirror to effectively scan across a plane. Distance measurements performed by such a system are effectively two-dimensional (i.e., planar), and the captured distance points are rendered as a 2-D (i.e., single plane) point cloud. In some embodiments, but without limitations, oscillating mirrors are oscillated at very fast speeds (e.g., thousands of cycles per minute).


LIDAR system 250 comprises laser electronics 252, which comprises a single light emitter and light detector. The emitted laser signal 251 may be directed to a fixed mirror 254, which reflects the emitted laser signal 251 to oscillating mirror 256. As oscillating mirror 256 “oscillates”, the emitted laser signal 251 may reflect off object 258 in its propagation path. The reflected signal 253 may be coupled to the detector in laser electronics 252 via the oscillating mirror 256 and fixed mirror 254. Design elements of the receiver of LIDAR system 250 include the horizontal FOV and the vertical FOV, which defines a scanning area.



FIG. 3A depicts a distributed sensor system 300 installed in an automobile utilizing a suite of sensors coupled to an MCU 302 according to embodiments of the present disclosure. The suite of sensors includes sensor module 304, sensor module 306, sensor module 308, sensor module 310 and sensor module 312. The term “sensor module” is intended to be broadly defined and includes implementations of single sensor modules and multi-sensor modules. In addition, the types of sensor(s) within a sensor module may vary depending on the configuration of the system. In certain instances, a sensor module may comprise a single sensor (hereinafter, “single sensor module”) such as a LiDAR sensor or multiple sensors (hereinafter, “multi-sensor module”). A multi-sensor module may comprise a plurality of integrated sensors, a plurality of discrete sensors or a combination thereof. The multi-sensor module may also comprise a plurality of LiDAR sensors or a plurality of different types of sensors that are correlated within the module. As shown in FIG. 3A, the suite of sensor modules may be distributed in a variety of location on the vehicle. Correlated sensor data from the various sensor modules are provided to the MCU 302 for analysis and decision processing. The connectivity between the sensor modules and the MCU 302 is provided by a sensor bus that may transmit the different sensor data in a serial manner (there may be other embodiments in which sensor data is transmitted on a parallel bus).


As previously described, a sensor module may comprise a single sensor or multiple sensors and support various types of sensors such as a LIDAR transceiver, thermal/far IR sensor, visible/near IR sensor or other types of sensor known to one of skill in the art. The sensor structure may have various shapes including a modular design that is rectangular or a wedge shaped that may be tiled together and/or stacked and may allow for a design that can go around corners. These different sensor shapes allow configurability of the sensor module including configurability of FOV, sensor range, etc. Based on the particular configuration of the sensor module and corresponding FOV, different scan patterns and resolutions may be implemented.


MCU 302 may be coupled to an Autonomous Driving System Control Unit (hereinafter, “ADSCU”) 301. In certain embodiments, the ADSCU 301 may provide sensor instructions and information to MCU 302.



FIG. 3B depicts the framework for a sensor system 320 according to embodiments of the current disclosure. Sensor system 322 may be supported by MCU 324 and its associated software. Sensor system 322 may include scan mirror 326, ASICs 328, firmware 330 and sensors 332. In some embodiments, scan mirror 326 may be a dual axis resonant scanning mirror. In some embodiments, sensors 332 may support a combination of sensor modules as described above and may include various sensor types including LIDAR, Color (RGB), thermal (Far-IR) or other sensor types known to one of skill in the art. The sensor system 320 is able to receive data signals from a combination of sensor modules, correlate the sensor data and timely process the correlated sensor data in order to make timely decisions based thereon.


In order for autonomous vehicles to perceive their surrounding environment and react accordingly, a plurality of techniques may be applied to the sensor system to collate data from the multiple sensor modules. In particular, it may be necessary to collate the data from the sensor modules for dynamic and spatial analysis/inference, which means their differences are decoupled, and digital information can be transmitted, stored and computed in a way that the vehicles and its operating system efficiently process and act on the different sensor data. In this regard, data from the distributed sensors can be multiplexed to provide a unified data packet and coupled via a sensor bus to a microcontroller.



FIG. 3C depicts the operation of an MCU 348 in an autonomous driving system 340 utilizing sensor module 352 and bus 358 according to embodiments of the disclosure. As illustrated, an object 341 within the autonomous navigation environment is detected by one or more sensor modules 352. As previously described, the structure and type of sensor(s) within the sensor module 352 may vary based on design and/or preference.


The autonomous driving system 340 may support multiple configurations and redundancies based on the number, types and locations of sensor modules 352 installed around the vehicle. Sensor modules 352 may be activated based on the application and external conditions. For example, when an automobile is being driven on an open highway a fewer number of sensors and/or sensor modules may be activated relative to when an automobile is being driven within heavy traffic. Additionally, sensors and/or sensor modules may be activated based on a particular mode in which an automobile is operating. For example, particular sensors may be activated if a vehicle is operating is a pilot mode as compared to an autonomous mode. This dynamic activation of sensors is another aspect of the configurability of the sensor network, which allows the system to be dynamically adapted to its environment both at installation as well as during operation.


Sensor module(s) 352 may detect an object 341 across a plurality of sensors and separately couple their detected data signals (shown as data streams 1 thru n) 354 to multiplexer 356. Multiplexer 356 combines the channels of different sensed data and generates a unified data packet correlating the data from each of the sensors. In some embodiments, the unified data packet comprises range and reflectivity data from LIDAR transceiver, color/RGB data from a camera, temperature data from a far infrared detector. In other embodiments, other sensor types from other region of electromagnetic spectrum such as acoustics, radar or sonar may be included. One skilled in the art will recognize that the sensor module 352 may include various combinations of sensor module(s), sensor types and sensor configurations. The unified data packet is coupled to a bus 358, which is typically serial but may also be parallel in nature.


The data from the multiple sensors and/or sensor modules may be multiplexed and coupled via bus 358 to a microcontroller MCU 348. MCU 348 interacts with an autonomous driving system control unit (hereinafter, “ADSCU”) 342 to receive the configuration and parameters for data acquisition from sensors.


In certain embodiments, the MCU 348 may receive external conditions and information about the motion of the car. MCU 348 comprises data processing element 349, demultiplexer 350, calibration engine 351 and driver 353. In certain embodiments where the bus is serial, the de-multiplexer 350 receives the data serially from multiple sensor modules and uses the calibration parameter from the calibration engine to transform the data as if it is coming from a sensor (i.e., on a sensor channel basis). Calibration engine 351 provides the transforms between different sensors and/or sensor modules. In certain examples, these transforms are initialized to factory settings and constantly updated over time. The data processing element 349 comprises single or multiple embedded algorithms for computing information such as object detection, velocity estimation, localization to roads and external maps. Driver 353 is responsible for activating the sensors and/or sensor modules of interest, and also providing the clock triggers.


The demultiplexer 350 de-multiplexes the unified serial data packet of sensor data and associates the data with a corresponding sensor and/or sensor module. Thereafter, this data is provided to the calibration engine 351, which generates transform information based on calibration parameters received from ADSCU 342. The demultiplexer 350 also receives the spatial transform information and integrates it with the de-multiplexed unified serial data packet of sensor data into a particular format such as a point cloud format.


As previously noted in FIG. 3A, the ADSCU 342 may provide sensor instructions to MCU 302. In certain embodiments, ADSCU 342 is the computer in the automobile and is an element manufactured into the vehicle. As shown in FIG. 3C, ADSCU 342 receives an input in the form of a point cloud from data processing 349, a component of MCU 348. In certain embodiments, the ADSCU 342 may generate calibration parameters maps 343, odometer 344, and lighting conditions 345. Other embodiments may have other calibration parameters and utilize a different mix of calibration parameters. In yet other embodiments, the odometer, lighting conditions and external map may be provided to the MCU 348 from another device within the vehicle. ADSCU 342 may also generate sensor configurations 346 including sensor type configurations, field of view, frame rate and region of interest. The region of interest may be, for example, a pedestrian crosswalk or a driving lane. Via a region of interest identification method, the autonomous driving system 340 can filter out amounts of unwanted raw data for the actual tracking. Effectively, MCU 348 homogenizes and decouples the different types of sensor data. With dynamic feedback from the ADSCU 342 in the form of calibration parameters and sensor configuration, MCU 348 can dynamically configure sensors and/or sensor modules across different configurations and space in an autonomous automobile environment.



FIGS. 3D and 3E illustrate methods 360 and 361 for dynamically configuring multi-sensor modules across different types of sensors and space according to embodiments of the current disclosure comprises the following steps:

    • Receive at MCU (Driver) sensor configuration parameters and receive at MCU (calibration engine) calibration parameters from ADSCU (step 362)
    • Send from MCU (Driver) configuration parameters to activate selective single sensor module or multi-sensor modules (step 364)
    • Acquire data from an object within the environment by the selected sensor module(s) (step 366)
    • Multiplex the sensor data to generate a unified data packet (step 368)
    • Send the unified data packet via a bus to MCU (step 370)
    • In the MCU, de-multiplex unified data packet into homogeneous sensor data (step 372)
    • In the MCU, send the homogeneous sensor data to a calibration engine (step 374)
    • In the MCU, generate transform information in the calibration engine and send transform information to de-multiplexer (step 376)
    • In the MCU, integrate, by the de-multiplexer, the homogeneous sensor data and the transform data, and send to data processing (step 378)
    • Generate by data processing of the MCU, a point cloud comprising the homogeneous sensor data and the transform data and send to ADSCU (step 380)
    • In the ADSCU, determine/adjust control of the vehicle based on the point cloud and generate updated sensor configurations and calibration parameters (step 382)
    • Repeat step 362



FIG. 3F illustrates a method 390 for updating calibration parameters in a calibration engine according to embodiments of the current disclosure comprises the following steps:

    • Receive the homogeneous sensor data from de-multiplexer (step 392)
    • Receive calibration parameter data from ADSCU (step 394)
    • Update calibration parameters in calibration engine and generate transform information (step 396)


The above description illustrates the configurability of autonomous navigation at a system level including the activation of certain sensors and/or sensor modules as well as the correlation of data across these sensors and sensor modules. In another aspect of the invention, each sensor module may be configured to operate in accordance with a preferred set of parameters.



FIG. 4A depicts sensor module 400 and FIG. 4B depicts sensor module 402 from which configurable operational parameters may be defined. This configurability not only allows for FOV definition but also sensor type configuration within a sensor module. Additionally, this configurability may be implemented at installation or in real-time during operation of the system. According to various embodiments, the sensor modules may be configured by defining directionality of one or more sensors within the sensor module using the physical structure of the sensor or by the inclusion of directionality elements (e.g., wedges) that define a direction of a corresponding sensor. As shown in FIG. 4B, sensor module 402 may comprise a plurality of sensors 406-413 that are coupled together in particular architecture such that a combination of individual sensor FOVs is stitched together to create a broader FOV of the sensor module. This configurability of sensor modules allows a user to effectively build unique sensor modules by combining the different sensors into diverse architectures. The configurability is further enhanced by the ability to include different sensor types within the sensor module to enhance performance relative to environmental characteristics in which the module operates.


A sensor module 402 has a horizontal FOV and vertical FOV that corresponds to the combination of sensors 406-413. The operational characteristics of each sensor 406-413 within the module 402 are combined to provide an enhanced modular FOV. These operational characteristics include the directionality of a sensor, the range of a sensor, the FOV of a sensor, the type of a sensor and other characteristics known to one of skill in the art. In certain embodiments, particular sensors within a module may be activated or deactivated depending on the environment in which the system is operating. In addition, particular sensors may function as redundant elements in case one or more of the sensors fails or becomes temporarily inoperable. The FOV of the sensor module not only depends on the specific operational characteristics of each sensor but also on the manner in which data from these sensors is correlated and combined.



FIG. 4C illustrates a specific example Lissajous scan pattern and resolution 430 based on different vertical FOVs of a sensor according to embodiments of the present disclosure. Scan 432 illustrates a vertical scan and a horizontal scan resulting from different vertical FOV configurations of a sensor.


The diagrams on the right side of FIG. 4C illustrate the scan resolutions for different FOVs. FIG. 4D, vFOV 434 illustrates the scan resolution with a 2.5 degree FOV. FIG. 4E, vFOV 436 illustrates the scan resolution with a 5 degree FOV. FIG. 4F, vFOV 438 illustrates the scan resolution with a 10 degree FOV. The resolution achieved with a 2.5 degree FOV is twice as dense as the resolution achieved with a 5 degree FOV. Similarly, the resolution achieved with a 5 degree FOV is twice as dense as the resolution achieved with a 10 degree FOV. This example illustrates the configurability of a sensor and its resultant affect on scan pattern and resolution. One skilled in the art will recognize that numerous patterns and resolutions may be achieved by configuring a sensor in accordance with aspects of the present disclosure.


The configurability of a sensor module is further enhanced not only by the specific operational parameters of one or more sensors therein, but the manner in which the one or more sensors is combined within the module. FIG. 4G illustrates an exemplary scanning pattern 440 for a sensor system comprising eight sensors within a sensor module according to embodiments of the present disclosure. Scanning pattern 440 may be obtained using sensor module architecture 402 in which data sensed across the eight sensors is combined to provide enhanced resolution and field of view. Scanning pattern 440 comprises scan 446, scan 447, scan 448, scan 449, scan 450, scan 451, scan 452, and scan 453 that are correlated and processed to generate the pattern. In this example, the total field of view for sensor module architecture 402 is approximately 40 degrees by 120 degrees. One skilled in the art will recognize that a diverse of modular FOVs and other module performance characteristics may be achieved by modifying the way in which sensors are coupled together, the specific parameters of the sensors and the methods in which the sensor data is correlated and analyzed.



FIG. 4H and FIG. 4I illustrate sensor module configurations 461 and 462, respectively, according to various embodiments of the invention. These configurations are intended to be exemplary and not limiting to the scope of the invention. In one embodiment, a sensor module configuration may be a square or rectangle shape, as illustrated in configuration 461, in which individual sensor shapes are configured to provide particular operational characteristics within the module. Configuration 461 comprises two stacked sets of sensors in which physical structures define a FOV for each sensor. For example, physical size and directionality of a sensor may provide different angular and spatial scanning characteristics that are used within the sensor module. As a result, sensor shape and relative locations of the sensors provide a particular scan resolution and FOV. In another configuration, a sensor module configuration may be a wedge shape, as illustrated in configuration 462, in which physical wedge elements define the directionality of sensors within the module. These two examples illustrate to one of skill in the art the vast number of configurable combinations of sensors within a sensor module. In one example, the sensors are LiDAR sensors with corresponding operational characteristics that allow an MCU to build an enhanced scan pattern with preferred resolution. The performance of the sensor system may be further enhanced in some embodiments by the inclusion of different sensor types within a sensor module.


LIDAR sensors provide unique capabilities for autonomous driving based primarily on the rate and accuracy at which these sensors operate. These LiDAR sensors create an accurate map that can be quickly and unambiguously processed to make rapid navigation decisions with minimal error. However, certain embodiments of the present invention support non-LiDAR sensors that may be included within a sensor module to supplement the LiDAR sensor data. This multi-sensor module employing different types of sensors present unique challenges in the correlation of sensed data across these sensors. Different types of sensors may have different rates of data collection resulting in a more difficult correlation across time. Additionally, different sensors that are closely collocated within the module may be subject to parallax error because data are taken from different vantage points. Accordingly, the use of different types of sensors within a single sensor module further complicates the correlation problem previously described as well as introduces additional complexities within the data analysis and response processing of the system.


Various embodiments of the invention provide a more efficient manner for sensor data correlation across diverse types of sensors by physically combining the different sensors within a single module package. This multi-sensor module employing different sensors insures that there is a 1:1 correspondence between data points from the various sensors. The sensor data stream can be presented to the autonomous systems with the various sensor-type data, already combined into a correlated data packet. The autonomous system bandwidth can then be focused on the task of navigation rather than preprocessing and correlation of the mixed data sets.


In one embodiment, consider a LIDAR system that returns a single point from the environment. This single data point is already both a distance measurement (range) as well as an object reflectivity measurement with active illumination. As a further enhancement, the LIDAR detector can also passively measure ambient light from the scene to effectively render a passive grayscale value associated with each LIDAR channel. In a real-world navigation scenario, the color of an object carries important information about its relevance. For example, stop signs and stoplights are red, yellow means caution, green may mean “information” or safe to go and so forth. Providing a unified data packet in which a data point has distance, reflectivity and color provides the autonomous system additional immediate information on the relevance of an object in the field of view


Another key aspect of the real world is that it is full of living creatures. There are generalized algorithms that attempt to classify detected objects based on size, shape and velocity. However, faults in such algorithms have been demonstrated and may result in errors within the sensor system. One key feature of most living animal creatures that an autonomous system may encounter is that they are warm blooded and generally have a different temperature than their surrounding environment. This characteristic can make it possible to monitor the temperature of objects with various thermal detection technologies. With a thermal sensor incorporated into the LIDAR sensor, yet another data type can be incorporated into the single data packet for each data point reported by the sensor, namely the temperature of the associate object. The ability to instantly classifies the object as a living creature has obvious benefits to rapid autonomous system decision making. The fact that the data are naturally correlated to a real physical object greatly improves both reaction time and certainty of object identification. In certain embodiments, correlation of diverse sensor data may be used to derive a confidence factor of an identified object so that a processed response may take into account the likelihood of an object being one type of object versus another type.


Thermal sensors provide real-time 3D thermo-spatial information, allowing for more intelligent machine vision. For example, but without limitation, an array of photodetectors sensitive to long IR electromagnetic radiation serving alongside a scanning LIDAR system can simultaneously localize objects in a 3D environment and discriminate warm objects (such as living beings) from other objects in a conventional automotive environment. Active-tracking system can deliver real-time digital information (as opposed to a passive tracking system that delivers a trigger signal) regarding the location and temperature of warm objects to a vehicle control system. A single detector can provide data over a large area by implementing a fast scanning mechanism. A large and dense array of channels can provide thermo-spatial data of in all directions and with high resolution. Furthermore, detectors can be arranged so that the data is both temporally and spatially correlated with the LiDAR channels.


One skilled in the art will recognize that numerous combinations of sensor types may be included within a sensor module and used to improve the performance of the sensor system. In certain examples, these different sensor types may be used to enhance the performance of a LiDAR system and provide greater accuracy based on certain correlated aspects of sensed data relative to LiDAR data.



FIG. 4J illustrates a sensor system 480 that supports detection of an object 482 using different types of sensors within a sensor module 484 according to various embodiments of the invention. In this example, a sensor module 484 may comprise various combinations of a LiDAR sensor, thermal/far infrared radiation (IR) sensor, visible/near IR sensor as well as other sensor types known to one of skill in the art. The sensor module 484 receives signals from different sensor types relative to a sensed object 482. The sensor data from each different type of sensor is captured and provided to a multiplexer 488 along corresponding channels 490-494. This data may subsequently represented on a single cloud point for further processing.


In a specific example, sensor 484a (Thermal/FarIR Channel) may comprise an array of photodetectors sensitive to long IR electromagnetic radiation. Sensor 484a can simultaneously localize objects in a 3D environment and discriminate warm objects (such as living beings) from other objects in a conventional automotive environment. Sensor 484b (Visible/NearIR Channel) detects RGB color characteristics of ambient light and may also include sensors to detect other light sources such as near infrared light. Sensor 484d may also include a sensor for another region of electromagnetic spectrum such as acoustics, radar or sonar. These sensors 484a, 484b and 484d are used to supplement the LiDAR sensor 484c to provide an enhanced sensor system performance.


Data multiplexer 488 generates a unified data packet 495 representing the correlated data from the different sensors 484a-d in a unified data packet. The data is correlated in that they are acquired from the same point in space (or nearly the same point and unified in that they are bundled into a single data packet).


In embodiments, aspects of the present patent document may be directed to or implemented on information handling systems/computing systems. For purposes of this disclosure, a computing system may include any instrumentality or aggregate of instrumentalities operable to compute, calculate, determine, classify, process, transmit, receive, retrieve, originate, route, switch, store, display, communicate, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes. For example, a computing system may be a personal computer (e.g., laptop), tablet computer, phablet, personal digital assistant (PDA), smart phone, smart watch, smart package, server (e.g., blade server or rack server), a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price. The computing system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of memory. Additional components of the computing system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, touchscreen and/or a video display. The computing system may also include one or more buses operable to transmit communications between the various hardware components.



FIG. 5 depicts a simplified block diagram of a computing device/information handling system (or computing system) according to embodiments of the present disclosure. It will be understood that the functionalities shown for system 500 may operate to support various embodiments of an information handling system—although it shall be understood that an information handling system may be differently configured and include different components.


As illustrated in FIG. 5, system 500 includes one or more central processing units (CPU) 501 that provides computing resources and controls the computer. CPU 501 may be implemented with a microprocessor or the like and may also include one or more graphics processing units (GPU) 517 and/or a floating point coprocessor for mathematical computations. System 500 may also include a system memory 502, which may be in the form of random-access memory (RAM), read-only memory (ROM), or both.


A number of controllers and peripheral devices may also be provided, as shown in FIG. 5. An input controller 503 represents an interface to various input device(s) 504, such as a keyboard, mouse, or stylus. There may also be a wireless controller 505, which communicates with a wireless device 506. System 500 may also include a storage controller 507 for interfacing with one or more storage devices 508 each of which includes a storage medium such as magnetic tape or disk, or an optical medium that might be used to record programs of instructions for operating systems, utilities, and applications, which may include embodiments of programs that implement various aspects of the present invention. Storage device(s) 508 may also be used to store processed data or data to be processed in accordance with the invention. System 500 may also include a display controller 509 for providing an interface to a display device 511, which may be a cathode ray tube (CRT), a thin film transistor (TFT) display, or other type of display. The computing system 500 may also include an automotive signal controller 512 for communicating with an automotive system 513. A communications controller 514 may interface with one or more communication devices 515, which enables system 500 to connect to remote devices through any of a variety of networks including the Internet, a cloud resource (e.g., an Ethernet cloud, an Fiber Channel over Ethernet (FCoE)/Data Center Bridging (DCB) cloud, etc.), a local area network (LAN), a wide area network (WAN), a storage area network (SAN) or through any suitable electromagnetic carrier signals including infrared signals.


In the illustrated system, all major system components may connect to a bus 516, which may represent more than one physical bus. However, various system components may or may not be in physical proximity to one another. For example, input data and/or output data may be remotely transmitted from one physical location to another. In addition, programs that implement various aspects of this invention may be accessed from a remote location (e.g., a server) over a network. Such data and/or programs may be conveyed through any of a variety of machine-readable medium including, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices.


Embodiments of the present invention may be encoded upon one or more non-transitory computer-readable media with instructions for one or more processors or processing units to cause steps to be performed. It shall be noted that the one or more non-transitory computer-readable media shall include volatile and non-volatile memory. It shall be noted that alternative implementations are possible, including a hardware implementation or a software/hardware implementation. Hardware-implemented functions may be realized using ASIC(s), programmable arrays, digital signal processing circuitry, or the like. Accordingly, the “means” terms in any claims are intended to cover both software and hardware implementations. Similarly, the term “computer-readable medium or media” as used herein includes software and/or hardware having a program of instructions embodied thereon, or a combination thereof. With these implementation alternatives in mind, it is to be understood that the figures and accompanying description provide the functional information one skilled in the art would require to write program code (i.e., software) and/or to fabricate circuits (i.e., hardware) to perform the processing required.


It shall be noted that embodiments of the present invention may further relate to computer products with a non-transitory, tangible computer-readable medium that have computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind known or available to those having skill in the relevant arts. Examples of tangible computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter. Embodiments of the present invention may be implemented in whole or in part as machine-executable instructions that may be in program modules that are executed by a processing device. Examples of program modules include libraries, programs, routines, objects, components, and data structures. In distributed computing environments, program modules may be physically located in settings that are local, remote, or both.


One skilled in the art will recognize no computing system or programming language is critical to the practice of the present invention. One skilled in the art will also recognize that a number of the elements described above may be physically and/or functionally separated into sub-modules or combined together.


It will be appreciated to those skilled in the art that the preceding examples and embodiments are exemplary and not limiting to the scope of the present disclosure. It is intended that all permutations, enhancements, equivalents, combinations, and improvements thereto that are apparent to those skilled in the art upon a reading of the specification and a study of the drawings are included within the true spirit and scope of the present disclosure. It shall also be noted that elements of any claims may be arranged differently including having multiple dependencies, configurations, and combinations.

Claims
  • 1. A sensor module comprising: a LiDAR transceiver configured to transmit a plurality of optical signals, detect a corresponding plurality of return optical signals from a plurality of first locations within an environment of the sensor module, process the return optical signals to generate LiDAR data related to one or more first attributes of the plurality of first locations within the environment during a first time period, and transmit the LiDAR data on a first channel;a thermal sensor configured to generate thermal data related to one or more second attributes of a plurality of second locations within the environment during the first time period and transmit the thermal data on a second channel;a multiplexer coupled to the first and second channels, the multiplexer configured to receive the LiDAR data from the first channel associated with the LiDAR transceiver and the thermal data from the second channel associated with the thermal sensor, identify first and second portions of the received LiDAR data that are spatially and temporally correlated with respective first and second portions of the thermal data, generate a first unified data packet comprising the first portion of the received LiDAR data and the spatially and temporally correlated first portion of the received thermal data, generate a second unified data packet comprising the second portion of the received LiDAR data and the spatially and temporally correlated second portion of the received thermal data, and output and transmit the first and second unified data packets on a bus;a demultiplexer coupled to the bus and configured to de-multiplex the first and second unified data packets into homogeneous LiDAR data and thermal data; anda calibration engine coupled to the demultiplexer, wherein the calibration engine generates transform information based on calibration parameters;wherein the demultiplexer receives the transform information from the calibration engine and integrates the transform information with the de-multiplexed homogeneous LiDAR data and thermal data into a point cloud format.
  • 2. The sensor module of claim 1, further comprising a color sensor.
  • 3. The sensor module of claim 1 wherein the LiDAR transceiver and the thermal sensor detect data along a shared axis.
  • 4. The sensor module of claim 1 wherein the LiDAR data and thermal data are correlated using a 1:1 correspondence between measurement data points from the LiDAR transceiver and the thermal sensor.
  • 5. The sensor module of claim 1 further comprising a non-LiDAR sensor configured to generate non-LiDAR data related to one or more third attributes of a plurality of third locations within the environment during the first time period and transmit the non-LiDAR data on a third channel.
  • 6. The sensor module of claim 5 wherein the multiplexer is coupled to the third channel, the multiplexer configured to generate the first and second unified data packets from the LiDAR data, the thermal data, and the non-LiDAR data.
  • 7. The sensor module of claim 6 wherein the non-LiDAR sensor is a color sensor.
  • 8. The sensor module of claim 1 wherein the first and second unified data packets are correlated relative to a single cloud point.
  • 9. The sensor module of claim 1, wherein the multiplexer is further configured to identify first and second portions of the LiDAR data that are spatially and temporally correlated with respective first and second portions of the thermal data by: identifying, from the LiDAR data related to a particular attribute of the one or more first attributes of the plurality of first locations within the environment during the first time period, the first portion of the LiDAR data;identifying, from the thermal data related to the particular attribute of the one or more second attributes of the plurality of second locations within the environment during the first time period, the first portion of the thermal data;identifying, from the LiDAR data related to a different attribute of the one or more first attributes of the plurality of first locations within the environment during the first time period, the second portion of the LiDAR data; andidentifying, from the thermal data related to the different attribute of the one or more second attributes of the plurality of second locations within the environment during the first time period, the second portion of the thermal data.
  • 10. A sensor system comprising: a bus;a sensor module coupled to the bus, the sensor module comprising a first sensor and a second sensor, wherein the first sensor is a lidar sensor and the second sensor is a non-lidar sensor;a multiplexer coupled between the sensor module and the bus, wherein the first sensor and the second sensor are separately coupled to the multiplexer over different channels, and wherein the multiplexer is configured to correlate lidar data from the first sensor with data from the second sensor and combine the correlated data into a unified data packet that includes both lidar data and data from the second sensor that is spatially and temporarily correlated with the lidar data;a control unit comprising a data processing element, a demultiplexer and a calibration engine, wherein: the calibration engine is coupled to the demultiplexer and generates transform information based on calibration parameters;the demultiplexer is coupled to: (i) receive the unified data packets from the bus, (ii) demultiplex the unified data packets into homogeneous sensor data including lidar data from the first sensor and sensor data from the second sensor, (iii) receive the transform information from the calibration engine, and (iv) integrate the homogeneous sensor data with the transform information; andthe data processing element is configured to generate a point cloud comprising the homogeneous sensor data with the transform information.
  • 11. The sensor system set forth in claim 10 wherein: the control unit further comprises a calibration engine;the demultiplexer associates the homogeneous sensor data with a corresponding sensor and provides the homogeneous data to the calibration engine; andthe calibration engine generates transforms based on calibration parameters.
  • 12. The sensor system set forth in claim 11 wherein the control unit further comprises a driver operatively coupled to activate and provide clock triggers to the first and second sensors.
  • 13. The sensor system set forth in claim 10 wherein the data processing element comprises one or more algorithms for computing object detection, velocity estimation and localization to roads and external maps.
  • 14. The sensor system set forth in claim 10 wherein the demultiplexer receives spatial transform information and integrates it with the demultiplexed unified data packet into a point cloud format.
  • 15. The sensor system set forth in claim 10 wherein the second sensor is a thermal sensor.
US Referenced Citations (383)
Number Name Date Kind
3064252 Varela Nov 1962 A
3373441 Zadig Mar 1968 A
3551845 Zelina Dec 1970 A
3636250 Haeff Jan 1972 A
3686514 Dube et al. Aug 1972 A
3781111 Fletcher et al. Dec 1973 A
3862415 Harnden, Jr. et al. Jan 1975 A
3897150 Bridges et al. Jul 1975 A
4179216 Theurer et al. Dec 1979 A
4199697 Edwards Apr 1980 A
4201442 McMahon et al. May 1980 A
4212534 Bodlaj Jul 1980 A
4220103 Kasahara et al. Sep 1980 A
4477184 Endo Oct 1984 A
4516837 Soref et al. May 1985 A
4634272 Endo Jan 1987 A
4656462 Araki et al. Apr 1987 A
4681433 Aeschlimann Jul 1987 A
4700301 Dyke Oct 1987 A
4742337 Haag May 1988 A
4834531 Ward May 1989 A
4862257 Ulich Aug 1989 A
4895440 Cain et al. Jan 1990 A
4896343 Saunders Jan 1990 A
4902126 Koechner Feb 1990 A
4944036 Hyatt Jul 1990 A
4952911 D'Ambrosia et al. Aug 1990 A
4967183 D'Ambrosia et al. Oct 1990 A
5004916 Collins, Jr. Apr 1991 A
5006721 Cameron et al. Apr 1991 A
5023888 Bayston Jun 1991 A
5026156 Bayston et al. Jun 1991 A
5059008 Flood et al. Oct 1991 A
5175694 Amato Dec 1992 A
5177768 Crespo et al. Jan 1993 A
5210586 Grage et al. May 1993 A
5212533 Shibuya et al. May 1993 A
5241481 Olsen Aug 1993 A
5249157 Taylor Sep 1993 A
5291261 Dahl et al. Mar 1994 A
5309212 Clark May 1994 A
5314037 Shaw et al. May 1994 A
5319201 Lee Jun 1994 A
5365218 Otto Nov 1994 A
5463384 Juds Oct 1995 A
5465142 Krumes et al. Nov 1995 A
5515156 Yoshida et al. May 1996 A
5546188 Wangler et al. Aug 1996 A
5563706 Shibuya et al. Oct 1996 A
5572219 Silverstein et al. Nov 1996 A
5691687 Kumagai et al. Nov 1997 A
5710417 Joseph et al. Jan 1998 A
5757472 Wangler et al. May 1998 A
5757501 Hipp May 1998 A
5757677 Lennen May 1998 A
5789739 Schwarz Aug 1998 A
5793163 Okuda Aug 1998 A
5793491 Wangler et al. Aug 1998 A
5805468 Blohbaum Sep 1998 A
5847817 Zediker et al. Dec 1998 A
5877688 Morinaka et al. Mar 1999 A
5889479 Tabel Mar 1999 A
5895984 Renz Apr 1999 A
5903355 Schwarz May 1999 A
5903386 Mantravadi et al. May 1999 A
5923910 Nakahara et al. Jul 1999 A
5942688 Kimura et al. Aug 1999 A
5949530 Wetteborn Sep 1999 A
5953110 Burns Sep 1999 A
5991011 Damm Nov 1999 A
6034803 Sullivan et al. Mar 2000 A
6043868 Dunne Mar 2000 A
6069565 Stern et al. May 2000 A
6088085 Wetteborn Jul 2000 A
6100539 Blumcke et al. Aug 2000 A
6137566 Leonard et al. Oct 2000 A
6153878 Jakob et al. Nov 2000 A
6157294 Urai et al. Dec 2000 A
6201236 Juds Mar 2001 B1
6259714 Kinbara Jul 2001 B1
6297844 Schatz et al. Oct 2001 B1
6321172 Jakob et al. Nov 2001 B1
6327806 Paige Dec 2001 B1
6329800 May Dec 2001 B1
6335789 Kikuchi Jan 2002 B1
6365429 Kneissl et al. Apr 2002 B1
6396577 Ramstack May 2002 B1
6420698 Dimsdale Jul 2002 B1
6441363 Cook, Jr. et al. Aug 2002 B1
6441889 Patterson Aug 2002 B1
6442476 Poropat Aug 2002 B1
6473079 Kacyra et al. Oct 2002 B1
6504712 Hashimoto et al. Jan 2003 B2
6509958 Pierenkemper Jan 2003 B2
6593582 Lee et al. Jul 2003 B2
6621764 Smith Sep 2003 B1
6636300 Doemens et al. Oct 2003 B2
6646725 Eichinger et al. Nov 2003 B1
6650402 Sullivan et al. Nov 2003 B2
6664529 Pack et al. Dec 2003 B2
6665063 Jamieson et al. Dec 2003 B2
6670905 Orr Dec 2003 B1
6682478 Nakamura Jan 2004 B2
6710324 Hipp Mar 2004 B2
6742707 Tsikos et al. Jun 2004 B1
6747747 Hipp Jun 2004 B2
6759649 Hipp Jul 2004 B2
6789527 Sauler et al. Sep 2004 B2
6798527 Fukumoto et al. Sep 2004 B2
6812450 Hipp Nov 2004 B2
6876790 Lee Apr 2005 B2
6879419 Richman et al. Apr 2005 B2
6969558 Walston et al. Nov 2005 B2
7030968 D'Aligny et al. Apr 2006 B2
7041962 Dollmann et al. May 2006 B2
7089114 Huang Aug 2006 B1
7106424 Meneely et al. Sep 2006 B2
7129971 McCutchen Oct 2006 B2
7130672 Pewzner et al. Oct 2006 B2
7131586 Tsikos et al. Nov 2006 B2
7190465 Froehlich et al. Mar 2007 B2
7240314 Leung Jul 2007 B1
7248342 Degnan Jul 2007 B1
7281891 Smith et al. Oct 2007 B2
7295298 Willhoeft et al. Nov 2007 B2
7313424 Mayevsky et al. Dec 2007 B2
7315377 Holland et al. Jan 2008 B2
7319777 Morcom Jan 2008 B2
7345271 Boehlau et al. Mar 2008 B2
7358819 Rollins Apr 2008 B2
7373473 Bukowski et al. May 2008 B2
7408462 Pirkl et al. Aug 2008 B2
7480031 Mack Jan 2009 B2
7544945 Tan et al. Jun 2009 B2
7570793 Lages et al. Aug 2009 B2
7583364 Mayor et al. Sep 2009 B1
7589826 Mack et al. Sep 2009 B2
7619477 Segarra Nov 2009 B2
7623222 Benz et al. Nov 2009 B2
7640068 Johnson et al. Dec 2009 B2
7642946 Wong et al. Jan 2010 B2
7684590 Kampchen et al. Mar 2010 B2
7697581 Walsh et al. Apr 2010 B2
7741618 Lee et al. Jun 2010 B2
7746271 Furstenberg Jun 2010 B2
7868665 Tumer et al. Jan 2011 B2
7965314 Miller Jun 2011 B1
7969558 Hall Jun 2011 B2
8042056 Wheeler et al. Oct 2011 B2
8077047 Humble et al. Dec 2011 B2
8098171 Szczerba et al. Jan 2012 B1
8139685 Simic et al. Mar 2012 B2
8203702 Kane et al. Jun 2012 B1
8274037 Ritter et al. Sep 2012 B2
8310653 Ogawa et al. Nov 2012 B2
8514099 Seder et al. Aug 2013 B2
8605262 Campbell et al. Dec 2013 B2
8675181 Hall Mar 2014 B2
8736818 Weimer et al. May 2014 B2
8767190 Hall Jul 2014 B2
8976340 Gilliland et al. Mar 2015 B2
8995478 Kobtsev et al. Mar 2015 B1
9063549 Pennecot et al. Jun 2015 B1
9069061 Harwit Jun 2015 B1
9069080 Stettner et al. Jun 2015 B2
9086273 Gruver et al. Jul 2015 B1
9093969 Gebeyehu et al. Jul 2015 B2
9110154 Bates et al. Aug 2015 B1
9151940 Chuang et al. Oct 2015 B2
9194701 Bosch Nov 2015 B2
RE45854 Gittinger et al. Jan 2016 E
9239959 Evans et al. Jan 2016 B1
9246041 Clausen et al. Jan 2016 B1
9285477 Smith et al. Mar 2016 B1
9286538 Chen et al. Mar 2016 B1
9310197 Gogolla et al. Apr 2016 B2
9383753 Templeton et al. Jul 2016 B1
9453914 Stettner et al. Sep 2016 B2
9529079 Droz et al. Dec 2016 B1
9772607 Decoux et al. Sep 2017 B2
RE46672 Hall Jan 2018 E
9904859 Grauer Feb 2018 B2
9964632 Droz et al. May 2018 B1
9983297 Hall et al. May 2018 B2
9989629 LaChapelle Jun 2018 B1
10003168 Villeneuve Jun 2018 B1
10018726 Hall et al. Jul 2018 B2
10048374 Hall et al. Aug 2018 B2
10094925 LaChapelle Oct 2018 B1
10109183 Franz et al. Oct 2018 B1
10120079 Pennecot et al. Nov 2018 B2
10126412 Eldada et al. Nov 2018 B2
10132928 Eldada et al. Nov 2018 B2
10210401 Allen et al. Feb 2019 B2
10309213 Barfoot et al. Jun 2019 B2
10330780 Hall et al. Jun 2019 B2
10386465 Hall et al. Aug 2019 B2
10393877 Hall et al. Aug 2019 B2
10436904 Moss et al. Oct 2019 B2
10545222 Hall et al. Jan 2020 B2
RE47942 Hall Apr 2020 E
10613203 Rekow et al. Apr 2020 B1
10627490 Hall et al. Apr 2020 B2
10627491 Hall et al. Apr 2020 B2
20010011289 Davis et al. Aug 2001 A1
20010017718 Ikeda et al. Aug 2001 A1
20020003617 Doemens et al. Jan 2002 A1
20020060784 Pack et al. May 2002 A1
20020117545 Tsikos et al. Aug 2002 A1
20030041079 Bellemore et al. Feb 2003 A1
20030043363 Jamieson et al. Mar 2003 A1
20030043364 Jamieson et al. Mar 2003 A1
20030057533 Lemmi et al. Mar 2003 A1
20030066977 Hipp et al. Apr 2003 A1
20030076485 Ruff et al. Apr 2003 A1
20030090646 Riegl et al. May 2003 A1
20030154017 Ellis Aug 2003 A1
20030163030 Arriaga Aug 2003 A1
20040021852 DeFlumere Feb 2004 A1
20040066500 Gokturk et al. Apr 2004 A1
20040134879 Kochergin et al. Jul 2004 A1
20040150810 Muenter et al. Aug 2004 A1
20040213463 Morrison Oct 2004 A1
20040240706 Wallace et al. Dec 2004 A1
20040240710 Lages et al. Dec 2004 A1
20040247157 Lages et al. Dec 2004 A1
20050023353 Tsikos et al. Feb 2005 A1
20050168720 Yamashita et al. Aug 2005 A1
20050211893 Paschalidis Sep 2005 A1
20050232466 Kampchen et al. Oct 2005 A1
20050246065 Ricard Nov 2005 A1
20050248749 Kiehn et al. Nov 2005 A1
20050279914 Dimsdale et al. Dec 2005 A1
20060007350 Gao et al. Jan 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060100783 Haberer et al. May 2006 A1
20060115113 Lages et al. Jun 2006 A1
20060132635 Land Jun 2006 A1
20060176697 Arruda Aug 2006 A1
20060186326 Ito Aug 2006 A1
20060197867 Johnson et al. Sep 2006 A1
20060231771 Lee et al. Oct 2006 A1
20060290920 Kampchen et al. Dec 2006 A1
20070035624 Lubard et al. Feb 2007 A1
20070071056 Chen Mar 2007 A1
20070121095 Lewis May 2007 A1
20070181810 Tan et al. Aug 2007 A1
20070201027 Doushkina et al. Aug 2007 A1
20070219720 Trepagnier et al. Sep 2007 A1
20070241955 Brosche Oct 2007 A1
20070272841 Wiklof Nov 2007 A1
20080002176 Krasutsky Jan 2008 A1
20080013896 Salzberg et al. Jan 2008 A1
20080074640 Walsh et al. Mar 2008 A1
20080079371 Kang et al. Apr 2008 A1
20080154495 Breed Jun 2008 A1
20080170826 Schaafsma Jul 2008 A1
20080186501 Xie Aug 2008 A1
20090010644 Varshneya et al. Jan 2009 A1
20090026503 Tsuda Jan 2009 A1
20090085901 Antony Apr 2009 A1
20090122295 Eaton May 2009 A1
20090142053 Varshneya et al. Jun 2009 A1
20090168045 Lin et al. Jul 2009 A1
20090218475 Kawakami et al. Sep 2009 A1
20090245788 Varshneya et al. Oct 2009 A1
20090323737 Ensher et al. Dec 2009 A1
20100006760 Lee et al. Jan 2010 A1
20100020306 Hall Jan 2010 A1
20100045965 Meneely Feb 2010 A1
20100046953 Shaw et al. Feb 2010 A1
20100134596 Becker Jun 2010 A1
20100188722 Yamada et al. Jul 2010 A1
20100198487 Vollmer et al. Aug 2010 A1
20100204964 Pack et al. Aug 2010 A1
20100239139 Hunt et al. Sep 2010 A1
20100265077 Humble et al. Oct 2010 A1
20100271615 Sebastian et al. Oct 2010 A1
20100302528 Hall Dec 2010 A1
20110028859 Chian Feb 2011 A1
20110040482 Brimble et al. Feb 2011 A1
20110211188 Juenemann et al. Sep 2011 A1
20110216304 Hall Sep 2011 A1
20110305250 Chann et al. Dec 2011 A1
20120038903 Weimer et al. Feb 2012 A1
20120195597 Malaney Aug 2012 A1
20120287417 Mimeault Nov 2012 A1
20130024176 Woodford Jan 2013 A2
20130050144 Reynolds Feb 2013 A1
20130050486 Omer et al. Feb 2013 A1
20130070239 Crawford et al. Mar 2013 A1
20130094960 Bowyer et al. Apr 2013 A1
20130151198 Brown Jun 2013 A1
20130168673 Yu et al. Jul 2013 A1
20130206967 Shpunt et al. Aug 2013 A1
20130241761 Cooper et al. Sep 2013 A1
20130242283 Bailey et al. Sep 2013 A1
20130258312 Lewis Oct 2013 A1
20130286404 Cenko et al. Oct 2013 A1
20130300479 Thibault Nov 2013 A1
20130314711 Cantin et al. Nov 2013 A1
20130336375 Ranki et al. Dec 2013 A1
20130342366 Kiefer et al. Dec 2013 A1
20140063483 Li Mar 2014 A1
20140071234 Millett Mar 2014 A1
20140078519 Steffey et al. Mar 2014 A1
20140104592 Tien et al. Apr 2014 A1
20140240317 Go et al. Aug 2014 A1
20140240721 Herschbach Aug 2014 A1
20140253369 Kelley et al. Sep 2014 A1
20140267848 Wu Sep 2014 A1
20140274093 Abdelmonem Sep 2014 A1
20140347650 Bosch Nov 2014 A1
20150015895 Bridges et al. Jan 2015 A1
20150035437 Panopoulos et al. Feb 2015 A1
20150055117 Pennecot et al. Feb 2015 A1
20150101234 Priest et al. Apr 2015 A1
20150116695 Bartolome et al. Apr 2015 A1
20150131080 Retterath et al. May 2015 A1
20150144806 Jin et al. May 2015 A1
20150185325 Park et al. Jul 2015 A1
20150202939 Stettner et al. Jul 2015 A1
20150219765 Mead et al. Aug 2015 A1
20150226853 Seo et al. Aug 2015 A1
20150293224 Eldada et al. Oct 2015 A1
20150293228 Retterath et al. Oct 2015 A1
20150303216 Tamaru Oct 2015 A1
20160003946 Gilliland et al. Jan 2016 A1
20160009410 Derenick et al. Jan 2016 A1
20160014309 Ellison et al. Jan 2016 A1
20160021713 Reed Jan 2016 A1
20160041266 Smits Feb 2016 A1
20160049058 Allen et al. Feb 2016 A1
20160117431 Kim et al. Apr 2016 A1
20160161600 Eldada et al. Jun 2016 A1
20160191173 Malaney Jun 2016 A1
20160249039 Tran Aug 2016 A1
20160259038 Retterath et al. Sep 2016 A1
20160300484 Torbett Oct 2016 A1
20160306032 Schwarz et al. Oct 2016 A1
20160313445 Bailey et al. Oct 2016 A1
20160363659 Mindell et al. Dec 2016 A1
20160365846 Wyland Dec 2016 A1
20170146639 Carothers May 2017 A1
20170146640 Hall et al. May 2017 A1
20170153319 Villeneuve et al. Jun 2017 A1
20170214861 Rachlin et al. Jul 2017 A1
20170219695 Hall et al. Aug 2017 A1
20170220876 Gao et al. Aug 2017 A1
20170242102 Dussan et al. Aug 2017 A1
20170269198 Hall et al. Sep 2017 A1
20170269209 Hall et al. Sep 2017 A1
20170269215 Hall et al. Sep 2017 A1
20170293810 Allen et al. Oct 2017 A1
20170299721 Eichenholz et al. Oct 2017 A1
20170350983 Hall et al. Dec 2017 A1
20180019155 Tsang et al. Jan 2018 A1
20180058197 Barfoot et al. Mar 2018 A1
20180059219 Irish et al. Mar 2018 A1
20180074382 Lee et al. Mar 2018 A1
20180100924 Brinkmeyer Apr 2018 A1
20180106902 Mase et al. Apr 2018 A1
20180168539 Singh et al. Jun 2018 A1
20180267151 Hall et al. Sep 2018 A1
20180275249 Campbell et al. Sep 2018 A1
20180284227 Hall et al. Oct 2018 A1
20180284274 LaChapelle Oct 2018 A1
20180321360 Hall et al. Nov 2018 A1
20180364098 McDaniel et al. Dec 2018 A1
20190001442 Unrath et al. Jan 2019 A1
20190004533 Huang Jan 2019 A1
20190011563 Hall et al. Jan 2019 A1
20190178991 Hall et al. Jun 2019 A1
20190265714 Ball Aug 2019 A1
20190339365 Hall et al. Nov 2019 A1
20190361092 Hall et al. Nov 2019 A1
20190369257 Hall et al. Dec 2019 A1
20190369258 Hall et al. Dec 2019 A1
20200064452 Avlas et al. Feb 2020 A1
20200142070 Hall et al. May 2020 A1
20200144971 Pinto et al. May 2020 A1
20200166613 Hall et al. May 2020 A1
20200191915 Hall et al. Jun 2020 A1
Foreign Referenced Citations (165)
Number Date Country
2089105 Aug 1994 CA
2754278 Apr 2012 CA
641583 Feb 1984 CH
1106534 Aug 1995 CN
1576123 Feb 2005 CN
2681085 Feb 2005 CN
2773714 Apr 2006 CN
102938142 Feb 2013 CN
930909 Jul 1955 DE
3134815 Mar 1983 DE
3216312 Nov 1983 DE
3216313 Nov 1983 DE
3701340 Jul 1988 DE
3741259 Jun 1989 DE
3808972 Oct 1989 DE
3821892 Feb 1990 DE
4040894 Apr 1992 DE
4115747 Nov 1992 DE
4124192 Jan 1993 DE
4127168 Feb 1993 DE
4137550 Mar 1993 DE
4215272 Nov 1993 DE
4243631 Jun 1994 DE
4340756 Jun 1994 DE
4411448 Oct 1995 DE
4412044 Oct 1995 DE
19512644 Oct 1996 DE
19512681 Oct 1996 DE
4345446 Jul 1998 DE
4345448 Jul 1998 DE
19727792 Feb 1999 DE
19741730 Apr 1999 DE
19741731 Apr 1999 DE
19752145 May 1999 DE
19717399 Jun 1999 DE
19757847 Jul 1999 DE
19757848 Jul 1999 DE
19757849 Jul 1999 DE
19757840 Sep 1999 DE
19815149 Oct 1999 DE
19828000 Jan 2000 DE
19902903 May 2000 DE
19911375 Sep 2000 DE
19919925 Nov 2000 DE
19927501 Nov 2000 DE
19936440 Mar 2001 DE
19953006 May 2001 DE
19953007 May 2001 DE
19953009 May 2001 DE
19953010 May 2001 DE
10025511 Dec 2001 DE
10110420 Sep 2002 DE
10114362 Oct 2002 DE
10127417 Dec 2002 DE
10128954 Dec 2002 DE
10141055 Mar 2003 DE
10143060 Mar 2003 DE
10146692 Apr 2003 DE
10148070 Apr 2003 DE
10151983 Apr 2003 DE
10162668 Jul 2003 DE
10217295 Nov 2003 DE
10222797 Dec 2003 DE
10229408 Jan 2004 DE
10244638 Apr 2004 DE
10244640 Apr 2004 DE
10244643 Apr 2004 DE
10258794 Jun 2004 DE
10303015 Aug 2004 DE
10331529 Jan 2005 DE
10341548 Mar 2005 DE
102004010197 Sep 2005 DE
102004014041 Oct 2005 DE
102005050824 May 2006 DE
102005003827 Jul 2006 DE
102005019233 Nov 2006 DE
102007013023 Sep 2008 DE
202015009250 Jan 2017 DE
0185816 Jul 1986 EP
0361188 Apr 1990 EP
0396865 Nov 1990 EP
0412395 Feb 1991 EP
0412398 Feb 1991 EP
0412399 Feb 1991 EP
0412400 Feb 1991 EP
0468175 Jan 1992 EP
0486430 May 1992 EP
0653720 May 1995 EP
0656868 Jun 1995 EP
0897120 Feb 1999 EP
0913707 May 1999 EP
0937996 Aug 1999 EP
0967492 Dec 1999 EP
1046938 Oct 2000 EP
1055937 Nov 2000 EP
1148345 Oct 2001 EP
1160718 Dec 2001 EP
1174733 Jan 2002 EP
1267177 Dec 2002 EP
1267178 Dec 2002 EP
1286178 Feb 2003 EP
1286181 Feb 2003 EP
1288677 Mar 2003 EP
1291673 Mar 2003 EP
1291674 Mar 2003 EP
1298012 Apr 2003 EP
1298453 Apr 2003 EP
1298454 Apr 2003 EP
1300715 Apr 2003 EP
1302784 Apr 2003 EP
1304583 Apr 2003 EP
1306690 May 2003 EP
1308747 May 2003 EP
1355128 Oct 2003 EP
1403657 Mar 2004 EP
1408318 Apr 2004 EP
1418444 May 2004 EP
1460454 Sep 2004 EP
1475764 Nov 2004 EP
1515157 Mar 2005 EP
1531342 May 2005 EP
1531343 May 2005 EP
1548351 Jun 2005 EP
1557691 Jul 2005 EP
1557692 Jul 2005 EP
1557693 Jul 2005 EP
1557694 Jul 2005 EP
1700763 Sep 2006 EP
1914564 Apr 2008 EP
1927867 Jun 2008 EP
1939652 Jul 2008 EP
1947377 Jul 2008 EP
1983354 Oct 2008 EP
2003471 Dec 2008 EP
2177931 Apr 2010 EP
2041687 Sep 1980 GB
H05240940 Sep 1993 JP
H03-006407 Feb 1994 JP
H6-288725 Oct 1994 JP
11264871 Sep 1999 JP
2001-256576 Sep 2001 JP
2002-031528 Jan 2002 JP
2003-336447 Nov 2003 JP
2004-348575 Dec 2004 JP
2005-070840 Mar 2005 JP
2005-297863 Oct 2005 JP
2006-177843 Jul 2006 JP
6883948 Jun 2021 JP
WO-1999003080 Jan 1999 WO
WO-2000025089 May 2000 WO
WO-0131608 May 2001 WO
WO-03019234 Mar 2003 WO
WO-03040755 May 2003 WO
WO-2004019293 Mar 2004 WO
WO-2004036245 Apr 2004 WO
WO-2008008970 Jan 2008 WO
WO-2009120706 Oct 2009 WO
WO-2015104572 Jul 2015 WO
WO-2016162568 Oct 2016 WO
WO-2017089063 Jun 2017 WO
WO-2017132703 Aug 2017 WO
WO-2017164989 Sep 2017 WO
WO-2017165316 Sep 2017 WO
WO-2018125823 Jul 2018 WO
WO-2018196001 Nov 2018 WO
Non-Patent Literature Citations (461)
Entry
U.S. Appl. No. 15/941,302, filed Mar. 30, 2018.
U.S. Appl. No. 16/510,680, filed Jul. 12, 2019, Hall et al.
U.S. Appl. No. 16/510,710, filed Jul. 12, 2019, Hall et al.
U.S. Appl. No. 16/510,749, filed Jul. 12, 2019, Hall et al.
U.S. Appl. No. 15/420,384, filed Jan. 31, 2017, Hall et al.
U.S. Appl. No. 16/030,780, filed Jul. 9, 2018, Hall et al.
U.S. Appl. No. 11/777,802, filed Jul. 13, 2007, Hall.
U.S. Appl. No. 13/109,901, filed May 17, 2011, Hall et al.
U.S. Appl. No. 15/180,580, filed Jun. 13, 2016, Hall et al.
U.S. Appl. No. 15/700,543, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,558, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,571, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,836, filed Sep. 11, 2017, Hall ct al.
U.S. Appl. No. 15/700,844, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,959, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 15/700,965, filed Sep. 11, 2017, Hall et al.
U.S. Appl. No. 16/912,648, filed Jun. 25, 2020, Hall et al.
U.S. Appl. No. 15/926,095, filed Mar. 30, 2018, Hall et al.
U.S. Appl. No. 15/464,227, filed Mar. 30, 2017, Hall et al.
U.S. Appl. No. 15/464,221, filed Mar. 30, 2017, Hall et al.
U.S. Appl. No. 15/974,527, filed May 8, 2018, Hall et al.
U.S. Appl. No. 16/748,498, filed Jan. 21, 2020, Hall et al.
U.S. Appl. No. 15/610,975, filed Jun. 1, 2017, Hall et al.
U.S. Appl. No. 16/546,131, filed Aug. 20, 2019, Hall et al.
U.S. Appl. No. 16/842,491, filed Apr. 7, 2020, Hall et al.
U.S. Appl. No. 16/546,184, filed Aug. 20, 2019, Hall et al.
U.S. Appl. No. 16/546,206, filed Aug. 20, 2019, Hall et al.
U.S. Appl. No. 16/909,306, filed Jun. 23, 2020, Hall et al.
U.S. Appl. No. 15/339,790, filed Oct. 31, 2016, Hall et al.
U.S. Appl. No. 16/854,755, filed Apr. 21, 2020, Hall et al.
U.S. Appl. No. 16/905,843, filed Jun. 18, 2020, Hall et al.
U.S. Appl. No. 16/905,849, filed Jun. 18, 2020, Hall et al.
U.S. Appl. No. 16/909,846, filed Jun. 23, 2020, Hall et al.
U.S. Appl. No. 15/835,983, filed Dec. 8, 2017, Hall et al.
U.S. Appl. No. 16/459,557, filed Jul. 1, 2019, Rekow et al.
U.S. Appl. No. 16/841,506, filed Apr. 6, 2020, Rekow et al.
U.S. Appl. No. 16/112,273, filed Aug. 24, 2018, Avlas et al.
U.S. Appl. No. 16/181,523, filed Nov. 6, 2018, Pinto et al.
U.S. Appl. No. 16/241,849, filed Jan. 7, 2019, Hall et al.
Quanergy Systems, Inc. v. Velodyne Lidar, Inc. (N.D. Cal.), filed Sep. 13, 2016, U.S. Pat. No. 7,969,558.
Velodyne Lidar, Inc. v. Hesai Photonics Technology Co., Ltd. (N.D. Cal.), filed Aug. 13, 2019, U.S. Pat. No. 7,969,558.
Velodyne Lidar, Inc. v. Suteng Innovation Technology Co., Ltd. (N.D. Cal.), filed Aug. 13, 2019, U.S. Pat. No. 7,969,558.
In re Certain Rotating 3-D Lidar Devices, Components Thereof, and Sensing Systems Containing the Same (ITC), Investigation No. ITC-337-TA-1173, filed Aug. 15, 2019, U.S. Pat. No. 7,969,558.
Petition for Inter Partes Review (USPTO Patent Trial and Appeal Board), Case No. IPR2018-00255, filed Nov. 29, 2017, U.S. Pat. No. 7,969,558.
Petition for Inter Partes Review (USPTO Patent Trial and Appeal Board), Case No. IPR2018-000255, filed Nov. 29, 2017, U.S. Pat. No. 7,969,558.
Accetta et al., Active Electro-Optical Systems, The Infrared and Electro-Optical Systems Handbook (1993, ed. by Clifton Fox), pp. 3-76. (IPR Nos. '255 and '256 Exhibit 2158).
Acuity Laser, Principles of Measurement Used by Laser Sensors, https://www.acuitylaser.com/measurement-principles (2018), 4 pages. (IPR Nos. '255 and '256 Exhibit 1075).
Acuity, Acuity Aluminum Billet Scalping Production Information webpage (Brennan Deposition Exhibit 14) (last visited Dec. 28, 2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2184).
Acuity, Acuity AR700 Laser Displacement Sensor Product Information webpage (Brennan Deposition Exhibit 13) (last visited Dec. 28, 2018), 9 pages. (IPR Nos. '255 and '256 Exhibit 2183).
Acuity, Acuity Drill Pipe Runout Product Information webpage (Brennan Deposition Exhibit 12) (last visited Dec. 28, 2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2182).
Acuity, Acuity Short Range Sensors Product Information webpage (Brennan Deposition Exhibit 11) (last visited Dec. 30, 2018), 3 pages. (IPR Nos. '255 and '256 Exhibit 2181).
Aiestaran et al. “A Fluorescent Linear Optical Fiber Position Sensor” Elsevier B.V. May 21, 2008 (4 pages).
Albota, “Three-dimensional imaging laser Radar with a photon-counting avalanche photodiode array and microchip laser,” Applied optics, vol. 41, No. 36 (Dec. 20, 2002), 8 pages.
Alhashimi, et al, Statistical Modeling and Calibration of Triangulation Lidars, Scitepress—Science and Technology Publications (2016), pp. 308-317. (IPR Nos. '255 and '256 Exhibit 1069).
Amann, Laser ranging: a critical review of usual techniques for distance measurement, 40(1) Society of Photo-Optical Instrumentation Engineers (Jan. 2001), pp. 10-19. (IPR Nos. '255 and '256 Exhibit 2148).
American National Standard for Safe Use of Lasers, ANSI Z136.1-2014, Laser Institute of America (Dec. 10, 2013), pp. 27-34 and 216-219. (IPR Nos. '255 and '256 Exhibit 1142).
American National Standard for Safe Use of Lasers, Laser Institute of America (Jun. 28, 2000), 184 pages. (IPR Nos. '255 and '256 Exhibit 2005).
American National Standards Institute, “Procedures for the Development and Coordination of American National Standards” (Mar. 22, 1995), 50 pages. (IPR Nos. '255 and '256 Exhibit 1040).
American Petroleum Institute, “Specification for Line Pipe,” API Specification 5L, 43rd Ed. (2004), 166 pages. (IPR Nos. '255 and '256 Exhibit 1139).
Aood Technology Limited, “Electrical Slip Rings vs. Rotating Electrical Connectors” (2013), 3 pages. (IPR Nos. '255 and '256 Exhibit 1032).
Aufrere, et al., Perception for collision avoidance and autonomous driving, The Robots Institute, Carnegie Mellon University (2003), 14 pages (IPR Nos. '255 and '256 Exhibit 2140).
Aull, et al., “Geiger-Mode Avalanche Photodiodes for Three Dimensional Imaging,” Lincoln Laboratory Journal (2002), 16 pages. (IPR Nos. '255 and '256 Exhibit 1021), Lincoln Laboratory Journal, vol. 13, No. 2, 2002, pp. 335-350.
Automotive Lidar, Market Presentation titled “Robotic Cars LiDAR Market in Million Dollars” (Apr. 2018), 86 pages. (IPR Nos. '255 and '256 Exhibit 2113).
Avalanche Photodiode: A User Guide (2011), 8 pages. (IPR Nos. '255 and '256 Exhibit 1019).
Beer, et al, Mechanics of Materials, McGraw Hill Companies, 4th Ed. (2006), pp. 750 and 752. (IPR Nos. '255 and '256 Exhibit 1140).
Berkovic et al., Optical Methods for Distance and Displacement Measurements, Advances in Optics and Photonics (Sep. 11, 2012), pp. 441-471. (IPR Nos. '255 and '256 Exhibit 2007).
Besl, Active, Optical Range Imaging Sensors Machine Visions and Applications (1988), Springer-Verlag New York Inc., pp. 1:127-152 (IPR Nos. '255 and '256 Exhibit 1015).
Blais, NRC-CNRC, Review of 20 Years of Range Sensor Development, National Research Council Canada (Jan. 2004), pp. 231-243 (IPR Nos. '255 and '256 Exhibit 2141).
Bordone, et al., “Development of a high-resolution laser radar for 3D imaging in artwork cataloging,” Proceedings of SPIE, vol. 5131 (2003), 6 pages. (IPR Nos. '255 and '256 Exhibit 1016).
Bornstein, “Where am I? Sensors and Methods for Mobile Robot Positioning” (1996), pp. 95-112.
Brennan, Drawing of I-beam by Dr. Brennan (Brennan Deposition Exhibit 16), (Jan. 4, 2019), 1 page. (IPR Nos. '255 and '256 Exhibit 2186).
Brustein et al., How a Billion-Dollar Autonomous Vehicle Startup Lost Its Way, Bloomberg https://www.bloomberg.com/news/features/2018-08-13/how-a-billiondollar-autonomous-vehicle-startup-lost-its-way (Aug. 13, 2018), 7 pages. (IPR Nos. '255 and '256 Exhibit 2098).
Business Wire, Press Release Distribution webpage, https://services.businesswire.com/press-release-distribution (Dec. 21, 2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 1143).
Businesswire, Velodyne Displays Solid State, Highest Performing LiDAR for ADAS, Businesswire https://www.businesswire.com/news/home/20180107005088/en/Velodyne-Displays-Solid-State-Highest-Performing-LiDAR (Jan. 7, 2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2097).
Businesswire, Velodyne LiDar Awarded “Industry Choice Company of the Year” at TU-Automotive Detroit Conference, Businesswire, https://www.businesswire.com/news/home/20180608005700/en/Velodyne-LiDAR-Awarded-%E2%80%9CIndustry-Choice-Company-Year%E2%80%9D (Jun. 8, 2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2096).
Cameron, An Introduction to LIDAR: The Key Self-Driving Car Sensor, Voyage https://news.voyage.auto/an-introduction-to-lidar-the-key-self-drivingcar-sensor-a7e405590cff (May 9, 2017), 14 pages. (IPR Nos. '255 and '256 Exhibit 2074).
Canadian Patent Office, Office Action, App. No. CA 3,012,003 (Aug. 28, 2019), 3 pages.
Canadian Patent Office, Office Action, App. No. CA 3,017,735 (Aug. 28, 2019), 3 pages.
Canadian Patent Office, Office Action, App. No. CA 3,017,811 (Aug. 28, 2019), 3 pages.
Canbus, https://web.archive.org/web/20040520021138/ http:/canbus.US:80/ (May 20, 2004), 3 pages. (IPR Nos. '255 and '256 Exhibit 1088).
Carson, N. “Defending GPS against the Spoofing Threat using Network Based Detection and 3, 15,20 Successive Interference Cancellation”. Auburn University. Nov. 2015, 35 pages.
Chapman, “Introduction to Laser Safety” (Sep. 10, 2007), 19 pages.
Chellapilla, Lidar: The Smartest Sensor on a Self Driving Car, LinkedIn.com https://www.linkedin.com/pulse/lidar-smartest-sensor-self-driving-carkumar-chellapill (Jul. 31, 2017), 8 pages. (IPR Nos. '255 and '256 Exhibit 2075).
Cheung, Spinning laser maker is the real winner of the Urban Challenge, Tech Guru Daily, available at http://www.tgdaily.com/trendwatch-features/34750-spinning-laser-maker-is-the-real-winner (Nov. 7, 2007), 7 pages. (IPR Nos. '255 and '256 Exhibit 2091).
Code of Federal Regulations, Food and Drugs Rule—Performance Standards for Light-Emitting Products, 21 C.F.R. § 1040.10 (2005).
Copper Development Association Inc., Copper Tube Handbook—Industry Standard Guide for the Design and Installation of Copper Piping Systems, CDA Publication A4015-14.17: Copper Tube Handbook (2016), 96 pages. (IPR Nos. '255 and '256 Exhibit 2139).
Cravotta, “Operating alone,” EDN (Dec. 5, 2005), 6 pages.
D'Allegro, Meet the Inventor Trying to Bring LiDAR to the Masses, The Drive http://www.thedrive.com/sheetmetal/15567/meet-the-inventor-trying-to bring-lidar-to-the-masses (Oct. 28, 2017), 5 pages. (IPR Nos. '255 and '256 Exhibit 2072).
Daido, Daido Special Steel Co. home page, https://web.archive.org/web/20051227070229/http:/daido.co.jp/ (Dec. 27, 2005), 1 page. (IPR Nos. '255 and '256 Exhibit 1087).
Daido, Daido steel drilling equipment page, https://web.archive.org/web/20050406120958/http:/www.daido.co.jp:80/english/products/applipro/energy/dri.html (Apr. 6, 2005), 1 page. (IPR Nos. '255 and '256 Exhibit 1083).
Daido, Daido steel petroleum components, https://web.archive.org/web/20050406121643/http:/www.daido.co.jp:80/english/products/applipro/energy/petro.htm (Apr. 6, 2005), 1 page. (IPR Nos. '255 and '256 Exhibit 1084).
Daido, Daido steel rebar page, https://web.archive.org/web/20051201010951/http:/www.daido.co.jp:80/products/stainless/ik_shokai.html (Dec. 1, 2005), 2 pages. (IPR Nos. '255 and '256 Exhibit 1086).
DARPA, 2005 DARPA Challenge Info page https://web.archive.org/web/20051214033009/http:/www.darpa.mil:80/grandchallenge/ (Nov. 17, 2005), 1 page. (IPR Nos. '255 and '256 Exhibit 1092).
DARPA, 2005 DARPA Team Papers https://web.archive.org/web/20051213010211/http:/www.darpa.mil:80/grandchallenge/techpapers.html (Dec. 13, 2005), 2 pages. (IPR Nos. '255 and '256 Exhibit 1093).
DARPA, Grand Challenge '05—Frequently Asked Questions, DARPA.com, http://archive.darpa.mil/grandchallenge05/qa.html ) (2005), 3 pages. (IPR Nos. '255 and '256 Exhibit 2143).
DARPA, Grand Challenge Media—Frequently Asked Questions (Media), DARPA.com, http://archive.darpa.mil/grandchallenge04/media_faq.htm (2004), 3 pages. (IPR Nos. '255 and '256 Exhibit 2142).
DARPA, PDF found on Team DAD paper URL, https://web.archive.org/web/20051213015642/http:/www.darpa.mil:80/grandchallenge/TechPapers/TeamDAD.pdf (Aug. 6, 2005), pp. 1-12. (IPR Nos. '255 and '256 Exhibit 1094).
DARPA, Urban Challenge, DARPA.com, http://archive.darpa.mil/grandchallenge/ (“DARPA Archive”) (2007), 4 pages. (IPR Nos. '255 and '256 Exhibit 2144).
Dehong, et al., Design and Implementation of LiDAR Navigation System Based on Triangulation Measurement, 29th Chinese Control and Decision Conference (CCDC) (May 2017), 59 pages. (IPR Nos. '255 and '256 Exhibit 1136).
Doyle, Velodyne HDL-64E Laser Rangefinder (LIDAR) Pseudo-Disassembled, Hizook (Jan. 4, 2009), 7 pages. (IPR Nos. '255 and '256 Exhibit 2046).
Engineering Toolbox, The Engineering Toolbox Copper Tubes—ASTM B88 Datasheet (last accessed Jul. 10, 2018), 4 pages. (IPR Nos. '255 and '256 Exhibit 2137).
English, et al., The Complementary Nature of triangulation and ladar technologies, 5791 Proceedings of SPIE (May 19, 2005), pp. 29-41. (IPR Nos. '255 and '256 Exhibit 2162).
Esacademy, Betting on CAN, https://web.archive.org/web/20040609170940/http:/www.esacademy.com:80/faq/docs/bettingcan/traditional.htm (Jun. 9, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1089).
European Patent Office, Office Action, App. No. EP 07840406.8 (Mar. 15, 2011) 7 pages.
European Patent Office, Office Action, App. No. EP 11166432.2 (Jan. 29, 2019), 3 pages.
European Patent Office, Office Action, App. No. EP 11166432.2 (Oct. 14, 2016), 4 pages.
European Patent Office, Office Action, App. No. EP 11166432.2 (Oct. 5, 2015), 4 pages.
European Patent Office, Office Action, App. No. EP 11166432.2 (Oct. 7, 2019), 6 pages.
Ewald et al., Object Detection with Laser Scanners for Automotive Applications, IFAC Control in Transportation Systems (2000), pp. 369-372. (IPR Nos. '255 and '256 Exhibit 2191).
Excelitas Technologies, “Avalanche Photodiode. A User Guide”, 2011 Excelitas Technologies Corp., pp. 1-8.
Fast Company, The World's 50 Most Innovative Companies 2017, https://www.fastcompany.com/most-innovative-companies/2017 (last visited Feb. 26, 2018), 5 pages. (IPR Nos. '255 and '256 Exhibit 2077).
Fischer, “Rapid Measurement and Mapping of Tracer Gas Concentrations in a Large Indoor Space” (May 2000), 27 pages.
Ford Media Center, Ford Tripling Autonomous Vehicle Development Fleet, Accelerating on-road Testing of Sensors and Software (Jan. 5, 2016), 4 pages. (IPR Nos. '255 and '256 Exhibit 2066).
Fox, “Active electro-optical systems,” The infrared and electro-optical systems handbook, vol. 6 (1993), pp. 1-80.
Frost et al., Driving the Future of Autonomous Navigation—Whitepaper for Analysis of LIDAR technology for advanced safety, https://velodynelidar.com/docs/papers/FROST-ON-LiDAR.pdf (2016), 30 pages. (IPR Nos. '255 and '256 Exhibit 1130).
Fuerstenberg, et al., Multilayer Laserscanner for Robust Object Tracking and Classification in Urban Traffic Scenes, 9th World Congress on Intelligent Transport Systems (2002), 14 pages. (IPR Nos. '255 and '256 Exhibit 1079), pp. 1-10.
Fuerstenberg, et al., Pedestrian Recognition and Tracking of Vehicles using a vehicle based Multilayer Laserscanner, IEEE (2002), 12 pages. (IPR Nos. '255 and '256 Exhibit 2192).
Fuerstenberg, Pedestrian detection and classification by laserscanners, (2003), 8 pages.
Furstenberg, et al., New Sensor for 360 Vehicle Surveillance—Innovative Approach to Stop & Go, Lane Assistance and Pedestrian Recognition (May 2001), 5 pages. (IPR Nos. '255 and '256 Exhibit 2190).
Gargiulo, Velodyne Lidar Tops Winning Urban Challenge Vehicles, Business Wire (Nov. 6, 2007), 2 pages. (IPR Nos. '255 and '256 Exhibit 2082).
Garmin, How the LIDAR-Lite v3/v3HP works with reflective surfaces, GARMIN.com, https://support.garmin.com/en-US/?faq=IVeHYIKwChAY0qCVhQiJ67 (last visited Aug. 24, 2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2145).
Glennie et al., Temporal Stability of the Velodyne HDL-64E S2 Scanner for High Accuracy Scanning Applications, MDPI Remote Sensing (Mar. 14, 2011), 15 pages. (IPR Nos. '255 and '256 Exhibit 2057).
Glennie, Performance analysis of a kinematic terrestrial LiDAR scanning system, MAPPS/ASPRS 2006 fall conference (Nov. 6-10, 2006), 9 pages.
Glennie, Reign of Point Clouds: A Kinematic Terrestrial LiDAR Scanning System (2007), pp. 22-31.
Gustavson, “Diode-laser radar for low-cost weapon guidance,” SPIE vol. 1633, Laser radar VII (1992), pp. 1-12.
Hall, et al., Team DAD Technical Paper, DARPA Grand Challenge 2005, XP-002543336, Aug. 26, 2005, pp. 1-12. (IPR Nos. '255 and '256 Exhibit 1081).
Hamamatsu, CCD area image sensor S7030/S7031 Series Back-thinned FFT-CCD Datasheet (2006), 8 pages. (IPR Nos. '255 and '256 Exhibit 2123).
Hamamatsu, CCD Image Sensors Webpage (“CCD Image Sensors”) (Feb. 2, 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 2124).
Hamamatsu, Image Sensor Selection guide (Dec. 2003), 20 pages. (IPR Nos. '255 and '256 Exhibit 2128).
Hamamatsu, Image Sensors Webpage (Mar. 17, 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 2160).
Hamamatsu, One-dimensional PSD Plastic package, 1-D PSD with plastic package Datasheet (“1-D PSD Datasheet”) (2004), 5 pages. (IPR Nos. '255 and '256 Exhibit 2118).
Hamamatsu, One-Dimensional PSD Webpage, One-dimensional (Mar. 17, 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 2119).
Hamamatsu, Photodiode Technical Information, 18 pages. (IPR Nos. '255 and '256 Exhibit 2129).
Hamamatsu, Position Sensitive Detectors (“PSDs”) Webpage, One-dimensional and Two-dimensional (Mar. 17, 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 2117).
Hamamatsu, S4111-46Q Si Photodiode Array Webpage (Oct. 22, 2005), 1 page. (IPR Nos. '255 and '256 Exhibit 2135).
Hamamatsu, Si photodiode array—S4111/S4114 series 16, 35, 46 element Si photodiode array for UV to NIR Datasheet (Jul. 2004), 4 pages. (IPR Nos. '255 and '256 Exhibit 2134).
Hamamatsu, Silicon Photodiode Array Webpage (Feb. 2, 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 2130).
Hamamatsu, Technical Information, SD-25—Characteristics and use of FFT-CCD area image sensor (Aug. 2003), 27 pages. (IPR Nos. '255 and '256 Exhibit 2126).
Hamamatsu, Technical Information, SD-28—Characteristics and use of Si APD (Avalanche Photodiode) (Aug. 2001), 12 pages. (IPR Nos. '255 and '256 Exhibit 2127).
Hamamatsu, Two-dimensional PSD S1300 Datasheet (Dec. 19, 2005), 1 page. (IPR Nos. '255 and '256 Exhibit 2121).
Hamamatsu, Two-dimensional PSDs S1200, S1300, S1880, S1881, S2044—Non-discrete position sensor utilizing photodiode surface resistance Datasheet (2003), 6 pages. (IPR Nos. '255 and '256 Exhibit 2120).
Hamamatsu, Two-dimensional PSDs Webpage (Mar. 17, 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 2122).
Hamatsu, Opto-Semiconductor Handbook, Si APD, MMPC (Chapter 3), (“APD Handbook”), available at https://www.hamamatsu.com/us/en/hamamatsu/overview/bsd/solid_state_division/related_documents.html (2014), 25 pages. (IPR Nos. '255 and '256 Exhibit 2006).
Hancock, “Laser Intensity Based Obstacle Detecting and Tracking” (Jan. 1999), pp. 45-65.
Haran et al., Infrared Reflectivy of Pedestrian Mannequin for Autonomous Emergency Braking Testing, IEEE 19th International Conference on Intelligent Transportation Systems (ITSC) (2016), 6 pages. (IPR Nos. '255 and '256 Exhibit 2168).
Heenan, et al., Feature-Level Map Building and Object Recognition for Intersection Safety Applications, in Advanced Microsystems for Automotive Applications (Jurgen Valldorf and Wolfgang Gessner eds.) (2005), pp. 505-519. (IPR Nos. '255 and '256 Exhibit 2199).
Hergert et al., The WITS$ guide to selecting a photodetector, Hamamatsu.com, https://hub.hamamatsu.com/us/en/technical-note/WITS-guide-detectorselection/index.html (Jul. 2015), 16 pages. (IPR Nos. '255 and '256 Exhibit 2133).
IBEO, “IBEO about,” https://web.archive.org/web/20040606111631/http:/www.ibeoas.de:80/html/about/about (2004).
IBEO, “IBEO data and prices,” https://web.archive.org/web/20041209025137/http://www.ibeoas.de:80/html/prod/prod_dataprices.html (2004), 2 pages.
IBEO, “IBEO history,” https://web.archive.org/web/20040807161657/,http:/www.ibeoas.de:80/html/about/ab_history.html (2004), 1 page.
IBEO, “IBEO LD Multilayer data sheet,” https://web.archive.org/web/20031003201743/http://www.ibeoas.de:80/html/prod/prod_Id_multi.html (2003), 1 page.
IBEO, “IBEO Motiv sensor,” https://web.archive.org/web/20040113062910/,http://www.ibeoas.de:80/html/rd/rd_rs_motiv.htm (1997-2000), 1 page.
IBEO, “IBEO multilayer tech” (2004), 1 page.
IBEO, “IBEO multitarget capability,” https://web.archive.org/web/20040323030746/,http/:www.ibeoas.de:80/html/knho/knho-senstech-mlc.html (2004), 1 page.
IBEO, “IBEO products,” https://web.archive.org/web/20040606115118/http/:www.ibeoas.de:80/html/prod/prod.html (2004), 1 page.
IBEO, “IBEO products,” https://web.archive.org/web/20041011011528/http://www.ibeoas.de:80/html/prod/prod.html (2004), 1 page.
IBEO, “IBEO publications,” https://web.archive.org/web/20031208175052/http://www.ibeoas.de:80/html/public/public.html (2003), 2 pages.
IBEO, “IBEO roadmap,” https://web.archive.org/web/20041209032449/http:/www.ibeoas.de:80/html/prod/prod_roadmap.html (2004), 1 page.
IBEO, “IBEO Time of Flight” (2004), 1 page.
IBEO, “IBEO,” https://web.archive.org/web/20040202131331/http:/www.ibeo-as.de:8 (2004), 1 page.
IBEO, IBEO about page, https://web.archive.org/web/20040606111631/http:/www.ibeoas.de:80/html/about/about (Jun. 6, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1103).
IBEO, IBEO Alasca, https://web.archive.org/web/20031001091407/http:/www.ibeoas.de:80/html/prod/prod_alasca.html (Oct. 1, 2003), 1 page. (IPR Nos. '255 and '256 Exhibit 1099).
IBEO, IBEO Automobile Sensor GmbH—Scanner Technology webpage (Brennan Deposition Exhibit 1) (Mar. 23, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 2171).
IBEO, IBEO Automobile Sensor GmbH—The Alasca project webpage (Brennan Deposition Exhibit 2) (Oct. 6, 2003), 1 page. (IPR Nos. '255 and '256 Exhibit 2172).
IBEO, IBEO Available products, https://web.archive.org/web/20041011011528/http://www.ibeoas.de:80/html/prod/prod.html (Oct. 11, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1108).
IBEO, IBEO data sheet re available products, https://web.archive.org/web/20041209025137/http://www.ibeoas.de:80/html/prod/prod_dataprices.html (Dec. 9, 2004), 2 pages. (IPR Nos. '255 and '256 Exhibit 1107).
IBEO, IBEO history, https://web.archive.org/web/20040807161657/http:/www.ibeoas.de:80/html/about/ab_history.html (Aug. 7, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1104).
IBEO, IBEO home page, https://web.archive.org/web/20040202131331/ http:/www.ibeo-as.de:8 (Feb. 2, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1102).
IBEO, IBEO LD Multilayer data sheet, https://web.archive.org/web/20031003201743/http://www.ibeoas.de:80/html/prod/prod_Id_multi.html (Oct. 3, 2003), 1 page. (IPR Nos. '255 and '256 Exhibit 1111).
IBEO, IBEO Motiv sensor, https://web.archive.org/web/20040113062910/http://www.ibeoas.de:80/html/rd/rd_rs_motiv.htm (Jan. 13, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1110).
IBEO, IBEO multilayer tech, (Jan. 8, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1097).
IBEO, IBEO multilayer technology page with moving graphic, Archive.org (Jan. 8, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1096).
IBEO, IBEO multitarget capability, https://web.archive.org/web/20040323030746/http:/www.ibeoas.de:80/html/knho/knho_senstech_mlc.html (Mar. 23, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1101).
IBEO, IBEO products page, https://web.archive.org/web/20040606115118/http:/www.ibeoas.de:80/html/prod/prod.html (Jun. 6, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1100).
IBEO, IBEO publications page, https://web.archive.org/web/20031208175052/http://www.ibeoas.de:80/html/public/public.html (Dec. 8, 2003), 2 pages. (IPR Nos. '255 and '256 Exhibit 1109).
IBEO, IBEO Roadmap, https://web.archive.org/web/20041209032449/http:/www.ibeoas.de:80/html/prod/prod_roadmap.html (Dec. 9, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1105).
IBEO, IBEO time of flight with moving graphic, (Jan. 8, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1095).
IBEO, IBEO Time of Flight, (Jan. 8, 2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1098).
Informed Infrastructure, Velodyne LiDAR Division Announces Agreement with Caterpillar for Laser Imaging Technology, Informed Infrastructure http://informedinfrastructure.com/25630/velodynes-lidar-divisionannounces-agreement-with-caterpillar-for-laser-imaging-technology-2/ (Aug. 8, 2012), 3 pages. (IPR Nos. '255 and '256 Exhibit 2079).
Inter Parties Review Decision Denying Petitioner's Request for Rehearing (May 21, 2020), 26 pages. (IPR No. 2018-00255).
Inter Parties Review Decision: Institution of Inter Partes Review (May 25, 2018), 11 pages. (IPR No. 2018-00255).
Inter Parties Review Decision: Petitioner's Motion to Submit Supplemental Information Pursuant to 37 C.F.R. § 42.123(b) (Aug. 8, 2018), 4 pages. (IPR No. 2018-00255).
Inter Parties Review Declaration of Dr. James F. Brennan III (Nov. 29, 2017), 172 pages. (IPR Nos. '255 and '256 Exhibit 1002).
Inter Parties Review Final Written Decision (May 23, 2019), 40 pages. (IPR No. 2018-00255).
Inter Parties Review Patent Owner's Contingent Motion to Amend (Public Version—Redacted) (Sep. 28, 2018), 56 pages. (IPR No. 2018-00255).
Inter Parties Review Patent Owner's Preliminary Response (Public Version—Redacted) (Mar. 7, 2018), 72 pages. (IPR No. 2018-00255).
Inter Parties Review Patent Owner's Reply in Support of Its Contingent Motion to Amend (Jan. 16, 2019), 33 pages. (IPR No. 2018-00255).
Inter Parties Review Patent Owner's Response (Public Version—Redacted) (Sep. 28, 2018), 92 pages. (IPR No. 2018-00255).
Inter Parties Review Patent Owner's Surreply (Jan. 16, 2019), 50 pages. (IPR No. 2018-00255).
Inter Parties Review Patent Owner's Updated Exhibit List (Feb. 11, 2019), 21 pages. (IPR No. 2018-00255).
Inter Parties Review Petition for Inter Partes Review of U.S. Pat. No. 7,969,558 (Claims 1-4, 8, and 9) (IPR No. 2018-00255, Quanergy Systems, Inc. v. Velodyne Lidar, Inc.) (Nov. 29, 2017), 67 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner Quanergy's Opposition to Patent Owner's Contingent Motion to Amend (Dec. 21, 2018), 35 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner Quanergy's Sur-Surreply (Jan. 30, 2019), 9 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner Quanergy's Surreply to Patent Owner's Contingent Motion to Amend (Jan. 30, 2019), 17 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner's Motion to Submit Supplemental Information Pursuant to 37 C.F.R. § 42.123(b) (Aug. 6, 2018), 16 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner's Reply to Patent Owner's Response (Dec. 21, 2018), 38 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner's Request for Rehearing (Jun. 24, 2019), 20 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner's Unopposed Motion to Submit Replacement Petition and Supplemental Declaration (Nov. 5, 2018), 9 pages. (IPR No. 2018-00255).
Inter Parties Review Petitioner's Updated Exhibit List (Jan. 30, 2019), 13 pages. (IPR No. 2018-00255).
Inter Parties Review Record of Oral Hearing (Feb. 27, 2019), 126 pages. (IPR Nos. 2018-00255 and 2018-00256).
Inter Parties Review Replacement Petition for Inter Partes Review of U.S. Pat. No. 7,969,558 (Claims 1-4, 8, and 9), 71 pages. (IPR No. 2018-00255).
Inter Parties Review, Chris Butler Affidavit and Exhibit (Dec. 18, 2018), 33 pages. (IPR Nos. '255 and '256 Exhibit 1066).
Inter Parties Review, Chris Butler Affidavit and Exhibit (Dec. 20, 2018), 52 pages. (IPR Nos. '255 and '256 Exhibit 1067).
Inter Parties Review, Decision Denying Petitioner's Request for Rehearing (May 21, 2020), 26 pages. (IPR No. 2018-00256).
Inter Parties Review, Decision: Institution of Inter Partes Review (May 25, 2018), 12 pages. (IPR No. 2018-00256).
Inter Parties Review, Decision: Petitioner's Motion to Submit Supplemental Information Pursuant to 37 C.F.R. § 42.123(b) (Aug. 8, 2018), 4 pages. (IPR No. 2018-00256).
Inter Parties Review, Declaration of Dr. Sylvia Hall-Ellis (Nov. 29, 2017), 93 pages. (IPR Nos. '255 and '256 Exhibit 1041).
Inter Parties Review, Declaration of J. Gary Eden, Ph.D. in Support of Patent Owner's Preliminary Responses (Public Version—Redacted) (Mar. 7, 2018), 120 pages. (IPR Nos. '255 and '256 Exhibit 2003).
Inter Parties Review, Declaration of J. Gary Eden, Ph.D. in Support of Patent Owner's Reply in Support of Its Motion to Amend (Jan. 16, 2019), 71 pages. (IPR Nos. '255 and '256 Exhibit 2202).
Inter Parties Review, Declaration of J. Gary Eden, Ph.D. in Support of Patent Owner's Responses and Motions to Amend (Public Version—Redacted) (Sep. 27, 2018), 202 pages. (IPR Nos. '255 and '256 Exhibit 2115).
Inter Parties Review, Declaration of James F. Brennan, III in Support of Petitioner's Replies and Oppositions to Motions to Amend (Dec. 21, 2018), 93 pages. (IPR Nos. '255 and '256 Exhibit 1063).
Inter Parties Review, Declaration of Sylvia Hall-Ellis (Dec. 21, 2018), 146 pages. (IPR Nos. '255 and '256 Exhibit 1065).
Inter Parties Review, Defendant Velodyne's Answer and Counterclaim, Quanergy Systems, Inc., v. Velodyne Lidar, Inc., No. 5:16-cv-05251-EJD (N.D. Cal.) ECF No. 36 (Dec. 5, 2016), 56 pages. (IPR Nos. '255 and '256 Exhibit 2080).
Inter Parties Review, Deposition of James F. Brennan, III, Quanergy Systems, Inc. v. Velodyne Lidar, Inc., Nos. IPR2018-00255 and IPR2018-00256 (Aug. 23, 2018), 241 pages. (IPR Nos. '255 and '256 Exhibit 2156).
Inter Parties Review, Deposition of James F. Brennan, III, Quanergy Systems, Inc. v. Velodyne Lidar, Inc., Nos. IPR2018-00255 and IPR2018-00256 (Jan. 4, 2019), 267 pages. (IPR Nos. '255 and '256 Exhibit 2194).
Inter Parties Review, Deposition Transcript of J. Gary Eden, Ph.D (taken Nov. 27, 2018), 285 pages. (IPR Nos. '255 and '256 Exhibit 1064).
Inter Parties Review, Deposition Transcript of J. Gary Eden, Ph.D (taken on Jan. 22, 2019), 368 pages. (IPR Nos. '255 and '256 Exhibit 1150).
Inter Parties Review, Eden Deposition Exhibit 1—Unmanned Vehicles Come of Age: The DARPA Grand Challenge (2006), pp. 26-29. (IPR Nos. '255 and '256 Exhibit 1151).
Inter Parties Review, Eden Deposition Exhibit 10—Are processor algorithms key to safe self-driving cars?—EDN Asia (https: //www.ednasia.com/ news /article/areprocessor-algorithms-key-to-safe-self-driving-cars) (Jul. 7, 2016), 7 pages. (IPR Nos. '255 and '256 Exhibit 1160).
Inter Parties Review, Eden Deposition Exhibit 11—Steve Taranovich's profile (https://www.edn.com/user/steve.taranovich) (Jan. 22, 2019), 4 pages. (IPR Nos. '255 and '256 Exhibit 1161).
Inter Parties Review, Eden Deposition Exhibit 12—Instrumentation and Control (http://www.Instrumentation.co.za /article.aspx?pklarticleid=1664) (Feb. 2002), 4 pages. (IPR Nos. '255 and '256 Exhibit 1162).
Inter Parties Review, Eden Deposition Exhibit 13—IBEO on board: ibeo Lux 4L / ibeo Lux 8L / ibeo Lux HD Data Sheet (Jul. 2017), 2 pages. (IPR Nos. '255 and '256 Exhibit 1163).
Inter Parties Review, Eden Deposition Exhibit 2—Driver Reaction Time in Crash Avoidance Research: validation of a Driving Simulator Study on a Test Track; Article in Human Factors and Ergonomics Society Annual Meeting Proceedings, Jul. 2000, 5 pages. (IPR Nos. '255 and '256 Exhibit 1152).
Inter Parties Review, Eden Deposition Exhibit 3—Axis of Rotation diagram (Jan. 22, 2019), 1 page. (IPR Nos. '255 and '256 Exhibit 1153).
Inter Parties Review, Eden Deposition Exhibit 4—Parallel Line and Plane—from Wolfram MathWorld (http://mathworld.wolfram.com/ParallelLineandPlane.html) (Jan. 22, 2019), 1 page. (IPR Nos. '255 and '256 Exhibit 1154).
Inter Parties Review, Eden Deposition Exhibit 5—Quasi-3D Scanning with Laserscanners: Introduction from 2D to 3D (2001), 7 pages. (IPR Nos. '255 and '256 Exhibit 1155).
Inter Parties Review, Eden Deposition Exhibit 6—L-Gage LT3 Long-Range Time-of-Flight Laser Distance-Gauging Sensors (2002), 12 pages. (IPR Nos. '255 and '256 Exhibit 1156).
Inter Parties Review, Eden Deposition Exhibit 7—About Ibeo: Our Mission (https://www.ibeoas.com/aboutibeo) (Jan. 21, 2019), 10 pages. (IPR Nos. '255 and '256 Exhibit 1157).
Inter Parties Review, Eden Deposition Exhibit 8—Automotive Industry; Explore Our Key Industries (https://velodynelidar.com/industry.html) (2019), 6 pages. (IPR Nos. '255 and '256 Exhibit 1158).
Inter Parties Review, Eden Deposition Exhibit 9—Leddar Tech, Solid-State LiDARs: Enabling the Automotive Industry Towards Autonomous Driving (2018), 6 pages. (IPR Nos. '255 and '256 Exhibit 1159).
Inter Parties Review, Excerpt from Beautiful Data, Edited by Toby Segaran and Jeff Hammerbacher (Jul. 2009), pp. 150-153. (IPR Nos. '255 and '256 Exhibit 2014).
Inter Parties Review, Excerpt from James T. Luxon and David E. Parker, Industrial Lasers and Their Applications, Prentice-Hall (1985), pp. 56, 68-70, 124-125, 145, 150-151, and 154-159. (IPR Nos. '255 and '256 Exhibit 2009).
Inter Parties Review, Excerpt from Peter W. Milonni and Joseph Eberly, Lasers (1988), pp. 585-589. (IPR Nos. '255 and '256 Exhibit 2011).
Inter Parties Review, Excerpt from Raymond T. Measures, Laser Remote Sensing, Fundamentals and Applications (1992), pp. 205 and 213-214. (IPR Nos. '255 and '256 Exhibit 2010).
Inter Parties Review, Excerpt from Stephan Lugomer, Laser Technology, Laser Driven Processes, Prentice-Hall (1990), pp. 302-311. (IPR Nos. '255 and '256 Exhibit 2008).
Inter Parties Review, Excerpt from William V. Smith, Laser Applications (1970), pp. 23-27. (IPR Nos. '255 and '256 Exhibit 2012).
Inter Parties Review, Excerpts of Deposition of Craig L. Glennie, Ph.D., Quanergy Systems, Inc., v. Velodyne Lidar, Inc., No. 5:16-cv-05251-EJD (N.D. Cal.) (Jun. 27, 2017), 6 pages. (IPR Nos. '255 and '256 Exhibit 2016).
Inter Parties Review, Final Written Decision (May 23, 2019), 41 pages. (IPR No. 2018-00256).
Inter Parties Review, Images of Generator Rotors (Brennan Deposition Exhibit 8) (2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2178).
Inter Parties Review, Listing of Labelled Substitute Claims (2018), 17 pages. (IPR Nos. '255 and '256 Exhibit 1076).
Inter Parties Review, Patent Owner's Contingent Motion to Amend (Public Version—Redacted) (Sep. 28, 2018), 57 pages. (IPR No. 2018-00256).
Inter Parties Review, Patent Owner's Preliminary Response (Public Version—Redacted) (Mar. 7, 2018), 73 pages. (IPR No. 2018-00256).
Inter Parties Review, Patent Owner's Reply in Support of Its Contingent Motion to Amend (Jan. 16, 2019), 33 pages. (IPR No. 2018-00256).
Inter Parties Review, Patent Owner's Response (Public Version—Redacted) (Sep. 28, 2018), 92 pages. (IPR No. 2018-00256).
Inter Parties Review, Patent Owner's Surreply (Jan. 16, 2019), 50 pages. (IPR No. 2018-00256).
Inter Parties Review, Patent Owner's Updated Exhibit List (Feb. 11, 2019), 20 pages. (IPR No. 2018-00256).
Inter Parties Review, Petition for Inter Partes Review of U.S. Pat. No. 7,969,558 (Claims 16-19 and 23-25) (IPR No. 2018-00256, Quanergy Systems, Inc. v. Velodyne Lidar, Inc.) (Nov. 29, 2017), 73 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner Quanergy's Opposition to Patent Owner's Contingent Motion to Amend (Dec. 21, 2018), 35 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner Quanergy's Sur-Surreply (Jan. 30, 2019), 9 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner Quanergy's Surreply to Patent Owner's Contingent Motion to Amend (Jan. 30, 2019), 17 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner's Motion to Submit Supplemental Information Pursuant to 37 C.F.R. § 42.123(b) (Aug. 6, 2018), 16 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner's Reply to Patent Owner's Response (Dec. 21, 2018), 37 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner's Request for Rehearing (Jun. 24, 2019), 20 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner's Unopposed Motion to Submit Replacement Petition and Supplemental Declaration (Nov. 5, 2018), 9 pages. (IPR No. 2018-00256).
Inter Parties Review, Petitioner's Updated Exhibit List (Jan. 30, 2019), 15 pages. (IPR No. 2018-00256).
Inter Parties Review, PTAB Conference Call, Quanergy Systems, Inc. v. Velodyne Lidar, Inc., Nos. IPR2018-00255 and 2018-00256 (Jan. 11, 2019), 27 pages. (IPR Nos. '255 and '256 Exhibit 2204).
Inter Parties Review, Quanergy Invalidity Contentions Claim Chart, U.S. Pat. No. 7,969,558 (Mizuno), Quanergy Systems, Inc. v. Velodyne LiDAR, Inc., Case No. 5:16-cv-5251-EJD (Mar. 27, 2017), 17 pages. (IPR Nos. '255 and '256 Exhibit 1127).
Inter Parties Review, Quanergy Invalidity Contentions Claim Chart, U.S. Pat. No. 7,969,558 (PILAR), Quanergy Systems, Inc. v. Velodyne LiDAR, Inc., Case No. 5:16-cv-5251-EJD (Mar. 27, 2017), 13 pages. (IPR Nos. '255 and '256 Exhibit 1128).
Inter Parties Review, Quanergy M8 Lidar Sensor Datasheet, 2 pages. (IPR Nos. '255 and '256 Exhibit 2071).
Inter Parties Review, Quanergy Systems Inc.'s Invalidity Contentions and Production of Documents Pursuant to Patent Local Rules 3-3 and 3-4, Quanergy Systems, Inc. v. Velodyne LiDAR, Inc., Case No. 5:16-cv-5251-EJD (Mar. 27, 2017), 24 pages. (IPR Nos. '255 and '256 Exhibit 1126).
Inter Parties Review, Quanergy's Objected-to Demonstrative Slides of Patent Owner (2019), 16 pages. (IPR Nos. '255 and '256 Exhibit 1164).
Inter Parties Review, Redlined Supplemental Declaration of Dr. James F. Brennan III (2018), 171 pages. (IPR Nos. '255 and '256 Exhibit 1062).
Inter Parties Review, Replacement Petition for Inter Partes Review of U.S. Pat. No. 7,969,558 (Claims 16-19 and 23-25) (2018) 76 pages. (IPR No. 2018-00256).
Inter Parties Review, Transcript of Sep. 13, 2018 Conference Call, Quanergy Systems, Inc. v. Velodyne Lidar, Inc., Nos. IPR2018-00255 and IPR2018-00256 (Sep. 13, 2018), 21 pages. (IPR Nos. '255 and '256 Exhibit 2116).
International Electrotechnical Commission, “Safety of laser products—part 1: equipment classification and requirements,” International Standard IEC 60825-1, edition 1.2 (Aug. 2001), 122 pages.
International Electrotechnical Commission, “Safety of laser products—part 1: equipment classification and requirements,” International Standard IEC 60825-1, edition 2.0 (2007), 104 pages.
Internet Archive Web Page: Laser Components (2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1023).
Internet Archive Web Page: Laser Components: High Powered Pulsed Laser Diodes 905D3J08-Series (2004), 6 pages. (IPR Nos. '255 and '256 Exhibit 1024).
Internet Archive Webpage: Mercotac 3-Conductor Rotary Electrical Connectors (Mar. 2006), 1 page. (IPR Nos. '255 and '256 Exhibit 1031).
IPO Education Foundation, Inventor of the Year Award, https://www.ipoef.org/inventor-of-the-year/ (2018), 5 pages. (IPR Nos. '255 and '256 Exhibit 2207).
irdajp.org, IrDA Infrared Data Association, http://www.irdajp.org/irdajp.info (2018), 3 pages. (IPR Nos. '255 and '256 Exhibit 1134).
Janocha, Actuators: Basics and Applications, Springer (2004), pp. 85-153. (IPR Nos. '255 and '256 Exhibit 1080).
Japanese Patent Office, Petitioner's Translation of Mizuno Japanese Patent Publication No. H3-6407 (1991), 15 pages. (IPR Nos. '255 and '256 Exhibit 1058).
Jelalian, “Laser Radar Systems” (1992), 1 page.
Juberts, et al., “Status report on next generation LADAR for driving unmanned ground vehicles” Mobile Robots XVII, edited by Douglas W. Gage, Proceedings of SPIE, vol. 5609, 2004, pp. 1-12.
Kaempchen, Feature-Level Fusion of Laser Scanner and Video Data for Advanced Drive Assistance Systems (Ph.D. Dissertation, Ulm University) (2007), 248 pages. (IPR Nos. '255 and '256 Exhibit 2198).
Kaufmann, Choosing Your Detector, OE Magazine (Mar. 2005), 3 pages. (IPR Nos. '255 and '256 Exhibit 2150).
Kaufmann, Light Levels and Noise—Guide Detector Choices, Photonics Spectra 149 (Jul. 2000), 4 pages. (IPR Nos. '255 and '256 Exhibit 2151).
Kawata, “Development of ultra-small lightweight optical range sensor system”, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, Aug. 2-6, 2005, pp. 58-63 (IPR Nos. '255 and '256 Exhibit 1033).
Kilpela, Excerpt of Pulsed Time-of-Flight Laser Range Finder Techniques for Fast, High Precision Measurement Applications, at Fig. 24 (Academic dissertation, University of Oulu (Brennan Deposition Exhibit 15) (2004), 1 page. (IPR Nos. '255 and '256 Exhibit 2185).
Kilpela, Pulsed Time-of-Flight Laser Range Finder Techniques for Fast, High Precision Measurement Applications (Academic dissertation, University of Oulu) (2004), 98 pages. (IPR Nos. '255 and '256 Exhibit 2152).
Kilpelä, “Precise pulsed time-of-flight laser range finder for industrial distance measurements,” Review of Scientific Instruments (Apr. 2001), 13 pages. (IPR Nos. '255 and '256 Exhibit 1005).
Kluge, Laserscanner for Automotive Applications (May 2001), 5 pages. (IPR Nos. '255 and '256 Exhibit 2196).
Kohanbash, “LIDAR fundamentals—robots for roboticists” (May 5, 2014), 6 pages.
Lages, Laserscanner for Obstacle Detection in Advanced Microsystems for Automotive Applications Yearbook (S. Kruger et al. eds.) (2002), pp. 136-140. (IPR Nos. '255 and '256 Exhibit 2200).
Lamon, “The SmarTer for Elrob 2006—a vehicle for fully autonomous navigation and mapping in outdoor environments” (2005), 14 pages.
Langheim, et al., Sensing of Car Environment at Low Speed Driving, Carsense (2002), 14 pages. (IPR Nos. '255 and '256 Exhibit 2193).
Laser Components Produkte, Laser Components IG, Inc., 2004, 1 page.
Laser Components, “High Power Pulsed Laser Diodes 905D3J08-Series”, Laser Components IG, Inc., 2004, 6 pages.
Laser Components, https:/web.archive.org/web/20041205172904/http:www.lasercomponents.com (2004), 1 page. (IPR Nos. '255 and '256 Exhibit 1023).
Liu, et al., “Coupling Study of a Rotary Capacitive Power Transfer System” Industrial Technology, 2009. ICIT 2009. IEEE International Conference, IEEE, Piscataway, NJ, USA, Feb. 10, 2009, pp. 1-6.
Maatta et al., A High-Precision Time-to-Digital Converter for Pulsed Time-of-Flight Laser Radar Applications, 47 IEEE No. 2, 521 (Apr. 1998), pp. 521-536. (IPR Nos. '255 and '256 Exhibit 2161).
Macadam, Understanding and Modeling the Human Driver, 40 Vehicle System Dynamics, Nos. 1-3 (2003), pp. 101-134. (IPR Nos. '255 and '256 Exhibit 2205).
Makynen, Position-Sensitive Devices and Sensor System for Optical Tracking and Displacement Sensing Applications (Academic Dissertation, University of Oulu (2000), 121 pages. (IPR Nos. '255 and '256 Exhibit 2153).
Manandhar, “Auto-Extraction of Urban Features from Vehicle-Borne Laser Data”, Centre for Spatial Information Science, The University of Tokyo, Japan; Symposium on Geospatial Theory, Processing Applications, Ottawa (2002) 6 pages. (IPR Nos. '255 and '256 Exhibit 1017).
Marino, “Jigsaw: A Foliage-Penetrating 3D Imaging Laser Radar System” (2005), pp. 23-36.
Marino, “A compact 3D imaging laser Radar system using Geiger-mode APD arrays: system and measurements,” Proceedings of SPIE—The international society for optical engineering (Aug. 2003), 16 pages.
McManamon, “Optical Phased Array Technology,” Proceedings of the IEEE, vol. 84, No. 2 (Feb. 1996), pp. 268-298.
Melle, et al., “How to select avalanche photodiodes,” Laser Focus World (Oct. 1, 1995), 9 pages. (IPR Nos. '255 and '256 Exhibit 1020).
Mercotac Model 305, Electrical Slip Rings, https://web.archive.org/web/200602100652519/www.mercotac.com/html/305.htm (Feb. 2006), 3 pages.
Mercotac, 3-Conductor Rotary Electrical Connectors https://web.archive.org/web/20060317120209/http://www.mercotac.com:80/html/threeconductor.html (Mar. 2006), 1 page.
Merriam, How to Use Lidar with the raspberry PI, Hackaday, https://hackaday.com/2016/01/22/how-to-use-lidar-with-the-raspberry-pi/ (Jan. 22, 2016), 13 pages. (IPR Nos. '255 and '256 Exhibit 1072).
Morsy et al., “Multispectral LiDAR Data for Land Cover Classification of Urban Areas,” Sensors 17(5), 958 (2017), 21 pages.
MTI Instruments Inc., An Introduction to Laser Triangulation Sensors, https://www.azosensors.com/article.aspx?ArticleID=523 (Aug. 28, 2014), 9 pages. (IPR Nos. '255 and '256 Exhibit 2154).
Nagappan, “Adaptive Cruise Control: Laser Diodes as an Alternative to Millimeter Wave Radars” (Sep. 2005), pp. 1-5.
National Highway Traffic Safety Administration (NHTSA), DOT, Final Rule Federal Motor Vehicle Safety Standards; Tire Pressure Monitoring Systems Controls and Displays (2005), 222 pages. (IPR Nos. '255 and '256 Exhibit 1141).
Office of the Federal Register National Archives and Records Administration, “Code of Federal Regulations, 21, Parts 800 to 1299, Revised as of Apr. 1, 2005, Food and Drugs”, Apr. 1, 2005, pp. 1-23.
Ogurtsov, et al., “High Accuracy ranging with Yb3+-doped fiber-ring frequency-shifted feedback laser with phase-modulated seed,” Optics Communications (2006), pp. 266-273. (IPR Nos. '255 and '256 Exhibit 1042).
Ohnsman, How A 34-Year-Old Audio Equipment Company is Leading the Self-Driving Car Revolution, Forbes (Aug. 8, 2017), 7 pages. (IPR Nos. '255 and '256 Exhibit 2040).
Ohr, “War raises stakes of next DARPA bot race,” EDN (Aug. 15, 2005), 3 pages.
Omron, Technical Explanation for Displacement Sensors and Measurement Sensors, CSM_Displacemente_LineWidth_TG_E_2_1 (2018), 8 pages. (IPR Nos. '255 and '256 Exhibit 2149).
Oshkosh, “Team Terramax: DARPA Grand Challenge 2005” (Oct. 2005), pp. 1-14.
Ou-Yang, et al., “High-dynamic-range laser range finders based on a novel multimodulated frequency method,” Optical Engineering (Dec. 2006), 6 pages. (IPR Nos. '255 and '256 Exhibit 1043).
Overton, First Sensor expands supply agreement for APDs used in Velodyne lidar systems, Laser Focus World (Feb. 15, 2017), 2 pages. (IPR Nos. '255 and '256 Exhibit 2039).
Ozguner, “Team TerraMax and the DARPA Grand Challenge: a General Overview,” IEEE Intelligent Vehicles Symposium (2004), 6 pages.
Panasonic, Measurement Sensors: Specular vs Diffuse, Panasonic Blog, https://na.industrial.panasonic.com/blog/measurement-sensorsspecular-vs-diffuse (Dec. 7, 2011), 2 pages. (IPR Nos. '255 and '256 Exhibit 2155).
PCT International Search Report and Written Opinion, App. No. PCT/US2007/073490, (2008), 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2010/037129, Jul. 27, 2010, 6 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/015869, Apr. 10, 2017, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/015874, May 23, 2017, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/015877, Apr. 13, 2017, 13 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/023259, May 31, 2017, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/023261, May 26, 2017, 11 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/023262, Jun. 5, 2017, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/036865, Sep. 26, 2017, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/047543, Nov. 27, 2017, 11 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/059452, Jan. 16, 2019, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/023283, Jun. 1, 2018, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/025395, Jun. 25, 2018, 14 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/050934, Nov. 20, 2018, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/051497, Nov. 28, 2018, 11 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2018/059062, Jan. 16, 2019, 6 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/016259, Apr. 26, 2019, 6 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046422, Dec. 3, 2019, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046573, Nov. 15, 2019, 9 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/051729, Nov. 20, 2019, 7 pages.
PCT Search Report and Written Opinion (Corrected), App. No. PCT/US2020/026925, May 12, 2020, 5 pages.
PCT Search Report and Written Opinion, App. No. PCT/US2020/012633, Jun. 2, 2020, 13 pages.
PCT Search Report and Written Opinion, App. No. PCT/US2020/012635, Jun. 4, 2020, 10 pages.
Search Report and Opinion, EP App. No. 07840406.8, Sep. 8, 2009, 6 pages.
Search Report and Opinion, EP App. No. 11166432.2, Jul. 28, 2011, 7 pages.
Search Report and Opinion, EP App. No. 17745112.7, Aug. 27, 2019, 8 pages.
Search Report and Opinion, EP App. No. 17770748.6, Oct. 22, 2019, 10 pages.
Search Report and Opinion, EP App. No. 17770926.8, Oct. 29, 2019, 11 pages.
Search Report and Opinion, EP App. No. 17770928.4, Oct. 29, 2019, 10 pages.
Search Report and Opinion, EP App. No. 17807474.6, Dec. 9, 2019, 9 pages.
Piatek et al., LiDAR: A photonics guide to autonomous vehicle market, Hamamatsu.com, https://hub.hamamatsu.com/us/en/application-note/LiDAR-competingtechnologies-automotive/index.html (Nov. 18, 2017), 6 pages. (IPR Nos. '255 and '256 Exhibit 2136).
Piatek, Measuring distance with light, Hamamatsu.com, https://hub.hamamatsu.com/us/en/application-note/measuringdistance-with-light/index.html (Apr. 2, 2015), 18 pages. (IPR Nos. '255 and '256 Exhibit 2132).
Piatek, Presentation entitled ‘LiDAR and Other Techniques—Measuring Distance with Light for Automotive Industry’, authored by Slawomir Piatek, Technical Consultant, Hamamatsu Corp. (Dec. 6, 2017), 66 pages. (IPR Nos. '255 and '256 Exhibit 2131).
Popper, Guiding Light, The Billion-Dollar Widget Steering the Driverless Car Industry, The Verge (Oct. 18, 2017), 17 pages. (IPR Nos. '255 and '256 Exhibit 2076).
Qing, “Method of 3D visualization using laser radar on board of mobile robot,” Journal of Jilin University (Information Science Ed.), vol. 22 (Jul. 2004), 4 pages.
Quanergy Systems, Inc. v. Velodyne LiDAR, Inc. (N.D. Cal.), Case No. 5:16-cv-05251, “Plaintiff Quanergy Systems, Inc.'s Amended Invalidity Contentions Pursuant to Patent Local Rule 3-3,” May 23, 2017, 238 pages.
Quanergy Systems, Inc. v. Velodyne LiDAR, Inc. (N.D. Cal.), Case No. 5:16-cv-05251, “Plaintiff Quanergy Systems, Inc.'s Invalidity Contentions and Production of Documents Pursuant to Patent Local Rules 3-3 and 3-4,” Mar. 27, 2017, 24 pages.
Quanergy Systems, Inc. v. Velodyne LiDAR, Inc. (N.D. Cal.), Case No. 5:16-cv-05251, Amended Complaint, Nov. 18, 2016, 6 pages.
Quanergy Systems, Inc. v. Velodyne LiDAR, Inc. (N.D. Cal.), Case No. 5:16-cv-05251, Answer to Counterclaim, (Jan. 16, 2017) 9 pages.
Quanergy Systems, Inc. v. Velodyne LiDAR, Inc. (N.D. Cal.), Case No. 5:16-cv-05251, Defendant Velodyne's Answer and Counterclaim, Dec. 5, 2016, 20 pages.
Quanergy Systems, Inc. v. Velodyne LiDAR, Inc. (N.D. Cal.), Complaint, Case No. 5:16-cv-05251 (Sep. 13, 2016), 21 pages.
Ramsey et al., Use Scenarios to Plan for Autonomous Vehicle Adoption, Gartner (Jun. 26, 2017), 17 pages. (IPR Nos. '255 and '256 Exhibit 2064).
Reutebuch, “LiDAR: an Emerging Tool for Multiple Resource Inventory,” Journal of Forestry (Sep. 2005) 7 pages.
Reymann et al., Improving LiDAR Point Cloud Classification using Intensities and Multiple Echoes, IEE/RSJ International Conference on Intelligent Robots and Systems (Sep. 2015), 8 pages. (IPR Nos. '255 and '256 Exhibit 2167).
Richmond et al., Polarimetric Imaging Laser Radar (PILAR) Program. In Advanced Sensory Payloads for UAV, Meeting Proceedings RTO-MP-SET-092, Paper 19. Neuilly-sur-seine, France: RTO (May 1, 2005), 35 pages. (IPR Nos. '255 and '256 Exhibit 1129).
Riegl LMS-Q120, http://web.archive.org/web/20050113054822/http:/www.riegl.com/industrial_scanners /lms_q120_/q120_all_.htm (2005), 4 pages.
Riegl, “Riegl LMS-Z210” (2003), 8 pages.
Robots for Roboticists, LIDAR Fundamentals, http://robotsforroboticists.com/lidar-fundamentals/ (May 5, 2014), 6 pages. (IPR Nos. '255 and '256 Exhibit 1068).
Ros-Drivers—Error in packet rate for the VLP-32C #142, GitHub Forum (Jan. 29, 2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2041).
Saleh, “Fundamentals of Photonics” vol. 2, Wiley-Interscience Publication, 1991, pp. 342-383, 494-541, and 592-695. (IPR Nos. '255 and '256 Exhibit 1008).
Sensick, “DME 2000 / DME 3000: Precise non-contact distance determination,” Sensick Catalogue (2006), pp. 450-457. (IPR Nos. '255 and '256 Exhibit 1073).
Sick DME 2000 Operating Instructions (Excerpt) (Brennan Deposition Exhibit 9) (May 2002), 42 pages. (IPR Nos. '255 and '256 Exhibit 2179).
Sick Laser Triangulation Sensors Product Information (Brennan Deposition Exhibit 6) (Jun. 25, 2018), 76 pages. (IPR Nos. '255 and '256 Exhibit 2176).
Sick LMS 200/ LMS 211/ LMS 220 / LMS 221/ LMS 291 Laser Measurement Systems—Technical Description (Brennan Deposition Exhibit 4) (Jun. 2003), 40 pages. (IPR Nos. '255 and '256 Exhibit 2174).
Sick LMS200/211/221/291 Laser Measurement Systems—Technical Description (Brennan Deposition Exhibit 3) (2006), 48 pages. (IPR Nos. '255 and '256 Exhibit 2173).
Sick Sensick Measuring Distance with Light—Distance Sensors Product Overview (Brennan Deposition Exhibit 10) (2004), 12 pages. (IPR Nos. '255 and '256 Exhibit 2180).
Sick, Sick ToF sensors at close range, https://web.archive.org/web/20040607070720/http:/www.sick.de:80/de/products/categories/industrial/distancesensors/dme2000/en.html (Jun. 7, 2004), 2 pages. (IPR Nos. '255 and '256 Exhibit 1082).
Singh, “Cyclone: A Laser Scanner for Mobile Robot Navigation” (Sep. 1991), pp. 1-18.
Skolnik, “Introduction to radar systems,” Second edition, McGraw-Hill book company (1980), pp. 1-3.
Skolnik, “Radar Handbook” Second Edition, McGraw-Hill Publishing Company, 1990, pp. 1-1191.
Song et al., Assessing the Possibility of Land-Cover Classification Using LiDAR Intensity Data, Commission III, PCV02 (2002), 4 pages. (IPR Nos. '255 and '256 Exhibit 2169).
Spies, “Extended Eyes—Sense and Avoid,” Presented at the 2006 International Aerospace Exhibition, Berlin (May 2006), 22 pages.
Stone, “Performance analysis of next-generation LADAR for manufacturing, construction, and mobility” (May 2004), 198 pages.
Strang, Drawing of cross-section of I-beam by Jonathan Strang (Brennan Deposition Exhibit 5), (2018) 1 page. (IPR Nos. '255 and '256 Exhibit 2175).
strata-gee.com, Velodyne President Calls Strata-gee to Set the Record Straight, https://www.strata-gee.com/velodyne-president-calls-strata-gee-setrecord-straight/ (Jun. 26, 2014), 6 pages. (IPR Nos. '255 and '256 Exhibit 1137).
Strawa et al., The Measurement of Aerosol Optical Properties Using Continuous Wave Cavity Ring-Down Techniques, 20 Journal of Atmospheric and Oceanic Technology 454 (Apr. 2003), pp. 454-465. (IPR Nos. '255 and '256 Exhibit 2090).
Tarakanov, et al., “Picosecond pulse generation by internal gain switching in laser diodes,” Journal of Applied Physics 95:223 (Mar. 2004), pp. 2223-2229. (IPR Nos. '255 and '256 Exhibit 1044).
Taranovich, Are processor algorithms key to safe self-driving cars? EDN Asia, https://www.ednasia.com/news/article/are-processor-algorithms-key-tosafe-self-driving-cars (Jul. 7, 2016), 11 pages. (IPR Nos. '255 and '256 Exhibit 2206).
Taylor, An Introduction to Error Analysis—The Study of Uncertainties in Physical Measurements, Oxford University Press (1982), pp. 81-137. (IPR Nos. '255 and '256 Exhibit 1138).
The American Heritage Dictionary of the English Language, Houghton Mifflin Company, 3d ed. (1996), pp. 1497, 1570, 1697, 1762, and 1804. (IPR Nos. '255 and '256 Exhibit 1018).
The American Society of Mechanical Engineers, Welded and Seamless Wrought Steel Pipe, ASME B36.10M-2004 (Oct. 25, 2004), 26 pages. (IPR Nos. '255 and '256 Exhibit 2138).
The Laser Institute of America, “American National Standard of Safe Use of Lasers” ANSI Z136.1-2000, Revision of ANSI Z136.1-1993, Second Printing 2003, 32 pages.
Thin Lens Equation, http://hyperphysics.phyastr.gsu.edu/hbase/geoopt/lenseq.html (last visited Dec. 30, 2018) (Brennan Deposition Exhibit 7), 4 pages. (IPR Nos. '255 and '256 Exhibit 2177).
Thomas, “A procedure for multiple-pulse maximum permissible exposure determination under the Z136.1-2000 American national standard for safe use of lasers,” Journal of Laser Applications, Aug. 2001, vol. 13, No. 4, pp. 134-140.
Thrun, “Probabilistic Terrain Analysis for High-Speed Desert Driving” (Oct. 2005), 7 pages.
Trepagnier, “Team gray technical paper,” DARPA grand challenge 2005 (Aug. 28, 2005), 14 pages.
Turk, et al., VITS—A Vision System for Autonomous Land Vehicle Navigation, 10 IEEE No. 3 (May 1988), pp. 342-361. (IPR Nos. '255 and '256 Exhibit 2147).
U.S. District Court, Claim Construction Order, Quanergy Systems, Inc. v. Velodyne LiDAR, Inc., Case No. 5:16-cv-5251-EJD (Oct. 4, 2017), 33 pages. (IPR Nos. '255 and '256 Exhibit 1027).
U.S. International Trade Commission, Investigation No. 337-TA-1173, “Respondent Hesai Photonics Technology Co., Ltd.'s Notice of Prior Art,” Nov. 13, 2019, 35 pages.
U.S. International Trade Commission, Investigation No. 337-TA-1173, “Respondent Robosense's Notice of Prior Art,” Nov. 13, 2019, 34 pages.
U.S. Patent Office, Information Disclosure Statement, U.S. Appl. No. 10/391,383 (U.S. Pat. No. 7130672, Pewzner) (Aug. 3, 2005), 8 pages.
U.S. Patent Office, Information Disclosure Statement, U.S. Appl. No. 10/508,232 (U.S. Pat. No. 7,313,424, Mayevsky) (Apr. 21, 2006), 17 pages.
Ullrich, et al., “High-performance 3D-imaging laser sensor,” Proceedings of SPIE vol. 3707 (Jun. 1999), pp. 658-664. (IPR Nos. '255 and '256 Exhibit 1014).
Urmson, “High speed navigation of unrehearsed terrain: red team technology for grand challenge 2004” (Jun. 1, 2004), 47 pages.
USGS, Eros CalVal Center of Excellence (ECCOE), https://calval.cr.usgs.gov/wordpress/wpcontent/uploads/JACIE_files/JACIE06/Files/312Habib.pdf (Dec. 21, 2018), 3 pages. (IPR Nos. '255 and '256 Exhibit 1071).
Uwinnipeg, Centripetal Acceleration, Uwinnipeg.ca, http://theory.uwinnipeg.ca/physics/circ/node6.html (1997), 2 pages. (IPR Nos. '255 and '256 Exhibit 2157).
Velodyne Acoustics, Inc., Motor Specification, Merlin Project, Rev. E1 Initial Engineering Release (Apr. 29, 2009), 1 page. (IPR Nos. '255 and '256 Exhibit 2020).
Velodyne Acoustics, Inc., Motor Winding Specs., P2.0 , E2 Changed Material (Mar. 10, 2010), 1 page. (IPR Nos. '255 and '256 Exhibit 2022).
Velodyne Acoustics, Inc., Outline Drawing HDL-64E S3 Envelope Drawing, Rev. A (Apr. 21, 2015), 1 page. (IPR Nos. '255 and '256 Exhibit 2094).
Velodyne LIDAR Products, PowerPoint (Jan. 18, 2017), 9 pages. (IPR Nos. '255 and '256 Exhibit 2031).
Velodyne LIDAR, CAD Drawing of MotorStat-38in, HDL-64E (2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2063).
Velodyne LIDAR, CAD Drawing of MotorStat3in, HDL-64E(2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2021).
Velodyne LIDAR, CAD Drawing of Rotor, HDL-64E (2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2026).
Velodyne LIDAR, CAD Drawing of RotorAI, HDL-64E (2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2027).
Velodyne LIDAR, Envelope Hi Res VLP-16 Drawings, Rev. A (Jun. 30, 2016), 4 pages. (IPR Nos. '255 and '256 Exhibit 2061).
Velodyne LIDAR, Excerpts of VLP-32C User Manual, 63-9325 Rev. B (2018), 26 pages. (IPR Nos. '255 and '256 Exhibit 2034).
Velodyne LIDAR, First Sensor Annual Report (2016), pp. 1-143. (IPR Nos. '255 and '256 Exhibit 2038).
Velodyne LIDAR, HDL-32E Data Sheet (2017), 2 pages. (IPR Nos. '255 and '256 Exhibit 2042).
Velodyne LIDAR, HDL-32E Envelope Drawing (2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2043).
Velodyne LIDAR, HDL-32E Supported Sensors, Poly Synch Docs 2.3.2, http://docs.polysync.io/sensors/velodyne-hdl-32e/ (2018), 7 pages. (IPR Nos. '255 and '256 Exhibit 2055).
Velodyne LIDAR, HDL-32E User's Manual and Programing Guide (Aug. 2016), 29 pages. (IPR Nos. '255 and '256 Exhibit 2044).
Velodyne LIDAR, HDL-64E Data Sheet (2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2069).
Velodyne LIDAR, HDL-64E S2 and S2.1 User's Manual and Programming Guide (Nov. 2012), 43 pages. (IPR Nos. '255 and '256 Exhibit 2050).
Velodyne LIDAR, HDL-64E S2 Datasheet (Mar. 2010), 2 pages. (IPR Nos. '255 and '256 Exhibit 2047).
Velodyne LIDAR, HDL-64E S3 Data Sheet (2016), 2 pages. (IPR Nos. '255 and '256 Exhibit 2048).
Velodyne LIDAR, HDL-64E S3 User's Manual and Programming Guide (May 2013), 54 pages. (IPR Nos. '255 and '256 Exhibit 2051).
Velodyne LIDAR, HDL-64E User's Manual (Mar. 2008), 21 pages. (IPR Nos. '255 and '256 Exhibit 2052).
Velodyne Lidar, Inc. v. Hesai Photonics Technology Co., Ltd. (N.D. Cal.), Complaint, Case No. 5:19-cv-04742 (Aug. 13, 2019), 13 pages.
Velodyne Lidar, Inc. v. Sunteng Innovation Technology Co., Ltd. (“Robosense”) (N.D. Cal.), Complaint, Case No. 5:19-cv-04746 (Aug. 13, 2019), 13 pages.
Velodyne LIDAR, Inc., Production Worksheet Detector, Item #24-AD5009 in Production, AD500-9 NIR Photodiode (Jan. 18, 2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2024).
Velodyne LIDAR, Inc., Production Worksheet, Item #30-AD230CER2 in Production, APD, 230UM, Ceramic Submount (Jan. 17, 2018), 1 pages. (IPR Nos. '255 and '256 Exhibit 2023).
Velodyne LIDAR, It Began With a Race . . . 16 Years of Velodyne LiDAR, Velodyne LiDAR Blog, available at http://velodynelidar.com/blog/it-began-with-a-race/ (2018), 8 pages. (IPR Nos. '255 and '256 Exhibit 2070).
Velodyne LIDAR, Product Guide (2018), 1 page. (IPR Nos. '255 and '256 Exhibit 2058).
Velodyne LIDAR, Puck, Real-time 3D LiDAR Sensor, VLP-16 Data Sheet (2017), 2 pages. (IPR Nos. '255 and '256 Exhibit 2060).
Velodyne LIDAR, Ultra Puck™ VLP-32 Data Sheet (2014), 2 pages. (IPR Nos. '255 and '256 Exhibit 2032).
Velodyne LIDAR, Velodyne Donates LiDAR and Robotic Artifacts to Smithsonian, Point of Engineering, Point of Beginning (May 23, 2011), 2 pages. (IPR Nos. '255 and '256 Exhibit 2078).
Velodyne LIDAR, VLP-16 User's Manual and Programming Guide (Mar. 2016), 49 pages. (IPR Nos. '255 and '256 Exhibit 2062).
Velodyne LIDAR, VLP-32C User Manual, 63-9325 Rev. B. (Feb. 2, 2018), 136 pages. (IPR Nos. '255 and '256 Exhibit 2114).
Velodyne LIDAR, Webserver User Guide VLP-16 & HDL-32E (63-6266 Rev A) (Nov. 2015), 32 pages. (IPR Nos. '255 and '256 Exhibit 2013).
Velodyne LIDAR, White Paper, Velodyne's HDL-64E: A High Definition Lidar Sensor for 3-D Applications (Oct. 2007), 7 pages. (IPR Nos. '255 and '256 Exhibit 2059).
Velodyne, Velodyne—High Definition Lidar—Overview https://web.archive.org/web/20071107104255/http://www.velodyne.com:80/lidar/products/overview.aspx (Nov. 7, 2007), 1 page. (IPR Nos. '255 and '256 Exhibit 1091).
Velodyne, Velodyne HDL Applications, https://web.archive.org/web/20080716041931/http://www.velodyne.com:80/lidar/technology/applications.aspx (Jul. 16, 2008), 1 page. (IPR Nos. '255 and '256 Exhibit 1106).
Velodyne, Velodyne HDL-64E user manual, https://web.archive.org/web/20081117092628/http://www.velodyne.com/lidar/products/manual/HDL-64E%20Manual.pdf (Nov. 17, 2008), 23 pages. (IPR Nos. '255 and '256 Exhibit 1090).
Velodynelidar, Data to Improve the Cost, Convenience and Safety of Motor Vehicles, https://velodynelidar.com/industry.html (2018), 6 pages. (IPR Nos. '255 and '256 Exhibit 1125).
Weber, Where to? A History of Autonomous Vehicles, Computer History Museum, https://support.garmin.com/en-US/?faq=IVeHYIKwChAY0qCVhQiJ67 (May 8, 2014), 23 pages. (IPR Nos. '255 and '256 Exhibit 2146).
Westinghouse, “AN/TPS-43 E Tactical Radar System” (1999), pp. 1-14.
Widmann, “Development of Collision Avoidance Systems at Delphi Automotive Systems” (1998), pp. 353-358.
Wikipedia, “Laser” (Nov. 10, 2017), 25 pages. (IPR Nos. '255 and '256 Exhibit 1022).
Willhoeft et al., “Quasi-3D Scanning with Laserscanners,” IBEO Automobile Sensor, 8th World Congress on Intelligent Transport Systems—Quasi-3D Scanning (2001), IBEO Automobile Sensor, 8th World Congress on Intelligent Transport Systems—Quasi-3D Scanning (2001), 12 pages. (IPR Nos. '255 and '256 Exhibit 1077).
Williams, Bias Voltage and Current Sense Circuits for Avalanche Photodiodes—Feeding and Reading the APD, Linear Technology AN92-1 (Nov. 2012), 32 pages. (IPR Nos. '255 and '256 Exhibit 2125).
Williams, Driverless cars yield to reality: It's a long road ahead, PC World (Jul. 8, 2013), 6 pages. (IPR Nos. '255 and '256 Exhibit 2073).
Wulf et al., “Fast 3D Scanning Methods for Laser Measurement Systems, CSCS-14, 14th Int'l Conference on Control Systems and Computer Science” (Jul. 2003), pp. 312-317. (IPR Nos. '255 and '256 Exhibit 1078).
Wulf, “2D Mapping of Cluttered Indoor Environments by Means of 3D Perception,” Proceedings of the 2004 IEEE International Conference on Robotics & Automation (Apr. 2004), pp. 4204-4209.
Yang, et al., “Performance of a large-area avalanche photodiode at low temperature for scintillation detection,” Nuclear Instruments and Methods in Physics Research (2003), pp. 388-393 (IPR Nos. '255 and '256 Exhibit 1034).
Yu et al., A New 3D Map Reconstruction Based Mobile Robot Navigation, IEEE (2006), 4 pages. (IPR Nos. '255 and '256 Exhibit 2189).
Zappa, et al, SPADA: Single-Photon Avalanche Diode Arrays, IEEE Photonics Technology Letters, vol. 17, No. 3 (Mar. 2005), 9 pages. (IPR Nos. '255 and '256 Exhibit 1135).
Zhao, “A vehicle-borne urban 3-D acquisition system using single-row laser range scanners,” IEEE transactions on systems, man, and cybernetics, vol. 33, No. 4 (Aug. 2003), pp. 658-666.
Zhao, “Reconstructing Textured CAD Model of Urban Environment Using Vehicle-Borne Laser Range Scanners and Line Cameras,” Lecture Notes in Computer Science, vol. 2095 (2001), pp. 284-297.
Zheng, “The Technique of Land 3D Laser Scanning and Imaging Surveying,” Railway Aerial Survey, vol. 2 (2003), 3 pages.
ULTRA Puck, VLP-32C Data Sheet (2018), 2 pages. (IPR Nos. '255 and '256 Exhibit 2093).
PCT International Search Report and Written Opinion, App. No. PCT/US2018/031682, Sep. 17, 2018, 12 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2017/035427, Aug. 29, 2017, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046412, Jun. 24, 2020, 10 pages.
PCT International Search Report and Written Opinion, App. No. PCT/US2019/046419, Oct. 29, 2019, 14 pages.
EP20739207.7, “Extended European Search Report”, Jan. 10, 2023, 12 pages.
Related Publications (1)
Number Date Country
20200217954 A1 Jul 2020 US