LiDAR detection systems and methods

Information

  • Patent Grant
  • 11675050
  • Patent Number
    11,675,050
  • Date Filed
    Tuesday, January 8, 2019
    5 years ago
  • Date Issued
    Tuesday, June 13, 2023
    11 months ago
  • Inventors
  • Original Assignees
    • Innovusion, Inc. (Sunnyvale, CA, US)
  • Examiners
    • Ko; Tony
    Agents
    • Mauriel Kapouytian Woods LLP
    • Ma; Wensheng
    • Huang; Liang
Abstract
Embodiments discussed herein refer to a relatively compact and energy efficient LiDAR system that can be mounted to a windshield on the interior cabin portion of a vehicle. In order to accommodate the relatively compact size of the LiDAR system, multiple moveable components are used to ensure that a desired resolution is captured in the system's field of view.
Description
FIELD OF THE INVENTION

The present disclosure relates to light detection and ranging (LiDAR), and in particular to LiDAR systems and methods for use in a vehicle.


BACKGROUND

Systems exist that enable vehicles to be driven semi-autonomously or fully autonomously. Such systems may use one or more range finding, mapping, or object detection systems to provide sensory input to assist in semi-autonomous or fully autonomous vehicle control. Many of these systems are relatively large and bulky and thus require a relatively large amount of space on the vehicle. In addition, many of these systems are relatively power intensive.


It may be desirable to integrate a relatively compact and energy efficient LiDAR system within a vehicle.


BRIEF SUMMARY

Embodiments discussed herein refer to a relatively compact and energy efficient LiDAR system that can be mounted to a windshield on the interior cabin portion of a vehicle. In order to accommodate the relatively compact size of the LiDAR system, multiple moveable components are used to ensure that a desired resolution is captured in the system's field of view.


In one embodiment, a light detection and ranging (LiDAR) system for use with a vehicle is provided. The system can include a housing configured to be mounted to a windshield of the vehicle. The housing can include a transceiver module operative to transmit and receive light energy, the transceiver module includes at least one lens that defines a vertical angle of a field of view of the LiDAR system, and a polygon structure that defines a lateral angle of the field of view of the LiDAR system. The polygon structure is operative to redirect light energy transmitted from the transceiver module away from the housing, and redirect light energy reflected from an object within the field of view of the LiDAR system to the transceiver module. The system can include a moveable platform coupled to the transceiver module, the moveable platform operative to move the transceiver module in a manner that results in an increase of resolution of a scene captured within the field of view.


In one embodiment, a light detection and ranging (LiDAR) system for use with a vehicle is provided. The system can include a housing configured to be mounted to a windshield of the vehicle. The housing can include a transceiver module operative to transmit and receive light energy, the transceiver module comprising at least one lens that defines a vertical angle of a field of view of the LiDAR system; a polygon structure that defines a lateral angle of the field of view of the LiDAR system; and a moveable mirror positioned to redirect light energy passing between the transceiver module and the polygon structure, the moveable mirror operative to adjust angles of light being emitted by the transceiver module in a manner that results in an increase of resolution of a scene captured within the field of view.


In one embodiment, a method for using a LiDAR system comprising a transceiver module and polygon structure is provided. The method can include emitting, from the transceiver module, light energy that occupies a plurality of non-overlapping angles such that the emitted light energy is transmitted directly to the polygon structure, wherein the plurality of non-overlapping angles define a vertical angle of a field of view of the LiDAR system; rotating the polygon structure in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system; and adjusting a position of the transceiver module in a manner that results in an increase of resolution of a scene captured within the field of view.


In one embodiment, a method for using a LiDAR system comprising a transceiver module, a polygon structure, and a mirror is provided. The method can include emitting, from the transceiver module, light energy that occupies a plurality of non-overlapping angles such that the emitted light energy is transmitted directly to the mirror, which directs the light energy to the polygon structure, wherein the plurality of non-overlapping angles define a vertical angle of a field of view of the LiDAR system, rotating the polygon structure in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system, and adjusting a position of the mirror in a manner that results in an increase of resolution of a scene captured within the field of view.


In one embodiment, a method for using a LiDAR system that includes a transceiver module, a polygon structure, and a mirror is provided. The method can include emitting, from the transceiver module, light energy that occupies a plurality of non-gap angles such that the emitted light energy is transmitted directly to the mirror, which directs the light energy to the polygon structure, wherein the plurality of non-gap angles and number of detectors define a vertical resolution of the LiDAR system, and wherein a maximum rotating angle of the mirror defines the angle of a field of view of the LiDAR system, and rotating the polygon structure in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system.


A further understanding of the nature and advantages of the embodiments discussed herein may be realized by reference to the remaining portions of the specification and the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A and 1B illustrate side and front views of a vehicle having a windshield mounted LiDAR system, according to an embodiment;



FIG. 2 shows illustrative window mounted LiDAR system, according to an embodiment;



FIG. 3 shows another illustrative window mounted LiDAR system, according to an embodiment;



FIG. 4 shows illustrative block diagram arrangement of a fiber optical emitter transceiver module, according to an embodiment;



FIG. 5 shows illustrative block diagram arrangement of a semiconductor based emitter transceiver module, according to an embodiment;



FIG. 6 shows illustrative block diagram arrangement of a transceiver module, according to an embodiment;



FIG. 7 shows an illustrative process, according to an embodiment;



FIG. 8 shows another illustrative process, according to an embodiment; and



FIG. 9 is a functional block diagram illustrating a vehicle system, according to an embodiment.





DETAILED DESCRIPTION

Illustrative embodiments are now described more fully hereinafter with reference to the accompanying drawings, in which representative examples are shown. Indeed, the disclosed communication systems and methods may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like numbers refer to like elements throughout.


In the following detailed description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of the various embodiments. Those of ordinary skill in the art will realize that these various embodiments are illustrative only and are not intended to be limiting in any way. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure.


In addition, for clarity purposes, not all of the routine features of the embodiments described herein are shown or described. One of ordinary skill in the art would readily appreciate that in the development of any such actual embodiment, numerous embodiment-specific decisions may be required to achieve specific design objectives. These design objectives will vary from one embodiment to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming but would nevertheless be a routine engineering undertaking for those of ordinary skill in the art having the benefit of this disclosure.



FIGS. 1A and 1B illustrate side and front views of vehicle 100 having a windshield mounted LiDAR system (WMLS) 150, according to an embodiment. Vehicle 100 is a generic representation of any vehicle that can transport persons or cargo from one location to another. Vehicle 100 has windshield 110, which has an exterior surface 111 that is exposed to the elements and an interior surface 112 that interfaces with the interior cabin of the vehicle. WMLS 150 can be mounted to the interior surface 112 of windshield 110. As illustrated in the FIGS., WMLS 150 is center mounted on windshield 110 along center axis 120 and near roof 115, such that it is positioned near the location of rear view mirror 117. It should be understood that the position of WMLS 150 is merely illustrative and that WMLS 150 can be positioned anywhere on windshield 110. If desired, more than one WMLS 150 can be mounted to windshield 110. In addition, one or more LiDAR systems according to embodiments discussed herein can be mounted anywhere on vehicle 100.


WMLS 150 can be a front facing, forward scanning system that captures lateral and vertical resolution of the 3D space existing in front of the vehicle. The lateral and vertical resolution can define the field of view of WMLS 150. In some embodiments, the lateral field of view is greater than the vertical field of view. For example, the lateral field of view can range from 100-180 degrees, or 110-170 degrees, or can be 120 degrees, whereas the vertical field of view can range from 20-50 degrees, 25-45 degrees, or 40 degrees. The ranging distance of the field of view can be set to any desired distance. For example, the ranging distance may be 50-300 meters, 100-200 meters, or 150 meters.



FIG. 2 shows illustrative WMLS 200 according to an embodiment. WMLS 200 can include housing 201, circuit board 210, transceiver module 220, and polygon structure 230. Housing 201 is constructed to house circuit board 210, transceiver module 220, and polygon structure 230 and can be mounted to a windshield or to other structures located on a vehicle. Circuit board 210 may include circuitry such as control electronics, power electronics, communications circuitry, power and data busses, and any other components. In some embodiments, circuit board 210 may be a metal based circuit board to assist in heat dissipation (e.g., when silicon based laser emitters are used). Transceiver module 220 may include LiDAR emitter(s) and detectors and lenses required to control dispersal of light energy emitted by the emitter(s) and receipt of returning light being detected by the detectors. During operation, light energy is emitted by emitter(s) towards polygon structure 230, which redirects the light energy out of housing 201 and through the windshield. The light energy being directed by polygon structure 230 is cast in accordance with the field of view parameters of WMLS 200. That is, if WMLS 200 has a field of view with range of x, a lateral angle of y, and vertical angle of z, the range x can be controlled by the power of the emitter(s), the vertical angle z can be controlled by lenses (not shown) of transceiver module 220, and the lateral angle y can be controlled by polygon structure 230. Light energy that is reflected back from objects in the field of view passes through the windshield, returns to polygon structure 230 and is redirected back to transceiver module 220, which detects the light energy with its detectors.


The facets number of polygon structure 230 is determined to accommodate horizontal FOV. The facet of polygon can be parallel or non-parallel to its symmetric axis. Polygon structure 230 may be constructed from a metal such as aluminum, a plastic, or other material that can have a polished or mirrored surface. Polygon structure may be selectively masked to control the lateral dispersion of light energy being projected in accordance with the field of view of WMLS 200. Polygon structure 230 is operative to spin about axis 231 in a first direction at a substantially constant speed. The axis 231 can be coincident to structure 230's symmetrical axis or it can be tilted at an angle to with respect to structure 230's symmetrical axis, which can effectively increase resolution in vertical angle of z. A motor such as a DC motor may control the spin of structure 230. The final shape of polygon can be trimmed (i.e., chop off the sharp corner or tip to reduce overall weight, chamfer the sharp edge to reduce air resistance) for better operation performance.


Transceiver module 220 may be placed on a movable platform (not shown) that can change the position or pointing of transceiver module 220 within housing 201. The platform may move the entire module 220 in the directions of arrow 221. In this arrangement, module 220 may be raised and lowered. Alternatively, the platform may rotate module 220 (about a rotation axis) along the directions of arrow 222. Moving transceiver module 220 enables WMLS 200 to increase its resolution by capturing image data that exists in gaps caused by the lenses being used in transceiver module 220.



FIG. 3 shows illustrative WMLS 300 according to an embodiment. WMLS 300 can include housing 301, circuit board 310, transceiver module 320, polygon structure 330, and mirror 340. Housing 301 is constructed to house circuit board 310, transceiver module 320, polygon structure 330, and mirror 340 and can be mounted to a windshield or to other structures located on a vehicle. Circuit board 310, transceiver module 320, and polygon structure 330 may be similar to circuit board 210, transceiver module 220, and polygon structure 230 of FIG. 2 and thus a corresponding description need not be repeated. WMLS 300 differs from WMLS 200 with the addition of mirror 340 and repositioning of transceiver module 320. Transceiver module 320 may be placed in a permanently fixed position relative to circuit board 310 and operative to direct light energy towards mirror 340 and receive reflected light energy from mirror 340. Mirror 340 is operative to redirect light energy transmitted from transceiver module 320 to polygon structure 330. Mirror 340 is also operative to redirect light energy received from polygon structure 330 back to transceiver module 320 for detection by the detectors (not shown). Mirror 340 may be moved in directions shown by arrow 341 or 342 to increase the resolution of WMLS 300 by capturing image data that exists in gaps caused by the lenses being used in transceiver module 320.



FIG. 4 shows illustrative block diagram arrangement of a fiber optic transceiver module 400 according to an embodiment. Module 400 may be used as one of transceiver modules 220 or 320 of FIGS. 2 and 3, respectively. Module 400 can include fiber optic light source 410, lens 420, lens group 430, receiver lens 440, and detector group 450. Light energy emanating from fiber optic light source 410 may be collimated by lens 420, which may be, for example, a cylindrical lens, before the light energy is dispersed by lens group 430. Lens group 430 can include any suitable number of lenses to control, for example, the vertical angle of the field of view. As shown, lens group 430 can include lenses 431-434. Each of lenses 431-434 may independently direct light energy according to non-overlapping angles shown as vectors A, B, C, and D. Each lens corresponds to a particular range of angles. That is, lens 431 corresponds to angles of vector A, lens 432 corresponds to angles of vector B, lens 433 corresponds to angles of vector C, and lens 434 corresponds to angles of vector D. That the light energy being emitted from light source 410 for each of vectors A-D do not overlap also ensures that the light energy being received by detector group 450 also do not overlap. That is, light energy originating from lens 431 is received only by detector 451, light energy originating from lens 432 is received only by detector 452, light energy originating from lens 433 is received only by detector 453, and light energy originating from lens 434 is received only by detector 454.


Gaps may exist between the angles represented by vectors A-D. That is, gaps exist between vectors A and B (shown as A/B Gap), B and C (shown as B/C Gap), and C and D (shown as C/D Gap). In this case, the angle between A and D defines the vertical field of view. The LiDAR systems according to embodiments discussed herein take these gaps into account by moving transceiver module 220 (as shown in FIG. 2) or moving mirror 340 (as shown in FIG. 3). By moving either the transceiver module or mirror, the LiDAR system is able to fill the gaps with additional resolution in its task to create a 3D image of the space captured within the system's field of view.



FIG. 5 shows illustrative block diagram arrangement of a semiconductor based emitter transceiver module 500 according to an embodiment. Module 500 may be used as one of transceiver modules 220 or 320 of FIGS. 2 and 3, respectively. Module 500 can include semiconductor based emitter light source 510, lens 520, lens group 530, receiver lens 540, and detector group 550. Light source 510 can include several semiconductor based light emitters. For example, in one embodiment, multiple emitters can fire laser pulses sequentially and/or simultaneously.


Light energy emanating from light source 510 may be collimated by lens 520, which may be, for example, a cylindrical lens, before the light energy is dispersed by lens group 530. Lens group 530 can include any suitable number of lenses to control, for example, the vertical angle of the field of view. As shown, lens group 530 can include lenses 531-534. Each of lenses 531-534 may independently direct light energy according to non-overlapping angles shown as vectors A, B, C, and D. Each lens corresponds to a particular range of angles, as explained above in connection with FIG. 4. That is, lens 531 corresponds to angles of vector A, lens 532 corresponds to angles of vector B, lens 533 corresponds to angles of vector C, and lens 534 corresponds to angles of vector D. That the light energy being emitted from light source 510 for each of vectors A-D do not overlap also ensures that the light energy being received by detector group 550 also do not overlap. Detector group 550 may include several detector chips for each one of lenses 531-534 of lens group 530. Detector group 550 can include detector subgroups 551-554, where subgroups 551-554 are designed to detect signals emanating from lenses 531-534, respectively. Each of subgroups 551-554 may include individual detectors, labeled w-z. Inclusion of multiple detector chips per subgroup enables for reduced complexity of receiving optics 540 and further enables each subgroup to cover the returning angles from each one of lenses 531-534. If gaps exist between angles defined by vectors A-D (e.g., such as A/B Gap, B/C Gap, and C/D Gap), the LiDAR system can be oscillated to account for those gaps by moving transceiver module 220 (as shown in FIG. 2) or moving mirror 340 (as shown in FIG. 3).



FIG. 6 shows illustrative block diagram arrangement of a transceiver module 600 according to an embodiment. Module 600 can include light source 610 (e.g., either fiber optic light source or silicon based emitter light sources), lens 620, lens 630, receiving lens 640, and detector group 650. Detector group 650 can resemble detector group 450 if the light source is fiber optic based, or detector group 650 can resemble detector group 550 if the light source includes multiple emitters. Transceiver module 600 differs from transceiver modules 400 and 500 in that there is substantially no gap between angles represented by vectors A-D. The angle between adjacent vectors, like A and B, or B and C, is very small and may be designed to represent the vertical angular resolution. Thus, module 600 would be used in a system such WMLS 200 or 300 to produce the desired field of view. In this case, rotating angle of WMLS 200 or 300 defines the vertical field of view.



FIG. 7 shows an illustrative process 700 according to an embodiment. Process 700 may be implemented in a system such as WMLS 200 of FIG. 2. Starting at step 710, light energy can be emitted from a transceiver module. The light energy can occupy a plurality of non-overlapping angles and is transmitted directly to a polygon structure. The plurality of non-overlapping angles define a vertical angle of a field of view of the LiDAR system. At step 720, the polygon structure can be rotated in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system. At step 730, a position of the transceiver module can be adjusted in a manner that results in an increase of resolution of a scene captured within the field of view.


It should be understood that the steps in FIG. 7 are merely illustrative and that additional steps may be added and the order to the steps may be rearranged.



FIG. 8 shows an illustrative process 800 according to an embodiment. Process 800 may be implemented in a system such as WMLS 300 of FIG. 3. Starting at step 810, light energy is emitted from a transceiver module. The light energy occupies a plurality of non-overlapping angles and is transmitted directly to a mirror, which directs the light energy to a polygon structure. The plurality of non-overlapping angles define a vertical angle of a field of view of the LiDAR system. At step 820, the polygon structure is rotated in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system. At step 830, a position of the mirror is adjusted in a manner that results in an increase of resolution of a scene captured within the field of view.


It should be understood that the steps in FIG. 8 are merely illustrative and that additional steps may be added and the order to the steps may be rearranged.



FIG. 9 is a functional block diagram illustrating a vehicle system 900, according to an example embodiment. Vehicle 900 can be configured to operate fully or partially in an autonomous mode. For example, vehicle 900 can control itself while in the autonomous mode, and may be operable to determine a current state of the vehicle and its environment, determine a predicted behavior of at least one other vehicle in the environment, determine a confidence level that may correspond to a likelihood of the at least one other vehicle to perform the predicted behavior, and control vehicle 900 based on the determined information. While in autonomous mode, the vehicle 900 may be configured to operate without human interaction.


In some embodiments, vehicle 900 can operate under solely control of a human operator, but the various sensors and systems of the vehicle and the road conditions (e.g., road and the path traveled, other vehicles, stop signs, traffic lights, various events occurring outside of the vehicle) can be monitored and recorded.


Vehicle 900 can include various subsystems such as a propulsion system 902, a sensor system 904, a control system 906, one or more peripherals 908, as well as a power supply 910, a computer system 912, and a user interface 916. Vehicle 900 may include more or fewer subsystems and each subsystem can include multiple elements. Further, each of the subsystems and elements of vehicle 900 can be interconnected. Thus, one or more of the described functions of the vehicle 900 may be divided up into additional functional or physical components, or combined into fewer functional or physical components. In some further examples, additional functional and/or physical components may be added to the examples illustrated by FIG. 9.


Propulsion system 902 may include components operable to provide powered motion for the vehicle 900. Depending upon the embodiment, the propulsion system 902 can include an engine/motor 918, an energy source 919, a transmission 920, and wheels/tires 921. The engine/motor 918 can be any combination of an internal combustion engine, an electric motor, steam engine, Stirling engine, or other types of engines and/or motors. In some embodiments, the engine/motor 918 may be configured to convert energy source 919 into mechanical energy. In some embodiments, the propulsion system 902 can include multiple types of engines and/or motors. For instance, a gas-electric hybrid car can include a gasoline engine and an electric motor. Other examples are possible.


Energy source 919 can represent a source of energy that may, in full or in part, power the engine/motor 918. That is, the engine/motor 918 can be configured to convert the energy source 919 into mechanical energy. Examples of energy sources 919 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source(s) 919 can additionally or alternatively include any combination of fuel tanks, batteries, capacitors, and/or flywheels. The energy source 919 can also provide energy for other systems of the vehicle 900.


Transmission 920 can include elements that are operable to transmit mechanical power from the engine/motor 918 to the wheels/tires 921. To this end, the transmission 920 can include a gearbox, clutch, differential, and drive shafts. The transmission 920 can include other elements. The drive shafts can include one or more axles that can be coupled to the one or more wheels/tires 921.


Wheels/tires 921 of vehicle 900 can be configured in various formats, including a unicycle, bicycle/motorcycle, tricycle, or car/truck four-wheel format. Other wheel/tire geometries are possible, such as those including six or more wheels. Any combination of the wheels/tires 921 of vehicle 900 may be operable to rotate differentially with respect to other wheels/tires 921. The wheels/tires 921 can represent at least one wheel that is fixedly attached to the transmission 920 and at least one tire coupled to a rim of the wheel that can make contact with the driving surface. The wheels/tires 921 can include any combination of metal and rubber, or another combination of materials.


Sensor system 904 may include a number of sensors configured to sense information about an environment of the vehicle 900. For example, the sensor system 904 can include a Global Positioning System (GPS) 922, an inertial measurement unit (IMU) 924, a RADAR unit 926, a laser rangefinder/LIDAR unit 928, and a camera 930. The sensor system 904 can also include sensors configured to monitor internal systems of the vehicle 900 (e.g., 02 monitor, fuel gauge, engine oil temperature). Other sensors are possible as well.


One or more of the sensors included in sensor system 904 can be configured to be actuated separately and/or collectively in order to modify a position and/or an orientation of the one or more sensors.


GPS 922 may be any sensor configured to estimate a geographic location of the vehicle 900. To this end, GPS 922 can include a transceiver operable to provide information regarding the position of the vehicle 900 with respect to the Earth.


IMU 924 can include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense position and orientation changes of the vehicle 900 based on inertial acceleration.


RADAR unit 926 may represent a system that utilizes radio signals to sense objects within the local environment of the vehicle 900. In some embodiments, in addition to sensing the objects, the RADAR unit 926 may additionally be configured to sense the speed and/or heading of the objects. Similarly, laser rangefinder or LIDAR unit 928 may be any sensor configured to sense objects in the environment in which the vehicle 900 is located using lasers. Depending upon the embodiment, the laser rangefinder/LIDAR unit 928 can include one or more laser sources, a laser scanner, and one or more detectors, among other system components. The laser rangefinder/LIDAR unit 928 can be configured to operate in a coherent (e.g., using heterodyne detection) or an incoherent detection mode.


Camera 930 can include one or more devices configured to capture a plurality of images of the environment of vehicle 900. Camera 930 can be a still camera or a video camera.


Control system 906 may be configured to control operation of vehicle 900 and its components. Accordingly, control system 906 can include various elements include steering unit 932, throttle 934, brake unit 936, a sensor fusion algorithm 938, a computer vision system 940, a navigation/pathing system 942, and an obstacle avoidance system 944.


Steering unit 932 can represent any combination of mechanisms that may be operable to adjust the heading of vehicle 900. Throttle 934 can be configured to control, for instance, the operating speed of the engine/motor 918 and, in turn, control the speed of the vehicle 900. Brake unit 936 can include any combination of mechanisms configured to decelerate the vehicle 900. Brake unit 936 can use friction to slow wheels/tires 921. In other embodiments, the brake unit 936 can convert the kinetic energy of wheels/tires 921 to electric current. The brake unit 936 may take other forms as well. The brake unit 936 may control braking of the vehicle 900, for example, using a braking algorithm that takes into account input from one or more units of the sensor system 904.


Sensor fusion algorithm 938 may be an algorithm (or a computer program product storing an algorithm) configured to accept data from the sensor system 904 as an input. The data may include, for example, data representing information sensed at the sensors of the sensor system 904. The sensor fusion algorithm 938 can include, for instance, a Kalman filter, Bayesian network, or other algorithm. The sensor fusion algorithm 938 can further provide various assessments based on the data from sensor system 904. Depending upon the embodiment, the assessments can include evaluations of individual objects and/or features in the environment of vehicle 900, evaluation of a particular situation, and/or evaluate possible impacts based on the particular situation. Other assessments are possible.


Computer vision system 940 may be any system operable to process and analyze images captured by camera 930 in order to identify objects and/or features in the environment of vehicle 900 that can include traffic signals, road way boundaries, and obstacles. Computer vision system 940 can use an object recognition algorithm, a Structure From Motion (SFM) algorithm, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 940 can be additionally configured to map an environment, track objects, estimate the speed of objects, etc.


Navigation and pathing system 942 may be any system configured to determine a driving path for the vehicle 900, for example, by referencing navigation data such as geographical or map data. The navigation and pathing system 942 may additionally be configured to update the driving path dynamically while the vehicle 900 is in operation. In some embodiments, the navigation and pathing system 942 can be configured to incorporate data from the sensor fusion algorithm 938, the GPS 922, and one or more predetermined maps so as to determine the driving path for vehicle 900. Obstacle avoidance system 944 can represent a control system configured to identify, evaluate, and avoid or otherwise negotiate potential obstacles in the environment of the vehicle 900. Control system 906 may additionally or alternatively include components other than those shown and described.


Peripherals 908 may be configured to allow interaction between the vehicle 900 and external sensors, other vehicles, other computer systems, and/or a user. For example, peripherals 908 can include a wireless communication system 946, a touchscreen 948, a microphone 950, and/or a speaker 952. In an example embodiment, peripherals 908 can provide, for instance, means for a user of the vehicle 900 to interact with the user interface 916. To this end, touchscreen 948 can provide information to a user of vehicle 900. User interface 916 can also be operable to accept input from the user via the touchscreen 948. The touchscreen 948 may be configured to sense at least one of a position and a movement of a user's finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities. Touchscreen 948 may be capable of sensing finger movement in a direction parallel or planar to the touchscreen surface, in a direction normal to the touchscreen surface, or both, and may also be capable of sensing a level of pressure applied to the touchscreen surface. Touchscreen 948 may be formed of one or more translucent or transparent insulating layers and one or more translucent or transparent conducting layers. Touchscreen 948 may take other forms as well.


In other instances, peripherals 908 may provide means for the vehicle 900 to communicate with devices within its environment. Microphone 950 may be configured to receive audio (e.g., a voice command or other audio input) from a user of vehicle 900. Similarly, speakers 952 may be configured to output audio to the user of vehicle 900.


In one example, wireless communication system 946 can be configured to wirelessly communicate with one or more devices directly or via a communication network. For example, wireless communication system 946 can use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as WiMAX or LTE. Alternatively, wireless communication system 946 can communicate with a wireless local area network (WLAN), for example, using WiFi. In some embodiments, wireless communication system 946 can communicate directly with a device, for example, using an infrared link, Bluetooth, or ZigBee. Other wireless protocols, such as various vehicular communication systems, are possible within the context of the disclosure. For example, the wireless communication system 946 can include one or more dedicated short range communications (DSRC) devices that can include public and/or private data communications between vehicles and/or roadside stations.


Power supply 910 may provide power to various components of vehicle 900 and can represent, for example, a rechargeable lithium-ion or lead-acid battery. In some embodiments, one or more banks of such batteries can be configured to provide electrical power. Other power supply materials and configurations are possible. In some embodiments, the power supply 910 and energy source 919 can be implemented together, as in some all-electric cars.


Many or all of the functions of vehicle 900 can be controlled by computer system 912. Computer system 912 may include at least one processor 913 (which can include at least one microprocessor) that executes instructions 915 stored in a non-transitory computer readable medium, such as the data storage 914. Computer system 912 may also represent a plurality of computing devices that may serve to control individual components or subsystems of the vehicle 900 in a distributed fashion.


In some embodiments, data storage 914 may contain instructions 915 (e.g., program logic) executable by processor 913 to execute various functions of vehicle 900, including those described above in connection with FIG. 9. In some embodiments, processor 913 may be operative to run an artificial intelligence (AI) engine, for example, to control the various systems of the vehicle 900. Data storage 914 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, and/or control one or more of propulsion system 902, sensor system 904, control system 906, and peripherals 908. In addition to instructions 915, data storage 914 may store data such as roadway maps, path information, among other information. Such information may be used by vehicle 900 and computer system 912 at during the operation of vehicle 900 in the autonomous, semi-autonomous, and/or manual modes.


Vehicle 900 may include a user interface 916 for providing information to or receiving input from a user of vehicle 900. User interface 916 can control or enable control of content and/or the layout of interactive images that can be displayed on the touchscreen 948. Further, user interface 916 can include one or more input/output devices within the set of peripherals 908, such as wireless communication system 946, touchscreen 948, microphone 950, and the speaker 952.


Port 960 may be a port through which vehicle 900 receives power to charge power supply 910 and to communicate data stored in data store 914.


Computer system 912 may control the function of vehicle 900 based on inputs received from various subsystems (e.g., propulsion system 902, sensor system 104, and control system 906), as well as from user interface 916. For example, computer system 912 may utilize input from control system 906 in order to control steering unit 932 to avoid an obstacle detected by sensor system 904 and obstacle avoidance system 944. Depending upon the embodiment, computer system 912 can be operable to provide control over many aspects of vehicle 900 and its subsystems.


The components of vehicle 900 can be configured to work in an interconnected fashion with other components within or outside their respective systems. For instance, in an example embodiment, camera 930 can capture a plurality of images that can represent information about a state of an environment of vehicle 900 operating in an autonomous or manual mode. The environment can include every conceivable type of data that can be observed and collected by vehicle 900. For example, the environment can include the road and all aspects associated with the road such as temperature, composition of the road (e.g., concrete or asphalt), moisture level, lanes, curbs, turn lanes, cross walks, stop lights, stop signs, yield signs and other traffic signs, and barricades. The environment can include objects such as other vehicles, people, random debris in or adjacent to the road.


Computer system 912 can monitor and log the environmental inputs in conjunction with operational states of the vehicle. The operational states can refer to operational and control parameters of the vehicle such as speed, trajectory, steering input, acceleration input, and brake input, and also can include results of driver input or AI driver input. This way, regardless of whether the vehicle is operating in autonomous mode or under human control, computer system 912 can simultaneously log the environmental inputs and the operational states to provide a comprehensive vehicle log.


The vehicle log data acquired from the vehicle using embodiments discussed herein can be used in a number of different ways. For example, the vehicle log data and results from either manual driving data or autonomous driving data that is contains can be used to train vehicle AI offline based on actual recorded data and actual decisions made and the results of those decisions. The vehicle log data from one vehicle may include data pertaining to hundreds, thousands, or hundreds of thousands of driving miles. Thus, the data acquired from just one vehicle is a relatively rich environment for training vehicle AI. The training data may be further enriched by aggregating vehicle log data from numerous vehicles and users, thus providing additional resources for training and improving vehicle AI. The aggregated vehicle log data can represent hundreds of thousands, millions, or an ever increasing number of trips, across various road conditions and driving situations, and the actions taken in response thereto that can be used to train the AI.


In addition, the AI training can occur offline and not during real driving conditions. This way, the vehicle AI can run simulations based on the aggregated vehicle logs to without having to actually drive the vehicle. In some embodiments, the vehicle AI may be fed road conditions and driving situations as inputs, and the results performed by the vehicle AI may be compared to the actual results stored in the log. The vehicle AI can be trained based on a comparison of the results.


The vehicle log data, which includes sensor specific data gathered during a trip as well as all of the decisions and outcomes of those decisions, can be part of the information that the vehicle AI uses to train. In some embodiments, the results of the AI training can include what sensors are needed in the vehicle (and where they are located) and what sensors are not. For example, AI training can be performed with log data having a sensor (e.g., camera) in a first location on the vehicle and a second location on the vehicle. The results of AI driving performance based on both sensor locations can be compared and decisions can be made as to which sensor configuration yields the better result. This sensor based training can be used to evaluate an infinite number of sensor configurations, and the vehicle AI can be tuned to work with one or more of those sensor configurations.


The aggregate vehicle log data may be used to provide additional information regarding the wear and tear on vehicles overall. For example, if the brakes are worn down to 30% of normal, the vehicle log data can reflect how the vehicle reacts when these brakes are applied. The vehicle AI can be trained to take wear and tear into account and can adjust vehicle operation to compensate for that wear and tear. For example, the vehicle AI may cause the brakes to be applied earlier if the brake wear is below a certain threshold.


The vehicle log data, which may contain serval gigabytes or terabytes of data, can be transferred to a remote server (not shown) for further analysis. For example, the log may be transferred from data storage 914 to data storage associated with remote server.


The remote server may include an autonomous vehicle driving platform that can apply analytics (e.g., similar to some of the examples discussed above) to the log. The autonomous vehicle driving platform (AVDP) may include one or more algorithms capable of autonomously controlling operation of a vehicle. In one embodiment, the AVDP may assess the log to determine whether any updates or modifications are needed for the one or more algorithms to improve autonomous vehicle operation. In another embodiment, the AVDP may use the log to build one or more algorithms that can autonomously control operation of a vehicle. In yet another embodiment, the AVDP run simulations using the environmental inputs received in the log and compare the simulation results to the actual monitored actions of the vehicle (which are also included in the log).


Although FIG. 9 shows various components of vehicle 900, i.e., wireless communication system 946, computer system 912, data storage 914, and user interface 916, as being integrated into vehicle 900, one or more of these components can be mounted or associated separately from the vehicle 900. For example, data storage 914 can, in part or in full, exist separate from vehicle 900. Thus, vehicle 900 can be provided in the form of device elements that may be located separately or together. The device elements that make up vehicle 900 can be communicatively coupled together in a wired and/or wireless fashion.


It is believed that the disclosure set forth herein encompasses multiple distinct inventions with independent utility. While each of these inventions has been disclosed in its preferred form, the specific embodiments thereof as disclosed and illustrated herein are not to be considered in a limiting sense as numerous variations are possible. Each example defines an embodiment disclosed in the foregoing disclosure, but any one example does not necessarily encompass all features or combinations that may be eventually claimed. Where the description recites “a” or “a first” element or the equivalent thereof, such description includes one or more such elements, neither requiring nor excluding two or more such elements. Further, ordinal indicators, such as first, second or third, for identified elements are used to distinguish between the elements, and do not indicate a required or limited number of such elements, and do not indicate a particular position or order of such elements unless otherwise specifically stated.


Moreover, any processes described with respect to FIGS. 1-9, as well as any other aspects of the invention, may each be implemented by software, but may also be implemented in hardware, firmware, or any combination of software, hardware, and firmware. They each may also be embodied as machine- or computer-readable code recorded on a machine- or computer-readable medium. The computer-readable medium may be any data storage device that can store data or instructions which can thereafter be read by a computer system. Examples of the computer-readable medium may include, but are not limited to, read-only memory, random-access memory, flash memory, CD-ROMs, DVDs, magnetic tape, and optical data storage devices. The computer-readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. For example, the computer-readable medium may be communicated from one electronic subsystem or device to another electronic subsystem or device using any suitable communications protocol. The computer-readable medium may embody computer-readable code, instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. A modulated data signal may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.


It is to be understood that any or each module or state machine discussed herein may be provided as a software construct, firmware construct, one or more hardware components, or a combination thereof. For example, any one or more of the state machines or modules may be described in the general context of computer-executable instructions, such as program modules, that may be executed by one or more computers or other devices. Generally, a program module may include one or more routines, programs, objects, components, and/or data structures that may perform one or more particular tasks or that may implement one or more particular abstract data types. It is also to be understood that the number, configuration, functionality, and interconnection of the modules or state machines are merely illustrative, and that the number, configuration, functionality, and interconnection of existing modules may be modified or omitted, additional modules may be added, and the interconnection of certain modules may be altered.


Whereas many alterations and modifications of the present invention will no doubt become apparent to a person of ordinary skill in the art after having read the foregoing description, it is to be understood that the particular embodiments shown and described by way of illustration are in no way intended to be considered limiting. Therefore, reference to the details of the preferred embodiments is not intended to limit their scope.

Claims
  • 1. A light detection and ranging (LiDAR) system for use with a vehicle, comprising: a housing configured to be mounted to a windshield of the vehicle, the housing comprising: a transceiver module configured to transmit and receive light energy, the transceiver module comprising at least one lens that defines a vertical angle of a field of view of the LiDAR system;a polygon structure that defines a lateral angle of the field of view of the LiDAR system, the polygon structure configured to: redirect light energy transmitted from the transceiver module away from the housing; andredirect light energy reflected from an object within the field of view of the LiDAR system to the transceiver module;a moveable platform coupled to the transceiver module, the moveable platform configured to move the transceiver module in a manner that results in an increase of resolution of a scene captured within the field of view.
  • 2. The LiDAR system of claim 1, wherein the transceiver module comprises: a light source;wherein the at least one lens that defines the vertical angle of the field of view of the LiDAR system is included as part of a lens group that defines the vertical angle of the field of view; anda detector group.
  • 3. The LiDAR system of claim 2, wherein the light source is a fiber optic light source.
  • 4. The LiDAR system of claim 2, wherein the light source is a semiconductor based emitter light source, the LiDAR system further comprising a metal based printed circuit board (PCB) and driver circuitry mounted to the metal based PCB.
  • 5. The LiDAR system of claim 2, wherein the light source comprises a plurality of semiconductor based emitter light sources that fire light pulses in a sequence.
  • 6. The LiDAR system of claim 2, wherein the lens group is configured to direct light energy along a plurality of non-overlapping angles, wherein a combination of the plurality of non-overlapping angles forms the vertical angle of the field of view.
  • 7. The LiDAR system of claim 6, wherein a gap exists in between each of the plurality of non-overlapping angles, and wherein the moveable platform is configured to move the transceiver module in a manner that accounts for the gap that exists in between each of the plurality of non-overlapping angles.
  • 8. The LiDAR system of claim 6, wherein the detector group comprises at least one detection circuit for each of the plurality of non-overlapping angles.
  • 9. The LiDAR system of claim 1, wherein the polygon structure is configured to rotate about a rotation axis in a first direction at a substantially constant speed.
  • 10. The LiDAR system of claim 9, wherein the rotation axis is coincident to or different to a symmetric axis of the polygon structure.
  • 11. The LiDAR system of claim 9, wherein the polygon structure comprises a facet that is parallel to the rotation axis.
  • 12. The LiDAR system of claim 9, wherein the polygon structure is masked.
  • 13. The LiDAR system of claim 9, wherein the polygon structure is trimmed with respect to at least one corner or edge.
  • 14. The LiDAR system of claim 9, wherein the polygon structure comprises a facet that is non-parallel to the rotation axis.
  • 15. A light detection and ranging (LiDAR) system for use with a vehicle, comprising: a housing configured to be mounted to a windshield of the vehicle, the housing comprising: a transceiver module configured to transmit and receive light energy, the transceiver module comprising at least one lens that defines a vertical angle of a field of view of the LiDAR system;a polygon structure that defines a lateral angle of the field of view of the LiDAR system; anda moveable mirror positioned to receive the light energy from the transceiver module and redirect the received light energy to the polygon structure, the moveable mirror configured to adjust angles of light being emitted by the transceiver module in a manner that results in an increase of resolution of a scene captured within the field of view.
  • 16. The LiDAR system of claim 15, wherein the transceiver module comprises: a light source;wherein the at least one lens that defines the vertical angle of the field of view of the LiDAR system is included as part of a lens group that defines the vertical angle of the field of view; anda detector group.
  • 17. The LiDAR system of claim 16, wherein the light source is a fiber optic light source.
  • 18. The LiDAR system of claim 16 wherein the light source is a semiconductor based emitter light source, the LiDAR system further comprising a metal based printed circuit board (PCB) and driver circuitry mounted to the metal based PCB.
  • 19. The LiDAR system of claim 16, wherein the light source comprises a plurality of semiconductor based emitter light sources that fire light pulses in a sequence.
  • 20. The LiDAR system of claim 16, wherein the lens group is configured to direct light energy along a plurality of non-overlapping angles, wherein a combination of the plurality of non-overlapping angles forms the vertical angle of the field of view.
  • 21. The LiDAR system of claim 20, wherein a gap exists in between each of the plurality of non-overlapping angles, and wherein the moveable mirror is positioned to redirect light energy passing between the transceiver module and the polygon structure in a manner that accounts for the gap that exists in between each of the plurality of non-overlapping angles.
  • 22. The LiDAR system of claim 20 wherein the detector group comprises at least one detection circuit for each of the plurality of non-overlapping angles.
  • 23. The LiDAR system of claim 15, wherein the polygon structure is configured to rotate about a rotation axis in a first direction at a substantially constant speed.
  • 24. The LiDAR system of claim 23, wherein the rotation axis is coincident to or different to a symmetric axis of the polygon structure.
  • 25. The LiDAR system of claim 23, wherein the polygon structure comprises a facet that is parallel to the rotation axis.
  • 26. The LiDAR system of claim 23 wherein the polygon structure is masked.
  • 27. The LiDAR system of claim 23, wherein the polygon structure is trimmed with respect to at least one corner or edge.
  • 28. The LiDAR system of claim 23, wherein the polygon structure comprises a facet that is non-parallel to the rotation axis.
  • 29. A method for using a LiDAR system comprising a transceiver module and a polygon structure, the method comprising: emitting, from the transceiver module, light energy that occupies a plurality of non-overlapping angles such that the emitted light energy is transmitted directly to the polygon structure, wherein the plurality of non-overlapping angles define a vertical angle of a field of view of the LiDAR system;rotating the polygon structure in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system; andadjusting a position of the transceiver module in a manner that results in an increase of resolution of a scene captured within the field of view.
  • 30. A method for using a LiDAR system comprising a transceiver module, a polygon structure, and a mirror, the method comprising: emitting, from the transceiver module, light energy that occupies a plurality of non-overlapping angles such that the emitted light energy is transmitted directly to the mirror, which directs the light energy to the polygon structure, wherein the plurality of non-overlapping angles define a vertical angle of a field of view of the LiDAR system;rotating the polygon structure in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system; andadjusting a position of the mirror in a manner that results in an increase of resolution of a scene captured within the field of view.
  • 31. A method for using a LiDAR system comprising a transceiver module, a polygon structure, and a mirror, the method comprising: emitting, from the transceiver module, light energy that occupies a plurality of non-gap angles such that the emitted light energy is transmitted directly to the mirror, which directs the light energy to the polygon structure, wherein the plurality of non-gap angles and number of detectors define a vertical resolution of the LiDAR system, and wherein a maximum rotating angle of the mirror defines the angle of a field of view of the LiDAR system; androtating the polygon structure in a first direction, wherein the rotating polygon structure defines a lateral angle of the field of view of the LiDAR system.
CROSS-REFERENCE TO A RELATED APPLICATION

This application claims the benefit of U.S. Provisional Application No. 62/615,280, filed Jan. 9, 2018, the disclosure of which is incorporated herein in its entirety.

US Referenced Citations (265)
Number Name Date Kind
3897150 Bridges et al. Jul 1975 A
4412720 Costa Nov 1983 A
4464048 Farlow Aug 1984 A
4923263 Johnson May 1990 A
5006721 Cameron et al. Apr 1991 A
5157451 Taboada Oct 1992 A
5303084 Pflibsen et al. Apr 1994 A
5319434 Croteau et al. Jun 1994 A
5369661 Yamaguchi et al. Nov 1994 A
5442358 Keeler et al. Aug 1995 A
5546188 Wangler et al. Aug 1996 A
5579153 Laming et al. Nov 1996 A
5657077 Deangelis et al. Aug 1997 A
5793491 Wangler et al. Aug 1998 A
5838239 Stern et al. Nov 1998 A
5864391 Hosokawa et al. Jan 1999 A
5926259 Bamberger et al. Jul 1999 A
5936756 Nakajima Aug 1999 A
6163378 Khoury Dec 2000 A
6175440 Conemac Jan 2001 B1
6317202 Hosokawa et al. Nov 2001 B1
6501586 Takayama Dec 2002 B1
6594000 Green et al. Jul 2003 B2
6650404 Crawford Nov 2003 B1
6788445 Goldberg et al. Sep 2004 B2
6788861 Utsui et al. Sep 2004 B1
6950733 Stopczynski Sep 2005 B2
7128267 Reichenbach et al. Oct 2006 B2
7202941 Munro Apr 2007 B2
7345271 Boehlau et al. Mar 2008 B2
7382442 Adachi et al. Jun 2008 B2
7440084 Kane Oct 2008 B2
7440175 Di Teodoro et al. Oct 2008 B2
7489865 Varshneya et al. Feb 2009 B2
7576837 Liu et al. Aug 2009 B2
7583364 Mayor et al. Sep 2009 B1
7830527 Chen et al. Nov 2010 B2
7835068 Brooks et al. Nov 2010 B1
7847235 Krupkin et al. Dec 2010 B2
7869112 Borchers et al. Jan 2011 B2
7936448 Albuquerque et al. May 2011 B2
7969558 Hall Jun 2011 B2
7982861 Abshire et al. Jul 2011 B2
8072582 Meneely Dec 2011 B2
8072663 O'Neill et al. Dec 2011 B2
8471895 Banks Jun 2013 B2
8736818 Weimer et al. May 2014 B2
8749764 Hsu Jun 2014 B2
8812149 Doak Aug 2014 B2
8994928 Shiraishi Mar 2015 B2
9041762 Bai et al. May 2015 B2
9048616 Robinson Jun 2015 B1
9065243 Asobe et al. Jun 2015 B2
9086273 Gruver et al. Jul 2015 B1
9194701 Bosch Nov 2015 B2
9255790 Zhu Feb 2016 B2
9300321 Zalik et al. Mar 2016 B2
9304316 Weiss et al. Apr 2016 B2
9316724 Gehring et al. Apr 2016 B2
9354485 Fermann et al. May 2016 B2
9368936 Lenius et al. Jun 2016 B1
9510505 Halloran et al. Dec 2016 B2
9575184 Gilliland et al. Feb 2017 B2
9605998 Nozawa Mar 2017 B2
9621876 Federspiel Apr 2017 B2
9638799 Goodwin et al. May 2017 B2
9696426 Zuk Jul 2017 B2
9702966 Batcheller et al. Jul 2017 B2
9804264 Villeneuve et al. Oct 2017 B2
9810786 Welford et al. Nov 2017 B1
9812838 Villeneuve et al. Nov 2017 B2
9823353 Eichenholz et al. Nov 2017 B2
9857468 Eichenholz et al. Jan 2018 B1
9869754 Campbell et al. Jan 2018 B1
9880263 Droz et al. Jan 2018 B2
9880278 Uffelen et al. Jan 2018 B2
9885778 Dussan Feb 2018 B2
9897689 Dussan Feb 2018 B2
9915726 Bailey et al. Mar 2018 B2
9927915 Frame et al. Mar 2018 B2
9958545 Eichenholz et al. May 2018 B2
10007001 LaChapelle et al. Jun 2018 B1
10012732 Eichenholz et al. Jul 2018 B2
10031214 Rosenzweig et al. Jul 2018 B2
10042159 Dussan et al. Aug 2018 B2
10061019 Campbell et al. Aug 2018 B1
10073166 Dussan Sep 2018 B2
10078133 Dussan Sep 2018 B2
10094925 LaChapelle Oct 2018 B1
10157630 Vaughn et al. Dec 2018 B2
10191155 Curatu Jan 2019 B2
10215847 Scheim et al. Feb 2019 B2
10267898 Campbell et al. Apr 2019 B2
10295656 Li et al. May 2019 B1
10310058 Campbell et al. Jun 2019 B1
10324170 Engberg, Jr. et al. Jun 2019 B1
10324185 McWhirter et al. Jun 2019 B2
10393877 Hall et al. Aug 2019 B2
10429495 Wang et al. Oct 2019 B1
10444356 Wu et al. Oct 2019 B2
10451716 Hughes et al. Oct 2019 B2
10466342 Zhu et al. Nov 2019 B1
10502831 Eichenholz Dec 2019 B2
10509112 Pan Dec 2019 B1
10520602 Villeneuve et al. Dec 2019 B2
10557923 Watnik et al. Feb 2020 B2
10571567 Campbell et al. Feb 2020 B2
10578720 Hughes et al. Mar 2020 B2
10591600 Villeneuve et al. Mar 2020 B2
10598790 Rubin Mar 2020 B2
10627491 Hall et al. Apr 2020 B2
10641872 Dussan et al. May 2020 B2
10663564 LaChapelle May 2020 B2
10663585 McWhirter May 2020 B2
10663596 Dussan et al. May 2020 B2
10684360 Campbell Jun 2020 B2
10732281 LaChapelle Aug 2020 B2
10768304 Englard et al. Sep 2020 B2
10908262 Dussan Feb 2021 B2
10908265 Dussan Feb 2021 B2
10908268 Zhou et al. Feb 2021 B2
10969475 Li et al. Apr 2021 B2
10983218 Hall et al. Apr 2021 B2
11002835 Pan et al. May 2021 B2
11009605 Li et al. May 2021 B2
11016192 Pacala et al. May 2021 B2
11022688 Eichenholz et al. Jun 2021 B2
11022689 Villeneuve et al. Jun 2021 B2
11194048 Burbank et al. Dec 2021 B1
20020136251 Green et al. Sep 2002 A1
20030184835 Goldberg et al. Oct 2003 A1
20040135992 Munro Jul 2004 A1
20050033497 Stopczynski Feb 2005 A1
20050190424 Reichenbach et al. Sep 2005 A1
20050195383 Breed et al. Sep 2005 A1
20050219504 Adachi et al. Oct 2005 A1
20060071846 Yanagisawa et al. Apr 2006 A1
20060132752 Kane Jun 2006 A1
20060209373 Kato Sep 2006 A1
20070091948 Di Teodoro et al. Apr 2007 A1
20070216995 Bollond et al. Sep 2007 A1
20080174762 Liu et al. Jul 2008 A1
20080193135 Du et al. Aug 2008 A1
20090002678 Tanaka et al. Jan 2009 A1
20090010644 Varshneya et al. Jan 2009 A1
20090051926 Chen Feb 2009 A1
20090059201 Willner et al. Mar 2009 A1
20090067453 Mizuuchi et al. Mar 2009 A1
20090091732 Kato Apr 2009 A1
20090147239 Zhu et al. Jun 2009 A1
20090153644 Naito Jun 2009 A1
20090262760 Krupkin et al. Oct 2009 A1
20090316134 Michael et al. Dec 2009 A1
20100006760 Lee et al. Jan 2010 A1
20100020306 Hall Jan 2010 A1
20100020377 Brochers et al. Jan 2010 A1
20100027602 Abshire et al. Feb 2010 A1
20100045965 Meneely Feb 2010 A1
20100053715 O'Neill et al. Mar 2010 A1
20100128109 Banks May 2010 A1
20100271614 Albuquerque et al. Oct 2010 A1
20110063703 Ishibe Mar 2011 A1
20110181864 Schmitt et al. Jul 2011 A1
20120038903 Weimer et al. Feb 2012 A1
20120124113 Zalik et al. May 2012 A1
20120221142 Doak Aug 2012 A1
20120260512 Kretschmer et al. Oct 2012 A1
20130076852 Bai et al. Mar 2013 A1
20130107016 Federspeil May 2013 A1
20130116971 Retkowski et al. May 2013 A1
20130241761 Cooper et al. Sep 2013 A1
20130293867 Hsu et al. Nov 2013 A1
20130293946 Fermann et al. Nov 2013 A1
20130329279 Nati et al. Dec 2013 A1
20130342822 Shiraishi Dec 2013 A1
20140078514 Zhu Mar 2014 A1
20140104594 Gammenthaler Apr 2014 A1
20140347650 Bosch Nov 2014 A1
20140350836 Stettner et al. Nov 2014 A1
20150078123 Batcheller et al. Mar 2015 A1
20150084805 Dawber Mar 2015 A1
20150109603 Kim et al. Apr 2015 A1
20150116692 Zuk et al. Apr 2015 A1
20150139259 Robinson May 2015 A1
20150158489 Oh et al. Jun 2015 A1
20150229912 Masalkar et al. Aug 2015 A1
20150338270 Williams et al. Nov 2015 A1
20150355327 Goodwin et al. Dec 2015 A1
20160003946 Gilliland et al. Jan 2016 A1
20160006914 Neumann Jan 2016 A1
20160047896 Dussan Feb 2016 A1
20160047900 Dussan Feb 2016 A1
20160047902 Ishikawa et al. Feb 2016 A1
20160061655 Nozawa Mar 2016 A1
20160061935 Mccloskey et al. Mar 2016 A1
20160100521 Halloran et al. Apr 2016 A1
20160117048 Frame et al. Apr 2016 A1
20160172819 Ogaki Jun 2016 A1
20160178736 Chung Jun 2016 A1
20160226210 Zayhowski et al. Aug 2016 A1
20160245902 Natnik Aug 2016 A1
20160291134 Droz et al. Oct 2016 A1
20160313445 Bailey et al. Oct 2016 A1
20160327646 Scheim et al. Nov 2016 A1
20170003116 Yee et al. Jan 2017 A1
20170153319 Villeneuve et al. Jun 2017 A1
20170242104 Dussan Aug 2017 A1
20170299721 Eichenholz et al. Oct 2017 A1
20170307738 Schwarz et al. Oct 2017 A1
20170365105 Rao et al. Dec 2017 A1
20180040171 Kundu et al. Feb 2018 A1
20180050704 Tascione et al. Feb 2018 A1
20180069367 Villeneuve et al. Mar 2018 A1
20180113200 Steinberg et al. Apr 2018 A1
20180136331 Rosenzweig et al. May 2018 A1
20180152691 Pacala et al. May 2018 A1
20180156896 O'Keeffe Jun 2018 A1
20180158471 Vaughn et al. Jun 2018 A1
20180164439 Droz et al. Jun 2018 A1
20180188355 Bao et al. Jul 2018 A1
20180188357 Li et al. Jul 2018 A1
20180188358 Li et al. Jul 2018 A1
20180188371 Bao et al. Jul 2018 A1
20180210084 Zwölfer et al. Jul 2018 A1
20180239023 Rubin Aug 2018 A1
20180275274 Bao et al. Sep 2018 A1
20180284234 Curatu Oct 2018 A1
20180284237 Campbell et al. Oct 2018 A1
20180284241 Campbell et al. Oct 2018 A1
20180284242 Campbell Oct 2018 A1
20180284286 Eichenholz et al. Oct 2018 A1
20180286320 Tardif et al. Oct 2018 A1
20180292532 Meyers et al. Oct 2018 A1
20180329060 Pacala et al. Nov 2018 A1
20180359460 Pacala et al. Dec 2018 A1
20190011567 Pacala et al. Jan 2019 A1
20190025428 Li et al. Jan 2019 A1
20190101645 Demersseman et al. Apr 2019 A1
20190107607 Danziger Apr 2019 A1
20190107623 Campbell et al. Apr 2019 A1
20190120942 Zhang et al. Apr 2019 A1
20190120962 Gimpel et al. Apr 2019 A1
20190154804 Eichenholz May 2019 A1
20190154807 Steinkogler et al. May 2019 A1
20190154889 McWhirter May 2019 A1
20190212416 Li et al. Jul 2019 A1
20190250254 Campbell et al. Aug 2019 A1
20190250270 Suzuki et al. Aug 2019 A1
20190257924 Li et al. Aug 2019 A1
20190265334 Zhang et al. Aug 2019 A1
20190265336 Zhang et al. Aug 2019 A1
20190265337 Zhang et al. Aug 2019 A1
20190265339 Zhang et al. Aug 2019 A1
20190277952 Beuschel et al. Sep 2019 A1
20190310351 Hughes et al. Oct 2019 A1
20190310368 LaChapelle Oct 2019 A1
20190369215 Wang et al. Dec 2019 A1
20190369258 Hall et al. Dec 2019 A1
20190383915 Li et al. Dec 2019 A1
20200142070 Hall et al. May 2020 A1
20200256964 Campbell et al. Aug 2020 A1
20200284906 Eichenholz et al. Sep 2020 A1
20200319310 Hall et al. Oct 2020 A1
20200400798 Rezk et al. Dec 2020 A1
20210088630 Zhang Mar 2021 A9
Foreign Referenced Citations (84)
Number Date Country
204758260 Nov 2015 CN
204885804 Dec 2015 CN
108051868 May 2018 CN
108132472 Jun 2018 CN
207457508 Jun 2018 CN
207557465 Jun 2018 CN
208314210 Jan 2019 CN
208421228 Apr 2019 CN
208705506 Apr 2019 CN
106597471 May 2019 CN
209280923 Aug 2019 CN
108445468 Nov 2019 CN
110031823 Mar 2020 CN
108089201 Apr 2020 CN
109116331 Apr 2020 CN
109917408 Apr 2020 CN
109116366 May 2020 CN
109116367 May 2020 CN
110031822 May 2020 CN
211655309 Oct 2020 CN
109188397 Nov 2020 CN
109814086 Nov 2020 CN
109917348 Nov 2020 CN
110492856 Nov 2020 CN
110736975 Nov 2020 CN
109725320 Dec 2020 CN
110780284 Dec 2020 CN
110780283 Jan 2021 CN
110784220 Feb 2021 CN
212623082 Feb 2021 CN
110492349 Mar 2021 CN
109950784 May 2021 CN
213182011 May 2021 CN
213750313 Jul 2021 CN
214151038 Sep 2021 CN
109814082 Oct 2021 CN
113491043 Oct 2021 CN
214795200 Nov 2021 CN
214795206 Nov 2021 CN
214895784 Nov 2021 CN
214895810 Nov 2021 CN
215641806 Jan 2022 CN
112639527 Feb 2022 CN
215932142 Mar 2022 CN
112578396 Apr 2022 CN
4142702 Jun 1993 DE
0 757 257 Feb 1997 EP
1 237 305 Sep 2002 EP
1 923 721 May 2008 EP
2 157 445 Feb 2010 EP
2 395 368 Dec 2011 EP
2 889 642 Jul 2015 EP
1 427 164 Mar 1976 GB
2000411 Jan 1979 GB
S628119 Jan 1987 JP
H0683998 Mar 1994 JP
H11194018 Jul 1999 JP
2007144667 Jun 2007 JP
2008298520 Dec 2008 JP
2009121836 Jun 2009 JP
2010035385 Feb 2010 JP
2014115182 Jun 2014 JP
2016040662 Mar 2016 JP
2017-003347 Jan 2017 JP
2017-138301 Aug 2017 JP
10-2012-0013515 Feb 2012 KR
10-2013-0068224 Jun 2013 KR
10-2018-0107673 Oct 2018 KR
9816801 Apr 1998 WO
2016056545 Apr 2016 WO
2017110417 Jun 2017 WO
2018125725 Jul 2018 WO
2018129410 Jul 2018 WO
2018126248 Jul 2018 WO
2018129408 Jul 2018 WO
2018129409 Jul 2018 WO
2018129410 Jul 2018 WO
2018175990 Sep 2018 WO
2018182812 Oct 2018 WO
2019079642 Apr 2019 WO
2019165095 Aug 2019 WO
2019165289 Aug 2019 WO
2019165294 Aug 2019 WO
2020013890 Jan 2020 WO
Non-Patent Literature Citations (28)
Entry
Chen, X, et al. (Feb. 2010). “Polarization Coupling of Light and Optoelectronics Devices Based on Periodically Poled Lithium Niobate,” Shanghai Jiao Tong University, China, Frontiers in Guided Wave Optics and Optoelectronics, 24 pages.
Goldstein, R. (Apr. 1986) “Electro-Optic Devices in Review, The Linear Electro-Optic (Pockels) Effect Forms the Basis for a Family of Active Devices,” Laser & Applications, FastPuise Technology, Inc., 6 pages.
International Preliminary Report on Patentability, dated Jul. 9, 2019, for International Application No. PCT/US2018/012703, 10 pages.
International Preliminary Report on Patentability, dated Jul. 9, 2019, for International Application No. PCT/US2018/012704, 7 pages.
International Preliminary Report on Patentability, dated Jul. 9, 2019, for International Application No. PCT/US2018/012705, 7 pages.
International Search Report and Written Opinion, dated Jan. 17, 2020, for International Application No. PCT/US2019/019276, 14 pages.
International Search Report and Written Opinion, dated Jul. 9, 2019, for International Application No. PCT/US2019/018987, 17 pages.
International Search Report and Written Opinion, dated Sep. 18, 2018, for International Application No. PCT/US2018/012116, 12 pages.
International Search Report and Written Opinion, dated May 3, 2019, for International Application No. PCT/US2019/019272, 16 pages.
International Search Report and Written Opinion, dated May 6, 2019, for International Application No. PCT/US2019/019264, 15 pages.
International Search Report and Written Opinion, dated Jan. 3, 2019, for International Application No. PCT/US2018/056577, 15 pages.
International Search Report and Written Opinion, dated Mar. 23, 2018, for International Application No. PCT/US2018/012704, 12 pages.
International Search Report and Written Opinion, dated Jun. 7, 2018, for International Application No. PCT/US2018/024185, 9 pages.
International Preliminary Report on Patentability, dated Apr. 30, 2020, for International Application No. PCT/US2018/056577, 8 pages.
European Search Report, dated Jul. 17, 2020, for EP Application No. 18776977.3, 12 pages.
Extended European Search Report, dated Jul. 10, 2020, for EP Application No. 18736738.8, 9 pages.
Gunzung, Kim, et al. (Mar. 2, 2016). “A hybrid 3D LIDAR imager based on pixel-by-pixel scanning and DS-OCDMA,” pages Proceedings of SPIE [Proceedings of SPIE ISSN 0277-786X vol. 10524], SPIE, US, vol. 9751, pp. 975119-975119-8.
Extended European Search Report, dated Jul. 22, 2020, for EP Application No. 18736685.1, 10 pages.
Gluckman, J. (May 13, 2016). “Design of the processing chain for a high-altitude, airborne, single-photon lidar mapping instrument,” Proceedings of SPIE; [Proceedings of SPIE ISSN 0277-786X vol. 10524], SPIE, US, vol. 9832, 9 pages.
Office Action Issued in Japanese Patent Application No. 2019-536019 dated Nov. 30, 2021, 6 pages.
European Search Report, dated Jun. 17, 2021, for EP Application No. 18868896.4, 7 pages.
“Fiber laser,” Wikipedia, https://en.wikipedia.org/wiki/Fiber_laser, 6 pages.
International Search Report and Written Opinion, dated Mar. 19, 2018, for International Application No. PCT/US2018/012705, 12 pages.
International Search Report and Written Opinion, dated Mar. 20, 2018, for International Application No. PCT/US2018/012703, 13 pages.
“Mirrors”, Physics LibreTexts, https://phys.libretexts.org/Bookshelves/Optics/Supplemental_Modules_(Components)/Mirrors, (2021), 2 pages.
“Why Wavelengths Matter in Fiber Optics”, FirstLight, https://www.firstlight.net/why-wavelengths-matter-in-fiber-optics/, (2021), 5 pages.
Vuthea et al., “A Design of Risley Scanner for LiDAR Applications,” 2018 International Conference on Optical MEMS and Nanophotonics (OMN), 2 pages.
Notice of Allowance issued in Korean Patent Application No. 10-2021-7041437 dated Apr. 28, 2022, 6 pages.
Related Publications (1)
Number Date Country
20190212416 A1 Jul 2019 US
Provisional Applications (1)
Number Date Country
62615280 Jan 2018 US