Distributed modular solid-state lidar system

Information

  • Patent Grant
  • 12153163
  • Patent Number
    12,153,163
  • Date Filed
    Friday, July 26, 2019
    5 years ago
  • Date Issued
    Tuesday, November 26, 2024
    2 months ago
  • Inventors
  • Original Assignees
    • Opsys Tech Ltd.
  • Examiners
    • Mikeska; Neil R
    Agents
    • Rauschenbach Patent Law Group, PLLC
    • Rauschenbach; Kurt
Abstract
A LIDAR system includes a first optical transmitter comprising a plurality of first emitters, where each of the plurality of first emitters is positioned to generate an optical beam with a FOV at a target range when energized. A second optical transmitter includes a plurality of second emitters, where each of the plurality of second emitters is positioned to generate an optical beam with a FOV at the target range when energized. The first and second optical transmitters are positioned relative to each other so the FOVs of at least some of the optical beams generated by the first and second optical transmitter when energized overlap at the target range. An optical receiver includes a plurality of optical detectors, where a respective one of the plurality of optical detectors is positioned to detect a respective optical beam generated by at least one of the first and second optical transmitter and reflected by a target in the FOV at the target range. A controller includes a first and second output being connected to respective control inputs of the first and second optical transmitters, and a third output being connected to a control input of the optical receiver. The controller generates control signals at the first and second outputs that control energizing select ones of the plurality of first and the plurality of second emitters that generate optical beams with the FOVs that overlap at the target range and generating a control signal at the third output that activates selected ones of the plurality of optical detectors to detect optical beams reflected from an object at the target range.
Description

The section headings used herein are for organizational purposes only and should not to be construed as limiting the subject matter described in the present application in any way.


INTRODUCTION

Autonomous, self-driving, and semi-autonomous automobiles use a combination of different sensors and technologies such as radar, image-recognition cameras, and sonar for detection and location of surrounding objects. These sensors enable a host of improvements in driver safety including collision warning, automatic-emergency braking, lane-departure warning, lane-keeping assistance, adaptive cruise control, and piloted driving. Among these sensor technologies, light detection and ranging (LIDAR) systems take a critical role, enabling real-time, high resolution 3D mapping of the surrounding environment.


The majority of commercially available LIDAR systems used for autonomous vehicles today utilize a small number of lasers, combined with some method of mechanically scanning the environment. For example, several manufacturers provide rotating LIDAR systems with transmit/receive optics located on a spinning motor in order provide a 360° horizontal field-of-view. A one-dimensional (1D) fixed array of lasers is used for the vertical direction, and the horizontal scan is accomplished through the rotation. Currently, the largest 1D array in the commercial market contains 128 lasers. Some vendors utilize a moving mirror to scan the FOV. For example, the mirror can be actuated using MEMS or galvanometers. In these systems, often only 1 or 2 lasers are used, in combination with two mirrors, one for the horizontal direction and one for the vertical direction.


In mechanically scanned LIDAR systems, the scanning method puts physical constraints on the receiver and transmitter optics. For example, in a rotating motor scanned LIDAR system, the receiver is typically placed together with the transmitter on the same motor in order allow the transmitter and receiver to “look at the same spot” during operation. Similarly, the receiver in the scanning mirror devices is often positioned to make use of the same moving mirror as the transmitter so that the light reflected from the target travels back through the shared mirror optics to reach the receiver. These physical constraints can put limits on the size of the LIDAR system(s) and the size and location of individual components. In addition, these physical constraints impact performance features like measurement range and SNR/cross-talk.





BRIEF DESCRIPTION OF THE DRAWINGS

The present teaching, in accordance with preferred and exemplary embodiments, together with further advantages thereof, is more particularly described in the following detailed description, taken in conjunction with the accompanying drawings. The skilled person in the art will understand that the drawings, described below, are for illustration purposes only. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating principles of the teaching. The drawings are not intended to limit the scope of the Applicant's teaching in any way.



FIG. 1 illustrates the operation of a LIDAR system implemented in a vehicle.



FIG. 2 illustrates a distributed sensor system for an autonomous vehicle that utilizes multiple types of known sensors.



FIG. 3 illustrates a schematic diagram representing a multi-sensor LIDAR system with multiple two-dimensional fields-of-view and ranges for an automobile.



FIG. 4 illustrates a schematic diagram of known rotating LIDAR system for a vehicle.



FIG. 5 illustrates an autonomous vehicle equipped with multiple known rotating LIDAR systems.



FIG. 6 illustrates a known LIDAR system with multiple moving parts.



FIG. 7A illustrates a schematic diagram of a solid-state LIDAR system.



FIG. 7B illustrates a two-dimensional projection of the system field-of-view (FOV) of the LIDAR system of FIG. 7A.



FIG. 8A illustrates a measurement point cloud for a LIDAR system according to the present teaching with a single 2D laser array source illuminating a field-of-view.



FIG. 8B illustrates a measurement point cloud for a LIDAR system according to the present teaching with two 2D laser array sources illuminating a field-of-view.



FIG. 9 illustrates a schematic diagram of a cross-section of a structure of a known bottom-emitting VCSEL laser that can be used in a LIDAR system of the present teaching.



FIG. 10 illustrates a schematic diagram of an embodiment of a 2D monolithic VCSEL array with 256 separate laser emitters according to the present teaching.



FIG. 11 illustrates a schematic diagram of an embodiment of a modular LIDAR system according to the present teaching that includes a separate transmit module and receive module.



FIG. 12 illustrates an embodiment of a composite FOV generated by the modular LIDAR system of FIG. 11.



FIG. 13 illustrates an autonomous car integrated with an embodiment of a modular LIDAR system of the present teaching.



FIG. 14 illustrates a schematic diagram of an embodiment of a modular LIDAR system according to the present teaching that includes a joint transmit/receive module.



FIG. 15 illustrates an embodiment of a composite field-of-view generated by the modular LIDAR system of FIG. 14.



FIG. 16 illustrates an embodiment of a modular LIDAR system according to the present teaching that is integrated into an autonomous vehicle.



FIG. 17 illustrates a schematic diagram of an embodiment of a modular LIDAR system of the present teaching with co-located controller module.



FIG. 18 illustrates an embodiment of a composite field-of-view generated by the modular LIDAR system of FIG. 17.



FIG. 19 illustrates an embodiment of an operating scenario comprising a modular LIDAR system according to the present teaching including two separate receivers and two separate transmitters.



FIG. 20 illustrates a block diagram of an embodiment of a controller of the present teaching.





DESCRIPTION OF VARIOUS EMBODIMENTS

The present teaching will now be described in more detail with reference to exemplary embodiments thereof as shown in the accompanying drawings. While the present teaching is described in conjunction with various embodiments and examples, it is not intended that the present teaching be limited to such embodiments. On the contrary, the present teaching encompasses various alternatives, modifications and equivalents, as will be appreciated by those of skill in the art. Those of ordinary skill in the art having access to the teaching herein will recognize additional implementations, modifications, and embodiments, as well as other fields of use, which are within the scope of the present disclosure as described herein.


Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the teaching. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


It should be understood that the individual steps of the methods of the present teaching can be performed in any order and/or simultaneously as long as the teaching remains operable. Furthermore, it should be understood that the apparatus and methods of the present teaching can include any number or all of the described embodiments as long as the teaching remains operable.


Known LIDAR systems with moving parts are problematic for many reasons including being prone to reliability issues. One aspect of the present teaching is the realization that a solid-state LIDAR system that requires no moving parts for scanning the field-of-view can be designed with fewer physical constraints, allowing for both performance improvement and increased flexibility in integration of the LIDAR system within an autonomous vehicle. In addition, a solid-state system with substantially more lasers makes possible operational modes that are not achievable with mechanical scanning, such as the ability to randomly scan the field-of-view without any mechanical constraint on the scanning pattern.


For many reasons, LIDAR systems generally require multiple separate units to support the measurement requirements. These reasons include, for example, size limitations, mounting restrictions, measurement requirements, integrations with other systems and many others. As such, LIDAR systems are needed that are modular and easily distributed.


The present teaching relates to solid-state LIDAR systems that contain no moving parts, with a plurality of lasers, where each laser can correspond to a single fixed projection angle. The use of solid-state lasers by itself does not mean that there are no moving parts, as MEMS devices are often referred to as solid-state. MEMS devices in LIDAR systems, however, typically incorporate physical motion, which can be a concern for reliability and device lifetime. One feature of the LIDAR system according to the present teaching is that the modular distributed system design is compatible with all solid-state systems, including those systems that use solid-state laser arrays and/or systems without MEMS devices. This includes systems with no physical motion.



FIG. 1 illustrates the operation of a LIDAR system implemented in a vehicle. The LIDAR system 100 includes a laser projector, also referred to as an illuminator, that projects light beams 102 generated by a light source toward a target scene and a receiver that receives the light 104 that reflects of an object, shown as a person 106, in that target scene. LIDAR systems typically also include a controller and/or system processor that computes the distance information about the object 106 from the reflected light, and an element that can scan or provide a particular pattern of the light that may be a static pattern across a desired range and field-of-view. The receiver, controller, and system processor are used to convert the received signal light into measurements that represent a pointwise 3D map of the surrounding environment that falls within the LIDAR system range and field-of-view. In various embodiments, the controller can be a simple electrical circuit or can be a more complicated processor, depending on the particular application and desired performance.


The laser source and optical beam projection system that form the illuminator and the receiver may be located on the front side of a vehicle 108. The laser source and optical beam projection system may also be referred to as a transmitter. A person 106, and/or another object, such as a car or light pole, will provide light reflected from the source back to the receiver. A controller or processor then determines the range, or distance, to the object. As is known in the art, a LIDAR receiver calculates range information based on time-of-flight measurements of light pulses emitted from the light source. In addition, known information about the optical beam profile that illuminates the scene in a target plane associated with a particular range and, based on the particular design of the source and projector system, is used to determine location information about the reflecting surface, thereby generating a complete X, Y, Z or three-dimensional picture of the scene. In other words, the pointwise three-dimensional (3D) map of the surrounding environment represents a collection of measurement data that indicates position information from all the surfaces that reflect the illumination from the source to the receiver within the field-of-view of the LIDAR system. In this way, a 3D representation of objects in the field-of-view of the LIDAR system is obtained. The pointwise 3D data map may also be referred to as a measurement point cloud.



FIG. 2 illustrates a distributed sensor system 200 for an autonomous vehicle 202 that utilizes multiple types of known sensors 204, 206, 208, 210, 212, 214, 216, 218. Each sensor technology has strengths and weaknesses. The combination of measurements output from the different sensors 204, 206, 208, 210, 212, 214, 216, 218 is used to produce a set of data from the scene. The data set from the distributed sensor system 200 that includes a variety of measurements from the scene is processed together, enabling error-free, safe, and reliable navigation under all conditions. The distributed sensor system 200 includes multiple cameras 204, 206, 208, multiple ultrasonic sensors 210, 212, multiple radars 214, 216, a rotating LIDAR system 218, and a controller 220. The autonomous car 202 uses cameras 204, 206, 208 that can distinguish colors. Cameras 204, 206, 208 may also be used to see traffic lights, turn signals, and other emitted light from the scene being measured. Cameras 204, 206, 208 can also be mounted with small physical separation, and their images combined to produce a stereo image. Ultrasonic sensors 210, 212 are used to detect proximity of objects within a few meters of the car. Radar 214, 216 is used to detect moving objects at long distance. In general, radar 214, 216 does not operate with the high resolution of the LIDAR system 218. The LIDAR system 218 is used to produce a high-resolution 3D point cloud of the environment in real-time. The controller 220 is used to generate and process the data set that is provided by the outputs of measurements from all the various sensors 204, 206, 208, 210, 212, 214, 216, 218. The controller may also be used to coordinate the operation of the various sensors 204, 206, 208, 210, 212, 214.



FIG. 3 illustrates a schematic diagram representing a multi-sensor LIDAR system 300 with multiple two-dimensional fields-of-view and ranges for an automobile 302. The system includes multiple LIDAR modules. The modules are not shown, only resulting fields-of-view and ranges are shown. For example, an adaptive cruise control function may require a field-of-view and range 304 with a narrow field-of-view, but with a longer-distance range requirement, compared to a system with a wider field-of-view and shorter range with a side-looking “surround view” field-of-view and range 306, 308. The composite field-of-view for the automobile is the combined field-of-view of all the LIDAR modules positioned around the vehicle, which can differ in performance characteristics of some or all of the modules. In the system 300 illustrated FIG. 3, five LIDAR modules (not shown) are used to create the composite field-of-view. The field-of-view and range 304 of the long-distance forward-looking LIDAR system has a field-of-view that overlaps with a second LIDAR module, which has a field-of-view and range 310 with a wider field-of-view, but a shorter range capability. There is also a rear-view module that has field-of-view and range 312 with similar dimensions of the field-of-view and range 310 of the second LIDAR system.


Although the present teaching describes LIDAR systems in the context of automotive vehicles, where LIDAR is widely used for autonomous, or self-driving, or driver-assisted vehicles, it should be understood that the embodiments may be applicable to any type of vehicle. Other types of vehicles might include robots, tractors, trucks, airplanes, drones, boats, ships, and others. The present teachings are also applicable to various stationary applications. For example, in high density, metropolitan areas, LIDAR can be employed to monitor traffic, both vehicular and pedestrian. It is anticipated that LIDAR systems will be deployed in many different applications, especially as the cost of the LIDAR systems becomes reduced in the near future. One skilled in the art will appreciate that the present teaching is not limited by the types of target objects described herein as being detected and ranged, but is more broadly applicable to any type of target.



FIG. 4 illustrates a schematic diagram of known rotating LIDAR system 400 for a vehicle. The LIDAR system 400 uses a rotating motor to provide a 360° horizontal scan of the environment. The LIDAR system 400 has a base 402. For example, the base 402 can be fixed to an automobile (not shown). The top portion 404 of the LIDAR system rotates relative to the base and contains the receiver optics and transmitter optics. The field-of-view of the transmitter and the field-of-view of the receiver overlap over the distances corresponding to the target image plane. The receiver and transmitter optics may be 1D arrays of lasers and detectors that are fixed in a fashion so as to cover the desired vertical field-of-view. Arrays of 16, 32, 64, and even 128 lasers are sometimes used with rotating LIDAR systems, such as LIDAR system 400 shown in FIG. 4. The vertical angular resolution is fixed, and typically equal to the ratio of the vertical field-of-view to the number of lasers in the fixed array. The horizontal angular resolution is a function of the rotation speed and the operating parameters of the system, and can be varied based on the application and desired performance.


In concept, a single rotating LIDAR system can provide a full 360° view, enabling the use of a single LIDAR system. However, in practice, multiple rotating LIDAR systems are used in order to split the field-of-view around the vehicle. FIG. 5 illustrates an autonomous vehicle equipped with multiple known rotating LIDAR systems 500. The system illustrated in FIG. 5 includes multiple separate rotating LIDAR systems 502 mounted on a roof rack, each configured to achieve a separate field-of-view that is less than 180°. Some systems include more than twelve separate rotating LIDAR systems 502. Note that, in general, a substantial portion of each rotating LIDAR unit has the field-of-view blocked in a direction to minimize optical cross-talk of the multiple rotating LIDAR units. When the rotating units are pointed in the direction where the field-of-view is blocked, no useful data is being generated by that particular LIDAR unit. In that regard, a significant penalty in cost and efficiency results from the use of rotating LIDAR versus a LIDAR system that operates continuously with a field-of-view of less than 180°.



FIG. 6 illustrates a known LIDAR system 600 with multiple moving parts. The system 600 uses a first 602 and second mirror 604 mounted on an X-Y galvanometer 606 that scans the environment in both a vertical and a horizontal direction. Light from a laser transmitter 608, which typically includes one or two lasers, reflects off the first mirror 602 onto the second mirror 604, and the combination of the two mirrors 602, 604 determines the projection angle. One or two lasers are typically used in these systems, instead of larger laser arrays like in rotating LIDAR systems, because of the physical difficulty of aligning multiple lasers/receivers onto the mirror system without substantial optical losses and complexity. The transmitted light reflects off a target at the image plane 610 and a receiver 612 is configured to have a field-of-view overlapping with the target at the image plane 610 so that the receiver 612 captures the reflected light. The size and physical orientation of the mirror place various mechanical constraints on the design. For example, the size of the mirrors constrains the amount of reflected light that can be gathered, acting as an effective maximum aperture. Mirror size also impacts the performance, with tradeoffs in power consumption, scanning speed, and system performance.



FIG. 7A illustrates a schematic diagram of a solid-state LIDAR system. The transmitter 702 includes a plurality of lasers that can be configured in an array. Each individual laser can be fired independently, with the optical beam emitted by each laser corresponding to a 3D projection angle subtending only a portion of the total system field-of-view. Thus, each laser emitter generates an optical beam when energized (fired) that illuminates a corresponding field-of-view at a target range. The transmitter field-of-view is a combination of the various energized emitter fields-of-view. While some embodiments of the present teaching use laser emitters, it should be understood that other emitters may be used, for example LEDs, or other devices that generate optical beams when energized. In some embodiments of the LIDAR system of present teaching, all of the light beams from the lasers in the laser array share the transmitter optics 704 to produce projected light (illumination) 706. The reflected light 708 from the target range 710 shares the receiver optics 712. In this example, an automobile 714 is shown in the target range 710. One example of a transmitter is described in detail in U.S. Patent Publication No. 2017/0307736 A1, which is assigned to the present assignee. The entire contents of U.S. Patent Publication No. 2017/0307736 A1 are incorporated herein by reference.


The LIDAR system 700 illustrated in FIG. 7A uses a detector array 716 that typically has a fewer number of individual detector elements than the transmitter array has individual lasers. These detectors can be individually controlled such that individual or groups of detectors are activated that detect light over a particular receiver field-of-view at the target range. The receiver field-of-view represents the composite of the activated detector fields-of-view. Activation may include, for example, a controllable shutter in front of the detectors that only passes light from a particular field-of-view at the target range, or biases applied to particular detectors or groups of detectors.



FIG. 7B illustrates a two-dimensional projection of the system field-of-view 750 of the LIDAR system 700 described in connection with FIG. 7A. An individual detector field-of-view of the detector array is shown as a small square 752. An illuminated measurement point associated with an individual laser that is energized in the transmitter is illustrated by a circle 754.


In FIG. 7B, the ratio of the number of detectors (i.e. number of small squares 752) to the number of lasers (i.e. number of circles 754) in each array is one to nine. In the particular example shown in FIG. 7B, the detector array is a 5×5 array. Compared to a prior art flash LIDAR system with equivalent angular resolution, the cost of the detector array can be significantly lower because the number of detectors required can be significantly lower. The measurement resolution of the system in FIG. 7A is not determined by the size of the detector elements, but instead is determined by the number of lasers in the transmitter and the collimation of the individual laser beams. These parameters translate in a known way to the size of the circle 754 at the target range, and the relative size of the circles 754 and the squares 752 that represent field-of-view of a detector element.



FIG. 7B illustrates a critical aspect of the LIDAR system of FIG. 7A. A single 3D measurement point in the overall field-of-view of the LIDAR system of FIG. 7A is highlighted by hash marks in a particular square 756, with that measurement point shown as particular dark circle 758, which corresponds to a specific individual laser in the laser array. It can be further seen in FIG. 7B that this measurement point falls within an individual detector where the field-of-view of that individual detector has been shown with a cross-hatch pattern for identification. This figure illustrates that the 3D resolution of the LIDAR system 700 is determined by the number of lasers, as each laser corresponds to a specific angular projection angle that gives rise to the size of the circles 754 at the target range, and the relative size of the circles 754 and the squares 752 that represent the field-of-view of a detector element.


One feature of the present teaching is that the LIDAR modules are solid-state with no moving parts. The LIDAR modules incorporate a plurality of lasers with each laser generating an optical beam with a fixed projection angle. The resulting composite field-of-view and the angular resolution of the LIDAR system, which includes one or more modules, are determined by the arrangement of optics and lasers and modules. The lasers in each module are grouped into 2D arrays, that when combined with corresponding optics, are positioned to both provide the desired module composite field-of-view, as well as the desired module angular resolution. Each individual laser generates an optical beam that corresponds to a single pointing angle. The angular resolution can be achieved by interleaving the field-of-view of two or more 2D laser arrays. When two fields-of-view generated by two separate 2D laser arrays are overlapped in space, a resulting composite field-of-view is generated with finer angular resolution.


The physical size of a LIDAR system is important for integration into the autonomous vehicle. Although current state-of-the-art systems are generally bulky, and often mounted on roof racks, the goal is to eventually integrate LIDAR systems into the chassis such that they can be visually seamless with the exterior, and cosmetically attractive. There are certain components, such as the headlights, where synergies can be gained by co-packaging with part or all of the LIDAR system. One aspect of the LIDAR system described herein is that it can be separated into small, modular components to support the desire to make LIDAR unobtrusive visually in the future.



FIG. 8A illustrates a measurement point cloud 800 for a single 2D laser array source illuminating a field-of-view. Each measurement point 802 represents a measurement signal based on light emitted by a particular laser element in the array. The distance 804 between the vertical spacing of the measurement points 802 determines the vertical angular resolution. The horizontal spacing 806 of the points 802 on the point cloud determines the horizontal angular resolution of the point cloud 800.



FIG. 8B illustrates a measurement point cloud 850 for an embodiment of the present teaching where the illumination of a second 2D laser array has been projected to substantially overlap with a first 2D laser array. A measurement point 852 corresponding to the first 2D VCSEL array is shown as a circle, a measurement point 854 from the second 2D VCSEL array is shown as a triangle. The composite point cloud 850 is formed by the interleaving of the field-of-view of the two separate 2D arrays in order to achieve a finer angular resolution in the horizontal, vertical, or a combination of horizontal and vertical directions. One example of such a transmitter is described U.S. Patent Application Ser. No. 62/538,149, entitled “VCSEL Array LIDAR With Small Angular Divergence”, which is assigned to the present assignee. The entire contents of U.S. Patent Application 62/538,149 are incorporated herein by reference.


A LIDAR system of the present teaching can utilize substantially more lasers than a prior art mechanically scanning LIDAR system since each laser corresponds to a specific projection angle. For example, if an autonomous vehicle requires the full 360° horizontal field-of-view, with a uniform vertical field-of-view of 20°, and the average angular resolution in both direction is 0.5°, then the number of lasers required equals 28,800. Not all types of lasers can be implemented in a practical manner for a LIDAR system that requires thousands of individual lasers. In practice, various embodiments of the LIDAR system of the present teaching that require large numbers of lasers will likely use 2D arrays of vertical cavity surface emitting lasers (VCSEL) due to their current size, cost, and performance capabilities. However, other kinds of transmitter arrays can also be used. One skilled in the will appreciate that numerous types of lasers can be used in the LIDAR systems according to the present teaching.


One feature of the present teaching is that it can use a variety of known VCSEL devices, including top-emitting VCSELs, bottom-emitting VCSELS, and various types of high-power VCSELs. FIG. 9 illustrates a schematic diagram of a cross-section of a structure of a known bottom-emitting VCSEL laser 900 that can be used in a LIDAR system of the present teaching. The area of the emission aperture 902 of the VCSEL laser 900 typically ranges from a few microns in diameter for mW power operation, up to 100 microns diameter or more for 100 mW and greater CW power operation. The VCSEL 900 is fabricated on a substrate 904 that can be, for example, GaAs, or numerous other semiconductor materials. An n-type distributed Bragg reflector (DBR) layer 906 is positioned on the substrate. An active region 908 is constructed on the n-type DBR layer 906, followed by an aperture that can be made formed in an oxide material. A p-type Distributed Bragg Grating DBR layer 912 is then grown on the active region. Typically, the p-type DBR is highly reflecting, and the n-type DBR is partially reflecting, resulting in light output 914 from the bottom, substrate-side of the layer structure. The active region 908, oxide aperture 910, and p-type DBR layer 912 are formed in a mesa structure. A top contact 916 and a bottom contact 918 are used to provide an electrical current to the active region to generate the output light. An oxide aperture 910 provides current confinement to the active region 908. The top contact 916 is p-type, and the bottom contact 918 is n-type. Emission apertures 902 are formed in the bottom contact 918 to allow the output light 914 to emerge from the bottom, substrate side of the bottom-emitting VCSEL. Note that only one emission aperture 902 is shown in FIG. 9 which illustrates one element of a multi-element array. This type of VCSEL 900 may be a single element, or multiple element VCSEL that can be fabricated as one- or two-dimensional arrays on the substrate 904.


In some embodiments, the VCSEL array is monolithic and the lasers all share a common substrate. A variety of common substrate types can be used. For example, the common substrate can be a semiconductor material. The common substrate can also include a ceramic material. In some embodiments, the 2D VCSEL array is assembled from a group of 1D laser bars.


In some embodiments, the VCSELs are top-emitting VCSELS. In other embodiments, the VCSELs are bottom-emitting VCSELS. The individual VCSELS can have either a single large emission aperture, or the individual VCSELS can be formed from two or more sub-apertures within a larger effective emission diameter. A group of sub-apertures forming a larger effective emission region is sometimes referred to as a cluster. The contacts on the VCSELS can connect a drive signal to one or more individual elements, or all the elements, or subsets of elements to provide various different illumination patterns when one or more electrical drive signals are supplied to the contacts.



FIG. 10 illustrates a schematic diagram of an embodiment of a 2D monolithic VCSEL array 1000 with 256 separate laser emitters 1002 according to the present teaching. Each laser emitter 1002 has an emission aperture of diameter “a” 1004. Emission from each single laser emitter 1002 substantially fills the full emission aperture. Each laser emitter 1002, therefore, generates a laser beam with initial diameter “a” which is equal to the diameter 1004 of the emission aperture. The laser emitters 1002 are spaced uniformly in the horizontal direction with a spacing dx 1006. The laser emitters are spaced uniformly in the vertical direction with a spacing dy 1008. The overall size of the array, measured from the centers of the outermost lasers is distance Dx 1010 in the horizontal direction and distance Dy 1012 in the vertical direction. The actual chip size will be slightly larger than in dimension than the distance Dx 1010 and the distance Dy 1012. In various embodiments, the emitters 1002 may produce beams with various shapes other than the circular emitter shape. For example, oval, square, rectangular and various odd shapes may be realized in various embodiments. In embodiments in which lasers are arranged as 2D arrays, the rows and columns of the lasers can be driven electrically in a matrix addressable fashion.


Some embodiments of the present teaching utilize bottom-emitting high-power arrays of VCSELs with a single large aperture per laser, such as the configuration shown in FIG. 9. Other embodiments of the present teaching utilize top-emitting or bottom-emitting high-power arrays of VCSELs with an overall emission area comprising sub-apertures. However, one skilled in the art will appreciate that the present teaching is not limited to any single configurations of top- and bottom-emitting VCSELs and associated emission apertures.


The use of 2D VCSEL arrays as a building block of the LIDAR systems described herein, establishes a platform that allows a small physical size for the transmitter. A typical 2D array with 256 high-power individual lasers, would be possible on with a monolithic chip of ˜4 mm×4 mm. The laser 2D array is then used together with optics that are chosen to keep the physical dimension as small as possible, for example through the use of micro-lens arrays, shared lenses of dimension <20 mm, or diffractive optics of maximum dimension of 20 mm as examples.



FIG. 11 illustrates a schematic diagram of an embodiment of a modular LIDAR system 1100 according to the present teaching that includes a single transmitter and single receiver per module. The transmitter, receiver, and controller are physically separate modules 1102, 1104, 1106, 1108. There are two transmitter modules 1102, 1104, a receiver module 1106 and a controller module 1108. Connections 1110 exist between the modules 1102, 1104, 1106, 1108 for provision of electrical power, and communication between the controller module 1108 and the other modules 1102, 1104, 1106. Communication between the modules 1102, 1104, 1106, 1108 could take place either optically or electrically, and they may use a variety of communication protocols, including Ethernet or other various other communication protocols. Power to the transmitter and receiver modules 1102, 1104, 1106, could be supplied through the controller module 1108 as shown, or can be supplied separately. The modular LIDAR system 1100 is a relatively simple configuration, including only two transmitter modules 1102, 1104 and a single receiver module 1106. In other embodiments, a single controller module 1108 is used control one or more transmitters and/or one or more receivers.



FIG. 12 illustrates an embodiment of an example composite field-of-view 1200 that is generated by the modular LIDAR system 1100 described in connection with FIG. 11. Referring to both FIGS. 11 and 12, there is a region representing field-of-view 1202 for one transmitter module 1102 and a region representing field-of-view 1204 for the other transmitter module 1104. As previously described, each transmitter module 1102, 1104 uses at least two 2D laser arrays that are substantially overlapped in the field-of-view to produce a finer angular measurement resolution. The field-of-view for individual laser elements from the first array is illustrated by circles 1206. The field-of-view for individual laser elements from the second array is illustrated by squares 1208. As shown in the figure, there are two 16×16 arrays of 256 lasers in each transmit module 1102, 1104. They produce the substantially overlapped fields-of-view shown by the close-packed arrays of corresponding circles 1206 and squares 1208. The dual-array configuration for both modules 1102, 1104 is the same in this embodiment. Also, the fields-of-view 1202, 1204 for the corresponding transmitters are configured to sit side-by-side at the target range.


The total composite field-of-view of the two transmitters is made up of 1,024 laser fields-of-view. In some embodiments, the emission wavelengths of laser arrays in the first transmit module 1102 and the second transmit module 1104 are not the same. Also, in some embodiments, the laser wavelengths within a single transmitter module 1102, 1104 is not the same. For example, each laser array may emit at a different wavelength. In this configuration, the laser fields-of-view represented by circles 1206 could be different in wavelength from the laser fields-of-view represented by squares 1208. In general, there is no limitation on the number and distribution of wavelengths within either arrays or modules. The number and positions of the different laser wavelengths is chosen based on a desired effect on the field-of-view at the measurement target range. For composite field-of-view 1200 a field-of-view of the single receiver module is shown by the dashed box 1210, which is slightly larger than the field-of-view of the combined transmitter fields-of-view 1202, 1204. In some embodiments, the receiver field-of-view 1210 comprises multiple detector fields-of-view from multiple detectors that are configured as an array. It is typically desired to have the receiver field-of-view 1210 slightly larger than the transmitter fields-of-view 1202, 1204 to simplify alignment of the receiver and transmitter fields-of-view.


The controller module 1108 manages the operation of the two transmitter modules 1102, 1104 to minimize optical cross-talk, as well as optical power levels for eye safe operation. The controller module 1108 also manages the receiver module 1106 which, in some embodiments uses at least one 2D array of detectors, and in other embodiments, uses a plurality of 2D arrays of detectors. The number and position of the various detector arrays are chosen to achieve a particular receive field-of-view at the target range. For example, if two wavelengths are used, then the receiver can have two separate 2D arrays of detector, one for each wavelength.


In general, the transmitter modules of the present teaching include a plurality of laser emitters. Each laser emitter is positioned to illuminate a particular emitter field-of-view at a target range. In various embodiments, the plurality of emitters is positioned in various patterns. For example, the plurality of emitters can be a one- and/or two-dimensional array with a particular spacing between emitters in the vertical and/or horizontal direction. Central to the invention is the fact that individual and/or groups of emitters are positioned to illuminate a particular emitter field-of-view at the target range. This information is typically known by the controller, and is used to produce illumination in particular fields-of-view at the target range by controlling the energizing of individual and/or groups of emitters. The field-of-view of the one or more emitter fields-of-view contained in a particular transmitter that are illuminated based on a signal from the controller at any particular time is referred to as the transmitter field-of-view.


In general, the receiver modules of the present teaching include a plurality of detector elements. Each detector element is positioned to detect light from a particular detector field-of-view at a target range. In various embodiments, the plurality of detectors is positioned in various patterns. For example, the plurality of detectors may be a one- and/or two-dimensional array with a particular spacing between detectors in the vertical and/or horizontal direction. One aspect of the present teaching is the fact that individual and/or groups of detectors can be positioned to detect light from a particular detection field-of-view at the target range. This information on field-of-view can be known by the controller, and is used to detect illumination from particular fields-of-view at the target range by controlling the detectors/receiver such that select detectors are activated to detect light reflected from the target range in the field-of-view of the activated detector or detectors. The field-of-view of the one or more detector fields-of-view contained in a particular receiver that are activated to detect based on a signal from the controller at any particular time is referred to as the receiver field-of-view.


The controller may control the activation of particular detectors by any of a number of known control mechanisms. For example, a controllable aperture may be used to control the activation of one or more detector fields-of-view in a receiver field-of-view. Individual detector biases can also be controlled to control the activation of individual detector fields-of-view within a receive field-of-view.


Overlap regions between different transmitters and/or different receivers are often designed into the system to provide continuous tracking of objects across particular composite fields-of-view. For example, a continuous composite field-of-view is provided by two transmitters when at least some of the fields-of-view of the emitters of each transmitter overlap to some degree. Furthermore, a continuous transmit receive composite field-of-view is provided when at least one detector field-of-view is overlapped with the overlapped emitter fields-of-view. Overlap of at least some individual detector fields-of-view from two receivers is also used to provide a continuous composite receive field-of-view across two different receivers.


In the overlap regions, a controller is used to achieve performance objectives from the system. For example, the controller can control energizing select emitters that generate optical beams with the fields-of-view that overlap at the target range so that only one emitter from each of two overlapped transmitters generate an optical beam that when reflected off the object at the target range is detected by one of the optical detectors during a particular light detection and ranging measurement. This eliminates or substantially reduces cross talk, especially of those transmitters operating at the same wavelength.


One feature of the present teaching is that the controlling of the emitters and detectors can be used to support performance of a modular LIDAR system. The controller has the information about various FOV positions as well as ability to control energizing of individual and groups of laser emitters and to control activation of individual detectors. The knowledge can extend beyond an individual transmitter and/or receiver to include all the connected modules. As such, the controller is able to provide a particular transmitter field-of-view and a particular receiver field-of-view at any particular time and/or for any desired measurement sequence at the target range from all connected transmit, receive, and/or transmit/receive modules. If different wavelengths are used within single transmitters or across multiple transmitters, the details of the transmission and detection for the multiple wavelengths is typically known and also actively managed by the controller.


In some embodiments of the LIDAR system of the present teaching, the receiver can detect different wavelengths separately. In these embodiments, the laser emitters can be simultaneously energized in an overlapping field-of-view, thereby allowing simultaneous measurements and improved angular resolution, as described in connection with FIG. 8B. One aspect of the present teaching is the realization that configuring the LIDAR system with a controller that has the ability to control the energizing of particular emitters and the activation of particular detectors provides the designer with control over a number of important performance parameters of the system. For example, the controller can be configured to control energizing of transmitters such that the illumination at a target range does not exceed an eye-safe Maximum Permissible Exposure (MPE) limit for optical power in a particular aperture size. Maximum Permissible Exposure can be exceeded, for example, if two or more emitter fields-of-view overlap such that the sum of the optical power within an aperture defined by eye safety standards is too high.


The controller can also be configured to control the transmitter emitters and the receiver detectors such that a one-to-one correspondence is maintained between an energized emitter and a detector configured to detect. The controller can control the transmitter emitters and the receiver detectors such that a one-to-one correspondence is maintained between an energized transmitter field-of-view from a group of emitters and a receiver field-of-view from a group of detectors configured to detect. The controller can also be configured to control the energizing of emitters and/or the detecting of detectors to improve various performance metrics, such as reducing measurement cross talk. For example, measurement cross talk can arise when a field-of-view of more than one emitter overlaps with a detector and/or receiver field-of-view. These overlap regions are often designed into the system to provide continuous tracking of objects across a composite field-of-view. That is, a continuous composite field-of-view is provided by two transmitters when at least some of the fields-of-view of the emitters of each transmitter overlap to some degree. Further, at least one detector field-of-view is overlapped with the emitter fields-of-view. In general, overlap of at least some individual detector fields-of-view from two receivers is also used to provide a continuous composite field-of-view across two different receivers.


In addition, the controller can be configured to control the energizing of emitters and/or the activation of detectors to produce a desired and/or varying angular resolution at the target range. The controller can also be configured to control the energizing of emitters and/or the activation of detectors to track an object across a composite field-of-view. The composite field-of-view can be provided by one or more transmitters, one or more receivers, and/or one or more modules that form the modular LIDAR system.


One feature of the modular approach to LIDAR system design according to the present teaching is the ability to easily integrate the various modules into an autonomous vehicle. FIG. 13 illustrates an autonomous car integrated with an embodiment of a modular LIDAR system 1300 according to the present teaching. Three receiver modules 1302, 1304, 1306 are depicted near the roof of the car 1306. Eight transmitter modules 1310, 1312, 1314, 1316, 1318, 1320, 1322, 1324 are shown, located in various areas, such as the side mirrors, the headlights, and the grill of the automobile 1308. The controller module 1326, is separately located in the trunk area of the automobile 1308. The ability to have multiple modules of relatively small size allows the location of the transmitters and receives in virtually any location of the automobile 1308, which allows for seamless integration into the vehicle. The controller module 1326 acts to coordinate the operation of the various receiver modules 1302, 1304, 1306 and transmitter modules 1310, 1312, 1314, 1316, 1318, 1320, 1322, 1324 to minimize optical cross-talk, and to control eye safety.


The use of a single controller module 1326 also allows for the smooth tracking of an object crossing in front of the vehicle, since with the solid-state system, the system can control the laser energizing pattern of elements in the laser arrays within various transmitter modules 1310, 1312, 1314, 1316, 1318, 1320, 1322, 1324 to continuously measure an object of interest. As the object crosses the boundary from one transmitter field-of-view to the next transmitter field-of-view, the controller module 1326 can coordinate the transmitter modules 1310, 1312, 1314, 1316, 1318, 1320, 1322, 1324 and the receiver modules 1302, 1304, 1306 to prevent any gap in the measurements. The modular design described herein is also easier to integrate into different types of vehicles as the physical size is relatively small. Known LIDAR systems with a larger single unit that takes up more space might have to be located external to the vehicle, such as shown in the system described in connection with FIG. 5. Clearly, the automobile described in connection with FIG. 5 is impacted aesthetically in an unattractive manner by the need to have a large roof rack with a mechanical rotating LIDAR and will have undesirable air flow dynamics effecting efficiency of the automobile.


One feature of the modular LIDAR systems of the present teaching is the ability to minimize electrical cross-talk between the transmitter and the receiver elements. The lasers used for automotive LIDAR typically are relatively high-power, operating at the limit of eye safety and often require large electrical driving pulse currents to generate optical pulses. The electrical signal from the transmitter, if allowed to electromagnetically couple to the receive circuit, produces excess noise that may provide a false return signal. This is a particularly difficult problem in the case of a LIDAR system that has capability to energize more than one laser pulse, using a different wavelength. In these system configurations, the energizing time of any particular transmit laser can be completely independent of receiver timing. The modular LIDAR systems of the present teaching can separate the transmit and the receive signals to substantially mitigate any electrical cross-talk concern. This separation also can be used improve EMI emissions.



FIG. 14 illustrates a schematic diagram of an embodiment of a modular LIDAR system 1400 according to the present teaching that includes multiple transmitters per module. In FIG. 14, one modular unit 1402, 1404 contains several transmitters 1406 and a single receiver 1408. The controller module 1410 is a separate module. Connections exist between the modules using cables 1412 for provision of electrical power and for communication. Communication between the modules 1402, 1404, 1410 could take place either optically or electrically, and in a variety of communication protocols including Ethernet or other communication protocols. Power to the transmit/receive modules could be supplied through the controller module 1410 as shown, or separately.



FIG. 15 illustrates an embodiment of a composite field-of-view 1500 generated by the modular LIDAR system 1400 of FIG. 14. Referring to both FIGS. 14 and 15, there is a field-of-view 1502 generated for the transmit/receive module 1402, and a field-of-view 1504 generated for the transmit/receive module 1404. Each of these fields-of-view 1502, 1504 include transmit fields-of-view 1506, 1508, 1510, 1512 for each the two transmitters in each module 1402, 1404 and receive fields-of-view 1514, 1516 for each receiver in each module 1402, 1404. The transmit/receive field-of-view 1502, 1504 for each module 1402, 1404 is nominally the same as that shown in FIG. 12, and includes fields-of-view for each of 1,024 lasers that are substantially overlapped. The composite field-of-view, therefore, has fields-of-view for each of 2,048 lasers. The field-of-view 1500 of the two modules 1402, 1404 has been arranged to provide exactly two times the total field-of-view in the horizontal direction as for one individual unit. In various embodiments, the wavelengths of the two transmit/receive modules can be the same, or can different. One skilled in the art will appreciate that the present teaching is not limited by the number of wavelengths employed by the system. The controller module 1410 operates both transmit-receive modules 1402, 1404 and acts to minimize optical cross-talk between modules and also to maintain eye safety by controlling the laser drive energizing patterns for the transmit arrays.


One feature of the modular LIDAR system of the present teaching is that the composite field-of-view is designed to allow continuous tracking of an object across a field-of-view. Referring to the composite field-of-view 1500 described in connection with FIG. 15, the advantage of continuous tracking is that there are no gaps between the transmit fields-of-view, 1506, 1508, 1510, 1512. However, there are overlaps in the receive fields-of-view 1514, 1516. This produces an overlap region 1518 where both the receive fields-of-view 1514, 1516 can detect light from both transmit fields-of-view 1508, 1510. The controller module 1410 is used to manage the energizing of particular lasers in the transmit arrays that produce light in the overlap region 1518 so that there is only minimal cross-talk. Consequently, the laser light produced in the transmit field-of-view 1510 is minimized in the receive field-of-view 1514. Also, the laser light from the transmit field-of-view 1508 is minimized in the receive field-of-view 1516. In addition, the controller module 1410 manages energizing particular lasers in the transmitter arrays such that a peak power within the composite field-of-view 1500 never exceeds an eye safe threshold power. In addition, the controller controls energizing particular lasers in the transmitter arrays that generate optical beams with the overlap region 1518 at the target range so that only one of the first transmitter in transmit/receive module 1402 and second transmitter in transmit/receive module 1404 generate an optical beam that when reflected off the object at the target range is detected by one of the plurality of optical detectors in only one of the first and second transmit/receive modules 1402, 1404 during a particular light detection and ranging measurement.



FIG. 16 illustrates an embodiment a modular LIDAR system 1600 according to the present teaching that is integrated into an autonomous vehicle 1602. FIG. 16 shows how the modular transmit/receive modules 1604, 1606, 1608 can be positioned on the automobile 1602. The two transmit/receive modules 1604, 1606 are placed close the roof 1610, while a single module 1606 is placed in the grill 1612 of the car 1602. The controller module 1614 is in the trunk of the car 1602. This arrangement allows easier integration into a variety of vehicles, as compared to known large mechanical rotating LIDAR systems.


One feature of the present teaching is that the transmit, receive, and/or transmit/receive modules do not necessarily need to be packaged separately from the controller module. As such, the transmit, receive, and/or transmit/receive modules and controller module do not have to be located in physically different locations. FIG. 17 illustrates a schematic diagram of an embodiment of a modular LIDAR system 1700 of the present teaching with co-located controller module 1702. A single enclosure 1704 is used to group together a set of transmit/receive modules 1706, 1708, 1710, 1712, 1714, 1716 with a single controller module 1702. For example, in the case of a large vehicle, such as a semi-trailer truck or a garbage truck, it might be desired to use a modular system with a single, common enclosure 1704. Such a common enclosure 1704 could provide additional robustness to environmental factors.



FIG. 18 illustrates an embodiment of a composite field-of-view 1800 generated by the modular LIDAR system 1700 of FIG. 17. Referring to both FIGS. 17 and 18, each transmit/receive module 1706, 1708, 1710, 1712, 1714, 1716 has a corresponding receive field-of-view 1802, 1804, 1806, 1808, 1810, 1812 and two corresponding transmit field-of-views 1814, 1816, 1818, 1820, 1822, 1824, 1826, 1828, 1830, 1832, 1834, 1836. The transmit/receive modules 1706, 1708, 1710, 1712, 1714, 1716 use twenty-four 2D laser arrays with 256 lasers each. FIG. 18 illustrates that there are multiple overlaps in the receiver fields-of-view 1802, 1804, 1806, 1808, 1810, 1812. The receive field-of-view 1802, 1804, 1806, 1808, 1810, 1812 are shown with different line types to help illustrate the overlap. In this configuration, multiple edges or overlap regions between transmit field-of-views need to be managed in order to avoid optical cross-talk and also to ensure eye safety. The controller module 1702 operates all the transmit/receive modules 1706, 1708, 1710, 1712, 1714, 1716 to ensure safety and performance specifications are satisfied, as well as allowing for optimization of functions like the tracking of an object within the composite field-of-view 1800.



FIG. 19 illustrates an embodiment of an operating scenario 1900 comprising a modular LIDAR system according to the present teaching with two separate receivers 1902, 1904 and two separate transmitters 1906, 1908. In this operating scenario 1900, the two receivers 1902, 1904 each have an overall field-of-view equal to 30° and are configured to have some overlap in their field-of-view in order ensure that there are no gaps in coverage. This allows continuous tracking of, for example, pedestrians 1910, 1912, 1914. The two separate transmitters 1906, 1908 each contain a plurality of lasers. The two pairs of dashed lines 1916, 1918, 1920, 1922 from each transmitter show the projection angle of two lasers within each transmitter. The two lasers are chosen to correspond to the pedestrians 1910, 1912, 1914. Pedestrian #1 1910 is only within the field-of-view of transmitter #1 1906 and the receiver #1 1902. Pedestrian #3 1914 is only within the field-of-view of transmitter #2 1908 and receiver #2 1904. Pedestrian #2 1912, however, is within the field-of-view of both transmitters 1906, 1908 and both receivers 1902, 1904.


Both receivers 1902, 1904 use an array of detectors. Each detector in the array of detectors has a field-of-view that is a subset of the overall receiver field-of-view. See, for example, FIG. 7B and the associated description. In the operating scenario 1900 shown, the field-of-view of each receiver is illustrated as split into three sections, each with a 10° field-of-view. For receiver #1 1902 these are regions 1A 1924, 1B 1926 and 1C 1928. For receiver #2 1904, these are regions 2A 1930, 2B 1932 and 2C 1934. A three-element detector array of uniform dimension in the horizontal plane would correspond to the fields-of-view 1924, 1926, 1928, 1930, 1932, 1934 for either receiver 1902, 1904. It should be understood that the operating scenario 1900 is drawn for simplicity and that the detector array size might be larger in actual practice in order to have a smaller field-of-view per individual detector. The field-of-view of each detector is labeled uniquely (1A, 1B, 1C, 2A, 2B, 2C) for reference.


A single controller 1938 can be used to control both receivers 1902, 1904 and both transmitters 1906, 1908. The operating scenario 1900 shown in FIG. 19 illustrates that pedestrian #1 is only seen by Receiver #1, but by both detectors 1A and 1B. Pedestrian #2 can be seen by both Receiver #1 and Receiver #2, as the field-of-view of detector 1C and detector 2A have substantial overlap. The overlap is indicated by the diagonally hatched region 1936. The controller 1938 has knowledge of the particular fields-of-view 1924, 1926, 1928, 1930, 1932, 1934 corresponding to each detector element for both receivers 1902, 1904 and, thus has knowledge of this region of overlap. The controller 1938 also has knowledge of the projection angles 1916, 1918, 1920, 1922 of the lasers within transmitter #1 1906 and within transmitter #2 1908 and, thus can compute which detectors at which range distance can be used to measure the reflected pulse from each laser within each transmitter 1906, 1908.


The controller 1938 sets the laser energizing pattern for lasers in the two transmitters 1906, 1908, and selects which detector or combination of detectors within all possible receivers 1902, 1904 to measure the return signal pulse. A typical criteria for selection of the optimum detector to measure each laser and range might include maximum received signal power, or highest SNR. There is a possibility of optical cross-talk resulting in a false measurement if the controller 1938 does not control the energizing pattern in the overlap region 1936 between the detectors in the two receivers 1902, 1904. For example, pedestrian #2 1912 can be measured using either transmitter 1906, 1908. If the controller 1938 energized a pulse from transmitter #1 1906 at pedestrian #2 1912, and before that pulse was measured by a receiver 1902, 1904, energized a second pulse from transmitter #2 1908, then both pulses would be observed at a single detector during a single measurement. This would result in an error in distance and/or a false object being detected. The controller 1938 controls the energizing pattern so as to maintain a strict one-to-one mapping between laser and detector during a single measurement, within the field-of-view of that chosen detector. This prevents optical cross-talk and errors in measurement within the modular LIDAR system.



FIG. 20 illustrates a block diagram 2000 of an embodiment of a controller 2002 of the present teaching. For simplicity, the controller 2002 is shown communicating with two receivers 2004, 2006 and two transmitters 2008, 2010, whereas in actual use, any number of receivers and controllers could be present. The controller 2002 has various sets of information stored and available. This information can be determined during setup/calibration and/or updated during the operation of the system. This information includes system constraints 2012. For example, system constraints can include the maximum duty cycle for a single laser, eye safety constraints on the system operation, and other types of constraints. There is also laser data 2014 that includes the projection angle and origin point of each laser with respect to the corresponding transmitter. In addition, there is detector data 2016 that includes the field-of-view of the each detector and the origin of each detector with respect to its corresponding receiver. Furthermore, the controller 2002 has system position information 2018, for example, information on the position of each transmitter and receiver module within the system, relative to some global origin and/or axes.


The controller 2002 can utilize the TOF measurement information from the receivers as additional inputs to determine the laser selection and energizing pattern algorithm/process. The TOF computation can be performed within the same physical unit as the optical receiver or can be performed elsewhere.


The controller 2002 executes an algorithm that controls the laser selection and energizing pattern. Since this is a solid-state system, many types of energizing and scanning patterns are possible within the field-of-view. In one embodiment of a method according to the present teaching, the controller 2002 executes a first step 2020 of selecting lasers to be fired for each transmitter. In a second step 2022, the controller selects corresponding detectors. In a third step 2024, the controller checks for overlap in the field-of-view of those lasers and detectors. If there is no overlap that could result in errors and/or optical cross talk, the controller proceeds to a fourth step 2026 where the lasers are energized. The corresponding return pulses are received at receiver #1 2004 and/or receiver #2 2006. If there is overlap, then the controller 2002 proceeds back to the first step 2020 and chooses a new set of lasers to energize accordingly and adjusts the energizing pattern as required to accommodate the desired overlap conditions. After return pulses are received at receiver #1 2004 and/or receiver #2 2006, a TOF analysis 2028 is performed. The results of the analysis 2028 can be used in step one 2020 to determine which energizing pattern to proceed with next.


EQUIVALENTS

While the Applicant's teaching is described in conjunction with various embodiments, it is not intended that the Applicant's teaching be limited to such embodiments. On the contrary, the Applicant's teaching encompasses various alternatives, modifications, and equivalents, as will be appreciated by those of skill in the art, which may be made therein without departing from the spirit and scope of the teaching.

Claims
  • 1. A modular light detection and ranging (LIDAR) system comprising: a) a first optical transmitter comprising a plurality of first emitters configured to generate optical beams when energized that combine to have a first transmitter field-of-view (FOV) at a target range that is a combination of fields-of-view of each of the plurality of first emitters, the generated optical beam of each of the plurality of first emitters corresponding to a projection angle subtending only a portion of the first transmitter field-of-view;b) a second optical transmitter comprising a plurality of second emitters configured to generate optical beams when energized that combine to have a second transmitter field-of-view at the target range, the generated optical beam of each of the plurality of second emitters corresponding to a projection angle subtending only a portion of the second transmitter field-of-view, wherein the first and second optical transmitters are positioned relative to each other so fields-of-view of at least some of the optical beams generated by the first and second optical transmitter when energized overlap at the target range;c) an optical receiver comprising a plurality of optical detectors, a respective one of the plurality of optical detectors being positioned to detect a respective optical beam generated by at least one of the first and second optical transmitter and reflected by a target in at least one of the first and second transmitter field-of-view at the target range; andd) a controller comprising a first and second output being connected to respective control inputs of the first and second optical transmitters, and a third output being connected to a control input of the optical receiver, the controller generating control signals at the first and second outputs that control energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating a desired field-of-view at the target range that comprises a projection angle subtending only at least one of a portion, which is less than a full portion, of the first transmitter field-of-view associated with the select ones of the plurality of the first emitters and a portion, which is less than a full portion, of the second transmitter field-of-view associated with the select ones of the plurality of the second emitters, and generating a control signal at the third output that activates selected ones of the plurality of optical detectors to detect optical beams reflected from an object at the target range, thereby detecting light from the desired field-of-view at the target range.
  • 2. The modular light detection and ranging system of claim 1 wherein the controller controls energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating the desired field-of-view at the target range so that only one of the first and second transmitters generate an optical beam that when reflected from the object at the target range is detected by one of the plurality of optical detectors during a particular light detection and ranging measurement.
  • 3. The modular light detection and ranging system of claim 1 wherein the controller controls energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating the desired field-of-view at the target range so that cross talk resulting from optical beams with fields-of-view that overlap at the target range is reduced.
  • 4. The modular light detection and ranging system of claim 1 wherein the controller controls energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating the desired field-of-view at the target range so that cross talk resulting from optical beams with fields-of-view that overlap at the target range is minimized.
  • 5. The modular light detection and ranging system of claim 1 wherein the controller generates control signals at the first and second outputs that control energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating the desired field-of-view at the target range so that an optical power in an aperture at the target range is less than a predetermined amount.
  • 6. The modular light detection and ranging system of claim 5 wherein the predetermined amount is less than a maximum permissible exposure (MPE).
  • 7. The modular light detection and ranging system of claim 1 further comprising: a) a second optical receiver comprising a plurality of optical detectors, a respective one of the plurality of optical detectors in the second optical receiver being positioned to detect a respective optical beam generated by at least one of the first and second optical transmitter and reflected by the target in the field-of-view at the target range; andb) the controller further comprising a fourth output being connected to a control input of the second optical receiver, the controller generating a control signal at the fourth output that activates selected ones of the plurality of optical detectors in the second optical receiver to detect optical beams reflected from the object at the target range, wherein the controller generates control signals at the third and fourth outputs such that only selected ones of the plurality of optical detectors in either the first optical receiver or the plurality of optical detectors in the second optical receiver detect optical beams reflected from the object at the target range at one time.
  • 8. The modular light detection and ranging system of claim 1 wherein the first and second optical transmitters are housed in a single enclosure.
  • 9. The modular light detection and ranging system of claim 1 wherein the first and second optical transmitters are housed in physically separate enclosures.
  • 10. The modular light detection and ranging system of claim 1 wherein the optical receiver is housed in an enclosure that is physically separate from enclosures housing the first and second optical transmitters.
  • 11. The modular light detection and ranging system of claim 1 wherein at least one of the first and second optical transmitters and the optical receiver are housed in a same enclosure.
  • 12. The modular light detection and ranging system of claim 1 wherein at least one of the first and second optical transmitters, the optical receiver, and the controller are housed in a same enclosure.
  • 13. The modular light detection and ranging system of claim 1 wherein the plurality of optical detectors in the optical receiver comprises a two-dimensional array.
  • 14. The modular light detection and ranging system of claim 1 wherein the plurality of first emitters in the first optical transmitter comprises a two-dimensional array of emitters.
  • 15. The modular light detection and ranging system of claim 14 wherein the two-dimensional array of emitters comprises a matrix addressable array of emitters.
  • 16. The modular light detection and ranging system of claim 1 wherein the plurality of first emitters in the first optical transmitter comprises a first and second two-dimensional array of emitters.
  • 17. The modular light detection and ranging system of claim 16 wherein the first and second two-dimensional array of emitters generate optical beams with an interleaved field-of-view.
  • 18. The modular light detection and ranging system of claim 1 wherein the plurality of first emitters in the first optical transmitter comprises a VCSEL array.
  • 19. The modular light detection and ranging system of claim 1 wherein the plurality of first emitters generates optical beams at a first wavelength and the plurality of second emitters generates optical beams at a second wavelength.
  • 20. The modular light detection and ranging system of claim 1 wherein at least two of the plurality of first emitters generates optical beams at different wavelengths.
  • 21. The modular light detection and ranging system of claim 1 wherein the controller generates the control signals at the first and second outputs that control energizing selected ones of the plurality of first and the plurality of second emitters based on a relative position of the first and second optical transmitters.
  • 22. The modular light detection and ranging system of claim 1 wherein the controller generates control signals at the first, second, and third outputs that control energizing select ones of the plurality of first and the plurality of second emitters and controls the activation of selected ones of the plurality of optical detectors based on relative positions of the first optical transmitter, the second optical transmitter, and the optical receiver.
  • 23. A modular light detection and ranging (LIDAR) system comprising: a) a first optical transmitter comprising a plurality of first emitters configured to generate optical beams when energized that combine to have a first transmitter field-of-view (FOV) at a target range that is a combination of fields-of-view of each of the plurality of first emitters, the generated optical beam of each of the plurality of first emitters corresponding to a projection angle subtending only a portion of the first transmitter field-of-view;b) a second optical transmitter comprising a plurality of second emitters configured to generate optical beams when energized that combine to have a second transmitter field-of-view at the target range, the generated optical beam of each of the plurality of second emitters corresponding to a projection angle subtending only a portion of the second transmitter field-of-view, wherein the first and second optical transmitters are positioned relative to each other so fields-of-view of at least some of the optical beams generated by the first and second optical transmitter when energized overlap at the target range;c) an optical receiver comprising a plurality of optical detectors, a respective one of the plurality of optical detectors being positioned to detect a respective optical beam generated by at least one of the first and second optical transmitter and reflected by a target in at least one of the first and second transmitter field-of-view at the target range; andd) a controller comprising a first and second output being connected to respective control inputs of the first and second optical transmitters, and a third output being connected to a control input of the optical receiver, the controller generating control signals at the first and second outputs that control energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating a desired field-of-view at the target range that comprises a projection angle subtending only a portion, which is less than a full portion, of the first transmitter field-of-view associated with the select ones of the plurality of the first emitters or a portion, which is less than a full portion, of the second transmitter field-of-view associated with the select ones of the plurality of the second emitters, and generating a control signal at the third output that activates selected ones of the plurality of optical detectors to detect optical beams reflected from an object at the target range, thereby detecting light from the desired field-of-view at the target range.
  • 24. A modular light detection and ranging (LIDAR) system comprising: a) a first optical transmitter comprising a plurality of first emitters configured to generate optical beams when energized that combine to have a first transmitter field-of-view (FOV) at a target range that is a combination of fields-of-view of each of the plurality of first emitters, the generated optical beam of each of the plurality of first emitters corresponding to a projection angle subtending only a portion of the first transmitter field-of-view;b) a second optical transmitter comprising a plurality of second emitters configured to generate optical beams when energized that combine to have a second transmitter field-of-view at the target range, the generated optical beam of each of the plurality of second emitters corresponding to a projection angle subtending only a portion of the second transmitter field-of-view, wherein the first and second optical transmitters are positioned relative to each other so fields-of-view of at least some of the optical beams generated by the first and second optical transmitter when energized overlap at the target range;c) an optical receiver comprising a plurality of optical detectors, a respective one of the plurality of optical detectors being positioned to detect a respective optical beam generated by at least one of the first and second optical transmitter and reflected by a target in at least one of the first and second transmitter field-of-view at the target range; andd) a controller comprising a first and second output being connected to respective control inputs of the first and second optical transmitters, and a third output being connected to a control input of the optical receiver, the controller generating control signals at the first and second outputs that control energizing select ones of the plurality of first and the plurality of second emitters, thereby illuminating a desired field-of-view at the target range that comprises a projection angle subtending only a portion, which is less than a full portion, of the first transmitter field-of-view associated with the select ones of the plurality of the first emitters and a portion, which is less than a full portion, of the second transmitter field-of-view associated with the select ones of the plurality of the second emitters and generating a control signal at the third output that activates selected ones of the plurality of optical detectors to detect optical beams reflected from an object at the target range, thereby detecting light from the desired field-of-view at the target range.
RELATED APPLICATION SECTION

The present application is a non-provisional of copending U.S. Provisional Patent Application Ser. No. 62/714,463, filed Aug. 3, 2018, and entitled “Distributed Modular Solid-State LIDAR System”. The entire contents of U.S. Patent Application Ser. No. 62/714,463 are incorporated herein by reference.

US Referenced Citations (244)
Number Name Date Kind
5157257 Geiger Oct 1992 A
5552893 Akasu Sep 1996 A
5909296 Tsacoyeanes Jun 1999 A
6057909 Yahav et al. May 2000 A
6061001 Sugimoto May 2000 A
6246708 Thornton et al. Jun 2001 B1
6353502 Marchant et al. Mar 2002 B1
6680788 Roberson et al. Jan 2004 B1
6717972 Steinle et al. Apr 2004 B2
6775480 Goodwill Aug 2004 B1
6788715 Leeuwen et al. Sep 2004 B1
6829439 Sidorovich et al. Dec 2004 B1
6860350 Beuhler et al. Mar 2005 B2
6888871 Zhang et al. May 2005 B1
7065112 Ghosh et al. Jun 2006 B2
7110183 von Freyhold et al. Sep 2006 B2
7544945 Tan et al. Jun 2009 B2
7652752 Fetzer et al. Jan 2010 B2
7702191 Geron et al. Apr 2010 B1
7746450 Willner et al. Jun 2010 B2
7773204 Nelson Aug 2010 B1
7969558 Hall Jun 2011 B2
8072581 Breiholz Dec 2011 B1
8115909 Behringer et al. Feb 2012 B2
8247252 Gauggel et al. Aug 2012 B2
8301027 Shaw et al. Oct 2012 B2
8576885 Van Leeuwen et al. Nov 2013 B2
8675181 Hall Mar 2014 B2
8675706 Seurin et al. Mar 2014 B2
8783893 Seurin et al. Aug 2014 B1
8824519 Seurin et al. Sep 2014 B1
9038883 Wang et al. May 2015 B2
9048633 Gronenborn et al. Jun 2015 B2
9268012 Ghosh et al. Feb 2016 B2
9285477 Smith et al. Mar 2016 B1
9348018 Eisele et al. May 2016 B2
9360554 Retterath et al. Jun 2016 B2
9378640 Mimeault et al. Jun 2016 B2
9392259 Borowski Jul 2016 B2
9516244 Borowski Dec 2016 B2
9520696 Wang et al. Dec 2016 B2
9553423 Chen et al. Jan 2017 B2
9560339 Borowski Jan 2017 B2
9574541 Ghosh et al. Feb 2017 B2
9575184 Gilliland et al. Feb 2017 B2
9658322 Lewis May 2017 B2
9674415 Wan et al. Jun 2017 B2
9791557 Wyrwas et al. Oct 2017 B1
9841495 Campbell et al. Dec 2017 B2
9857468 Eichenholz et al. Jan 2018 B1
9933513 Dussan et al. Apr 2018 B2
9946089 Chen et al. Apr 2018 B2
9989406 Pacala et al. Jun 2018 B2
9989629 LaChapelle Jun 2018 B1
9992477 Pacala et al. Jun 2018 B2
10007001 LaChapelle et al. Jun 2018 B1
10063849 Pacala et al. Aug 2018 B2
10191156 Steinberg et al. Jan 2019 B2
10295660 McMichael et al. May 2019 B1
10488492 Hamel et al. Nov 2019 B2
10514444 Donovan Dec 2019 B2
10761195 Donovan Sep 2020 B2
10928486 Donovan Feb 2021 B2
11016178 Donovan May 2021 B2
11061234 Zhu et al. Jul 2021 B1
11320538 Donovan et al. May 2022 B2
20020117340 Stettner Aug 2002 A1
20020195496 Tsikos et al. Dec 2002 A1
20030043363 Jamieson et al. Mar 2003 A1
20030147652 Green et al. Aug 2003 A1
20040120717 Clark et al. Jun 2004 A1
20040228375 Ghosh et al. Nov 2004 A1
20050025211 Zhang et al. Feb 2005 A1
20050180473 Brosnan Aug 2005 A1
20050232628 von Freyhold et al. Oct 2005 A1
20060132752 Kane Jun 2006 A1
20060231771 Lee et al. Oct 2006 A1
20060244978 Yamada et al. Nov 2006 A1
20070024849 Carrig et al. Feb 2007 A1
20070071056 Chen Mar 2007 A1
20070091960 Gauggel et al. Apr 2007 A1
20070131842 Ernst Jun 2007 A1
20070177841 Dazinger Aug 2007 A1
20070181810 Tan et al. Aug 2007 A1
20070219720 Trepagnier et al. Sep 2007 A1
20080074640 Walsh et al. Mar 2008 A1
20080186470 Hipp Aug 2008 A1
20090027651 Pack et al. Jan 2009 A1
20090140047 Yu et al. Jun 2009 A1
20090161710 Hoashi et al. Jun 2009 A1
20090273770 Bauhahn et al. Nov 2009 A1
20090295986 Topliss et al. Dec 2009 A1
20100046953 Shaw et al. Feb 2010 A1
20100215066 Mordaunt et al. Aug 2010 A1
20100271614 Albuquerque et al. Oct 2010 A1
20100302528 Hall Dec 2010 A1
20110176567 Joseph Jul 2011 A1
20110216304 Hall Sep 2011 A1
20120038903 Weimer et al. Feb 2012 A1
20130163626 Seurin et al. Jun 2013 A1
20130163627 Seurin et al. Jun 2013 A1
20130206967 Shpunt et al. Aug 2013 A1
20130208256 Mamidipudi et al. Aug 2013 A1
20130208753 Van Leeuwen et al. Aug 2013 A1
20140043309 Go et al. Feb 2014 A1
20140049610 Hudman et al. Feb 2014 A1
20140071427 Last Mar 2014 A1
20140111812 Baeg et al. Apr 2014 A1
20140139467 Ghosh et al. May 2014 A1
20140160341 Tickoo et al. Jun 2014 A1
20140218898 Seurin et al. Aug 2014 A1
20140247841 Seurin et al. Sep 2014 A1
20140267701 Aviv et al. Sep 2014 A1
20140303829 Lombrozo et al. Oct 2014 A1
20140312233 Mark et al. Oct 2014 A1
20140333995 Seurin et al. Nov 2014 A1
20140350836 Stettner Nov 2014 A1
20140376092 Mor Dec 2014 A1
20150055117 Pennecot et al. Feb 2015 A1
20150069113 Wang et al. Mar 2015 A1
20150097947 Hudman et al. Apr 2015 A1
20150103358 Flascher Apr 2015 A1
20150109603 Kim et al. Apr 2015 A1
20150123995 Zavodny et al. May 2015 A1
20150131080 Retterath et al. May 2015 A1
20150160341 Akatsu et al. Jun 2015 A1
20150219764 Lipson Aug 2015 A1
20150255955 Wang et al. Sep 2015 A1
20150260830 Ghosh et al. Sep 2015 A1
20150260843 Lewis Sep 2015 A1
20150311673 Wang et al. Oct 2015 A1
20150316368 Moench et al. Nov 2015 A1
20150340841 Joseph Nov 2015 A1
20150362585 Ghosh et al. Dec 2015 A1
20150377696 Shpunt et al. Dec 2015 A1
20150378023 Royo Royo et al. Dec 2015 A1
20160003946 Gilliland Jan 2016 A1
20160006914 Neumann Jan 2016 A1
20160025842 Anderson et al. Jan 2016 A1
20160025993 Mor et al. Jan 2016 A1
20160033642 Fluckiger Feb 2016 A1
20160072258 Seurin et al. Mar 2016 A1
20160080077 Joseph et al. Mar 2016 A1
20160119611 Hall et al. Apr 2016 A1
20160161600 Eldada et al. Jun 2016 A1
20160254638 Chen et al. Sep 2016 A1
20160259038 Retterath Sep 2016 A1
20160266242 Gilliland et al. Sep 2016 A1
20160274223 Imai Sep 2016 A1
20160282468 Gruver et al. Sep 2016 A1
20160291156 Hjelmstad Oct 2016 A1
20160306358 Kang et al. Oct 2016 A1
20160335778 Smits Nov 2016 A1
20160348636 Ghosh et al. Dec 2016 A1
20170003392 Bartlett et al. Jan 2017 A1
20170026633 Riza Jan 2017 A1
20170059838 Tilleman Mar 2017 A1
20170115497 Chen et al. Apr 2017 A1
20170131387 Campbell et al. May 2017 A1
20170131388 Campbell et al. May 2017 A1
20170139041 Drader et al. May 2017 A1
20170153319 Villeneuve et al. Jun 2017 A1
20170168162 Jungwirth Jun 2017 A1
20170176579 Niclass et al. Jun 2017 A1
20170181810 Tennican Jun 2017 A1
20170219426 Pacala et al. Aug 2017 A1
20170256915 Ghosh et al. Sep 2017 A1
20170269209 Hall et al. Sep 2017 A1
20170285169 Holz Oct 2017 A1
20170289524 Pacala et al. Oct 2017 A1
20170299722 Ouyang et al. Oct 2017 A1
20170307736 Donovan Oct 2017 A1
20170307758 Pei et al. Oct 2017 A1
20170350982 Lipson Dec 2017 A1
20170353004 Chen et al. Dec 2017 A1
20170356740 Ansari et al. Dec 2017 A1
20180045816 Jarosinski et al. Feb 2018 A1
20180058923 Lipson et al. Mar 2018 A1
20180059222 Pacala et al. Mar 2018 A1
20180062345 Bills et al. Mar 2018 A1
20180068458 Wan et al. Mar 2018 A1
20180074198 Von Novak et al. Mar 2018 A1
20180107221 Droz et al. Apr 2018 A1
20180113200 Steinberg et al. Apr 2018 A1
20180113208 Bergeron et al. Apr 2018 A1
20180120441 Elooz et al. May 2018 A1
20180128920 Keilaf et al. May 2018 A1
20180136335 Kare et al. May 2018 A1
20180152691 Pacala May 2018 A1
20180167602 Pacala et al. Jun 2018 A1
20180180720 Pei et al. Jun 2018 A1
20180180721 Pei et al. Jun 2018 A1
20180180722 Pei et al. Jun 2018 A1
20180203247 Chen et al. Jul 2018 A1
20180209841 Pacala et al. Jul 2018 A1
20180217236 Pacala et al. Aug 2018 A1
20180259623 Donovan Sep 2018 A1
20180259624 Kiehn et al. Sep 2018 A1
20180259645 Shu et al. Sep 2018 A1
20180269646 Welford et al. Sep 2018 A1
20180275248 Bailey Sep 2018 A1
20180299552 Shu et al. Oct 2018 A1
20180301872 Burroughs et al. Oct 2018 A1
20180301874 Burroughs et al. Oct 2018 A1
20180301875 Burroughs et al. Oct 2018 A1
20180364334 Xiang et al. Dec 2018 A1
20180364356 Eichenholz et al. Dec 2018 A1
20190003429 Miyashita Jan 2019 A1
20190004156 Niclass et al. Jan 2019 A1
20190011561 Pacala et al. Jan 2019 A1
20190011567 Pacala et al. Jan 2019 A1
20190018115 Schmitt et al. Jan 2019 A1
20190036308 Carson et al. Jan 2019 A1
20190049662 Thomsen et al. Feb 2019 A1
20190056497 Pacala et al. Feb 2019 A1
20190094346 Dumoulin et al. Mar 2019 A1
20190098233 Gassend et al. Mar 2019 A1
20190137607 Kostamovaara May 2019 A1
20190146071 Donovan May 2019 A1
20190170855 Keller et al. Jun 2019 A1
20190178974 Droz Jun 2019 A1
20190179018 Gunnam et al. Jun 2019 A1
20190293954 Lin et al. Sep 2019 A1
20190302246 Donovan et al. Oct 2019 A1
20200018835 Pei et al. Jan 2020 A1
20200081101 Donovan Mar 2020 A1
20200124732 Sutherland et al. Apr 2020 A1
20200200874 Donovan Jun 2020 A1
20200209355 Pacala et al. Jul 2020 A1
20200278426 Dummer et al. Sep 2020 A1
20200326425 Donovan et al. Oct 2020 A1
20200379088 Donovan et al. Dec 2020 A1
20200386868 Donovan et al. Dec 2020 A1
20200408908 Donovan Dec 2020 A1
20210033708 Fabiny Feb 2021 A1
20210041567 Milgrome et al. Feb 2021 A1
20210157000 Imaki May 2021 A1
20210181311 Donovan Jun 2021 A1
20210231779 Donovan Jul 2021 A1
20210231806 Donovan et al. Jul 2021 A1
20210234342 Donovan Jul 2021 A1
20210278540 Maayan et al. Sep 2021 A1
20210321080 Jeong et al. Oct 2021 A1
20220146680 Donovan et al. May 2022 A1
Foreign Referenced Citations (178)
Number Date Country
1512946 Jul 2004 CN
101013030 Aug 2007 CN
101080733 Nov 2007 CN
101545582 Sep 2009 CN
101692126 Apr 2010 CN
103633557 Mar 2014 CN
104898125 Sep 2015 CN
105705964 Jun 2016 CN
106464366 Feb 2017 CN
109073757 Dec 2018 CN
107728156 Nov 2019 CN
110402398 Nov 2019 CN
110914702 Mar 2020 CN
111356934 Jun 2020 CN
111919137 Nov 2020 CN
112543875 Mar 2021 CN
113692540 Nov 2021 CN
113906316 Jan 2022 CN
113924506 Jan 2022 CN
114096882 Feb 2022 CN
114174869 Mar 2022 CN
197 17 399 Jun 1999 DE
10103861 Aug 2001 DE
102007004609 Aug 2007 DE
102014216390 Feb 2016 DE
102019005059 Feb 2020 DE
1160540 Dec 2001 EP
1444696 Mar 2005 EP
1569007 Aug 2005 EP
2656099 Dec 2011 EP
2656106 Dec 2011 EP
2775316 Sep 2014 EP
3168641 Apr 2016 EP
3497477 Aug 2016 EP
2656100 Oct 2016 EP
3526625 Nov 2016 EP
3 159 711 Apr 2017 EP
3446153 Feb 2019 EP
3596492 Jan 2020 EP
3658949 Jun 2020 EP
3710855 Sep 2020 EP
3775979 Feb 2021 EP
3830602 Jun 2021 EP
3953727 Feb 2022 EP
3977159 Apr 2022 EP
3980808 Apr 2022 EP
3990943 May 2022 EP
4004587 Jun 2022 EP
2816264 May 2002 FR
5-243552 Sep 1993 JP
7-253460 Oct 1995 JP
8-280173 Oct 1996 JP
10-126007 May 1998 JP
2000-147604 May 2000 JP
2002-214361 Jul 2002 JP
2003258359 Sep 2003 JP
2003-536061 Dec 2003 JP
2004-078255 Mar 2004 JP
2004-94115 Mar 2004 JP
2004-361315 Dec 2004 JP
2005-331273 Dec 2005 JP
2006-162386 Jun 2006 JP
2007-214564 Aug 2007 JP
2008-015434 Jan 2008 JP
4108478 Jun 2008 JP
2008-180719 Aug 2008 JP
2009-103529 May 2009 JP
2009-170870 Jul 2009 JP
2009-204691 Sep 2009 JP
2010-91855 Apr 2010 JP
2010-256291 Nov 2010 JP
2011-003748 Jan 2011 JP
2012-504771 Feb 2012 JP
5096008 Dec 2012 JP
2013-050310 Mar 2013 JP
2013-113669 Jun 2013 JP
2014-059302 Apr 2014 JP
2014-077658 May 2014 JP
2016-14665 Jan 2016 JP
2016-146417 Aug 2016 JP
2016-176721 Oct 2016 JP
2016-188808 Nov 2016 JP
2016-540189 Dec 2016 JP
2017-053833 Mar 2017 JP
2017-134814 Aug 2017 JP
2018-025632 Feb 2018 JP
2019-060652 Apr 2019 JP
2019-68528 Apr 2019 JP
2019-509474 Apr 2019 JP
2019-516101 Jun 2019 JP
2020-510208 Apr 2020 JP
2021-503085 Feb 2021 JP
2021-507260 Feb 2021 JP
6839861 Mar 2021 JP
6865492 Apr 2021 JP
2021-073462 May 2021 JP
2021-73473 May 2021 JP
2021-105613 Jul 2021 JP
2021-519926 Aug 2021 JP
2021-139918 Sep 2021 JP
2021-532368 Nov 2021 JP
2022-1885 Jan 2022 JP
6995413 Jan 2022 JP
2022-22361 Feb 2022 JP
2022-36224 Mar 2022 JP
7037830 Mar 2022 JP
2022-526998 May 2022 JP
2022-534500 Aug 2022 JP
10-2000-0053620 Aug 2000 KR
10-2009-0016499 Feb 2009 KR
10-2012-0053045 May 2012 KR
10-2012-0061033 Jun 2012 KR
10-2013-0140554 Dec 2013 KR
10-2014-0138724 Dec 2014 KR
10-2015-0045735 Apr 2015 KR
10-2016-0101140 Aug 2016 KR
10-2018-0049937 May 2018 KR
10-2018-0064969 Jun 2018 KR
10-2018-0128447 Dec 2018 KR
10-2019-0076725 Jul 2019 KR
10-2019-0117418 Oct 2019 KR
10-2019-0120403 Oct 2019 KR
10-2020-0011351 Feb 2020 KR
10-2020-0075014 Jun 2020 KR
10-2020-0096632 Aug 2020 KR
10-2020-0128435 Nov 2020 KR
10-2021-0021409 Feb 2021 KR
10-2218679 Feb 2021 KR
10-2021-0029831 Mar 2021 KR
10-2021-0065207 Jun 2021 KR
10-2021-0137584 Nov 2021 KR
10-2021-0137586 Nov 2021 KR
10-2326493 Nov 2021 KR
10-2326508 Nov 2021 KR
10-2022-0003600 Jan 2022 KR
10-2022-0017412 Feb 2022 KR
10-2364531 Feb 2022 KR
10-2022-0024177 Mar 2022 KR
10-2022-0025924 Mar 2022 KR
10-2022-0038691 Mar 2022 KR
10-2398080 May 2022 KR
99-42856 Aug 1999 WO
2002065153 Aug 2002 WO
2006044758 Apr 2006 WO
2006083349 Aug 2006 WO
2013107709 Jul 2013 WO
2014014838 Jan 2014 WO
2015040671 Mar 2015 WO
2015040671 Mar 2015 WO
2015059705 Apr 2015 WO
2017112416 Jun 2017 WO
2017132704 Aug 2017 WO
2017184336 Oct 2017 WO
2018028795 Feb 2018 WO
2018082762 May 2018 WO
2018169758 Sep 2018 WO
2018166609 Sep 2018 WO
2018166610 Sep 2018 WO
2018166611 Sep 2018 WO
2018169758 Sep 2018 WO
2018180391 Oct 2018 WO
2018181250 Oct 2018 WO
2018191495 Oct 2018 WO
2019010320 Jan 2019 WO
2019022941 Jan 2019 WO
2019-064062 Apr 2019 WO
2019115148 Jun 2019 WO
2019195054 Oct 2019 WO
2019221776 Nov 2019 WO
2020028173 Feb 2020 WO
2020210176 Oct 2020 WO
2020242834 Dec 2020 WO
2020251891 Dec 2020 WO
2020263735 Dec 2020 WO
2021021872 Feb 2021 WO
2021150860 Jul 2021 WO
2021236201 Nov 2021 WO
2022103778 May 2022 WO
Non-Patent Literature Citations (157)
Entry
“Notification of Transmittal of The International Search Report and The Written Opinion of the International Searching Authority, or The Declaration” for International Patent Application No. PCT/US2018/057026, Dec. 16, 2019, 11 pages, International Searching Authority, Korean Intellectual Property Office, Daejeon, Republic of Korea.
“Written Opinion of the International Searching Authority” for International Patent Application No. PCT/EP2016/077499, Feb. 14, 2017, 7 pages, The International Searching Authority.
“Search Report” for International Patent Application No. PCT/EP2016/077499, 2 pages, International Searching Authority/ EPO, Rijswijk, the Netherlands.
“Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or The Declaration” for International Patent Application No. PCT/US2018/041021, Nov. 5, 2018, 15 Pages, Korean Intellectual Property Office, Daejeon, Republic of Korea.
“Office Action” South Korean Patent Application No. 10-2021-7006391, May 14, 2021, 8 pages, Korean Intellectual Property Office, South Korea.
“Supplementary European Search Report” for European Patent Application No. EP17786325, Mar. 11, 2020, 22 pages, European Patent Office, Munich, Germany.
“Office Action” for U.S. Appl. No. 15/456,789, issued Sep. 25, 2019, 58 pages, The USPTO.
“Notice of Allowance” for U.S. Appl. No. 16/028,774, issued Aug. 21, 2019, 56, pages, The USPTO.
“Notification Concerning Transmittal of International Preliminary Report on Patentability (Chapter 1 of The Patent Cooperation Treaty)” for International Patent Application No. PCT/US2017/026109, Nov. 1, 2018, 13 Pages, The International Bureau of WIPO, Geneva, Switzerland.
“Notification Concerning Transmittal of International Preliminary Report on Patentability (Chapter I of the Patent Cooperation Treaty)” for International Patent Application No. PCT/US2018/021553, Sep. 26, 2019, 9 pages, The International Bureau of WIPO, Geneva, Switzerland.
“Notification Concerning Transmittal of Copy of International Preliminary Report on Patentability (Chapter I of the Patent Cooperation Treaty)” for International Patent Application No. PCT/US2019/043674, Feb. 18, 2021, 10 pages, The International Bureau of WIPO, Geneva, Switzerland.
“Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration” for International Patent Application No. PCT/US2018/021553, Jun. 20, 2018, 13 pages, International Searching Authority, Korean Intellectual Property Office, Daejeon, Republic of Korea.
“European Search Report” For European Patent Application No. 17786325.5, Nov. 9, 2019, 18 pages, European Patent Office, Munich, Germany.
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for International Application No. PCT/US2017/026109, Jun. 19, 2017, 17 pages, International Search Authority/Korean Intellectual Property Office, Daejeon, Republic of Korea.
U.S. Appl. No. 15/456,789, filed Mar. 13, 2017 in the USPTO.
U.S. Appl. No. 16/028,774, filed Jul. 6, 2018 in the USPTO.
Notification Concerning Transmittal of International Preliminary Report on Patentability (Chapter I of the Patent Cooperation Treaty) for International Application No. PCT/US18/041021, Feb. 6, 2020, 10 pages, The International Bureau of WIPO, Geneva, Switzerland.
Notice of Final Rejection received for Korean Patent Application Serial No. 10-2021-7006391 dated Oct. 22, 2021, pages (Including English Translation).
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for International Patent Application No. PCT/US2019/024343, Jul. 12, 2019, 17 Pages, ISA/KR, Korean Intellectual Property Office, Daejeon, Republic of Korea.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2018/057026, mailed on Dec. 16, 2019, 9 pages.
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2020/036634, mailed on Dec. 23, 2021, 6 pages.
International Search Report and Written Opinion received for PCT Application Serial No. PCT/US2021/058687, mailed on Mar. 3, 2022 , 11 pages.
Decision to Grant a Patent received for Japanese Patent Application Serial No. 2021-014376, mailed on Mar. 22, 2022, 05 pages (2 pages of English Translation and 3 pages of Official Copy).
Non-Final Office Action received for U.S. Appl. No. 17/164,773, mailed on Apr. 21, 2022, 8 pages.
Notice of Allowance received for Chinese Patent Application Serial No. 201880047615.6, mailed on Mar. 23, 2022, 7 pages (2 pages of English Translation and 5 pages of Official Copy).
Decision to Grant a Patent received for Korean Patent Application Serial No. 10-2018-7030512, mailed on Mar. 18, 2022, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Decision to Grant a Patent received for Korean Patent Application Serial No. 10-2021-7036648, mailed on May 19, 2022, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Decision to Grant a Patent received for Japanese Patent Application Serial No. 2019-549550, mailed on Feb. 25, 2022, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Office Action for Japanese Patent Application No. 2021-020502, Apr. 13, 2022, 10 pages (7 pages of English Translation and 3 pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 17/227,300, mailed on Feb. 8, 2022, 11 pages.
International Search Report and Written Opinion received for PCT Application Serial No. PCT/US2021/020749, mailed on Jan. 3, 2022, 11 pages.
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2020/038927, mailed on Jan. 6, 2022, 9 pages.
Office Action received for Korean Application Serial No. 10-2020-7029872, mailed on Jan. 19, 2022, 32 pages (18 pages of English Translation and 14 pages of Official Copy).
Extended European Search Report received for European Patent Application Serial No. 19843301.3, mailed on Feb. 18, 2022, 10 pages.
International Preliminary Report on Patentability received for PCT Application Application No. PCT/US2020/043979, mailed on Feb. 10, 2022, 06 pages.
Office Action received for Korean Application Serial No. 10-2020-7029872, mailed on May 24, 2022, 05 pages (2 pages of English Translation and 3 pages of Official Copy).
Notice of Allowance received for Korean Patent Application Serial No. 10-2021-7006391, mailed on Feb. 9, 2022, 03 pages (1 page of English Translation and 2 pages of Official Copy).
Restriction Requirement received for U.S. Appl. No. 16/366,729, mailed on Jun. 3, 2022, 06 pages.
Office Action received for Japanese Patent Application Serial No. 2021-100687, mailed on Jul. 1, 2022, 09 pages. (6 pages of English Translation and 3 pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 16/841,930, mailed on Jun. 29, 2022, 10 pages.
Non-Final Office Action received for U.S. Patent Application Serial No. 16/878, 140, mailed on Jun. 22, 2022, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 16/907,732, mailed on Jul. 13, 2022, 20 pages.
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2021/014564, mailed on Aug. 4, 2022, 06 pages.
Notice of Allowance received for U.S. Appl. No. 16/895,588, mailed on Aug. 3, 2022, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 16/366,729, mailed on Aug. 26, 2022, 09 pages.
Notice of Allowance received for U.S. Appl. No. 16/805,733, mailed on Aug. 22, 2022, 13 pages.
Office Action received for Japanese Patent Application Serial No. 2021-168642, mailed on Aug. 25, 2022, 4 pages (2 pages of English Translation and 2 pages of Official Copy).
Office Action received for Japanese Patent Application Serial No. 2020-526502, mailed on Aug. 24, 2022, 10 pages (5 pages of English Translation and 5 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2018-7030512, mailed on Dec. 23, 2021, 7 pages. (3 pages of English Translation and 4 pages of Official Copy).
Office Action received for Japanese Patent Application Serial No. 2022-108166, mailed on Jul. 19, 2023, 2 pages of Official Copy only.
Non-Final Office Action received for U.S. Appl. No. 17/227,295, mailed on Mar. 9, 2023, 10 pages.
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2021/020749, mailed on Sep. 15, 2022, 8 pages.
Notice of Allowance received for U.S. Appl. No. 16/805,733, mailed on Nov. 10, 2022, 5 pages.
Notice of Allowance received for U.S. Appl. No. 16/841,930, mailed on Oct. 3, 2022, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 16/168,054, mailed on Oct. 20, 2022, 16 pages.
Notice of Allowance received for U.S. Appl. No. 17/164,773, mailed on Nov. 2, 2022, 8 pages.
Notice of Allowance received for U.S. Appl. No. 16/805,733, mailed on Jan. 25, 2023, 5 pages.
Office Action received for Chinese Patent Application Serial No. 201780024892.0, mailed on Sep. 2, 2022, 28 pages (11 pages of English Translation and 17 pages of Official Copy).
Extended European Search Report received in European Application No. 20787345.6, mailed on Dec. 5, 2022, 8 pages.
Final Office Action received for U.S. Appl. No. 16/878,140, mailed on Feb. 1, 2023, 26 pages.
Notice of Allowance received for U.S. Appl. No. 17/164,773, mailed on Feb. 1, 2023, 8 pages.
Notice of Allowance received for U.S. Appl. No. 16/841,930, mailed on Jan. 30, 2023, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2022/028297, mailed on Mar. 13, 2023, 11 pages.
Restriction Requirement received for U.S. Appl. No. 16/941,896, mailed on Jan. 24, 2023, 06 pages.
Partial European Search Report received for European Patent Application No. 22178999.3, mailed on Oct. 10, 2022, 22 pages.
Decision to Grant received for Korean Patent Application Serial No. 10-2022-7021139, mailed on Dec. 14, 2022, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2021-020502, mailed on Jan. 23, 2023, 6 pages (4 pages of English Translation and 2 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2021-7016081, mailed on Oct. 25, 2022, 4 pages (2 pages of English Translation and 2 pages of Official Copy).
Office Action received for Japanese Patent Application Serial No. 2021-199077, mailed on Dec. 23, 2022, 9 pages (6 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2022-7028820, mailed on Dec. 15, 2022, 12 pages (6 pages of English Translation and 6 pages of Official Copy).
Extended European Search Report received for European Patent Application No. 20815113.4, mailed on Jan. 31, 2023, 14 pages.
Partial European Search Report received for European Patent Application No. 20822328.9, mailed on Feb. 6, 2023, 20 pages.
Office Action received for Korean Patent Application No. 10-2022-7004969, mailed on Jan. 9, 2023, 11 pages (6 pages of English Translation and 5 pages of Official Copy).
Office Action received for Japanese Patent Application Serial No. 2020-552870, mailed on Nov. 29, 2022, 11 pages (7 pages of English Translation and 4 pages of Official Copy).
Office Action received for Japanese Patent Application Serial No. 2022-002790, mailed on Dec. 26, 2022, 10 pages (7 pages of English Translation and 3 pages of Official Copy).
Decision to Grant received for Korean Patent Application Serial No. 10-2020-7029872, mailed on Nov. 28, 2022, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2022-7015754, mailed on Dec. 12, 2022, 21 pages (11 pages of English Translation and 10 pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 16/366,729, mailed on Mar. 8, 2023, 7 pages.
Extended European Search Report received for European Patent Application No. 22178999.3, mailed on Mar. 6, 2023, 25 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2022/019054, mailed on Feb. 20, 2023, 13 pages.
Office Action received for Korean Application Serial No. 10-2021-7036300, mailed on Feb. 9, 2023, 14 pages (7 pages of English Translation and 7 pages of Official Copy).
Decision to Grant received for Korean Patent Application Serial No. 10-2021-7040665, mailed on Feb. 23, 2023, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Office Action received for Chinese Patent Application Serial No. 201880017776.0, mailed on Feb. 16, 2023, 22 pages (10 pages of English Translation and 12 pages of Official Copy).
Office Action received for Chinese Patent Application Serial No. 201880074279.4, mailed on Mar. 1, 2023, 23 pages (9 pages of English Translation and 14 pages of Official Copy).
Notice of Allowance received for U.S. Appl. No. 17/164,773, mailed on Apr. 5, 2023, 8 pages.
Office Action received for Japanese Patent Application Serial No. 2021-100687, mailed on Mar. 14, 2023, 05 pages. (3 pages of English Translation and 2 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2020-526502, mailed on Mar. 14, 2023, 8 pages (5 pages of English Translation and 3 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2021-168642, mailed on Mar. 15, 2023, 5 pages (3 pages of English Translation and 2 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2022-80688, mailed on Mar. 17, 2023, 11 pages (7 pages of English Translation and 4 pages of Official Copy).
Non-Final Office Action received for U.S. Appl. No. 17/155,626, mailed on Apr. 12, 2023, 24 pages.
Notice of Allowance received for U.S. Appl. No. 16/841,930, mailed on Apr. 17, 2023, 9 pages.
Office Action received for Korean Patent Application No. 10-2022-7036873, mailed on Mar. 29, 2023, 22 pages (12 pages of English Translation and 10 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2023-7007292, mailed on Apr. 17, 2023, 19 pages (10 pages of English Translation and 9 pages of Official Copy).
Decision to Grant a Patent received for Korean Patent Application Serial No. 10-2023-7009114, mailed on May 16, 2023, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Office Action for Japanese Patent Application No. 2021-572877, May 12, 2023, 12 pages (8 pages of English Translation and 4 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2021-559434, mailed on May 26, 2023, 17 pages (11 pages of English Translation and 6 pages of Official Copy).
Extended European Search Report received for European Patent Application No. 20831915.2, Jun. 2, 2023, 9 pages.
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2021/058687, mailed on May 25, 2023 , 7 pages.
Notice of Allowance received for Chinese Patent Application Serial No. 201780024892.0, mailed on May 30, 2023, 2 pages (Official Copy Only).
Non-Final Office Action received for U.S. Appl. No. 16/168,054, mailed on Jun. 1, 2021, 15 pages.
Notice of Allowance received for U.S. Appl. No. 15/456,789, mailed on Apr. 29, 2020, 5 pages.
Non-Final Office Action received for U.S. Appl. No. 15/915,840, mailed on May 7, 2020, 11 pages.
Notice of Allowance received for U.S. Appl. No. 15/915,840, mailed on Jan. 19, 2021, 6 pages.
Extended European Search Report received for European Patent Application No. 18767885.9, Nov. 18, 2020, 10 pages.
Office Action received for Japanese Patent Application No. 2019-549550, mailed on Mar. 22, 2021, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2019-7029980, mailed on Mar. 26, 2021, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
Plant, et al., “256-Channel Bidirectional Optical Interconnect Using VCSELs and Photodiodes on CMOS”, Journal of Lightwave Technology, vol. 19, No. 8, Aug. 2001, pp. 1093-1103.
Knodl, et al., “Bipolar Cascade VCSEL with 130% Differential Quantum Efficiency”, Annual Report 2000, Optoelectronics Department, pp. 11-14.
Morgan, et al., “Two-Dimensional Matrix Addressed Vertical Cavity Top-Surface Emitting Laser Array Display”, IEEE Photonics Technology Letters, vol. 6, No. 8, Aug. 1994, pp. 913-917.
Orenstein, et al., “Matrix Addressable Vertical Cavity Surface Emitting Laser Array”, Electronics Letters, vol. 27, No. 5, Feb. 28, 1991, pp. 437-438,.
Geib, et al., “Fabrication and Performance of Two-Dimensional Matrix Addressable Arrays of Integrated Vertical-Cavity Lasers and Resonant Cavity Photodetectors”, IEEE Journal of Selected Topics In Quantum Electronics, vol. 8, No. 4, Jul./Aug. 2002, pp. 943-947.
Moench et al., “VCSEL Based Sensors for Distance and Velocity”, Vertical Cavity Surface-Emitting Lasers XX, Proc. of SPIE, vol. 9766, 2016, pp. 97660A-1-97660A-11.
Notice of Allowance received for Korean Patent Application No. 10-2019-7029980, mailed on Aug. 6, 2021, 3 pages (1 page of English Translation and 2 pages of Official Copy). English Translation).
Office Action received for Japanese Patent Application No. 2019-549550, mailed on Aug. 27, 2021, 7 pages (5 pages of English Translation and 2 pages of Official Copy).
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2018/057026, mailed on May 28, 2020, 7 pages.
Non-Final Rejection received for U.S. Appl. No. 16/686,163, mailed on Apr. 16, 2020, 9 pages.
Notice of Allowance received for U.S. Appl. No. 16/686,163, mailed on Oct. 16, 2020, 9 pages.
Office Action received for Korean Patent Application No. 10-2020-7005082, mailed on May 8, 2020, 19 pages (11 pages of English Translation and 8 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2020-504014, mailed on Sep. 2, 2020, 7 pages (3 pages of English Translation and 4 pages of Official Copy).
Notice of Grant received for Korean Patent Application No. 10-2020-7005082, Nov. 24, 2020, 5 pages (3 pages of English Translation and 2 pages of Official Copy).
Office Action received for Chinese Patent Application No. 201880047615.6, mailed on Jan. 18, 2021, 16 pages (8 pages of English Translation and 8 pages of Official Copy).
Notice of Allowance received for Japanese Patent Application No. 2020-504014, mailed on Feb. 15, 2021, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Extended European Search Report received for European Patent Application No. 18839499.3, mailed on Mar. 4, 2021, 10 pages.
Office Action received for Korean Patent Application No. 10-2021-7004589, mailed on Mar. 10, 2021, 9 pages (5 pages of English Translation and 4 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2021-056628, mailed on Jun. 14, 2021, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
Notice of Allowance received for Korean Patent Application No. 10-2021-7004589, mailed on Aug. 6, 2021, 2 pages (1 page of English Translation and 1 page of Official Copy).
Office Action received for Chinese Patent Application Serial No. 201880047615.6, mailed on Aug. 25, 2021, 12 pages (6 pages of English Translation and 6 pages of Official Copy).
Notice of Allowance received for Japanese Patent Application Serial No. 2021-056628, mailed on Nov. 2, 2021, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Extended European Search Report received for European Patent Application No. 18918938.4, mailed on Jul. 6, 2021, 9 pages.
Office Action received for Korean Patent Application No. 10-2020-7016928, mailed on Jul. 16, 2021, 13 pages (7 pages of English Translation and 6 pages of Official Copy).
Office Action received for European Patent Application No. 17786325.5, mailed on Dec. 17, 2021, 5 pages.
Notice of Allowance received for Japanese Patent Application No. 2018-555665, mailed on Dec. 2, 2020, 05 pages (2 pages of English Translation and 3 pages of Official Copy).
International Preliminary Report on Patentability for International Patent Application No. PCT/US2019/024343, mailed on Oct. 15, 2020, 9 pages.
Office Action received for Korean Patent Application No. 10-2020-7029872, mailed on Jul. 19, 2021, 23 pages (13 pages of English Translation and 10 pages of Official Copy).
Extended European Search Report received for European Patent Application Serial No. 19781037.7, mailed on Oct. 25, 2021, 9 pages.
Decision to Grant a Patent received for Korean Patent Application Serial No. 10-2020-7016928, mailed on Nov. 16, 2021, 3 pages (1 page of English Translation and 2 pages of Official Copy).
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2019/043674, mailed on Feb. 18, 2021, 10 pages.
International Search Report and Written Opinion received for International Patent Application No. PCT/US2020/026964, mailed on Jul. 28, 2020, 8 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2020/026964, mailed on Oct. 21, 2021, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 17/227,300, mailed on Jun. 30, 2021, 8 pages.
International Search Report and the Written Opinion received for PCT Patent Application No. PCT/US2020/033630, mailed on Sep. 9, 2020, 9 pages.
International Search Report and the Written Opinion received for PCT Patent Application No. PCT/US2020/036634, mailed on Sep. 21, 2020, 7 pages.
International Search Report and the Written Opinion received for PCT Patent Application No. PCT/US2020/038927, mailed on Oct. 7, 2020, 12 pages.
International Search Report and the Written Opinion received for PCT Patent Application No. PCT/US2020/043979, mailed on Nov. 10, 2020, 7 pages.
International Search Report and the Written Opinion received for PCT Patent Application No. PCT/US2021/014564, mailed on May 17, 2021, 8 pages.
Office Action received for Korean Patent Application Serial No. 10-2021-7036648, mailed on Dec. 17, 2021, 8 pages (4 pages of English Translation and 4 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2021-014376, mailed on Sep. 27, 2021, 18 pages (12 pages of English Translation and 6 paegs of Official Copy).
International Preliminary Report on Patentability received for PCT Application Serial No. PCT/US2020/033630, mailed on Dec. 9, 2021, 8 pages.
Final Office Action received for U.S. Appl. No. 16/168,054, mailed on Jan. 26, 2022, 16 pages.
Extended European Search Report received for European Patent Application No. 20822328.9, mailed on May 4, 2023, 34 pages.
“Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration” for International Patent Application No. PCT/US2019/043674, Nov. 15, 2019, 16 pages, International Searching Authority/KR, Daejeon, Republic of Korea.
Decision to Grant for Korean Patent Application No. 10-2022-7015754, mailed on Aug. 28, 2023, 4 pages (1 page of English Translation and 3 pages of Official Copy).
Office Action received for Japanese Patent Application No. 2021-505773, mailed on Aug. 28, 2023, 10 pages (7 pages of English Translation and 3 pages of Official Copy).
Office Action received for Chinese Patent Application Serial No. 201980051834.6, mailed on Dec. 7, 2023, 23 pages (8 pages of English Translation and 15 pages of Official Copy).
“Notification Concerning Transmittal of International Preliminary Report on Patentability (Chapter I of the Patent Cooperation Treaty)” for International Patent Application No. PCT/US2019/043674, Feb. 18, 2021, 10 pages, The International Bureau of WIPO, Geneva, Switzerland.
Notice of Final Rejection received for Korean Patent Application Serial No. 10-2021-7006391 dated Oct. 22, 2021, 5 pages (Including English Translation).
Notice of Allowance received for U.S. Appl. No. 16/805,733, mailed on May 8, 2023, 5 pages.
Related Publications (1)
Number Date Country
20200041614 A1 Feb 2020 US
Provisional Applications (1)
Number Date Country
62714463 Aug 2018 US