Optical detection of range using lasers, often referenced by a mnemonic, LIDAR (for “light detection and ranging”), also sometimes referred to as “laser RADAR,” is used for a variety of applications, including imaging and collision avoidance. LIDAR provides finer scale range resolution with smaller beam sizes than conventional microwave ranging systems, such as radio-wave detection and ranging (RADAR).
At least one aspect relates to a light detection and ranging (LIDAR) system. The LIDAR system includes a laser source configured to generate a beam and a polygon scanner. The polygon scanner includes a frame and a plurality of mirrors coupled to the frame, each mirror including a glass material.
At least one aspect relates to an autonomous vehicle control system. The autonomous vehicle control system includes a laser source, a polygon scanner, and one or more processors. The laser source is configured to generate a first beam. The polygon scanner includes a frame and a plurality of mirrors coupled to the frame, each mirror comprising a glass material, the polygon scanner configured to reflect the first beam as a second beam. The one or more processors are configured to determine at least one of a range to an object or a velocity of the object using a third beam received from at least one of reflection or scattering of the second beam by the object, control operation of an autonomous vehicle responsive to the at least one of the range or the velocity.
At least one aspect relates to an autonomous vehicle. The autonomous vehicle includes a LIDAR system including a laser source configured to generate a first beam and a polygon scanner that includes a frame and a plurality of mirrors coupled to the frame, each mirror comprising a glass material. The autonomous vehicle includes a steering system, a braking system, and a vehicle controller including one or more processors configured to determine at least one of a range to an object or a velocity of the object using a third beam received from at least one of reflection or scattering of the second beam by the object, and control operation of the at least one of the steering system and the braking system responsive to the at least one of the range or the velocity.
Those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting. Any of the features described herein may be used with any other features, and any subset of such features can be used in combination according to various embodiments. Other aspects, inventive features, and advantages of the devices and/or processes described herein, as defined solely by the claims, will become apparent in the detailed description set forth herein and taken in conjunction with the accompanying drawings.
Implementations are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements and in which:
A LIDAR system can generate and transmit a light beam that an object can reflect or otherwise scatter as a return beam corresponding to the transmitted beam. The LIDAR system can receive the return beam, and process the return beam or characteristics thereof to determine parameters regarding the object such as range and velocity. The LIDAR system can apply various frequency or phase modulations to the transmitted beam, which can facilitate relating the return beam to the transmitted beam in order to determine the parameters regarding the object.
The LIDAR system can include a laser source and a polygon scanner. The laser source is configured to generate a first beam. The polygon scanner includes a frame and a plurality of mirrors coupled to the frame, each mirror comprising a glass material. The mirrors can reflect the first beam to output a second beam, which can be scanned over a field of view to be reflected or otherwise scattered by an object as a third beam, which can be used to determine range, velocity, and Doppler information regarding the object, such as for controlling operation of an autonomous vehicle.
Systems and methods in accordance with the present disclosure can implement LIDAR systems in which a polygon scanner is assembled by having multiple facets of polished glass mirrors that are attached to the frame, as compared to polygon scanners in which the scanner is formed by machining (e.g., computer numerical control (CNC) processes), such as by being made from diamond turned aluminum. By using polished glass mirrors for the facets, the surfaces of the facets can be made more flat and less rough, which can enable optical improvements such as higher reflectivity, lower scattering, and/or more particular beam shapes that are desirable for autonomous vehicles (e.g., beam shape having a lesser degree of variation from an ideal Gaussian beam). For example, making the facets more flat and/or less rough can reduce the likelihood of reflections or scattering occurring within surface of the facets themselves (such reflections or scattering can have Doppler shifts or otherwise contribute noise to the signal processing). In addition, the assembled polygon scanner can have reduced weight and/or inertia relative to polygon scanners made from solid metal blocks, which can improve reliability of the motor that rotates the polygon scanner and allow for greater flexibility in the form factor of the facets (e.g., to allow for larger facets or facets of various shapes, such as concave or convex facets). The assembled polygon scanner can be manufactured with a less complex, more scalable process. However, the advantages of the assembled polygon scanner described above are not limited to autonomous vehicles. They can be advantageous for any type of vehicles equipped with LIDAR sensors.
1. System Environments for Autonomous Vehicles
The direction control 112 may include one or more actuators and/or sensors for controlling and receiving feedback from the direction or steering components to enable the vehicle 100 to follow a desired trajectory. The powertrain control 114 may be configured to control the output of the powertrain 102, e.g., to control the output power of the prime mover 104, to control a gear of a transmission in the drivetrain 108, etc., thereby controlling a speed and/or direction of the vehicle 100. The brake control 116 may be configured to control one or more brakes that slow or stop vehicle 100, e.g., disk or drum brakes coupled to the wheels of the vehicle.
Other vehicle types, including but not limited to off-road vehicles, all-terrain or tracked vehicles, construction equipment, may utilize different powertrains, drivetrains, energy sources, direction controls, powertrain controls and brake controls. Moreover, in some implementations, some of the components can be combined, e.g., where directional control of a vehicle is primarily handled by varying an output of one or more prime movers.
Various levels of autonomous control over the vehicle 100 can be implemented in a vehicle control system 120, which may include one or more processors 122 and one or more memories 124, with each processor 122 configured to execute program code instructions 126 stored in a memory 124. The processors(s) can include, for example, graphics processing unit(s) (“GPU(s)”)) and/or central processing unit(s) (“CPU(s)”).
Sensors 130 may include various sensors suitable for collecting information from a vehicle's surrounding environment for use in controlling the operation of the vehicle. For example, sensors 130 can include radar sensor 134, LIDAR (Light Detection and Ranging) sensor 136, a 3D positioning sensors 138, e.g., any of an accelerometer, a gyroscope, a magnetometer, or a satellite navigation system such as GPS (Global Positioning System), GLONASS (Globalnaya Navigazionnaya Sputnikovaya Sistema, or Global Navigation Satellite System), BeiDou Navigation Satellite System (BDS), Galileo, Compass, etc. The 3D positioning sensors 138 can be used to determine the location of the vehicle on the Earth using satellite signals. The sensors 130 can include a camera 140 and/or an IMU (inertial measurement unit) 142. The camera 140 can be a monographic or stereographic camera and can record still and/or video images. The IMU 142 can include multiple gyroscopes and accelerometers capable of detecting linear and rotational motion of the vehicle in three directions. One or more encoders (not illustrated), such as wheel encoders may be used to monitor the rotation of one or more wheels of vehicle 100. Each sensor 130 can output sensor data at various data rates, which may be different than the data rates of other sensors 130.
The outputs of sensors 130 may be provided to a set of control subsystems 150, including a localization subsystem 152, a planning subsystem 156, a perception subsystem 154, and a control subsystem 158. The localization subsystem 152 can perform functions such as precisely determining the location and orientation (also sometimes referred to as “pose”) of the vehicle 100 within its surrounding environment, and generally within some frame of reference. The location of an autonomous vehicle can be compared with the location of an additional vehicle in the same environment as part of generating labeled autonomous vehicle data. The perception subsystem 154 can perform functions such as detecting, tracking, determining, and/or identifying objects within the environment surrounding vehicle 100. A machine learning model in accordance with some implementations can be utilized in tracking objects. The planning subsystem 156 can perform functions such as planning a trajectory for vehicle 100 over some timeframe given a desired destination as well as the static and moving objects within the environment. A machine learning model in accordance with some implementations can be utilized in planning a vehicle trajectory. The control subsystem 158 can perform functions such as generating suitable control signals for controlling the various controls in the vehicle control system 120 in order to implement the planned trajectory of the vehicle 100. A machine learning model can be utilized to generate one or more signals to control an autonomous vehicle to implement the planned trajectory.
Multiple sensors of types illustrated in
In some implementations, the vehicle 100 may also include a secondary vehicle control system (not illustrated), which may be used as a redundant or backup control system for the vehicle 100. In some implementations, the secondary vehicle control system may be capable of fully operating the autonomous vehicle 100 in the event of an adverse event in the vehicle control system 120, while in other implementations, the secondary vehicle control system may only have limited functionality, e.g., to perform a controlled stop of the vehicle 100 in response to an adverse event detected in the primary vehicle control system 120. In still other implementations, the secondary vehicle control system may be omitted.
Various architectures, including various combinations of software, hardware, circuit logic, sensors, and networks, may be used to implement the various components illustrated in
In addition, for additional storage, the vehicle 100 may include one or more mass storage devices, e.g., a removable disk drive, a hard disk drive, a direct access storage device (“DASD”), an optical drive (e.g., a CD drive, a DVD drive, etc.), a solid state storage drive (“SSD”), network attached storage, a storage area network, and/or a tape drive, among others.
Furthermore, the vehicle 100 may include a user interface 164 to enable vehicle 100 to receive a number of inputs from and generate outputs for a user or operator, e.g., one or more displays, touchscreens, voice and/or gesture interfaces, buttons and other tactile controls, etc. Otherwise, user input may be received via another computer or electronic device, e.g., via an app on a mobile device or via a web interface.
Moreover, the vehicle 100 may include one or more network interfaces, e.g., network interface 162, suitable for communicating with one or more networks 170 (e.g., a Local Area Network (“LAN”), a wide area network (“WAN”), a wireless network, and/or the Internet, among others) to permit the communication of information with other computers and electronic device, including, for example, a central service, such as a cloud service, from which the vehicle 100 receives environmental and other data for use in autonomous control thereof. Data collected by the one or more sensors 130 can be uploaded to a computing system 172 via the network 170 for additional processing. In some implementations, a time stamp can be added to each instance of vehicle data prior to uploading.
Each processor illustrated in
In general, the routines executed to implement the various implementations described herein, whether implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions, or even a subset thereof, will be referred to herein as “program code”. Program code can include one or more instructions that are resident at various times in various memory and storage devices, and that, when read and executed by one or more processors, perform the steps necessary to execute steps or elements embodying the various aspects of the present disclosure. Moreover, while implementations have and hereinafter will be described in the context of fully functioning computers and systems, it will be appreciated that the various implementations described herein are capable of being distributed as a program product in a variety of forms, and that implementations can be implemented regardless of the particular type of computer readable media used to actually carry out the distribution.
Examples of computer readable media include tangible, non-transitory media such as volatile and non-volatile memory devices, floppy and other removable disks, solid state drives, hard disk drives, magnetic tape, and optical disks (e.g., CD-ROMs, DVDs, etc.) among others.
In addition, various program code described hereinafter may be identified based upon the application within which it is implemented in a specific implementation. Any particular program nomenclature that follows is used merely for convenience, and thus the present disclosure should not be limited to use solely in any specific application identified and/or implied by such nomenclature. Furthermore, given the typically endless number of manners in which computer programs may be organized into routines, procedures, methods, modules, objects, and the like, as well as the various manners in which program functionality may be allocated among various software layers that are resident within a typical computer (e.g., operating systems, libraries, API's, applications, applets, etc.), the present disclosure is not limited to the specific organization and allocation of program functionality described herein.
2. LIDAR for Automotive Applications
A truck can include a LIDAR system (e.g., vehicle control system 120 in
In some instances, an object (e.g., a pedestrian wearing dark clothing) may have a low reflectivity, in that it only reflects back to the sensors (e.g., sensors 130 in
Regardless of the object's reflectivity, an FM LIDAR system may be able to detect (e.g., classify, recognize, discover, etc.) the object at greater distances (e.g., 2×) than a conventional LIDAR system. For example, an FM LIDAR system may detect a low reflectively object beyond 300 meters, and a high reflectivity object beyond 400 meters.
To achieve such improvements in detection capability, the FM LIDAR system may use sensors (e.g., sensors 130 in
Thus, by detecting an object at greater distances, an FM LIDAR system may have more time to react to unexpected obstacles. Indeed, even a few milliseconds of extra time could improve response time and comfort, especially with heavy vehicles (e.g., commercial trucking vehicles) that are driving at highway speeds.
The FM LIDAR system can provide accurate velocity for each data point instantaneously. In some implementations, a velocity measurement is accomplished using the Doppler effect which shifts frequency of the light received from the object based at least one of the velocity in the radial direction (e.g., the direction vector between the object detected and the sensor) or the frequency of the laser signal. For example, for velocities encountered in on-road situations where the velocity is less than 100 meters per second (m/s), this shift at a wavelength of 1550 nanometers (nm) amounts to the frequency shift that is less than 130 megahertz (MHz). This frequency shift is small such that it is difficult to detect directly in the optical domain. However, by using coherent detection in FMCW, PMCW, or FMQW LIDAR systems, the signal can be converted to the RF domain such that the frequency shift can be calculated using various signal processing techniques. This enables the autonomous vehicle control system to process incoming data faster.
Instantaneous velocity calculation also makes it easier for the FM LIDAR system to determine distant or sparse data points as objects and/or track how those objects are moving over time. For example, an FM LIDAR sensor (e.g., sensors 130 in
Faster identification and/or tracking of the FM LIDAR system gives an autonomous vehicle control system more time to maneuver a vehicle. A better understanding of how fast objects are moving also allows the autonomous vehicle control system to plan a better reaction.
The FM LIDAR system can have less static compared to conventional LIDAR systems. That is, the conventional LIDAR systems that are designed to be more light-sensitive typically perform poorly in bright sunlight. These systems also tend to suffer from crosstalk (e.g., when sensors get confused by each other's light pulses or light beams) and from self-interference (e.g., when a sensor gets confused by its own previous light pulse or light beam). To overcome these disadvantages, vehicles using the conventional LIDAR systems often need extra hardware, complex software, and/or more computational power to manage this “noise.”
In contrast, FM LIDAR systems do not suffer from these types of issues because each sensor is specially designed to respond only to its own light characteristics (e.g., light beams, light waves, light pulses). If the returning light does not match the timing, frequency, and/or wavelength of what was originally transmitted, then the FM sensor can filter (e.g., remove, ignore, etc.) out that data point. As such, FM LIDAR systems produce (e.g., generates, derives, etc.) more accurate data with less hardware or software requirements, enabling smoother driving.
The FM LIDAR system can be easier to scale than conventional LIDAR systems. As more self-driving vehicles (e.g., cars, commercial trucks, etc.) show up on the road, those powered by an FM LIDAR system likely will not have to contend with interference issues from sensor crosstalk. Furthermore, an FM LIDAR system uses less optical peak power than conventional LIDAR sensors. As such, some or all of the optical components for an FM LIDAR can be produced on a single chip, which produces its own benefits, as discussed herein.
2.1 Commercial Trucking
The environment 100B includes an object 110B (shown in
The commercial truck 102B may include a LIDAR system 104B (e.g., an FM LIDAR system, vehicle control system 120 in
As shown, the LIDAR system 104B in environment 100B may be configured to detect an object (e.g., another vehicle, a bicycle, a tree, street signs, potholes, etc.) at short distances (e.g., 30 meters or less) from the commercial truck 102B.
The environment 100C includes an object 110C (shown in
The environment 100D includes an object 110D (shown in
In commercial trucking applications, it is important to effectively detect objects at all ranges due to the increased weight and, accordingly, longer stopping distance required for such vehicles. FM LIDAR systems (e.g., FMCW and/or FMQW systems) or PM LIDAR systems are well-suited for commercial trucking applications due to the advantages described above. As a result, commercial trucks equipped with such systems may have an enhanced ability to move both people and goods across short or long distances. In various implementations, such FM or PM LIDAR systems can be used in semi-autonomous applications, in which the commercial truck has a driver and some functions of the commercial truck are autonomously operated using the FM or PM LIDAR system, or fully autonomous applications, in which the commercial truck is operated entirely by the FM or LIDAR system, alone or in combination with other vehicle systems.
3. LIDAR Systems
The LIDAR system 200 can include a laser source 204 that generates and emits a beam 206, such as a carrier wave light beam. A splitter 208 can split the beam 206 into a beam 210 and a reference beam 212 (e.g., reference signal). In some implementations, any suitable optical, electronic, or opto-electronic elements can be used to provide the beam 210 and the reference beam 212 from the laser 204 to other elements.
A modulator 214 can modulate one or more properties of the input beam 210 to generate a beam 216 (e.g., target beam). In some implementations, the modulator 214 can modulate a frequency of the input beam 210 (e.g., optical frequency corresponding to optical wavelength, where c=λν, where c is the speed of light, λ is the wavelength, and ν is the frequency). For example, the modulator 214 can modulate a frequency of the input beam 210 linearly such that a frequency of the beam 216 increases or decreases linearly over time. As another example, the modulator 214 can modulate a frequency of the input beam 210 non-linearly (e.g., exponentially). In some implementations, the modulator 214 can modulate a phase of the input beam 210 to generate the beam 216. However, the modulation techniques are not limited to the frequency modulation and the phase modulation. Any suitable modulation techniques can be used to modulate one or more properties of a beam. Returning to
The beam 216, which is used for outputting a transmitted signal, can have most of the energy of the beam 206 outputted by the laser source 204, while the reference beam 212 can have significantly less energy, yet sufficient energy to enable mixing with a return beam 248 (e.g., returned light) scattered from an object. The reference beam 212 can be used as a local oscillator (LO) signal. The reference beam 212 passes through a reference path and can be provided to a mixer 260. An amplifier 220 can amplify the beam 216 to output a beam 222, which a collimator 224 can collimate to output a beam 226.
As depicted in
The optics 232 can define a field of view 244 that corresponds to angles scanned (e.g., swept) by the beam 242 (e.g., a transmitted beam). For example, the beam 242 can be scanned in the particular plane, such as an azimuth plane or elevation plane (e.g., relative to an object to which the LIDAR system 200 is coupled, such as an autonomous vehicle). The optics 232 can be oriented so that the field of view 244 sweeps an azimuthal plane relative to the optics 232.
At least one motor 240 can be coupled with the optics 232 to control at least one of a position or an orientation of the optics 232 relative to the beam 230. For example, where the optics 232 include a reflector or deflector, the motor 240 can rotate the optics 232 so that surfaces of the optics 232 at which the beam 230 is received vary in angle or orientation relative to the beam 230, causing the beam 242 to be varied in angle or direction as the beam 242 is outputted from the optics 232.
The beam 242 can be outputted from the optics 232 and reflected or otherwise scattered by an object (not shown) as a return beam 248 (e.g., return signal). The return beam 248 can be received on a reception path, which can include the circulator 228, and provided to the mixer 260.
The mixer 260 can be an optical hybrid, such as a 90 degree optical hybrid. The mixer 260 can receive the reference beam 212 and the return beam 248, and mix the reference beam 212 and the return beam 248 to output a signal 264 responsive to the reference beam 212 and the return beam 248. The signal 264 can include an in-phase (I) component 268 and a quadrature (Q) component 272.
The LIDAR system 200 can include a receiver 276 that receives the signal 264 from the mixer 260. The receiver 276 can generate a signal 280 responsive to the signal 264, which can be an electronic (e.g., radio frequency) signal. The receiver 276 can include one or more photodetectors that output the signal 280 responsive to the signal 264.
The LIDAR system 200 can include a processing system 290, which can be implemented using features of the vehicle control system 120 described with reference to
The processing system 290 can include or be communicatively coupled with a vehicle controller 298 to control operation of a vehicle for which the LIDAR system 200 is installed (e.g., to provide complete or semi-autonomous control of the vehicle). For example, the vehicle controller 298 can be implemented by at least one of the LIDAR system 200 or control circuitry of the vehicle. The vehicle controller 298 can control operation of the vehicle responsive to at least one of a range to the object or a velocity of the object determined by the processing system 290. For example, the vehicle controller 298 can transmit a control signal to at least one of a steering system or a braking system of the vehicle to control at least one of speed or direction of the vehicle.
In some implementations, the optics 300 can be an assembled polygon that includes a plurality of mirrors 304 coupled with a frame 308. By assembling the optics 300 from separate components, rather than forming the scanner by machining a metal block, the optics 300 can be made to have improved optical and mechanical performance, including mirror form factor flexibility, low weight/inertia for a given mirror size, optical surface quality (e.g., lack of roughness), lower cost at volume, and robustness with respect to stresses such as thermal, shock, and vibration stresses. For example, by forming the optics 300 as an assembled device, the scanner can have about half the mass and inertia about axis 402 relative to a solid metal scanner having a similar or equal mirror size (e.g., a mass of 0.09 kg and an inertia about axis 402 of 5.2 e-5 kg m2, as compared to a solid metal scanner having a mass of 0.2 kg and an inertia of 1.05 e-5 kg m2).
The mirrors 304 can be facets, and can have outward-facing surfaces 312 through which incoming beams are received and then reflected by the mirrors 304 to be outputted from the surfaces 312. The mirrors 304 can be reflective to light used for LIDAR applications (e.g., light received from the laser 204 via one or more components as depicted in
The optics 300 can include various numbers of mirrors 304. For example, the optics 300 can include greater than or equal to three and less than or equal to twelve mirrors 304. The mirrors 304 can be arranged around a perimeter 306 of the frame 308, such as to define a polygonal shape. Each mirror 304 can have a same shape as at least one other mirror 304, such as by having a rectangular shape with identical length and width, a circular or elliptical shape with identical perimeter, a convex or concave polygonal shape with identical numbers and lengths of sides, and various other such similar or identical shapes.
The mirrors 304 can be sized to extend outward from the frame 308; for example, a plane in which a surface 414 of the frame 308 lies can intersect at least one mirror 304 inward from an outer edge 310 of the at least one mirror 304. For example, the mirrors 304 can extend further than an extent of the frame 308 defined by the surface 414. The mirrors 304 can extend further above and further below the frame 308 in a frame of reference in which at least one of the axis 402 is parallel with gravity or the surface 414 is parallel with ground. The mirrors 304 can extend further than the surface 414 in a direction along the axis 402 (e.g., a projection of the mirrors 304 onto the axis 402 or a plane in which the axis 402 lies can be outward from the frame 308). This can allow the overall optical surface area of the mirrors 304 that can be used for reflecting incoming beams to be increased without increasing the size or weight of the frame 308, due to the assembled configuration and bonding of the mirrors 304 to the frame 308. As such, greater flexibility can be achieved for arranging various components of the LIDAR system 200 with respect to each other and with respect to the optics 300, which can enable the overall form factor to be decreased in size.
The mirrors 304 can include a glass material. For example, the mirrors 304 can include optical glass such as crown glass or flint glass. For example, the mirrors 304 can include K9 glass or BK7 glass, which can have improved thermal performance. As another example, the mirrors 304 can include glass of fused silica, which can operate effectively under conditions of UV and near infrared (NIR) light, with low coefficient of thermal expansion. The mirrors 304 can be formed by being cut from a larger glass panel, which can allow for more scalable production of the mirrors 304.
In some implementations, the mirrors 304 (e.g., surfaces 312) can be polished. Due to the use of glass for the mirrors 304 (e.g., rather than metal materials such as CNC machined and diamond turned aluminum), the mirrors 304 can be polished with greater flatness and lesser roughness, and as a result have improved optical properties, such as by reducing scattering of incoming light by the surfaces 312 (which can then be reflected off a backing of the mirrors 304 and then outputted from the surfaces 312, again with reduced scattering). For example, in an example test of scattering by glass mirrors 304 as compared with diamond turned aluminum (each coated with unprotected gold), the polished glass of the mirrors 304 was found to have relative scattering of 0.80 dB, while the metal (diamond turned aluminum) was found to have relative scattering of 6.14 dB. As such, the glass mirrors 304 can have reduced likelihood of scattering of light beams within the structures defining the roughness of the surfaces 312, which can address issues such as Doppler components being contributed to the beam's signal by the scattering. In turn, signal processing computational demands can be reduced, as signal processing needed to remove the Doppler components can be reduced or eliminated. In some implementations, the mirrors 304 can be coated with a coating. For example, gold (e.g., unprotected gold), can be used as a coating material. However, the coating material is not limited to gold. Instead, any suitable reflective material can be used as a coating material.
As depicted in
In some implementations, each mirror 304 can extend from a first edge 316 to a second edge 320, and can be arranged so that there is a gap 324 between respective edges 316, 320 of adjacent mirrors 304. The gaps 324 can allow for expansion or other movement or change in shape of the mirrors 304, such as due to thermal or vibration effects. The edges 316, 320 can be angled, such that the gaps 324 decrease in size in a direction away from the axis 402 (while some gap 324 is still retained where edges 316, 320 meet surfaces 312). In some other implementations, the mirrors 304 can be arranged without any gap between respective edges 320 of adjacent mirrors 304.
The frame 308 can be made from a metal material, such as to be formed as a metal block. For example, the frame 308 can be made from aluminum. Using aluminum for the frame 308 can enable the frame 308 to be relatively lightweight and easy to manufacture. The frame 308 or portions thereof can be made from various materials, such as plastic or composite materials, that have sufficient rigidity or other material or structural properties across temperatures of operation of LIDAR system to allow for efficient force transfer from the frame 308 to the mirrors 304.
Each mirror 304 can be bonded at a respective bond surface 404 of the frame 308. The bond surfaces 404 can be positioned on or define the perimeter 306 of the frame 308. For example, the frame 308 can include a wall 408 (e.g., perimeter wall) that is oriented traverse to an axis 402 of the frame 308. The bond surfaces 404 can be defined on the wall 408. As depicted in
An adhesive (e.g., bonding material) can be provided on the bond surfaces 404 (e.g., placed on a central portion of the inner surfaces 416 and/or bond surfaces 404) to attach the mirrors 304 to the bond surfaces 404, which can enable symmetric thermal expansion (e.g., with relatively low thermally developed expansion stresses). For example, an epoxy, such as a dispensed epoxy, can be used to attach the mirrors 304 to the bond surfaces 404. At least one of the material properties of the adhesive and the surface area of the bond surfaces 404 can be selected so that an attachment force between the bond surfaces 404 and the mirrors 304 is greater than an apparent (e.g., centrifugal) force resulting from the rotation of the optics 300 (e.g., rotation of the scanner of the optics 300) that would drive the mirrors 304 away from the bond surfaces 404 during operation of the optics 300 due to rotation of the optics 300 about the axis 402. For example, the attachment force can be greater than the centrifugal force at a maximum expected rotation rate of the scanner by at least a threshold. The adhesive can be selected to have a coefficient of thermal expansion that is similar or about equal to that of the mirrors 304, which can improve the performance of the optics 300 with respect to thermal expansion or contraction.
The frame 308 can include a shaft receiver 420 inward from the wall 408. The shaft receiver 420 can be a channel or other opening to allow a shaft (e.g., shaft or axle coupled with the motor 240 described with reference to
Having now described some illustrative implementations, it is apparent that the foregoing is illustrative and not limiting, having been presented by way of example. In particular, although many of the examples presented herein involve specific combinations of method acts or system elements, those acts and those elements can be combined in other ways to accomplish the same objectives. Acts, elements and features discussed in connection with one implementation are not intended to be excluded from a similar role in other implementations or implementations.
The phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including” “comprising” “having” “containing” “involving” “characterized by” “characterized in that” and variations thereof herein, is meant to encompass the items listed thereafter, equivalents thereof, and additional items, as well as alternate implementations consisting of the items listed thereafter exclusively. In one implementation, the systems and methods described herein consist of one, each combination of more than one, or all of the described elements, acts, or components.
Any references to implementations or elements or acts of the systems and methods herein referred to in the singular can also embrace implementations including a plurality of these elements, and any references in plural to any implementation or element or act herein can also embrace implementations including only a single element. References in the singular or plural form are not intended to limit the presently disclosed systems or methods, their components, acts, or elements to single or plural configurations. References to any act or element being based on any information, act or element can include implementations where the act or element is based at least in part on any information, act, or element.
Any implementation disclosed herein can be combined with any other implementation or embodiment, and references to “an implementation,” “some implementations,” “one implementation” or the like are not necessarily mutually exclusive and are intended to indicate that a particular feature, structure, or characteristic described in connection with the implementation can be included in at least one implementation or embodiment. Such terms as used herein are not necessarily all referring to the same implementation. Any implementation can be combined with any other implementation, inclusively or exclusively, in any manner consistent with the aspects and implementations disclosed herein.
Where technical features in the drawings, detailed description or any claim are followed by reference signs, the reference signs have been included to increase the intelligibility of the drawings, detailed description, and claims. Accordingly, neither the reference signs nor their absence have any limiting effect on the scope of any claim elements.
Systems and methods described herein may be embodied in other specific forms without departing from the characteristics thereof. Further relative parallel, perpendicular, vertical or other positioning or orientation descriptions include variations within +/−10% or +/−10 degrees of pure vertical, parallel or perpendicular positioning. References to “approximately,” “about” “substantially” or other terms of degree include variations of +/−10% from the given measurement, unit, or range unless explicitly indicated otherwise. Coupled elements can be electrically, mechanically, or physically coupled with one another directly or with intervening elements. Scope of the systems and methods described herein is thus indicated by the appended claims, rather than the foregoing description, and changes that come within the meaning and range of equivalency of the claims are embraced therein.
The term “coupled” and variations thereof includes the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly with or to each other, with the two members coupled with each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled with each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
References to “or” can be construed as inclusive so that any terms described using “or” can indicate any of a single, more than one, and all of the described terms. A reference to “at least one of ‘A’ and ‘B’” can include only ‘A’, only ‘B’, as well as both ‘A’ and ‘B’. Such references used in conjunction with “comprising” or other open terminology can include additional items.
Modifications of described elements and acts such as variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations can occur without materially departing from the teachings and advantages of the subject matter disclosed herein. For example, elements shown as integrally formed can be constructed of multiple parts or elements, the position of elements can be reversed or otherwise varied, and the nature or number of discrete elements or positions can be altered or varied. Other substitutions, modifications, changes and omissions can also be made in the design, operating conditions and arrangement of the disclosed elements and operations without departing from the scope of the present disclosure.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
Number | Name | Date | Kind |
---|---|---|---|
4099249 | Casasent | Jul 1978 | A |
4620192 | Collins | Oct 1986 | A |
4648276 | Klepper et al. | Mar 1987 | A |
4804893 | Melocik | Feb 1989 | A |
5075864 | Sakai | Dec 1991 | A |
5216534 | Boardman et al. | Jun 1993 | A |
5223986 | Mayerjak et al. | Jun 1993 | A |
5227910 | Khattak | Jul 1993 | A |
5231401 | Kaman et al. | Jul 1993 | A |
5461505 | Nishikawa et al. | Oct 1995 | A |
5687017 | Katoh | Nov 1997 | A |
5781156 | Krasner | Jul 1998 | A |
5828585 | Welk et al. | Oct 1998 | A |
5947903 | Ohtsuki et al. | Sep 1999 | A |
5999302 | Sweeney et al. | Dec 1999 | A |
6029496 | Kreft | Feb 2000 | A |
6211888 | Ohtsuki et al. | Apr 2001 | B1 |
6671595 | Lu et al. | Dec 2003 | B2 |
6753950 | Morcom | Jun 2004 | B2 |
6871148 | Morgen et al. | Mar 2005 | B2 |
6931055 | Underbrink et al. | Aug 2005 | B1 |
7122691 | Oshima et al. | Oct 2006 | B2 |
7152490 | Freund et al. | Dec 2006 | B1 |
7486802 | Hougen | Feb 2009 | B2 |
7511824 | Sebastian et al. | Mar 2009 | B2 |
7639347 | Eaton | Dec 2009 | B2 |
7742152 | Hui et al. | Jun 2010 | B2 |
7917039 | Delfyett | Mar 2011 | B1 |
8135513 | Bauer et al. | Mar 2012 | B2 |
8531650 | Feldkhun et al. | Sep 2013 | B2 |
8751155 | Lee | Jun 2014 | B2 |
8805197 | Delfyett | Aug 2014 | B2 |
8818609 | Boyko et al. | Aug 2014 | B1 |
8831780 | Zelivinski et al. | Sep 2014 | B2 |
8954252 | Urmson et al. | Feb 2015 | B1 |
9041915 | Earhart et al. | May 2015 | B2 |
9046909 | Leibowitz et al. | Jun 2015 | B2 |
9086273 | Gruver et al. | Jul 2015 | B1 |
9097800 | Zhu | Aug 2015 | B1 |
9348137 | Plotkin et al. | May 2016 | B2 |
9383753 | Templeton et al. | Jul 2016 | B1 |
9607220 | Smith et al. | Mar 2017 | B1 |
9618742 | Droz | Apr 2017 | B1 |
9753462 | Gilliland et al. | Sep 2017 | B2 |
10036812 | Crouch et al. | Jul 2018 | B2 |
10231705 | Lee | Mar 2019 | B2 |
10345434 | Hinderling et al. | Jul 2019 | B2 |
10422649 | Engelman et al. | Sep 2019 | B2 |
10485508 | Miyaji et al. | Nov 2019 | B2 |
10520602 | Villeneuve et al. | Dec 2019 | B2 |
10534084 | Crouch | Jan 2020 | B2 |
10568258 | Wahlgren | Feb 2020 | B2 |
10571567 | Campbell | Feb 2020 | B2 |
11002856 | Heidrich et al. | May 2021 | B2 |
11041954 | Crouch et al. | Jun 2021 | B2 |
11249192 | Crouch et al. | Feb 2022 | B2 |
11402506 | Ohtomo et al. | Aug 2022 | B2 |
11441899 | Pivac et al. | Sep 2022 | B2 |
20020071109 | Allen et al. | Jun 2002 | A1 |
20020140924 | Wangler et al. | Oct 2002 | A1 |
20020180868 | Lippert et al. | Dec 2002 | A1 |
20030117312 | Nakanishi et al. | Jun 2003 | A1 |
20040034304 | Sumi | Feb 2004 | A1 |
20040109155 | Deines | Jun 2004 | A1 |
20040158155 | Njemanze | Aug 2004 | A1 |
20040222366 | Frick | Nov 2004 | A1 |
20050149240 | Tseng et al. | Jul 2005 | A1 |
20060132752 | Kane | Jun 2006 | A1 |
20060239312 | Kewitsch et al. | Oct 2006 | A1 |
20070005212 | Xu et al. | Jan 2007 | A1 |
20070181810 | Tan et al. | Aug 2007 | A1 |
20080018881 | Hui et al. | Jan 2008 | A1 |
20080024756 | Rogers | Jan 2008 | A1 |
20080040029 | Breed | Feb 2008 | A1 |
20080100822 | Munro | May 2008 | A1 |
20090002679 | Ruff et al. | Jan 2009 | A1 |
20090009842 | Destain et al. | Jan 2009 | A1 |
20090030605 | Breed | Jan 2009 | A1 |
20090059201 | Willner et al. | Mar 2009 | A1 |
20100094499 | Anderson | Apr 2010 | A1 |
20100183309 | Etemad et al. | Jul 2010 | A1 |
20100188504 | Dimsdale et al. | Jul 2010 | A1 |
20100312432 | Hamada et al. | Dec 2010 | A1 |
20110013245 | Tanaka et al. | Jan 2011 | A1 |
20110015526 | Tamura | Jan 2011 | A1 |
20110026007 | Gammenthaler | Feb 2011 | A1 |
20110026008 | Gammenthaler | Feb 2011 | A1 |
20110205523 | Rezk et al. | Aug 2011 | A1 |
20110292371 | Chang | Dec 2011 | A1 |
20120038902 | Dotson | Feb 2012 | A1 |
20120127252 | Lim | May 2012 | A1 |
20120229627 | Wang | Sep 2012 | A1 |
20120274922 | Hodge | Nov 2012 | A1 |
20120281907 | Samples et al. | Nov 2012 | A1 |
20120306383 | Munro | Dec 2012 | A1 |
20130104661 | Klotz et al. | May 2013 | A1 |
20130120989 | Sun et al. | May 2013 | A1 |
20130268163 | Comfort et al. | Oct 2013 | A1 |
20130325244 | Wang et al. | Dec 2013 | A1 |
20140036252 | Amzajerdian et al. | Feb 2014 | A1 |
20140064607 | Grossmann et al. | Mar 2014 | A1 |
20150005993 | Breuing | Jan 2015 | A1 |
20150046119 | Sandhawalia et al. | Feb 2015 | A1 |
20150130607 | MacArthur | May 2015 | A1 |
20150160332 | Sebastian et al. | Jun 2015 | A1 |
20150177379 | Smith et al. | Jun 2015 | A1 |
20150185244 | Inoue et al. | Jul 2015 | A1 |
20150260836 | Hayakawa | Sep 2015 | A1 |
20150267433 | Leonessa et al. | Sep 2015 | A1 |
20150269438 | Samarasekera et al. | Sep 2015 | A1 |
20150270838 | Chan et al. | Sep 2015 | A1 |
20150282707 | Tanabe et al. | Oct 2015 | A1 |
20150323660 | Hampikian | Nov 2015 | A1 |
20150331103 | Jensen | Nov 2015 | A1 |
20150331111 | Newman et al. | Nov 2015 | A1 |
20160078303 | Samarasekera et al. | Mar 2016 | A1 |
20160084946 | Turbide | Mar 2016 | A1 |
20160091599 | Jenkins | Mar 2016 | A1 |
20160123720 | Thorpe et al. | May 2016 | A1 |
20160125739 | Stewart et al. | May 2016 | A1 |
20160216366 | Phillips et al. | Jul 2016 | A1 |
20160245903 | Kalscheur et al. | Aug 2016 | A1 |
20160260324 | Tummala et al. | Sep 2016 | A1 |
20160266243 | Marron | Sep 2016 | A1 |
20160274589 | Templeton et al. | Sep 2016 | A1 |
20160302010 | Sebastian et al. | Oct 2016 | A1 |
20160350926 | Flint et al. | Dec 2016 | A1 |
20160377721 | Lardin et al. | Dec 2016 | A1 |
20160377724 | Crouch et al. | Dec 2016 | A1 |
20170160541 | Carothers et al. | Jun 2017 | A1 |
20170248691 | McPhee et al. | Aug 2017 | A1 |
20170299697 | Swanson | Oct 2017 | A1 |
20170329014 | Moon et al. | Nov 2017 | A1 |
20170329332 | Pilarski et al. | Nov 2017 | A1 |
20170343652 | De Mersseman et al. | Nov 2017 | A1 |
20170350964 | Kaneda | Dec 2017 | A1 |
20170350979 | Uyeno et al. | Dec 2017 | A1 |
20170356983 | Jeong et al. | Dec 2017 | A1 |
20180003805 | Popovich et al. | Jan 2018 | A1 |
20180136000 | Rasmusson et al. | May 2018 | A1 |
20180188355 | Bao et al. | Jul 2018 | A1 |
20180224547 | Crouch et al. | Aug 2018 | A1 |
20180267556 | Templeton et al. | Sep 2018 | A1 |
20180276986 | Delp | Sep 2018 | A1 |
20180284286 | Eichenholz et al. | Oct 2018 | A1 |
20180299534 | Lachapelle et al. | Oct 2018 | A1 |
20180307913 | Finn et al. | Oct 2018 | A1 |
20190064831 | Gali et al. | Feb 2019 | A1 |
20190086514 | Dussan et al. | Mar 2019 | A1 |
20190107606 | Russell et al. | Apr 2019 | A1 |
20190154439 | Binder | May 2019 | A1 |
20190154807 | Steinkogler et al. | May 2019 | A1 |
20190154816 | Hughes | May 2019 | A1 |
20190154832 | Maleki et al. | May 2019 | A1 |
20190154835 | Maleki et al. | May 2019 | A1 |
20190258251 | Ditty et al. | Aug 2019 | A1 |
20190310351 | Hughes et al. | Oct 2019 | A1 |
20190310469 | Sapir | Oct 2019 | A1 |
20190317219 | Smith et al. | Oct 2019 | A1 |
20190318206 | Smith et al. | Oct 2019 | A1 |
20190346856 | Berkemeier et al. | Nov 2019 | A1 |
20190361119 | Kim et al. | Nov 2019 | A1 |
20200025879 | Pacala et al. | Jan 2020 | A1 |
20200049819 | Cho et al. | Feb 2020 | A1 |
20200049879 | Sato | Feb 2020 | A1 |
20200182978 | Maleki et al. | Jun 2020 | A1 |
20200192082 | Zhou et al. | Jun 2020 | A1 |
20200249349 | Steinberg | Aug 2020 | A1 |
20210089047 | Smith et al. | Mar 2021 | A1 |
20210165102 | Crouch et al. | Jun 2021 | A1 |
20210325664 | Adams | Oct 2021 | A1 |
20220260686 | Wang | Aug 2022 | A1 |
20220413260 | Gassend | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
101346773 | Jan 2009 | CN |
102150007 | Aug 2011 | CN |
103227559 | Jul 2013 | CN |
103608696 | Feb 2014 | CN |
104793619 | Jul 2015 | CN |
104914445 | Sep 2015 | CN |
104956400 | Sep 2015 | CN |
105116922 | Dec 2015 | CN |
105425245 | Mar 2016 | CN |
105629258 | Jun 2016 | CN |
105652282 | Jun 2016 | CN |
107015238 | Aug 2017 | CN |
107024686 | Aug 2017 | CN |
107193011 | Sep 2017 | CN |
207318710 | May 2018 | CN |
10 2007 001 103 | Jul 2008 | DE |
10 2017 200 692 | Aug 2018 | DE |
1 298 453 | Apr 2003 | EP |
3330766 | Jun 2018 | EP |
2568688 | Feb 1986 | FR |
2 349 231 | Oct 2000 | GB |
S63-071674 | Apr 1988 | JP |
H06-148556 | May 1994 | JP |
H09-257415 | Oct 1997 | JP |
H09-325290 | Dec 1997 | JP |
2765767 | Jun 1998 | JP |
H11-153664 | Jun 1999 | JP |
2000-338244 | Dec 2000 | JP |
2002-249058 | Sep 2002 | JP |
2003-185738 | Jul 2003 | JP |
2006-148556 | Jun 2006 | JP |
2006-226931 | Aug 2006 | JP |
2007-155467 | Jun 2007 | JP |
2007-214564 | Aug 2007 | JP |
2007-214694 | Aug 2007 | JP |
2009-257415 | Nov 2009 | JP |
2009-288255 | Dec 2009 | JP |
2009-291294 | Dec 2009 | JP |
2011-044750 | Mar 2011 | JP |
2011-107165 | Jun 2011 | JP |
2011-203122 | Oct 2011 | JP |
2012-502301 | Jan 2012 | JP |
2012-103118 | May 2012 | JP |
2012-154863 | Aug 2012 | JP |
2015-125062 | Jul 2015 | JP |
2015-172510 | Oct 2015 | JP |
2015-212942 | Nov 2015 | JP |
2015-535925 | Dec 2015 | JP |
2017-138219 | Aug 2017 | JP |
2017-524918 | Aug 2017 | JP |
2018-173346 | Nov 2018 | JP |
2018-204970 | Dec 2018 | JP |
2018-0058068 | May 2018 | KR |
2018-0126927 | Nov 2018 | KR |
201516612 | May 2015 | TW |
201818183 | May 2018 | TW |
201832039 | Sep 2018 | TW |
201833706 | Sep 2018 | TW |
202008702 | Feb 2020 | TW |
WO-2007124063 | Nov 2007 | WO |
WO-2010127151 | Nov 2010 | WO |
WO-2011102130 | Aug 2011 | WO |
WO-2014011241 | Jan 2014 | WO |
WO-2014132020 | Sep 2014 | WO |
WO-2015037173 | Mar 2015 | WO |
WO-2016134321 | Aug 2016 | WO |
WO-2016164435 | Oct 2016 | WO |
WO-2017018065 | Feb 2017 | WO |
WO-2018066069 | Apr 2018 | WO |
WO-2018067158 | Apr 2018 | WO |
WO-2018102188 | Jun 2018 | WO |
WO-2018102190 | Jun 2018 | WO |
WO-2018107237 | Jun 2018 | WO |
WO-2018125438 | Jul 2018 | WO |
WO-2018144853 | Aug 2018 | WO |
WO-2018160240 | Sep 2018 | WO |
WO-2019014177 | Jan 2019 | WO |
WO-2020062301 | Apr 2020 | WO |
Entry |
---|
Principles of Communication—Modulation (Year: 2022). |
JP 3422720 B2 (Year: 2003). |
JP 3422720 B2 (translated) (Year: 2003). |
Adany, P. et al., “Chirped Lidar Using Simplified Homodyne Detection,” Journal of Lightwave Technology, vol. 27, No. 16, Aug. 15, 2009, pp. 3351-3357. |
Anonymous, “Fundamentals of Direct Digital Synthesis,” Analog Devices, MT-085 Tutorial Rev. D, copyright 2009, pp. 1-9. |
Anonymous, “Occlusion—Shadows and Occlusion—Peachpit”, Jul. 3, 2006 (Jul. 3, 2006), P055697780,Retrieved from the Internet:URL:https://www.peachpit.com/articles/article.aspx?p=486505&seqNum=7[retrieved on May 25, 2020] 2 pages. |
Aull, B. et al., “Geiger-Mode Avalanche Photodiodes for Three-Dimensional Imaging,” Lincoln Laboratory Journal, vol. 13, No. 2, 2002, pp. 335-350. |
Bashkansky, M. et al., “RF phase-coded random-modulation LIDAR,” Optics Communications, vol. 231, 2004, pp. 93-98. |
Beck, S. et al., “Synthetic-aperture imaging laser radar: laboratory demonstration and signal processing,” Applied Optics, vol. 44, No. 35, Dec. 10, 2005, pp. 7621-7629. |
Berkovic, G. and Shafir, E., “Optical methods for distance and displacement measurements”, Advances in Optics and Photonics, vol. 4, Issue 4, Dec. 2012, pp. 441-471. |
Besl, P. and McKay, N., “A Method for Registration of 3-D shapes”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, No. 2, Feb. 1992, pp. 239-256. |
Campbell, J. et al., “Super-resolution technique for CW lidar using Fourier transform reordering and Richardson-Lucy deconvolution.” Optics Letters, vol. 39, No. 24, Dec. 15, 2014, pp. 6981-6984. |
Cao, X. et al., “Lidar Signal Depolarization by Solid Targets and its Application to Terrain Mapping and 3D Imagery,” Defence R&D, Contract Report DRDC Valcartier CR 2011-236, Mar. 2011, retrieved at URL:http://publications.gc.ca/collections/collection_2016/rddc-drdc/D68-3-236-2011-eng.pdf, pp. 1-74. |
Cheng, H., “Autonomous Intelligent Vehicles: Theory, Algorithms, and Implementation”, copyright 2011, Springer, retrieved from http://ebookcentral.proquest.com, created from epo-ebooks on Jun. 1, 2020, 24 pages. |
Chinese Office Action on CN Appl. Ser. No. 201880009947.5 dated Oct. 11, 2021 (5 pages). |
Contu, F., “The Do's and Don'ts of High Speed Serial Design in FPGA's”. Xilinix All Programmable, Copyright 2013, High Speed Digital Design & Validation Seminars 2013, pp. 1-61. |
Corrected Notice of Allowance on U.S. Appl. No. 16/725,419 dated May 28, 2020 (2 pages). |
Crouch, S. and Barber, Z., “Laboratory demonstrations of interferometric and spotlight synthetic aperture ladar techniques,” Optics Express, vol. 20, No. 22, Oct. 22, 2012, pp. 24237-24246. |
Crouch, S. et al., “Three dimensional digital holographic aperture synthesis”, Optics Express, vol. 23, No. 18, Sep. 7, 2015, pp. 23811-23816. |
Dapore, B. et al., “Phase noise analysis of two wavelength coherent imaging system”, Optics Express, vol. 21, No. 25, Dec. 16, 2013, pp. 30642-30652. |
Decision of Rejection on JP Appl. Ser. No. 2019-527155 dated Jun. 8, 2021 (8 pages). |
Decision of Rejection on JP Appl. Ser. No. 2020-559530 dated Aug. 31, 2021 (13 pages). |
Duncan, B. and Dierking, M., “Holographic aperture ladar: erratum”, Applied Optics, vol. 52, No. 4, Feb. 1, 2013, pp. 706-708. |
Duncan, B. et al., “Holographic aperture ladar”, Applied Optics, vol. 48, Issue 6, Feb. 20, 2009, pp. 1168-1177. |
El Gayar, N. (Ed.) et al., “Multiple Classifier Systems”, 9th International Workshop, International Workshop on Multiple Classifier Systems, MCS 2010, Cairo, Egypt, Apr. 7-9, 2010, specifically Farhad Samadzadegan et al., “A Multiple Classifier System for Classification of LIDAR Remote Sensing Data Using Multi-class SVM”, pp. 254-263 (337 total pages). |
Extended European Search Report on EP Appl. Ser. No. 17876081.5 dated Jun. 3, 2020 (9 pages). |
Extended European Search Report on EP Appl. Ser. No. 17876731.5 dated Jun. 17, 2020 (14 pages). |
Extended European Search Report on EP Appl. Ser. No. 17888807.9 dated Jun. 3, 2020 (9 pages). |
Extended European Search Report on EP Appl. Ser. No. 17898933.1 dated May 12, 2020 (7 pages). |
Fehr, D. et al., “Compact Covariance descriptors in 3D point clouds for object recognition,” presented at the Robotics and Automation (ICRA), May 14, 2012, IEEE International Conference, pp. 1793-1798. |
Final US Office Action on U.S. Appl. No. 17/331,362 dated Nov. 29, 2021 (17 pages). |
First Office Action on CN Appl. Ser. No. 201780081215.2 dated Mar. 3, 2021 (14 pages). |
First Office Action on CN Appl. Ser. No. 201980033898.3 dated Apr. 20, 2021 (14 pages). |
Foucras, M. et al., “Detailed Analysis of the Impact of the Code Doppler on the Acquisition Performance of New GNSS Signals,” ION ITM, International Technical Meeting of The Institute of Navigation, San Diego, California, Jan. 27, 2014, pp. 1-13. |
Griggs, R.(Ed.), “Implementation Agreement for Integrated Dual Polarization Micro-Intradyne Coherent Receivers”, OIF (Optical Internetworking Forum), IA# OIF-DPC-MRX-01.0, Mar. 31, 2015, pp. 1-32. |
Haralick, R. et al., “Image Analysis Using Mathematical Morphology,” IEEE Transactions In Pattern Analysis and Machine Intelligence, Jul. 1987, vol. PAMI-9, No. 4, pp. 532-550. |
International Preliminary Report and Written Opinion on Patentability on Appl. Ser. No. PCT/US2018/041388 dated Jan. 23, 2020 (12 pages). |
International Preliminary Report and Written Opinion on Patentability on Appl. Ser. No. PCT/US2019/028532 dated Oct. 27, 2020 (11 pages). |
International Preliminary Report and Written Opinion on Patentability on Appl. Ser. No. PCT/US2019/068351 dated Jul. 15, 2021 (8 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2017/062703 dated Aug. 27, 2018 (13 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2017/062708 dated Mar. 16, 2018 (14 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2017/062714 dated Aug. 23, 2018 (13 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2017/062721 dated Feb. 6, 2018 (12 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2018/016632 dated Apr. 24, 2018 (6 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2018/041388 dated Sep. 20, 2018 (13 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2018/044007 dated Oct. 25, 2018 (17 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2019/028532 dated Aug. 16, 2019 (16 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2019/068351 dated Apr. 9, 2020 (14 pages). |
International Search Report and Written Opinion on Appl. Ser. No. PCT/US2021/032515 dated Aug. 3, 2021 (18 pages). |
Johnson, A. et al., “Using spin images for efficient object recognition in cluttered 3D scenes”, IEEE Trans. Pattern Anal. Mach. Intell., vol. 21, No. 5, May 1999, pp. 433-448. |
Johnson, A., “Spin-Images: A Representation for 3-D Surface Matching,” doctoral dissertation, tech. report CMU-RI-TR-97-47, Robotics Institute, Carnegie Mellon University, Aug. 1997, 308 pages. |
Kachelmyer, A., “Range-Doppler Imaging with a Laser Radar”, The Lincoln Laboratory Journal, vol. 3, No. 1, 1990, pp. 87-118. |
Klasing, K. et al., “Comparison of Surface Normal Estimation Methods for Range Sensing Applications,” 2009 IEEE International Conference on Robotics and Automation, May 12, 2009, pp. 3206-3211. |
Krause, B. et al., “Motion compensated frequency modulated continuous wave 3D coherent imaging ladar with scannerless architecture”, Applied Optics, vol. 51, No. 36, Dec. 20, 2012, pp. 8745-8761. |
Le, T., “Arbitrary Power Splitting Couplers Based on 3×3 Multimode Interference Structures for All-Optical Computing”, LACSIT International Journal of Engineering and Technology, vol. 3, No. 5, Oct. 2011, pp. 565-569. |
Lin, C. et al.; “Eigen-feature analysis of weighted covariance matrices for LiDAR point cloud classification”, ISPRS Journal of Photogrammetry and Remote Sensing, vol. 94, Aug. 1, 2014 (30 pages). |
Lu, M. et al., “Recognizing Objects in 3D Point Clouds with Multi-Scale Local Features,” Sensors 2014, Dec. 15, 2014, retrieved at URL:www.mdpi.com/1424-8220/14/12/24156/pdf, pp. 24156-24173. |
MacKinnon, D. et al., “Adaptive laser range scanning”, American Control Conference, Piscataway, NJ, 2008, pp. 3857-3862. |
Marron, J. et al., “Three-dimensional Lensless Imaging Using Laser Frequency Diversity”, Applied Optics, vol. 31, No. 2, Jan. 10, 1992, pp. 255-262. |
Miyasaka, T. et al., “Moving Object Tracking and Identification in Traveling Environment Using High Resolution Laser Radar”, Graphic Information Industrial, vol. 43, No. 2, pp. 61-69, Feb. 1, 2011. |
Monreal, J. et al., “Detection of Three Dimensional Objects Based on Phase Encoded Range Images”, Sixth International Conference on Correlation Optics, vol. 5477, Jun. 4, 2004, pp. 269-280. |
Munkres, J., “Algorithms for the Assignment and Transportation Problems”, Journal of the Society for Industrial and Applied Mathematics, vol. 5, No. 1, Mar. 1957, pp. 32-38. |
Non-Final Office Action on U.S. Appl. No. 16/464,648 dated Jun. 1, 2021 (6 pages). |
Non-Final Office Action on U.S. Appl. No. 16/464,657 dated Dec. 22, 2021 (17 pages). |
Non-Final Office Action on U.S. Appl. No. 16/725,419 dated Feb. 24, 2020 (4 pages). |
Notice of Allowance for U.S. Appl. No. 16/725,399 dated Dec. 3, 2020 (11 pages). |
Notice of Allowance on KR Appl. Ser. No. 10-2019-7019062 dated Feb. 10, 2021 (4 Pages). |
Notice of Allowance on KR Appl. Ser. No. 10-2019-7019076 dated Feb. 15, 2021 (4 pages). |
Notice of Allowance on KR Appl. Ser. No. 10-2019-7019078 dated Feb. 15, 2021 (4 pages). |
Notice of Allowance on U.S. Appl. No. 16/464,648 dated Oct. 13, 2021 (7 pages). |
Notice of Allowance on U.S. Appl. No. 15/423,978 dated Jul. 15, 2019 (8 pages). |
Notice of Allowance on U.S. Appl. No. 15/645,311 dated Apr. 18, 2019 (13 pages). |
Notice of Allowance on U.S. Appl. No. 16/515,538 dated Feb. 23, 2021 (16 pages). |
Notice of Allowance on U.S. Appl. No. 16/725,419 dated Apr. 15, 2020 (9 pages). |
Notice of Preliminary Rejection on KR Appl. Ser. No. 10-2021-7014545 dated Aug. 19, 2021 (17 pages). |
Notice of Preliminary Rejection on KR Appl. Ser. No. 10-2021-7014560 dated Aug. 19, 2021 (5 pages). |
Notice of Preliminary Rejection on KR Appl. Ser. No. 10-2021-7019744 dated Aug. 19, 2021 (15 pages). |
Notice of Reasons for Refusal on JP Appl. Ser. No. 2019-527156 dated Dec. 1, 2020 (12 pages). |
Notice of Reasons for Refusal on JP Appl. Ser. No. 2020-559530 dated Apr. 20, 2021 (11 pages). |
Notice of Reasons for Refusal on JP Appl. Ser. No. 2021-165072 dated Nov. 30, 2021 (9 pages). |
Notice of Reasons for Refusal on JP Appl. Ser. No. 2021-538998 dated Nov. 30, 2021 (20 pages). |
O'Donnell, R., “Radar Systems Engineering Lecture 11 Waveforms and Pulse Compression,” IEEE New Hampshire Section, Jan. 1, 2010, pp. 1-58. |
Office Action on EP Appl. Ser. No. 19791789.1 dated Dec. 21, 2021 (12 pages). |
Office Action on JP App. Ser. No. 2019-527155 dated Dec. 1, 2020 (10 pages). |
Office Action on JP Appl. Ser. No. 2019-527155 dated Dec. 1, 2020 (8 pages). |
Office Action on JP Appl. Ser. No. 2019-527224 dated Dec. 1, 2020 (6 pages). |
Office Action on JP Appl. Ser. No. 2019-538482 dated Feb. 2, 2021 (6 pages). |
Office Action on KR Appl. Ser. No. 10-2019-7018575 dated Jun. 23, 2020 (4 pages). |
Office Action on KR Appl. Ser. No. 10-2019-7019062 dated Oct. 5, 2020 (6 pages). |
Office Action on KR Appl. Ser. No. 10-2019-7019076 dated Jun. 9, 2020 (18 pages). |
Office Action on KR Appl. Ser. No. 10-2019-7019078 dated Jun. 9, 2020 (14 pages). |
Office Action on KR Appl. Ser. No. 10-2019-7022921 dated Aug. 26, 2020 (6 pages). |
Office Action on U.S. Appl. No. 15/423,978 dated Mar. 22, 2019 (6 pages). |
Office Action on U.S. Appl. No. 17/331,362 dated Aug. 19, 2021 (17 pages). |
Optoplex Corporation, “90 degree Optical Hybrid”, Nov. 9, 2016, 2 pages. |
Rabb, D. et al., “Multi-transmitter Aperture Synthesis”, Optics Express, vol. 18, No. 24, Nov. 22, 2010, pp. 24937-24945. |
Roos, P. et al., “Ultrabroadband optical chirp linearization for precision metrology applications”, Optics Letters, vol. 34, No. 23, Dec. 1, 2009, pp. 3692-3694. |
Salehian, H. et al., “Recursive Estimation of the Stein Center of SPD Matrices and Its Applications”, 2013 IEEE International Conference on Computer Vision (ICCV), Dec. 1, 2013, pp. 1793-1800. |
Samadzadegan, F. et al., “A Multiple Classifier System for Classification of LIDAR Remote Sensing Data Using Multi-class SVM”, Multiple Classifier Systems, 9th International Workshop, MCS 2010, Cairo, Egypt, Apr. 7-9, 2010, pp. 254-263. |
Satyan, N. et al., “Precise control of broadband frequency chirps using optoelectronic feedback”, Optics Express, vol. 17, No. 18, Aug. 31, 2009, pp. 15991-15999. |
Second Office Action for KR Appl. Ser. No. 10-2021-7020076 dated Jun. 30, 2021 (5 pages). |
Second Office Action on CN Appl. Ser. No. 201780081968.3 dated May 12, 2021 (7 pages). |
Stafford, J. et al., “Holographic aperture ladar with range compression,” Journal of the Optical Society of America, vol. 34, No. 5, May 2017, pp. A1-A9. |
Supplementary European Search Report on EP Appl. Ser. No. 18748729.3 dated Nov. 20, 2020 (8 pages). |
Supplementary European Search Report on EP Appl. Ser. No. 18831205.2 dated Feb. 12, 2021 (7 pages). |
Supplementary European Search Report on EP Appl. Ser. No. 19791789.1 dated Dec. 9, 2021 (4 pages). |
Third Party Submission on U.S. Appl. No. 16/725,375, filed Jun. 25, 2020 (73 pages). |
Tippie, A. et al., “High-resolution synthetic-aperture digital holography with digital phase and pupil correction”, Optics Express, vol. 19, No. 13, Jun. 20, 2011, pp. 12027-12038. |
Weinmann, M. et al., “Semantic point cloud interpretation based on optimal neighborhoods, relevant features and efficient classifiers”, ISPRS Journal of Photogrammetry and Remote Sensing, vol. 105, Feb. 27, 2015, pp. 286-304. |
Wikipedia, “Digital-to-analog converter”, retrieved from https://en.wikipedia.org/wiki/Digital-to-analog_converter, on Apr. 15, 2017, 7 pages. |
Wikipedia, “Field-programmable gate array”, retrieved from https://en.wikipedia.org/wiki/Field-programmable_gate_array, on Apr. 15, 2017, 13 pages. |
Wikipedia, “In-phase and quadrature components”, retrieved from https://en.wikipedia.org/wiki/In-phase_and_quadrature_components, on Jan. 26, 2018, 3 pages. |
Wikipedia, “Phase-shift keying”, retrieved from https://en.wikipedia.org/wiki/Phase-shift_keying#Binary_phase-shift_keying.28BPSK.29, on Oct. 23, 2016, 9 pages. |
Ye, J., “Least Squares Linear Discriminant Analysis”, 24th International Conference on Machine Learning, pp. 1087-1093 (as of Nov. 27, 2016). |
Lu et al., “Recognizing objects in 3D point clouds with multi-scale features”, Sensors 2014, 14, 24156-24173; doi: 10.3390/s141224156 (Year: 2014). |
Notice of Reasons for Refusal on JP Appl. Ser. No. 2021-538998 dated Apr. 26, 2022 (11 pages). |
Examination Report on EP Appl. Ser. No. 17898933.1 dated May 25, 2022 (5 pages). |
Notice of Reasons for Refusal on JP Appl. Ser. No. 2021-118743 dated Jun. 7, 2022 (9 pages). |
Farhad Samadzadegan et al., “A Multiple Classifier System for Classification of LIDAR Remote Sensing Data Using Multi-class SVM”, International Workshop on Multiple Classifier Systems, MCS 2010, Lecture Notes in Computer Science, 2010, vol. 5997, pp. 254-263. |
Notice of Reasons for Rejection issued in connection with JP Appl. Ser. No. JP 2021-126516 dated Jun. 21, 2022 (16 pages). |
Chester, David B. “A Parameterized Simulation of Doppler Lidar”, All Graduate Thesis and Dissertions, Dec. 2017, Issue 6794, <URL: https://digitalcommons.usu.edu/etd/6794 > * pp. 13-14, 27-28, 45*. |
Korean Office Action issued in connection with KR Appl. Ser. No. 10-2021-7023519 dated Feb. 13, 2023. |
Notice of Reasons of Rejection issued in connection with JP Appl. Ser. No. 2022-000212 dated Feb. 7, 2023. |
Chinese Office Action issued in related CN Appl. Ser. No. 201780081804.0 dated Dec. 1, 2022 (20 pages). |
Office Action issued in connection with Japanese Appl. No. 2022-569030 dated Aug. 22, 2023. |
Stamatis et al., “3D automatic target recognition for future LIDAR missiles”, Dec. 2016, IEEE Transactions on Aerospace and Electronic Systems 52(6):2662-2675, DOI:10.1109/TAES.2016.150300. |
International Search Report and Written Opinion issued in connection with PCT/US2023/011341 dated Feb. 16, 2024. |
Office Action issued in connection with Japanese Appl. No. 2023-095571 dated Feb. 20, 2024. |
Office Action issued in connection with Chinese Appl. No. 202180034633.2 dated Mar. 9, 2024. |
Office Action issued in connection with Chinese Appl. No. 202310820368.1 dated Feb. 28, 2024. |
Number | Date | Country | |
---|---|---|---|
20230243977 A1 | Aug 2023 | US |