Planar-beam, light detection and ranging system

Information

  • Patent Grant
  • 11604475
  • Patent Number
    11,604,475
  • Date Filed
    Tuesday, February 9, 2021
    3 years ago
  • Date Issued
    Tuesday, March 14, 2023
    a year ago
Abstract
A planar-beam, light detection and ranging (PLADAR) system can include a laser to output a laser beam and a collimator configured to collimate the laser beam axially to emit a planar beam from the laser. The PLADAR system can further include a detector to detect reflected light based on the planar beam being reflected from external surfaces of target objects.
Description
BACKGROUND

Light detection, and ranging (LIDAR or LADAR) systems utilize a number of laser beams to detect reflectance or backscatter from the laser beams to map surface features or for remote sensing. For typical LIDAR systems, each beam is precisely configured with a dedicated photodetector that detects the reflectance and/or backscatter from that particular beam. As the beam count increases, so do cost and space requirements for the individual lasers and photodetectors.





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure herein is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements, and in which:



FIG. 1 is a diagram illustrating an example planar-beam, light detection and ranging (PLADAR) system, as described herein;



FIG. 2 is a block diagram illustrating an example autonomous vehicle (AV) including a PLADAR system, as described herein;



FIG. 3 is a flow chart describing an example method of processing PLADAR data, according to one or more examples described herein;



FIGS. 4A and 4B are a flow chart describing example methods of configuring a PLADAR system, as described herein; and



FIG. 5 is a block diagram illustrating an example computing system upon which examples described herein may be implemented.





DETAILED DESCRIPTION

Current LIDAR technology involves fixed-beam LIDAR systems that include laser sources, scanners, optical systems (e.g., beam splitters and/or collimators), and photodetectors. For example, cutting edge LIDAR systems can include pulse rates on the order of one million pulses per second producing a detailed point cloud map of an autonomous vehicle's surroundings at ranges upwards of one hundred-plus meters. These LIDAR systems require precision pulse sequencing for the laser beams for multiple reasons, such as power constraints, sampling and/or processing constraints, and the like. When using typical LIDAR systems for autonomous vehicles traveling on public roads, operational speed may be limited by the nature of the beam pattern produced by the LIDAR system. For example, in order to ensure safety for an autonomous vehicle traveling at low speeds over public roads, a LIDAR system may require several separate beams to readily detect potential hazards with sufficient granularity to decelerate, maneuver, and/or stop the autonomous vehicle accordingly. When the autonomous vehicle travels at high speeds (e.g., 60 mph, 75 mph, etc.), in order to achieve the same granularity for potential hazards in order to safely react, decelerate, and/or stop the autonomous vehicle, a fixed-beam LIDAR system may require well over seventy separate beams.


Increasing the number of fixed beams places additional requirements for a LIDAR system. For example, the LIDAR system will require more power, greater processing capability, larger or more sensitive photodetector and receiving equipment, constrained optics, and generally greater weight and more space. Furthermore, cost and waste quickly become an issue when increasing the number of fixed-beams, since the beam pattern for the fixed-beam LIDAR system must be tuned for a maximum operational speed of the autonomous vehicle. If autonomous vehicles are to operate safely with LIDAR technology on public highways at high speed, then alternative arrangements may be necessary to avoid spiraling costs, wasted power, additional equipment, and increased processing requirements.


To address many of the shortcomings of fixed-beam LIDAR systems, a planar-beam, light detection and ranging (PLADAR) system is provided. The PLADAR system can include a laser scanner that emits a planar-beam, and a detector array to detect reflected light (e.g., backscatter) from the planar beam. In some aspects, the laser scanner can include a collimation component that collimates a laser beam generated by the laser scanner into the planar beam. The laser scanner can utilize a single laser collimated on an axis (e.g., a vertical axis) to generate the planar beam, which can extend from the laser scanner approximately triangularly as opposed to linearly. According to certain implementations, the laser scanner can include a fiber laser that generates the laser beam for axial collimation. Fiber lasers can offer vibrational stability, ideal optical quality, and compact size in addition to other desirable qualities. However, virtually any type of laser with appropriate emission characteristics may be used, such as certain types of gas lasers, excimer lasers, dye lasers, other forms of solid state lasers, semi-conductor based lasers, metal vapor lasers, etc. utilizing continuous wave or pulsed emissions. For autonomous vehicle applications, wavelengths on the order of 1000 nanometers (nm) (e.g., 1200-1550 nm) corresponding to the near-infrared spectral range may be optimal for health and safety reasons.


In many examples, the detector array of the PLADAR system can include at least one set of photodetectors, such as one or more linear rows of photodetectors, which can be included on a circuit board of the PLADAR system. The circuit board can include a number of adjustment components for the photodetector array(s) to calibrate the photodetectors in concert. In some aspects, the PLADAR system can further include an adjustment controller to dynamically adjust the adjustment components to optimally configure the row(s) of photodetectors in response to a command signal. The command signal may be generated by a calibration system pre-implementation, or dynamically when the PLADAR system is in use. Additionally or alternatively, the adjustment components can be tuned manually by a user or technician when calibrating the photodetector array to the planar beam.


Accordingly to examples described herein, the PLADAR system can be implemented on an autonomous vehicle to provide sensor data to an on-board data processing system of the autonomous vehicle. The PLADAR system can include an analog-to-digital converter (ADC) chain coupled to the photodetector array. In certain implementations, the ADC chain can generate output from all of the photodetectors simultaneously, and the outputted data can be processed (e.g., by the on-board data processing system of the autonomous vehicle) accordingly. In such implementations, the pulse rate of the planar beam can be significantly reduced compared to fixed-beam LIDAR systems. For example, when fine granularity is desired, instead of transmitting one hundred or so beams (e.g., with ˜0.15° beam spacing), examples described herein can transmit a single (or multiple) beam planes with the same or similar data quality at ˜ 1/100th the pulse rate.


Among other benefits, the examples described herein achieve a technical effect of providing an alternative to increasingly expensive, complex, and tediously calibrated LIDAR systems. A PLADAR system can maintain or increase data quality while reducing cost and complexity, which are increasing concerns in autonomous vehicle technology and currently function as hindrances in the rollout of autonomous vehicles for common use.


As used herein, a PLADAR system implements remote sensing using planar beams as opposed to linear beams. “PLADAR” is used herein to represent any light detection and ranging system that uses two-dimensional beam planes for remote sensing.


As used herein, a computing device refer to devices corresponding to desktop computers, cellular devices or smartphones, personal digital assistants (PDAs), field programmable gate arrays (FPGAs), laptop computers, tablet devices, television (IP Television), etc., that can provide network connectivity and processing resources for communicating with the system over a network. A computing device can also correspond to custom hardware, in-vehicle devices, or on-board computers, etc. The computing device can also operate a designated application configured to communicate with the network service.


One or more examples described herein provide that methods, techniques, and actions performed by a computing device are performed programmatically, or as a computer-implemented method. Programmatically, as used herein, means through the use of code or computer-executable instructions. These instructions can be stored in one or more memory resources of the computing device. A programmatically performed step may or may not be automatic.


One or more examples described herein can be implemented using programmatic modules, engines, or components. A programmatic module, engine, or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions. As used herein, a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs or machines.


Some examples described herein can generally require the use of computing devices, including processing and memory resources. For example, one or more examples described herein may be implemented, in whole or in part, on computing devices such as servers, desktop computers, cellular or smartphones, personal digital assistants (e.g., PDAs), laptop computers, printers, digital picture frames, network equipment (e.g., routers) and tablet devices. Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein (including with the performance of any method or with the implementation of any system).


Furthermore, one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer-readable medium. Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples disclosed herein can be carried and/or executed. In particular, the numerous machines shown with examples of the invention include processor(s) and various forms of memory for holding data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smartphones, multifunctional devices or tablets), and magnetic memory. Computers, terminals, network enabled devices (e.g., mobile devices, such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, examples may be implemented in the form of computer-programs, or a computer usable carrier medium capable of carrying such a program.


System Description



FIG. 1 is a block diagram illustrating an example planar-beam, light detection and ranging (PLADAR) system, as described herein. The PLADAR system 100 can include a PLADAR scanner and optics 114 that generate a two-dimensional beam plane 118. The scanner/optics 114 can include a single laser source (or multiple laser sources) that generates a laser beam. In certain aspects, the laser source of the beam plane 118 can be a fiber laser emitting in the near to mid infrared spectral range. The optics of the scanner/optics 114 can include a number mirrors and/or a collimation component that collimates the laser beam axially to form the beam plane 118. The collimation component can include a number of lenses, spatial filters, mirrors, fiber optics, and/or gratings, which can filter, amplify, and/or narrow a resultant planar beam 118. Accordingly, the collimation component of the scanner/optics 114 collimates the laser beam on a single axis to generate the beam plane 118.


In many aspects, the PLADAR system 100 can include a photodetector array 119 including a number of individual photodetectors. According to examples described herein, the photodetector array 119 can comprise a linear arrangement (e.g., as a one or more linear rows of photodetectors) to correlate with the beam plane 118 emitted by the PLADAR system 100. For example, the individual photodetectors can be included and calibrated on a circuit board to be aligned with the beam plane 118. Furthermore, the photodetector array 119 can include a number of adjustable components 123 (e.g., calibration screws) that can allow for straightforward calibration of the photodetector array 119 with the beam plane 118. In some aspects, the photodetector array 119 can include a sufficient number of individual photodetectors (e.g., tens to hundreds) for generating sensor data with sufficient granularity to detect any possible road hazards (e.g., objects with size on the order of feet or inches) for operating an autonomous vehicle on public roads. In such aspects, the photodetector array 119 can include as many or more photodetectors as current or future state of the art fixed-beam LIDAR systems.


In various implementations, the photodetector array 119 can include multiple linear arrangements of photodetectors, and/or interleaved detectors across the multiple linear arrangements. For example, the photodetector array 119 can include two or more lines of photodetectors to take advantage of the beam spread of the beam plane 118. In variations, the nature of the beam plane 118 can allow for any suitable arrangement for the photodetector array 119, such as separate staggered photodetector lines of varying lengths, a wider central arrangement, a narrower central arrangement, and the like. Thus, in addition to a single linear row of photodetectors, the photodetector array can include a plurality of photodetector rows aligned in a manner corresponding to the beam plane 118.


The photodetector array 119 detects reflection and/or backscatter 126 from the beam plane 118, and a timing component 135 is utilized to perform ranging operations for the PLADAR system 100. Accordingly, the PLADAR system 100 actively transmits the beam plane 118, light from the beam plane 118 is reflected off objects and surfaces, and this reflection/backscatter 126 is detected by the individual receivers of the photodetector array 119. Data from the detected light is precisely timed to perform the ranging operations (e.g., dynamic calculations of distance to each surface) and generate a dynamic point cloud map of the situational environment of the PLADAR system 100.


Furthermore, LIDAR systems sequence the individual beams and detectors in order to decrease power and processing loads, which can constrain data quality. The PLADAR system 100 can utilize a single beam plane 118 at a single pulse rate, which may be increased or decreased accordingly depending on the situational environment (e.g., a crowded city environment, a rural road with little or no traffic, etc.). In certain aspects, PLADAR sensor data 129 from the photodetectors of the photodetector array 119 may be sampled simultaneously from all photodetectors by a local or external data processor 130. Thus, in some implementations, the data processor 130 can be included as a component of the PLADAR system 100. In other implementations, the data processor 130 may be remote, and/or can be included as a part of, for example, an on-board data processing system of an autonomous vehicle.


Each detector of the photodetector array 119 can include an analog-to-digital converter (ADC), which converts the detected light signal from the reflection/backscatter 126 of the beam plane 118 into a digital signal for processing. Accordingly, in many examples, the photodetector array 119 can include an ADC chain, similar to certain LIDAR systems. The combined data from the ADC chain (i.e., PLADAR sensor data 129) can be sampled by the data processor 130 (e.g., simultaneously or near-simultaneously for each pulse) to generate the point cloud map of the situational environment. Feedback 132 can be provided by the data processor 130 to a PLADAR controller 150, or adjustment controller, which can make adjustments to the configurable parameters of the PLADAR system 100.


According to examples described herein, the individual detectors of the photodetector array 119 can be adjusted in concert using the adjustable components 123. In certain examples, the photodetector array 119 can be pre-calibrated and aligned with the beam plane 118 during the manufacturing process. Additionally or alternatively, when a misalignment is detected (e.g., by the data processor 130), the photodetector array 119 may be manually calibrated during servicing. Additionally or alternatively still, the feedback 132 provided by the data processor 130 can indicate the misalignment, and can be processed by the PLADAR controller 150. The PLADAR controller 150 can determine a number of adjustments based on the feedback 132, and can utilize the adjustable components 123 to re-calibrate the photodetector array 119 automatically and on the fly.


In certain aspects, the PLADAR controller 150 can further operate the PLADAR motor 120, which can, for example, control a rotational rate of the PLADAR system 100. The PLADAR controller can further control the timing component 135 and the PLADAR scanner/optics 114 to increase or decrease the pulse rate when, for example, finer granularity in the generated point cloud is needed (e.g., in pedestrian rich environments). Generally, however, the pulse rate of the beam plane 118 can be far less (e.g., 100× less) than those of typical LIDAR systems, since the PLADAR system 100 utilizes a single light source.


Other arrangements are contemplated. For example, the PLADAR scanner/optics 114 can generate the beam plane 118 along with one or more linear beams having dedicated detectors. As another example, the PLADAR scanner/optics 114 can generate multiple beam planes 118. The embodiment illustrated in FIG. 1 shows a rotational PLADAR system 100 operated by a PLADAR motor 120. However, example PLADAR systems 100 described herein can include a scanning motor that uses a beam plane 118 to scan a certain directional aspect (e.g., directly in front of an autonomous vehicle). Further, the beam plane 118 is axially collimated on a single axis, and can be collimated vertically, horizontally, in a slanted manner (as shown), and can provide almost any desired vertical field of view (e.g., a 45° VFOV at 25 meters).



FIG. 2 is a block diagram illustrating an example autonomous vehicle including a PLADAR system, as described herein. The PLADAR system 205 of the autonomous vehicle (AV) 200 can provide PLADAR data 202 to an on-board data processing system 210 of the autonomous vehicle 200. In some examples, the PLADAR system 205 can comprise a light source (e.g., a laser), a photodetector, scanner components (e.g., which can include one or more lens(es), mirror(s), motor(s), actuator(s), etc.), and circuitry to couple to various components of the autonomous vehicle 200. The data processing system 210 can utilize the PLADAR data 202 to detect the situational conditions of the autonomous vehicle 200 as the AV 100 travels along a current route. For example, the data processing system 210 can identify potential obstacles or road hazards—such as pedestrians, bicyclists, objects on the road, road cones, road signs, animals, etc.—in order to enable an AV control system 220 to react accordingly.


In certain implementations, the data processing system 210 can utilize sub-maps 233 stored in a database 230 of the autonomous vehicle 200 in order to perform localization and pose operations to determine a current location and orientation of the autonomous vehicle 200 in relation to a given region (e.g., a city). The sub-maps 233 can comprise previously recorded sensor data, such as stereo camera data, radar maps, and/or point cloud LIDAR maps that enable the data processing system 210 to compare the PLADAR data 202 from the PLADAR system 205 with a current sub-map 234 to identify such obstacles and potential road hazards in real time. The data processing system 210 can provide the processed sensor data 213—identifying such obstacles and road hazards—to AV control system 220, which can react accordingly by operating the steering, braking, and acceleration systems 225 of the autonomous vehicle 200.


In some examples, the autonomous vehicle 200 further includes a number of stereo cameras 260 that generate dynamic image data 262 of the autonomous vehicle's 200 surroundings. For example, the autonomous vehicle 200 can include stereo cameras 260 with fields of view showing a 360° panorama (or forward and rearward directions) of the autonomous vehicle 200. The on-board data processing system 210 can further process the dynamic image data 262 to identify features or potential hazards along the current route traveled. The processed data 213 can include processed image data from the stereo cameras 260, which can be utilized by the AV control system 220 to perform low level maneuvering.


In many implementations, the AV control system 220 can receive a destination 219 from, for example, an interface system 215 of the autonomous vehicle 200. The interface system 215 can include any number of touch-screens, voice sensors, mapping resources, etc. that enable a passenger 239 to provide a passenger input 241 indicating the destination 219. For example, the passenger 239 can type the destination 219 into a mapping engine 275 of the autonomous vehicle 200, or can speak the destination 219 into the interface system 215. Additionally or alternatively, the interface system 215 can include a wireless communication module that can connect the autonomous vehicle 200 to a network 280 to communicate with a backend transport arrangement system 290 to receive invitations 282 to service a pick-up or drop-off request. Such invitations 282 can include destination 219 (e.g., a pick-up location), and can be received by the autonomous vehicle 200 as a communication over the network 280 from the backend transport arrangement system 290. In many aspects, the backend transport arrangement system 290 can manage routes and/or facilitate transportation for users using a fleet of autonomous vehicles throughout a given region. The backend transport arrangement system 290 can be operative to facilitate passenger pick-ups and drop-offs to generally service pick-up requests, facilitate delivery such as packages, food, or animals, and the like.


Based on the destination 219 (e.g., a pick-up location), the AV control system 220 can utilize the mapping engine 275 to receive route data 232 indicating a route to the destination 219. In variations, the mapping engine 275 can also generate map content 226 dynamically indicating the route traveled to the destination 219. The route data 232 and/or map content 226 can be utilized by the AV control system 220 to maneuver the autonomous vehicle 200 to the destination 219 along the selected route. For example, the AV control system 220 can dynamically generate control commands 221 for the autonomous vehicle's steering, braking, and acceleration system 225 to actively drive the autonomous vehicle 200 to the destination 219 along the selected route. Optionally, the map content 226 showing the current route traveled can be streamed to the interior interface system 215 so that the passenger(s) 239 can view the route and route progress in real time.


In many examples, while the AV control system 220 operates the steering, braking, and acceleration systems 225 along the current route on a high level, and the processed data 213 provided to the AV control system 220 can indicate low level occurrences, such as obstacles and potential hazards to which the AV control system 220 can make decisions and react. For example, the processed data 213 can indicate a pedestrian crossing the road, traffic signals, stop signs, other vehicles, road conditions, traffic conditions, bicycle lanes, crosswalks, pedestrian activity (e.g., a crowded adjacent sidewalk), and the like. The AV control system 220 can respond to the processed data 213 by generating control commands 221 to reactively operate the steering, braking, and acceleration systems 225 accordingly.


According to examples described herein, the autonomous vehicle 200 can include a PLADAR controller 235 to receive feedback data 223 from the data processing system 210 in order to configure various adjustable parameters of the PLADAR system 205. The feedback data 223 can include information indicating data quality, such as errors or uncertainty in the data from certain individual photodetectors, which can be extrapolated by the PLADAR controller 235 to determine a number of adjustment commands 237 for the PLADAR system 205 that can correct the error(s). For example, the PLADAR controller 235 can identify a pattern in the feedback data 223 indicating a misalignment of the photodetector array with respect to the PLADAR beam 207. The PLADAR controller 235 can identify the misalignment and generate the adjustment commands 237 for execution on the adjustable components of the photodetector array to re-calibrate the PLADAR system 205. As discussed herein, the adjustment commands 237 can be executed on the adjustable components dynamically as the autonomous vehicle 200 travels along a current route, or during garage servicing of the autonomous vehicle 200.


Additionally or alternatively, the feedback data 223 can include requests from the data processing system 210 for the PLADAR controller 235 to configure the PLADAR system 205 for increased or decreased granularity. For example, the data processing system 210 can identify a substantial decrease in potential hazards (e.g., when the autonomous vehicle 200 leaves a city and enters open rural road with little traffic). The feedback data 223 can include a request to save power in such conditions by decreasing the pulse rate and/or scan rate of the PLADAR system 205. Accordingly, in some aspects, the adjustment commands 237 can be generated by the PLADAR controller 235 to adjust a rotational parameter 209 (e.g., decrease a rotational rate) and/or decrease the pulse rate of the PLADAR beam 207—thereby enabling a decrease in sample rate by the data processing system 210.


Conversely, the on-board data processing system 210 can identify an increase in potential hazards (e.g., entering an area of increased pedestrian activity) or an increased probability of experiencing hazards (e.g., when traveling at high speeds), and request that the PLADAR controller 235 generate adjustment commands 237 to increase the sample rate. Such commands 237 can be executed on the adjustable parameters of the PLADAR system 205 to increase a pulse rate of the PLADAR beam 207 and/or increase the rotational rate-thereby enabling the data processing system 210 to increase the sample rate and bolster point cloud granularity.


Methodology



FIG. 3 is a flow chart describing an example method of processing PLADAR data, according to one or more examples described herein. In the below description of FIG. 3, reference may be made to like features represented by reference characters from FIGS. 1 and 2. Furthermore, the method described with respect to FIG. 3 may be performed by an example data processor 130 shown and described with respect to FIG. 1, or an on-board data processing system 230 shown and described with respect to FIG. 2. Referring to FIG. 3, the data processor 130 can sample data from each detector of the PLADAR system 100 simultaneously (300). For example, the data processor 130 can monitor each ADC of an ADC chain coupled to the photodetector array 119. For each beam plane 118 pulse, return light signals (e.g., reflection/backscatter 126) can be received by the photodetector array 119. Each detector can include an ADC that converts the detected light signal into a digital signal, and a timing component 135 can be utilized by the data processor 130 to precisely perform ranging for each ADC of the ADC chain and for every beam plane 118 pulse.


The data processor 130 (e.g., of an autonomous vehicle 200) can process the PLADAR data 129 to perform ranging and identify potential hazards (305). For example, the data processor 130 can be programmed to identify aspects of the autonomous vehicle's situational environment that causes the autonomous vehicle 200 to operate safely on public roads. Such aspects can include pedestrians, bicyclists, hazardous objects on the road (e.g., rocks), stop lights, signs, other vehicles, and the like. The processor 130 can identify such aspects by, for example, comparing the PLADAR data 129 to a stored sub-map including prerecorded data on the same current route, as described with respect to FIG. 2. In many implementations, the data processor 130 transmits the processed sensor data 213 to an AV control system 220, which can control the autonomous vehicle's 200 steering braking, and acceleration systems 225 to make decisions and react to each processed object for low level maneuvering (310). Additionally, the AV control system 220 can further utilize dynamic image data 262 from a stereo camera system 260 of the autonomous vehicle 200 for low level maneuvering.


In certain implementations, the data processor 130 can identify, in the PLADAR data 129, a misalignment between the photodetector array 119 and the beam plane 118 (315). As an example, the PLADAR data 129 can indicate unreliable data for the top detectors and the bottom detectors, which can indicate a diagonal misalignment of the photodetector array 118. In some aspects, the data processor 130 can determine the nature of the misalignment based on the sampled PLADAR data 129. In other aspects, the data processor 130 can generally identify the error in the data, and generate feedback 132 requesting the PLADAR controller 150 to perform a diagnostics test. In either aspect, the data processor 130 can generate feedback 132 indicating the misalignment (320), and transmit the feedback to the PLADAR controller 150 to re-calibrate the photodetector array 119 to the planar beam 118 (325).


According to some examples, the data processor 130 can determine a condition change in the situational environment of the autonomous vehicle 200 (330). As an example, the data processor 130 can identify that the autonomous vehicle 200 is traveling at higher speeds, and that more detailed data from a forward direction of the autonomous vehicle 200 is desired. The data processor 130 can generate a request to adjust the PLADAR system 100 configurations to, for example, increase a pulse rate, scan rate, detector sensitivity, and/or a laser intensity to increase the data quality (335). Conversely, to optimize power and processing resources, in certain circumstances (e.g., low speed operation), the data processor 130 can generate a request to decrease such configurable parameters when situational conditions are conducive to such decreases (335). These requests can be transmitted to the PLADAR controller 150 (340), which can execute adjustment commands 237 on the configurable components of the PLADAR system 100 accordingly.



FIGS. 4A and 4B are a flow chart describing example methods of configuring a PLADAR system, as described herein. In the below description of FIGS. 4A and 4B, reference may be made to like features represented by reference characters from FIGS. 1 and 2. Furthermore, the methods described with respect to FIGS. 4A and 4B may be performed by an example PLADAR controller 150, 235 shown and described with respect to FIGS. 1 and 2. Referring to FIG. 4A, the PLADAR controller 150 can receive feedback 132 from the data processor 130 indicating a misalignment (400). In some examples, the feedback 132 identifies the specific misalignment (e.g., leftward, rightward, topward, downward, clockwise or counterclockwise diagonal misalignments or any combination of the foregoing).


In other examples, the feedback 132 can include samplings of the PLADAR data 129, which the PLADAR controller 150 can analyze to identify a data pattern that describes or details the misalignment (405). For example, data from the photodetector array 119 can indicate a consistent pattern of bad or unreliable data from any number of individual detectors in the array 119. In some situations, the PLADAR controller 150 can perform an initial set of adjustments on the adjustable components 123 to diagnose the misalignment. In other situations, the misalignment may be readily identified by the PLADAR controller 150, and the calibration can be made directly. Accordingly, once the precise misalignment is identified, the PLADAR controller 150 can generate and execute adjustment commands 237 on the adjustable components 123 of the photodetector array 119 to realign or re-calibrate the photodetector array 119 to the beam plane 118 (410).


Referring to FIG. 4B, the PLADAR controller 150 can receive a request from the data processor 130 to adjust PLADAR system 100 configurations (450). For example, based on changing situational conditions (e.g., changing weather such as rain or snow, changing speed, changing environmental complexity or potential hazard count, etc.), the data processor 130 can determine that an increased or decreased pulse rate (451) and/or scan rate (452) is preferable. Additionally or alternatively, the data processor 130 may determine that conditions require an increase in laser intensity (453) to enhance reflectance, or an increase in detector sensitivity (454). Alternatively, the data processor 130 may determine that conditions are conducive to power savings (e.g., in low speed uncrowded situations), and may request to decrease such configurations.


In any case, the PLADAR controller 150 can generate adjustment commands 237 based on the requests from the data processor 130 (455). The PLADAR controller 150 can then execute the adjustment commands 237 on the relevant components of the PLADAR system 100 to configure the PLADAR system 100 accordingly (460). For example, the PLADAR controller 150 can execute commands 237 on the PLADAR motor 120 to increase or decrease the scan rate (461). As another example, the PLADAR controller 150 can execute commands 237 on the timing component 135 to increase or decrease a pulse rate of the laser (462). Further, the PLADAR controller 150 can execute commands 237 on the laser source itself to increase or decrease laser intensity (e.g., increase or decrease power or beam frequency) (463). Still further, in some implementations, the PLADAR controller 150 can execute commands 237 on the detector array 119 to increase or decrease detector sensitivity (464).


While the data processor 130 and PLADAR controller 150 are shown as separate components in FIGS. 1 and 2, it is contemplated that certain embodiments can include a single component (e.g., one or more blade computers of an autonomous vehicle 200 that perform all of the operations described with respect to FIG. 3 and FIGS. 4A and 4B.


Hardware Diagram



FIG. 5 is a block diagram that illustrates a computer system upon which examples described herein may be implemented. A computer system 500 can be implemented on, for example, a server or combination of servers. For example, the computer system 500 may be implemented as part of a data processing system 130, which itself may be implemented as a part of the AV's on-board data processing system 210. In the context of FIG. 1, the data processing system 130 may be implemented with the PLADAR controller 150 as a single computer system 500, or using a combination of multiple computer systems as described in connection with FIG. 5.


In one implementation, the computer system 500 includes processing resources 510, a main memory 520, a read-only memory (ROM) 530, a storage device 540, and a communication interface 550. The computer system 500 includes at least one processor 510 for processing information stored in the main memory 520, such as provided by a random access memory (RAM) or other dynamic storage device, for storing information and instructions which are executable by the processor 510. The main memory 520 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 510. The computer system 500 may also include the ROM 530 or other static storage device for storing static information and instructions for the processor 510. A storage device 540, such as a magnetic disk or optical disk, is provided for storing information and instructions.


The communication interface 550 enables the computer system 500 to communicate with the PLADAR system 580 over a network link (e.g., a wireless or wired link). In accordance with examples, the computer system 500 receives PLADAR data 582 from the PLADAR system 580. The executable instructions stored in the memory 530 can include configuration instructions 522, which the processor 510 executes to generate a set of adjustment commands 554 to configure the adjustable parameters of the autonomous vehicle's PLADAR system 580 based on the PLADAR data 582 and the situational conditions of the autonomous vehicle 200.


The processor 510 is configured with software and/or other logic to perform one or more processes, steps and other functions described with implementations, such as described by FIGS. 1 through 4B, and elsewhere in the present application.


Examples described herein are related to the use of the computer system 500 for implementing the techniques described herein. According to one example, those techniques are performed by the computer system 500 in response to the processor 510 executing one or more sequences of one or more instructions contained in the main memory 520. Such instructions may be read into the main memory 520 from another machine-readable medium, such as the storage device 540. Execution of the sequences of instructions contained in the main memory 520 causes the processor 510 to perform the process steps described herein. In alternative implementations, hard-wired circuitry may be used in place of or in combination with software instructions to implement examples described herein. Thus, the examples described are not limited to any specific combination of hardware circuitry and software.


It is contemplated for examples described herein to extend to individual elements and concepts described herein, independently of other concepts, ideas or system, as well as for examples to include combinations of elements recited anywhere in this application. Although examples are described in detail herein with reference to the accompanying drawings, it is to be understood that the concepts are not limited to those precise examples. As such, many modifications and variations will be apparent to practitioners skilled in this art. Accordingly, it is intended that the scope of the concepts be defined by the following claims and their equivalents. Furthermore, it is contemplated that a particular feature described either individually or as part of an example can be combined with other individually described features, or parts of other examples, even if the other features and examples make no mentioned of the particular feature. Thus, the absence of describing combinations should not preclude claiming rights to such combinations.

Claims
  • 1. A light detection and ranging (LIDAR) system comprising: a laser configured to output a first laser beam at a first angle;a collimator configured to collimate the first laser beam on an axis to emit a second laser beam at a second angle along a beam plane, the second angle being different than the first angle; anda detector array configured to detect return light from the second laser beam, the return light reflecting off objects, the detector array comprising a plurality of photodetectors arranged on a circuit board; andone or more processors configured to: obtain data indicative of the return light from one or more of the photodetectors;determine a misalignment between the beam plane and one or more of the photodetectors based, at least in part, on the data; andgenerate one or more calibration parameters based on the misalignment, the one or more calibration parameters associated with adjusting a position of at least one of the photodetectors.
  • 2. The LIDAR system of claim 1, wherein the laser comprises a fiber laser that outputs the first laser beam.
  • 3. The LIDAR system of claim 1, wherein the collimator comprises one or more mirrors that produce the second laser beam from the first laser beam.
  • 4. The LIDAR system of claim 1, wherein the collimator comprises one or more lenses that produce the second laser beam from the first laser beam.
  • 5. The LIDAR system of claim 1, wherein the plurality of photodetectors include at least one row of photodetectors.
  • 6. The LIDAR system of claim 5, further comprising: an analog to digital converter (ADC) chain coupled to the at least one row of photodetectors.
  • 7. The LIDAR system of claim 6, wherein the one or more processors are configured to sample data from each ADC of the ADC chain simultaneously to obtain the data indicative of the return light.
  • 8. The LIDAR system of claim 1, further comprising: a plurality of adjustment components disposed on the circuit board, the plurality of adjustment components configured to adjust the position of one or more of the photodetectors.
  • 9. The LIDAR system of claim 8, wherein at least one adjustment component of the plurality of adjustment components comprises a screw.
  • 10. The LIDAR system of claim 8, wherein the one or more processors are configured to provide a command signal to one or more of the plurality of adjustment components, the command signal associated with adjusting the position of the at least one of the photodetectors.
  • 11. The LIDAR system of claim 1, wherein the beam plane comprises a two dimensional beam plane along which the second laser beam is emitted.
  • 12. The LIDAR system of claim 1, further comprising: an electric motor configured to rotate the detector array about an axis.
  • 13. The LIDAR system of claim 1, wherein the one or more processors are further configured to: provide one or more control signals associated with adjusting a pulse rate of the first laser beam.
  • 14. An autonomous vehicle comprising: a light detection and ranging (LIDAR) system, the LIDAR system comprising: a laser configured to output a first laser beam at a first angle;a collimator configured to collimate the first laser beam on an axis to emit a second laser beam at a second angle along a beam plane, the second angle being different than the first angle; anda detector array configured to detect return light from the second laser beam, the return light reflecting off objects, the detector array comprising a plurality of photodetectors arranged on a circuit board; andone or more processors configured to: obtain data indicative of the return light from one or more of the photodetectors;determine a misalignment between the beam plane and one or more of the photodetectors based, at least in part, on the data; andgenerate one or more calibration parameters based on the misalignment, the one or more calibration parameters associated with adjusting a position of at least one of the photodetectors.
  • 15. The autonomous vehicle of claim 14, further comprising: acceleration, braking, and steering systems; anda control system to autonomously operate the acceleration, braking, and steering systems.
  • 16. The autonomous vehicle of claim 14, further comprising: a control system that processes sensor data from the LIDAR system to dynamically identify features as the autonomous vehicle accelerates and maneuvers.
  • 17. An autonomous vehicle control system comprising: a light detection and ranging (LIDAR) system, the LIDAR system comprising: a laser configured to output a first laser beam at a first angle;a collimator configured to collimate the first laser beam on an axis to emit a second laser beam at a second angle along a beam plane, the second angle being different than the first angle; anda detector array configured to detect return light from the second laser beam, the return light reflecting off objects, the detector array comprising a plurality of photodetectors arranged on a circuit board; andone or more processors configured to: obtain data indicative of the return light from one or more of the photodetectors;determine a misalignment between the beam plane and one or more of the photodetectors based, at least in part, on the data; andgenerate one or more calibration parameters based on the misalignment, the one or more calibration parameters associated with adjusting a position of at least one of the photodetectors.
  • 18. The autonomous vehicle control system of claim 17, wherein the LIDAR system further comprises: a plurality of adjustment components disposed on the circuit board, the plurality of adjustment components configured to adjust the position of one or more of the photodetectors.
  • 19. The autonomous vehicle control system of claim 18, wherein the one or more processors are configured to provide a command signal to one or more of the plurality of adjustment components, the command signal associated with adjusting the position of the at least one of the photodetectors.
  • 20. The autonomous vehicle control system of claim 17, wherein the collimator comprises one or more mirrors that produce the second laser beam from the first laser beam.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/242,657, titled “PLANAR-BEAM, LIGHT DETECTION AND RANGING SYSTEM”, and filed on Jan. 8, 2019: which is a continuation of Ser. No. 15/446,953, titled “PLANAR-BEAM, LIGHT DETECTION AND RANGING SYSTEM”, and filed on Mar. 1, 2017; which claims the benefit of priority to U.S. Provisional Application No. 62/303,013, titled “PLANAR-BEAM, LIGHT DETECTION AND RANGING SYSTEM,” and filed on Mar. 3, 2016; the aforementioned applications being hereby incorporated by reference in their respective entireties.

US Referenced Citations (171)
Number Name Date Kind
4567347 Ito Jan 1986 A
4706773 Reinaud Nov 1987 A
5012745 Yamashita May 1991 A
5170458 Aoyagi Dec 1992 A
5590604 Lund Jan 1997 A
5598783 Lund Feb 1997 A
5726647 Waffler Mar 1998 A
6122040 Arita Sep 2000 A
6434302 Fidric Aug 2002 B1
6657705 Sano Dec 2003 B2
6827265 Knowles Dec 2004 B2
6860350 Beuhler Mar 2005 B2
6956227 Miyazaki Oct 2005 B2
6957775 Tsikos Oct 2005 B2
7064817 Schmitt et al. Jun 2006 B1
7248342 Degnan Jul 2007 B1
7406220 Christensen Jul 2008 B1
7532311 Henderson May 2009 B2
7609922 Aoki Oct 2009 B2
7650239 Samukawa Jan 2010 B2
7961301 Earhart Jun 2011 B2
7993005 Peterson Aug 2011 B2
8208716 Choi Jun 2012 B2
8344940 Jeong Jan 2013 B2
8610881 Gammenthaler Dec 2013 B2
8692980 Gilliland Apr 2014 B2
8736818 Weimer May 2014 B2
8767186 Lu Jul 2014 B2
8797828 Lev Aug 2014 B1
8899903 Saad Dec 2014 B1
8909375 Larson Dec 2014 B2
8964028 Oggier Feb 2015 B2
8994581 Brown Mar 2015 B1
8996224 Louie Mar 2015 B1
9086273 Gruver Jul 2015 B1
9126595 Seo Sep 2015 B2
9201424 Ogale Dec 2015 B1
9224053 Ferguson Dec 2015 B1
9228833 Zeng Jan 2016 B2
9231998 Lu Jan 2016 B2
9247211 Zhang Jan 2016 B2
9328526 Shani May 2016 B2
9360554 Retterath Jun 2016 B2
9383753 Templeton Jul 2016 B1
9453914 Stettner Sep 2016 B2
9454150 Uehara Sep 2016 B2
9529079 Droz Dec 2016 B1
9625582 Gruver Apr 2017 B2
9637118 Yokota May 2017 B2
9669827 Ferguson Jun 2017 B1
9696722 Ulrich Jul 2017 B1
9719801 Ferguson Aug 2017 B1
9720415 Levinson Aug 2017 B2
9746323 Wood Aug 2017 B2
9804264 Villeneuve Oct 2017 B2
9823353 Eichenholz Nov 2017 B2
9840256 Valois Dec 2017 B1
9841495 Campbell Dec 2017 B2
9841763 Valois Dec 2017 B1
9857468 Eichenholz Jan 2018 B1
9869753 Eldada Jan 2018 B2
9869754 Campbell Jan 2018 B1
9874635 Eichenholz Jan 2018 B1
9880263 Droz Jan 2018 B2
9897687 Campbell Feb 2018 B1
10330922 Yasuda Jun 2019 B2
10338225 Boemke Jul 2019 B2
20020072869 Stiller Jun 2002 A1
20020135468 Bos Sep 2002 A1
20020145042 Knowles Oct 2002 A1
20020196424 Sano Dec 2002 A1
20030019931 Tsikos Jan 2003 A1
20030042303 Tsikos Mar 2003 A1
20030043058 Jamieson Mar 2003 A1
20030052169 Tsikos Mar 2003 A1
20040030474 Samuel Feb 2004 A1
20040148093 Tanaka Jul 2004 A1
20040174537 Ferger Sep 2004 A1
20050095092 Segal May 2005 A1
20050185846 Luo Aug 2005 A1
20050196015 Luo Sep 2005 A1
20050196035 Luo Sep 2005 A1
20060089765 Pack Apr 2006 A1
20060149134 Soper Jul 2006 A1
20060158423 Kern Jul 2006 A1
20060227317 Henderson Oct 2006 A1
20070200064 Remillard Aug 2007 A1
20070212006 Wysocki Sep 2007 A1
20070219720 Trepagnier Sep 2007 A1
20070237449 Aoki Oct 2007 A1
20080002427 Kropac Jan 2008 A1
20080039991 May Feb 2008 A1
20080112028 Peterson May 2008 A1
20080136626 Hudson Jun 2008 A1
20080161986 Breed Jul 2008 A1
20080215184 Choi Sep 2008 A1
20090103577 Maeno Apr 2009 A1
20090115994 Stettner May 2009 A1
20090312906 Bauer Dec 2009 A1
20090317035 Aoki Dec 2009 A1
20090319112 Fregene Dec 2009 A1
20100013615 Hebert Jan 2010 A1
20100020306 Hall Jan 2010 A1
20100053715 O'Neill Mar 2010 A1
20100110192 Johnston May 2010 A1
20100165323 Fiess Jul 2010 A1
20100182587 Fluckiger Jul 2010 A1
20100185335 Barwick Jul 2010 A1
20100185353 Barwick Jul 2010 A1
20100194890 Weller Aug 2010 A1
20100208034 Chen Aug 2010 A1
20100208244 Earhart Aug 2010 A1
20110050855 Nobis Mar 2011 A1
20110134249 Wood Jun 2011 A1
20110166757 Otanez Jul 2011 A1
20110184605 Neff Jul 2011 A1
20110241845 Sullivan Oct 2011 A1
20110245964 Sullivan Oct 2011 A1
20110301786 Allis Dec 2011 A1
20110317993 Weissler Dec 2011 A1
20120008129 Lu Jan 2012 A1
20120033196 Vanek Feb 2012 A1
20120038903 Weimer Feb 2012 A1
20120140061 Zeng Jun 2012 A1
20120154785 Gilliland Jun 2012 A1
20120239238 Harvey Sep 2012 A1
20130078063 Shanii Mar 2013 A1
20130166105 Wastel Jun 2013 A1
20130190963 Kuss Jul 2013 A1
20130226431 Lu Aug 2013 A1
20130317649 Larson Nov 2013 A1
20140041966 Healy Feb 2014 A1
20140111812 Baeg Apr 2014 A1
20140136414 Anker May 2014 A1
20140168631 Haslim Jun 2014 A1
20140176933 Haslim Jun 2014 A1
20140231647 Chinn Aug 2014 A1
20140277691 Jacobus Sep 2014 A1
20140286744 Shani Sep 2014 A1
20150025731 Uehara Jan 2015 A1
20150091374 Lenius Apr 2015 A1
20150202939 Stettner Jul 2015 A1
20150293225 Riley Oct 2015 A1
20150293228 Retterath Oct 2015 A1
20150334269 Yokota Nov 2015 A1
20160003946 Gilliland Jan 2016 A1
20160129917 Gariepy May 2016 A1
20160167514 Nishizaki Jun 2016 A1
20160170202 Yasuda Jun 2016 A1
20160170487 Saisho Jun 2016 A1
20160223671 Thayer Aug 2016 A1
20160227193 Osterwood Aug 2016 A1
20160231746 Hazelton Aug 2016 A1
20160282468 Gruver Sep 2016 A1
20160291134 Droz Oct 2016 A1
20160318415 Salasoo Nov 2016 A1
20160349746 Grau Dec 2016 A1
20170096138 Reiff Apr 2017 A1
20170146639 Carothers May 2017 A1
20170155225 Villenueve Jun 2017 A1
20170168146 Boehmke Jun 2017 A1
20170184399 Thayer Jun 2017 A1
20170226765 Wastel Aug 2017 A1
20170255199 Boehmke Sep 2017 A1
20170357260 Gilliland Dec 2017 A1
20180070804 Tesar Mar 2018 A1
20180149732 Droz May 2018 A1
20180329067 Boehmke Nov 2018 A1
20190146505 Boehmke May 2019 A1
20190235054 Ratner Aug 2019 A1
20210402477 Miyashita Dec 2021 A1
Foreign Referenced Citations (33)
Number Date Country
3027331 Feb 2017 CA
102230962 Nov 2011 CN
102494609 Jun 2012 CN
101959022 Nov 2012 CN
103152518 Jun 2013 CN
103179339 Jun 2013 CN
203353019 Dec 2013 CN
104133473 Nov 2014 CN
103001428 May 2015 CN
102857592 Dec 2015 CN
204990755 Jan 2016 CN
103146355 Feb 2016 CN
106313078 Jan 2017 CN
207908673 Sep 2018 CN
10244638 Apr 2004 DE
0185816 Jul 1986 EP
1808711 Jul 2007 EP
2500748 Sep 2012 EP
2940489 Nov 2015 EP
2767846 Feb 2017 EP
1816514 Aug 2017 EP
H09163197 Dec 1998 JP
H09326032 Jun 1999 JP
2011088623 Apr 2001 JP
2005024463 Jan 2005 JP
2007249632 Sep 2007 JP
2008104254 May 2008 JP
2011123078 Dec 2010 JP
2493988 Sep 2013 RU
2012154453 Jun 2014 RU
2014146890 Jun 2016 RU
WO 2011104706 Sep 2011 WO
PCTJP2014053408 Mar 2014 WO
Non-Patent Literature Citations (5)
Entry
Extended European Search Report for Application No. EP21208121, dated Feb. 18, 2022, 8 pages.
Extended European Search Report for EP17760835.3, dated May 2, 2019.
International Search Report and Written Opinion for PCT/US2017/020483 dated Jun. 29, 2017.
International Preliminary Report on Patentability for PCT/2017/020483 dated Sep. 13, 2018.
Partial European Search Report in EP17760835.3, dated Jan. 24, 2019.
Related Publications (1)
Number Date Country
20210181746 A1 Jun 2021 US
Provisional Applications (1)
Number Date Country
62303013 Mar 2016 US
Continuations (2)
Number Date Country
Parent 16242657 Jan 2019 US
Child 17171660 US
Parent 15446953 Mar 2017 US
Child 16242657 US