The subject disclosure relates to Lidar systems and, in particular, to a method for depth imaging using a Lidar system using an optical quadrant detector.
A Lidar system can be used in a vehicle in order to aid in navigation of the vehicle. Often the Lidar system includes a mechanical system for orienting the light across a field of view. The resolution of such images is therefore limited to the scanning rates of such mechanical systems. Additionally, such systems usually require relatively long pulses of light. Such systems generally require arrays of sensors in two dimensions, whose number of sensing pixels limits the system resolution. There is therefore a concern that such pulses may approach or exceed eye-safety limitations. Accordingly, it is desirable to provide a Lidar system for determining depth and angular parameters for targets in a field of view without the use of mechanical scanning devices and with reduced pulse duration and power.
In one exemplary embodiment, a method of imaging a field of interest is disclosed. The method includes illuminating, via a laser, a field of interest with a source pulse of light, receiving, at a quadrant detector, a reflected pulse that is a reflection of the source pulse from the field of interest, and determining a three-dimensional image of the field of interest from a location of the reflected pulse at the quadrant detector and a time-of-flight for the reflected pulse.
In addition to one or more of the features described herein, the method further includes sampling the reflected pulse a plurality of times at the quadrant detector to determine a parameter of the field of interest at a plurality of depths within the field of interest. The method further includes determining an angular location of a target within the field of interest from the location of the reflected pulse at the quadrant detector. The method further includes determining the location of the reflected pulse at the quadrant detector by comparing light intensities at quadrants of the quadrant detector. The method further includes determining a depth of a target within the field of interest from a time of flight associated with the reflected pulse. The method further includes synchronizing the laser with the quadrant detector. The method further includes navigating a vehicle through the field of interest using the three-dimensional image.
In another exemplary embodiment, a Lidar system is disclosed. The Lidar system includes a laser, a quadrant detector and a processor. The laser is configured to illuminate a field of interest with a source pulse of light. The quadrant detector is configured to receive a reflected pulse that is a reflection of the source pulse from the field of interest. The processor is configured to determine a three-dimensional image of the field of interest from a location of the reflected pulse at the quadrant detector and a time-of-flight for the reflected pulse.
In addition to one or more of the features described herein, the quadrant detector samples the reflected pulse a plurality of times to determine a parameter of the field of interest at a plurality of depths within the field of interest. The processor is further configured to determine an angular location of a target within the field of interest from the location of the reflected pulse at the quadrant detector. The processor is further configured to determine the location of the reflected pulse at the quadrant detector by a comparing light intensities at the quadrants of the quadrant detector. The processor is further configured to determine a depth of a target within the field of interest from a time of flight associated with the reflected pulse. A laser driver synchronizes the laser with the quadrant detector. A spatial modulator configured to filters out signals arising from two or more targets that are at a same distance from the quadrant detector and that are angularly distinguishable.
In yet another exemplary embodiment, a vehicle is disclosed. The vehicle includes a laser, a quadrant detector and a processor. The laser is configured to illuminate a field of interest with a source pulse of light. The quadrant detector is configured to receive a reflected pulse that is a reflection of the source pulse from the field of interest. The processor is configured to determine a three-dimensional image of the field of interest from a location of the reflected pulse at the quadrant detector and a time-of-flight for the reflected pulse, and navigate the vehicle through the field of interest using the three-dimension image.
In addition to one or more of the features described herein, the quadrant detector samples the reflected pulse a plurality of times to determine a parameter of the field of interest at a plurality of depths within the field of interest. The processor is further configured to determine an angular location of a target within the field of interest from the location of the reflected pulse at the quadrant detector. The processor is further configured to determine the location of the reflected pulse at the quadrant detector by a comparing light intensities at the quadrants of the quadrant detector. The processor is further configured to determine a depth of a target within the field of interest from a time of flight associated with the reflected pulse. A laser driver that synchronizes the laser with the quadrant detector.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
In accordance with an exemplary embodiment,
The autonomous vehicle 10 generally includes at least a navigation system 20, a propulsion system 22, a transmission system 24, a steering system 26, a brake system 28, a sensor system 30, an actuator system 32, and a controller 34. The navigation system 20 determines a trajectory plan for automated driving of the autonomous vehicle 10. The propulsion system 22 provides power for creating a motive force for the autonomous vehicle 10 and may, in various embodiments, include an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The transmission system 24 is configured to transmit power from the propulsion system 22 to wheels 16 and 18 of the autonomous vehicle 10 according to selectable speed ratios. The steering system 26 influences a position of the wheels 16 and 18. While depicted as including a steering wheel 27 for illustrative purposes, in some embodiments contemplated within the scope of the present disclosure, the steering system 26 may not include a steering wheel 27. The brake system 28 is configured to provide braking torque to the wheels 16 and 18.
The sensor system 30 includes a Lidar system 40 that senses targets in an exterior environment of the autonomous vehicle 10 and provides a depth image of the environment. In operation, the Lidar system 40 sends out a source pulse of light 48 that is reflected back at the autonomous vehicle 10 by one or more targets 50 in the field of view of the Lidar system 40 as a reflected pulse 52.
The actuator system 32 includes one or more actuators that control one or more vehicle features such as, but not limited to, the propulsion system 20, the transmission system 22, the steering system 24, and the brake system 26.
The controller 34 includes a processor 36 and a computer readable storage device or media 38. The computer readable storage medium includes programs or instructions 39 that, when executed by the processor 36, operate the Lidar system 40 in order to obtain data such as location and depth data of a target 50. The computer readable storage medium 38 may further include programs or instructions 39 that when executed by the processor 36, operate the navigation system 20 and/or the actuator system 32 according to data obtained from the Lidar system 40 in order to navigate the autonomous vehicle 10 with respect to the target 50.
In various embodiments the controller 34 operates the Lidar system 40 in order to determine a parameter such as angular location and depth of the target 50 from reflected pulse 52. These parameters can be used either alone or in combination with other parameters (e.g., Doppler) to obtain a predictive map of the environment for navigational purposes. The navigation system 20 builds a trajectory for the autonomous vehicle 10 based on data from the Lidar system 40 and any other parameters. The controller 34 can provide the trajectory to the actuator 32 to control the propulsion system 20, transmission system 22, steering system 24 and/or brake 26 in order to navigate the vehicle 10 with respect to the target 50.
An x-coordinate of a center of the reflected pulse 52 can be determined by comparing the current generated by light incident on the right half (IR) of the optical quadrant detector 200 (i.e., Quadrants Q1 and Q4) to the current generated by light incident on the left half (IL) of the optical quadrant detector 200 (i.e., Quadrants Q2 and Q3), as expressed in Eq. (1):
where IR=I1+I4 and IL=I2+I3. Expressed as a time-varying variable, the x-coordinate is given (in terms of the quadrant currents I1, I2, I3 and I4) by:
Similarly, the y-coordinate of the center of the beam of light 204 can be determined by comparing the current generated by light incident on the upper half (IU) of the optical quadrant detector 200 (i.e., Quadrants Q1 and Q2) to the current generated by light incident on the lower half (ID) of the optical quadrant detector 200 (i.e., Quadrants Q3 and Q4), as expressed by Eq. (3):
where IU=I1+I2 and ID=I3+I4. Expressed as a time-varying variable, the y-coordinate is given (in terms of the quadrant currents I1, I2, I3 and I4) by:
In various embodiments, the optical quadrant detector 200 has a high degree of position or angular resolution. This resolution can be less than 0.01 degrees in various embodiments. The optical quadrant detector 200 further demonstrates a wide spectral response over the visible, near infrared (NIR), short wave infrared (SWIR), medium wavelength infrared (MWIR) and long wave infrared (LWIR) regions of the electromagnetic spectrum. The optical quadrant detector 200 can be composed of Silicon, Germanium, InGaAs, Mercury Cadmium Telluride (MCT) or other suitable materials. The optical quadrant detector 200 has a quick response rate in comparison to a duration of a reflected pulse 52 received at the optical quadrant detector 200.
When used in the Lidar system 40, the x-coordinate and y-coordinate of the reflected pulse 52 can be used to determine an angular location of the target 50 that produces the reflected pulse 52 as well as a depth image of the target 50, as discussed below with respect to
The Lidar system 40 further includes receiver equipment that includes receiver optics 310 and the optical quadrant detector 200 of
In an additional embodiment, the illumination optics 306, the receiver optics 310 or both can include a spatial modulator 320. The spatial modulator 320 can be used to filter out signals arising from two or more targets or objects 50 that are at a same distance from the Lidar system 40 or optical quadrant detector 200 and that are angularly distinguishable.
It is noted that the optical quadrant detector 200 has a quick sampling response time in comparison to the time duration of the reflected pulse 52. In various embodiments, the response of the optical quadrant detector 200 is less than a few 100 picoseconds. Therefore, the optical quadrant detector 200 can sample the reflected pulse 52 multiple times throughout the duration of the reflected pulse 52. Each sample of the reflected pulse 52 provides information at a reflective surface at a selected depth of the target 50, an angular location of the reflective surface and an intensity of light at the particular depth. A plurality of samples of these parameters can therefore be used to build a depth image of the field of interest 308
r=c×TOF/2 Eq. (5)
where r is the range of the target and c is the speed of light. Thus, the time-dependent coordinates x(t) and y(t) can be rewritten to be dependent upon range or depth measurements.
The image determined from the Lidar system can be provided to the navigation system 20,
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof.