One or more embodiments relate to a sensor system with one or more light absorbing surfaces for reducing stray internal light reflections.
A photodetector is an optoelectronic device that converts incident light or other electromagnetic radiation in the ultraviolet (UV), visible, and infrared spectral regions into electrical signals. Photodetectors may be used in a wide array of applications, including, for example, fiber optic communication systems, process controls, environmental sensing, safety and security, and other imaging applications such as light detection and ranging applications. High photodetector sensitivity can enable detection of faint signals returned from distant objects, however, such sensitivity to optical signals can be susceptible to high optical noise sensitivities. Accordingly, the proposed systems and methods of the present disclosure provide solutions to reduce optical noise sensitives in photodetector devices.
In one embodiment, a focal plane assembly (FPA) is provided with an array of detectors. The FPA includes a micro-lens array (MLA) with an input surface that is configured to receive light, and an output surface arranged along a focal plane and configured to focus the light on the array of detectors. A mask is disposed over an outer portion of at least one of the input surface and the output surface. The mask is configured to absorb stray light and reduce optical noise within the MLA.
In another embodiment, a micro-lens array (MLA) is provided with an input surface that is configured to receive light, and an output surface arranged along a focal plane and configured to focus the light on the array of detectors. A mask is disposed over an outer portion of at least one of the input surface and the output surface. The mask is configured to absorb stray light.
In yet another embodiment, a lidar unit is provided with at least one emitter that is configured to emit light pulses away from a vehicle. A housing includes: an opening that is configured to receive light; and an outlet that is opposite the opening and aligned along an optical axis. At least one lens is supported by the housing and aligned along the optical axis to focus the light. The lidar unit also includes an array of detectors and a lens array with an input surface and an array of optics. The lens array includes an input surface that is aligned with the at least one lens and configured to receive the light. The lens array also includes an array of optics that form an output surface that is configured to focus the light on the array of detectors. A mask is disposed over an outer portion of at least one of the input surface and the output surface, and the mask is configured to absorb stray light and reduce optical noise within the lens array.
In the drawings, like reference numbers generally indicate identical or similar elements. Additionally, generally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
As required, detailed embodiments are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary and may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present disclosure.
As noted herein, an example use of photodetectors is in light detection and ranging applications (e.g., within lidar systems). For example, one of the most desirable attributes of a lidar system is long range detection, which depends on photodetector sensitivity. Long range detection may be achieved using a very sensitive photodetector. High photodetector sensitivity can enable detection of faint signals returned from distant objects, hence, providing a lidar device that is capable of detecting objects at long ranges. However, sensitivity to optical signals may also correlate with sensitivity to optical noise. Due to this correlation, it is desirable for a device deploying a photodetector (e.g., lidar) to reduce the optical noise sensitivity.
One such example of noise sensitivity may be due to stray light reflections within a photodetector receiver (e.g., lidar receiver). Specifically, within a receiver's focal plane assembly (FPA), discrete micro-lens arrays that are used to enhance optical fill factor may produce total internal reflections that can cause a die to function as a light pipe by carrying stray light long distances from the source location. This can result in an optical blooming effect, a type of optical noise which can temporarily blind a particularly sensitive detector to other signals.
According to some aspects, the disclosed embodiments limit the stray light reflections that are trapped within a micro-lens array (MLA), thereby maintaining high photodetector sensitivity levels while reducing the optical blooming (i.e., stray light reflections). For example, aspects of the disclosure provide an MLA with optical characteristics designed to reduce the effects of stray light that is trapped within the MLA. According to some aspects, the MLA may include masking layers placed at an input surface, an output surface, or both surfaces of the MLA to reduce optical noise in the FPA by absorbing stray light. According to some aspects, the masking layers may further be placed at outer regions, inner regions, or both regions of either the input surface and/or output surface of the MLA.
Reduction of optical noise inside an FPA can improve the detection of an incident signal at the detector (e.g., a returned light signal). According to some aspects described herein, reduction of the optical noise by the disclosed MLA can lead to the reduction/attenuation of the bloom effect at the detector. This reduction can significantly improve the detection capability of the detector especially when the returned light signal is received from a highly reflective object (e.g., road signs (directional signs, yield signs, stop signs, etc.)) that include embedded reflective material, any other objects that include embedded reflective material, or other highly reflective objects.
A lidar assembly (e.g., the lidar unit 200) including the disclosed FPA may improve its detection capabilities and detection accuracy due to the reduced noise affecting the signal. This performance improvement of the lidar provides further benefits for an autonomous vehicle (AV), such as AV 102, operating the lidar unit 200. For example, the lidar unit 200 of AV 102 may receive a light signal reflected from an object exhibiting high reflectivity (e.g., a mirror, glass, or objects embedded with highly reflective material such as stop signs and other road signs). In this case, the lidar unit 200 deploying the disclosed systems and methods may reduce the bloom effect such that objects at long distance can still be detected while attenuating noise signals produced at the MLA. In this example, the lidar unit 200 can achieve high detection sensitivity while reducing susceptibility to noisy signals and improving signal to noise ratio (SNR) of the received signal prior to performing digital signal processing (DSP) applications.
According to some aspects, the performance improvement of the lidar unit 200 further translates to downstream improvement in DSP speed and accuracy. For example, by eliminating the noisy signals, an on-board computer device (e.g., the AV system 104) may utilize less bandwidth to filter out the noise using signal processing techniques. Additionally, the AV system 104 may utilize less bandwidth to check detection accuracy of the lidar output. By the same token, the AV system 104 may also utilize less bandwidth to compare detection accuracy of the lidar output with other sensor outputs in a detection stack (e.g., radar and camera data). According to some aspects, a lidar unit 200 implementing the disclosed embodiments may provide more accurate output signals that feed into the AV's detection, tracking, and prediction stacks. For example, improving the accuracy of the data stacks received by the AV system 104 can reduce the delay that the AV system 104 may exhibit in processing a compromised detection signal from the lidar unit 200 or in reconciling such signal with other detection signals received at the same time instance (e.g., from cameras or radar systems (not shown)). Minimizing computational costs associated with processing detection signals can improve the processing capabilities of the AV system 104, reduce latency, and free up the AV system's 104 bandwidth to perform other downstream navigation tasks like prediction and motion planning tasks.
According to some aspects, the lidar unit's 200 performance improvement may further translate to downstream improvement in detection and classification capabilities—thereby improving the navigation capabilities of the AV 102. For example, the AV 102 may be able to navigate a geographic location without having the lidar unit 200 be blinded due to an optical bloom, and do so in a continuous manner. Because of the reduced bloom effect described herein, the AV 102 may also navigate the geographic location while being able to detect and classify a highly reflective object. It can be appreciated that the above benefits are merely exemplary and other benefits to detection, compute, and downstream applications such as autonomous driving and navigation may be within the scope of this disclosure as would be appreciated by those skilled in the art.
With reference to
The sensor system 100 includes a top sensor assembly 112 and multiple side sensor assemblies 114 for monitoring an environment external to the AV 102. The top sensor assembly 112 is mounted to a roof of the AV 102 and includes a light detection and ranging (lidar) unit according to one or more embodiments. The lidar unit includes one or more emitters 116 and one or more detectors 118. The emitters 116 transmit light pulses 120 away from the AV 102. The transmitted light pulses 120 are incident on one or more objects, e.g., a remote vehicle 122, a pedestrian 124, and a cyclist 126, and reflect back toward the top sensor assembly 112 as reflected light pulses 128. The top sensor assembly 112 guides the reflected light pulses 128 toward the detectors 118, which provide corresponding light signals 130 to the controller 106. The controller 106 processes the light signals 130 to determine a distance of each object 122, 124, 126, relative to the AV 102. The top sensor assembly 112 includes absorbing material that is disposed over one or more surfaces to absorb stray light to reduce optical noise incident on the detectors 118, thereby increasing the signal-to-noise ratio of the light signals 130.
The side sensor assemblies 114 include cameras, e.g., visible spectrum cameras, infrared cameras, etc., for monitoring the external environment. The top sensor assembly 112 and the side sensor assemblies 114 may each include a lidar unit, one or more cameras, and/or a radar unit.
The AV system 104 may communicate with a remote computing device 132 over a network 134. The remote computing device 132 may include one or more servers to process one or more processes of the technology described herein. The remote computing device 132 may also communicate with a database 136 over the network 134.
The lidar unit 200 includes one or more emitters 216 for transmitting light pulses 220 through the aperture 212 and away from the AV 102 that are incident on one or more objects and reflect back toward the lidar unit 200. The lidar unit 200 also includes one or more light detectors 218 for receiving reflected light pulses 228 that pass through the aperture 212. The detectors 218 also receive light from external light sources, e.g., the sun. The emitters 216 and the detectors 218 may be stationary, e.g., mounted to the base 202, or dynamic and mounted to the housing 208. The emitters 216 may include laser emitter chips or other light emitting devices and may include any number of individual emitters (e.g., 8 emitters, 64 emitters, or 128 emitters). The emitters 216 may transmit light pulses 220 of substantially the same intensity or of varying intensities, and in various waveforms, e.g., sinusoidal, square-wave, and sawtooth. The lidar unit 200 may include one or more optical elements 222 to focus and direct light that is passed through the aperture 212. The detectors 218 may include a photodetector or an array of photodetectors that are positioned to receive the reflected light pulses 228. In one or more embodiments, the detectors 218 include passive imagers.
The lidar unit 200 includes a controller 230 with a processor 232 and memory 234 to control various components, e.g., the motor 204, the emitters 216, and the detectors 218. The controller 230 also analyzes the data collected by the detectors 218, to measure characteristics of the light received, and generates information about the environment external to the AV 102. The controller 230 may be integrated with another controller, e.g., the controller 106 of the AV system 104. The lidar unit 200 also includes a power unit 236 that receives electrical power from a vehicle battery 238, and supplies the electrical power to the motor 204, the emitters 216, the detectors 218, and the controller 230.
Referring to
With reference to
The MLA 356 forms a plurality of apertures that extend between the planar input surface 358 and the output surface, including central apertures 364 and outer apertures 366 (shown in
Referring back to
With reference to
Aspects of the present disclosure provide an MLA with optical characteristics designed to reduce the effects of trapped stray light. According to some aspects, the MLA may include masking layers placed at an input surface, an output surface, or both surfaces to reduce optical noise in the FPA by absorbing stray light. According to some aspects, masking layers may be placed at outer regions, inner regions, or both regions of either the input surface and/or output surface of the MLA, as described herein with respect to
The term “vehicle” refers to any moving form of conveyance that is capable of carrying either one or more human occupants and/or cargo and is powered by any form of energy. The term “vehicle” includes, but is not limited to, cars, trucks, vans, trains, autonomous vehicles, aircraft, aerial drones and the like. An “autonomous vehicle” (or “AV”) is a vehicle having a processor, programming instructions and drivetrain components that are controllable by the processor without requiring a human operator. An autonomous vehicle may be fully autonomous in that it does not require a human operator for most or all driving conditions and functions, or it may be semi-autonomous in that a human operator may be required in certain conditions or for certain operations, or that a human operator may override the vehicle's autonomous system and may take control of the vehicle. Notably, the present solution is being described herein in the context of an autonomous vehicle. However, the present solution is not limited to autonomous vehicle applications. The present solution may be used in other applications such as robotic applications, radar system applications, metric applications, and/or system performance applications.
Based on the teachings contained in this disclosure, it will be apparent to persons skilled in the relevant art(s) how to make and use embodiments of this disclosure using data processing devices, computer systems and/or computer architectures. In particular, embodiments can operate with software, hardware, and/or operating system implementations other than those described herein.
It is to be appreciated that the Detailed Description section, and not any other section, is intended to be used to interpret the claims. Other sections can set forth one or more but not all exemplary embodiments as contemplated by the inventor(s), and thus, are not intended to limit this disclosure or the appended claims in any way.
While this disclosure describes exemplary embodiments for exemplary fields and applications, it should be understood that the disclosure is not limited thereto. Other embodiments and modifications thereto are possible, and are within the scope and spirit of this disclosure. For example, and without limiting the generality of this paragraph, embodiments are not limited to the software, hardware, firmware, and/or entities illustrated in the figures and/or described herein. Further, embodiments (whether or not explicitly described herein) have significant utility to fields and applications beyond the examples described herein.
Embodiments have been described herein with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined as long as the specified functions and relationships (or equivalents thereof) are appropriately performed. Also, alternative embodiments can perform functional blocks, steps, operations, methods, etc. using orderings different than those described herein.
References herein to “one embodiment,” “an embodiment,” “an example embodiment,” or similar phrases, indicate that the embodiment described can include a particular feature, structure, or characteristic, but every embodiment can not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it would be within the knowledge of persons skilled in the relevant art(s) to incorporate such feature, structure, or characteristic into other embodiments whether or not explicitly mentioned or described herein. Additionally, some embodiments can be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments can be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, can also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The breadth and scope of this disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the disclosure. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the disclosure. Additionally, the features of various implementing embodiments may be combined to form further embodiments.