This disclosure relates generally to remote sensing systems, and more specifically to compact remote sensing systems.
Conventional scanning systems (e.g., LiDAR systems) are typically bulky and large in size. These conventional scanning systems typically include a large scanning mechanism and a high resolution focal plane array (FPA) with enough pixels to support the resolution of the scanning mechanism. Both angular (XY) and depth (Z) position may be determined from the detector array.
Embodiments of the present disclosure may provide a compact remote sensing device that includes a transmit component that scans a beam of light across a scene or object field, and a receive component that receives return light from the object field. The transmit component includes a small, fast scanning mechanism, for example a MEMS (microelectromechanical system) mirror or mirror array, or a piezoelectric steering mirror (referred to as a piezo mirror), that scans a beam of light emitted by a light source across a field of view (FOV). The receive component includes a focal plane array (FPA) with a FOV at least large enough to capture the FOV of the scanning mechanism. The FPA may be a low resolution FPA (i.e., with fewer pixels than the resolution of the scanning mechanism), and the light beam may be scanned and captured at multiple spots (or sub-pixels) within the pixels of the FPA. The FPA may be implemented according to avalanche photodiode (APD) technology, PIN diode technology, or any other suitable technology.
In some embodiments, angular (XY) position may be determined from feedback of the scanning mechanism, and depth or range (Z) position may be determined from feedback of the detector array. The angular and depth position information may be processed to generate 3D imaging data output.
Some embodiments may also include an optical shutter located in the optical receiver path that acts to block unwanted light. Some embodiments may instead or also include an optical band-pass filter on the optical receiver path to attenuate or reject light outside of a range of the beam.
Some embodiments may include a polygon rotating mirror with N facets (e.g., five facets, six facets, etc.), each facet with a different facet angle, that may be used to increase the vertical FOV of the scanning mechanism. In these embodiments, the horizontal component of the scan may be provided by the rotating mirror.
Embodiments of the remote sensing device may, for example, be implemented as compact LiDAR modules. Embodiments may be implemented as small form factor, compact modules that may be integrated into small, tight spaces. Embodiments may provide high resolution 3D imaging without the need of a high resolution FPA and its corresponding digital receiver chain. Embodiments may provide high resolution 3D imaging and remote sensing capabilities at lower power than conventional systems that use higher-resolution FPAs. Further, embodiments may be less sensitive to vibrations than conventional systems due to the relatively small size of the scanning mechanism.
This specification includes references to “one embodiment” or “an embodiment.” The appearances of the phrases “in one embodiment” or “in an embodiment” do not necessarily refer to the same embodiment. Particular features, structures, or characteristics may be combined in any suitable manner consistent with this disclosure.
“Comprising.” This term is open-ended. As used in the appended claims, this term does not foreclose additional structure or steps. Consider a claim that recites: “An apparatus comprising one or more processor units . . . ”. Such a claim does not foreclose the apparatus from including additional components (e.g., a network interface unit, graphics circuitry, etc.).
“Configured To.” Various units, circuits, or other components may be described or claimed as “configured to” perform a task or tasks. In such contexts, “configured to” is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs those task or tasks during operation. As such, the unit/circuit/component can be said to be configured to perform the task even when the specified unit/circuit/component is not currently operational (e.g., is not on). The units/circuits/components used with the “configured to” language include hardware—for example, circuits, memory storing program instructions executable to implement the operation, etc. Reciting that a unit/circuit/component is “configured to” perform one or more tasks is expressly intended not to invoke 35 U.S.C. § 112, sixth paragraph, for that unit/circuit/component. Additionally, “configured to” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in manner that is capable of performing the task(s) at issue. “Configure to” may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
“First,” “Second,” etc. As used herein, these terms are used as labels for nouns that they precede, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, a buffer circuit may be described herein as performing write operations for “first” and “second” values. The terms “first” and “second” do not necessarily imply that the first value must be written before the second value.
“Based On.” As used herein, this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is, a determination may be solely based on those factors or based, at least in part, on those factors. Consider the phrase “determine A based on B.” While in this case, B is a factor that affects the determination of A, such a phrase does not foreclose the determination of A from also being based on C. In other instances, A may be determined based solely on B.
Embodiments of a compact remote sensing device are described. The remote sensing device may include a transmit component that scans a beam of light across a scene or object field, and a receive component that receives return light from the object field. The transmit (TX) component includes a light source (e.g., one or more lasers) and a small, fast scanning mechanism, for example a MEMS (microelectromechanical system) mirror or mirror array, or a piezoelectric steering mirror (referred to as a piezo mirror), that scans a beam of light emitted by the light source across a field of view (FOV) at the object field. The receive (RX) component includes a focal plane array (FPA) with a FOV at least large enough to capture the FOV of the scanning mechanism. The FPA may be referred to herein as a detector array. However, instead of a high resolution FPA as used in conventional scanning systems, a lower resolution FPA (i.e., with fewer pixels than the resolution of the scanning mechanism) may be used, and the light beam may be scanned and captured at multiple spots (or sub-pixels) within the pixels of the FPA.
In some embodiments of the remote sensing device, angular (XY) position may be determined from feedback of the scanning mechanism, and depth or range (Z) position may be determined from feedback of the detector array. The angular and depth position information may be processed by a controller/processor component of the remote sensing system to generate three-dimensional (3D) imaging data output.
Embodiments of the remote sensing device may, for example, be implemented as compact LiDAR modules. Embodiments may be implemented as small form factor, compact modules that may be integrated into small, tight spaces. Embodiments may provide high resolution 3D imaging without the need of a high resolution FPA and its corresponding digital receiver chain. Embodiments may provide high resolution 3D imaging and remote sensing capabilities using lower-resolution FPAs that operate at lower power than conventional systems that use higher-resolution FPAs that require more power. Further, embodiments may be less sensitive to vibrations than conventional systems due to the relatively small size of the scanning mechanism.
Light source 112 may, for example, be a laser that emits a collimated, narrow beam 120. In some embodiments, multiple lasers with different attack angles may be used to increase the scanning mechanism 116 FOV without impacting the scanner's mechanical motion. In some embodiments, other light emitting technologies may be used, for example infrared (IR) light emitting mechanisms.
Relay lens 114 may include one or more refractive lens elements, and may act to refract light (beam 120) emitted by the light source 112 on the optical path to the scanning mechanism 116.
Scanning mechanism 116 may be or may include a small, fast scanning mechanism such as a MEMS (microelectromechanical system) mirror or mirror array or a piezo mirror that scans the collimated, narrow beam 120 of light emitted by the light source 112 across a field of view (FOV) at the object field 199. In at least some embodiments, the scanning mechanism 116 is sized to support beam divergence; as a non-limiting example, the scanning mechanism 116 may include a 6 mm scanning mirror.
TX lens 118 may include one or more refractive lens elements, and may act to refract light (scanned beam 120) from the scanning mechanism 116 on the optical path to the object field 199.
RX lens 142 may include one or more refractive lens elements, and may act to collect and refract light (scanned beam 120) received from the object field 199 on the optical path to the detector array 144.
Detector array 144 may be selected to provide a FOV at least large enough to capture the full FOV of the scanning mechanism 116.
In some embodiments, RX 140 may also include an optical shutter 148 located at or near the object field side surface of the detector array 144 and in the optical path of the receiver 140 between the RX lens 142 and the detector array 144.
While not shown in
As shown in
A scene (the object field 299) is scanned by scanning mechanism 216 according to a pattern with a collimated, narrow beam 220 emitted from light source 212. The beam 220 strikes objects/surfaces in the object field 299 and is returned to the RX 240, where the return beam scans across the object-side surface of the detector array 244. Pixels of the detector array are activated according to the current scan angle of the beam 220. The beam 220 may be scanned and captured at multiple spots 250 (or sub-pixels) within each pixel of the detector array 244. Controller/processor 280 may determine angular (XY) position from feedback 260 of the scanning mechanism 216, and may determine depth or range (Z) position from feedback 270 from the detector array 244. The angular and depth position information may be processed by controller/processor 280 to generate 3D imaging data 290 as output.
A scene (the object field 299) is scanned by scanning mechanism 216 according to a pattern with a collimated, narrow beam 220 emitted from light source 212. The beam 220 strikes objects/surfaces in the object field 299 and is returned to the RX 240, where the return beam passes through the current aperture 249 of the optical shutter as it scans across the object-side surface of the detector array 344. Pixels of the detector array are activated according to the current scan angle of the beam 320; the aperture 249 is also moved according to the current scan angle within the activated pixels. The beam 320 may be scanned and captured at multiple spots 350 (or sub-pixels) within each pixel of the detector array 344. Controller/processor 380 may determine angular (XY) position from feedback 360 of the scanning mechanism 316, and may determine depth or range (Z) position from feedback 370 from the detector array 344. The angular and depth position information may be processed by controller/processor 380 to generate 3D imaging data 290 as output.
In the remote sensing device 500 as illustrated in
As indicated at 1000, a transmit component of the remote sensing device scans a beam of light across a FOV at an object field. For example, the beam may be a collimated, narrow beam emitted from a light source such as one or more lasers. The FOV may, for example, be scanned according to a scan pattern by a scanning mechanism as illustrated in
As indicated at 1002, a pixel on a detector array of a receive component of the remote sensing device may be activated according to the scan angle of the beam.
As indicated at 1004, the receive component of the remote sensing device receives return light from the object field. As indicated at 1006, the beam returned from the object field is imaged, for example by an RX lens as illustrated in
As indicated at 1008, angular (XY) position is determined from feedback from the scanning mechanism. As indicated at 1010, depth (Z) position is determined from feedback from the detector array. The position information may be processed to generate 3D imaging data output.
As indicated at 1100, a light source of a transmit component of the remote sensing device emits a beam of light. For example, the beam may be a collimated, narrow beam emitted from a light source such as one or more lasers. In some embodiments, the light source may continuously emit the beam during the scan method of elements 1102-1118.
As indicated at 1102, the remote sensing device may begin a scan of a field of view (FOV) at an object field. The FOV may, for example, be scanned according to a scan pattern as illustrated in
As indicated at 1104, a scanning mechanism of the transmit component scans the beam across the FOV. The scanning mechanism may be a small, fast scanning mechanism such as a MEMS (microelectromechanical system) mirror or mirror array or a piezo mirror.
As indicated at 1106, a corresponding pixel on a detector array of a receive component of the remote sensing device may be activated according to the current scan angle of the scanning mechanism.
As indicated at 1108, return light from the object field is received at a receiver lens of the receive component. The receiver lens refracts the light to the detector array.
In some embodiments as shown in
In some embodiments, an optical band-pass filter may instead or also be included on the optical receiver path between the receiver lens and the detector array to attenuate or reject light outside of a range of the beam.
As indicated at 1112, the beam is imaged at a spot inside the currently activated pixel, underfilling the pixel. As shown in
As indicated at 1114, the angular (XY) position information for the imaged beam may be determined from feedback from the scanning mechanism indicating the current scan angle. As indicated at 1116, depth (Z) position for the imaged beam may be determined from feedback from the currently activated pixel of the detector array.
At 1118, if the current FOV scan of the object field is not done, then the method returns to element 1102 to continue the scan. Otherwise, if the current FOV scan of the object field is done, then the method may return to element 1102 to begin a next scan, and the position information collected for the FOV scan may be processed to generate 3D imaging data output for the FOV scan as indicated at 1120.
As indicated at 1200, a scanning mechanism of the remote sensing device scans a beam emitted from a light source such as one or more lasers. In some embodiments, the scanning mechanism may be a one-dimensional scanning mirror that scans the beam in one dimension (e.g., the vertical dimension or axis). In some embodiments, a scanning mechanism with a limited vertical field of view (VFOV) may be used.
As indicated at 1202, a pixel on the detector array are activated according to the current scan angle of the beam. In some embodiments, the detector array may be a one-dimensional array of pixels with a FOV corresponding to the scanning mechanism. Each pixel FOV (PFOV) may be significantly larger than the size of the beam. The beam may be scanned and captured at multiple spots (or sub-pixels) within the pixels of the detector array.
As indicated at 1204, the beam is reflected off a current facet of a rotating polygon mirror to the object field. A polygon rotating mirror with N facets (e.g., five facets, six facets, etc.), each facet with a different facet angle, may be used to increase the VFOV of the scanning mechanism. Each facet corresponds to a different elevation angle. The horizontal component of the scan pattern may be provided by the polygon rotating mirror as it rotates.
As indicated at 1206, the return beam from the object field is reflected off the current facet of the polygon mirror to the 1D detector array. As indicated at 1208, the beam is imaged at spots inside the activated pixel.
As indicated at 1210, horizontal (X) position may be determined from the current encoder angle of the polygon rotating mirror. As indicated at 1212, vertical (Y) position may be determined from the scanning mechanism position/angle and the current facet number of the polygon rotating mirror. As indicated at 1214, depth or range (Z) position may be determined from feedback from the currently activated pixel of the detector array. The position information may be processed to generate 3D imaging data output.
The methods described herein may be implemented in software, hardware, or a combination thereof, in different embodiments. In addition, the order of the blocks of the methods may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and changes may be made as would be obvious to a person skilled in the art having the benefit of this disclosure. The various embodiments described herein are meant to be illustrative and not limiting. Many variations, modifications, additions, and improvements are possible. Accordingly, plural instances may be provided for components described herein as a single instance. Boundaries between various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the example configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of embodiments as defined in the claims that follow.
This application claims benefit of priority to U.S. Provisional Application No. 62/377,404, filed Aug. 19, 2016, titled “Remote Sensing Device”, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
7479966 | Maier | Jan 2009 | B2 |
20020088952 | Rao | Jul 2002 | A1 |
20030193589 | Lareau | Oct 2003 | A1 |
20040012775 | Kinney | Jan 2004 | A1 |
20050163365 | Barbour | Jul 2005 | A1 |
20060253035 | Stern | Nov 2006 | A1 |
20070274714 | Achal | Nov 2007 | A1 |
20080218851 | Chen | Sep 2008 | A1 |
20090021730 | Maier | Jan 2009 | A1 |
20090096783 | Shpunt | Apr 2009 | A1 |
20110038507 | Hager | Feb 2011 | A1 |
20110085219 | Yang | Apr 2011 | A1 |
20110285995 | Tkaczyk | Nov 2011 | A1 |
20110299763 | Barbour | Dec 2011 | A1 |
20120112038 | Hamoir | May 2012 | A1 |
20120268745 | Kudenov | Oct 2012 | A1 |
20140320843 | Streuber | Oct 2014 | A1 |
20140350836 | Stettner | Nov 2014 | A1 |
20150268345 | Ell | Sep 2015 | A1 |
20160045291 | Verker | Feb 2016 | A1 |
Number | Date | Country | |
---|---|---|---|
62377404 | Aug 2016 | US |