The description relates to systems for scanning a multidimensional (e.g., 2D or 3D) environment, such as laser scanners or LIDARs, for instance.
One or more embodiments may be used in robot or vehicle autonomous navigation systems, for instance.
An electronic system configured to measure a distance from the surroundings by measuring the time of flight (TOF) of a laser beam, that is the time taken for a light pulse to travel from a light source to a target and back (echo), is known as a light detection and ranging (briefly, LIDAR) or laser scanner system.
In order to obtain a TOF measurement, a time to digital (TDC) converter device may be employed, that is a device configured to measure (e.g., with sub-nanosecond accuracy) a time interval elapsed between two digital signals, such as the transmitted light pulse and the light echo signals of the LIDAR, for instance.
A TOF measurement can be obtained in a direct or indirect way, for instance.
A direct time of flight (dTOF) measurement uses a phase detector device configured to detect a phase shift between the transmitted laser light pulse and the light echo signal, wherein distance of the target is determined by the shift times (half) the speed of light.
An indirect time of flight (iTOF) measurement does not measure the phase shift directly but it obtains it from detecting the number of photons (or light intensity) during the pulse/modulation period of the light signal.
Laser scanner systems can be used to build maps of objects and landscapes by scanning 2D or 3D environments and acquiring multiple TOF measurements of the surroundings.
Existing laser scanners use alternative and divergent technical solutions to scan a 2D or 3D target.
For instance, some solutions involve controlled steering of laser beams on the target and taking a distance measurement at every steering direction.
For instance, a rotating electrical connector (e.g., a gimbal) may be used to sequentially scan different directions, that is an electromechanical device to facilitate transmission of power and electrical signals from a stationary to a rotating structure.
These solutions may be inconvenient for many applications due to, e.g.:
Single laser single sensor (briefly, SLSS) systems are also known. These systems comprise a pulsed laser source, an optical scanning arrangement (e.g., a first and a second mirror configured to rotate along a first and a second axis, respectively, with the first and the second axes orthogonal therebetween) and a light sensor configured to measure the TOF for each light pulse.
In SLSS systems the laser source 12 is pulsed with a pulse time interval greater than a maximum distance (or ToF) which can be measured, in order to prevent any ambiguity in the TOF measurement This is a drawback of SLSS in that it limits the throughput and the applicability of the system 10, in particular for relatively high distances.
An alternative solution is the so-called “flash LIDAR” arrangement. This involves “flashing” illuminate a full scene of the target by coupling a diffractive optics (briefly, DOE) arrangement to the laser source and using a grid-like sensing arrangement, with each sensor in the grid arrangement dedicated to calculating the ToF of the light beam echoed from a corresponding part of the illuminated full scene.
This flash LIDAR arrangement may be inconvenient for many applications due to, e.g.:
Existing sensors may involve additional reset signals and present a limited throughput.
One or more embodiments contribute in overcoming the aforementioned drawbacks.
According to one or more embodiments, a LIDAR apparatus includes a laser source, a beam steering arrangement (e.g., MEMS lenses or mirrors or optical phase arrays—OPAs) and an array of sensors, wherein each sensor of the array is focused on a determined region of the target or field of view may be exemplary of such an apparatus.
One or more embodiments may relate to a corresponding method.
In one or more embodiments, sensor parallelism provides improved figures of merit (resolution, framerate and maximum distance, for instance).
In one or more embodiments, sensor resolution may be a fraction of the final resolution, facilitating reducing cost and power consumption.
In one or more embodiments, a transmission path may be simplified, for instance thanks to a single laser source.
One or more embodiments may have a reduced size and weight, leading to cost savings and reduced power consumption.
One or more embodiments can speed up the frequency of the laser beam pulses thanks to the focusing of each sensor of the sensor array on a certain region of the field visual. For instance, laser pulses can be emitted sequentially without waiting, between sequential laser pulses, for the echo signal to be received by the sensor.
One or more embodiments may provide an innovative dual scanning, thanks to a reduced opening and speed of second stage (a lens in the example).
One or more embodiments may facilitate combining different technologies for different applications.
In one or more embodiments, exploiting MEMS technology facilitates providing a small, lightweight and fast system.
One or more embodiments may extend a ranging distance with respect to existing solutions, for instance increasing it from about 10 to about 100 meters.
One or more embodiments will now be described, by way of non-limiting example only, with reference to the annexed Figures, wherein:
In the ensuing description, one or more specific details are illustrated, aimed at providing an in-depth understanding of examples of embodiments of this description. The embodiments may be obtained without one or more of the specific details, or with other methods, components, materials, etc. In other cases, known structures, materials, or operations are not illustrated or described in detail so that certain aspects of embodiments will not be obscured.
Reference to “an embodiment” or “one embodiment” in the framework of the present description is intended to indicate that a particular configuration, structure, or characteristic described in relation to the embodiment is comprised in at least one embodiment Hence, phrases such as “in an embodiment” or “in one embodiment” that may be present in one or more points of the present description do not necessarily refer to one and the same embodiment.
Moreover, particular conformations, structures, or characteristics may be combined in any adequate way in one or more embodiments.
The drawings are in simplified form and are not to precise scale.
Throughout the figures annexed herein, like parts or elements are indicated with like references/numerals and a corresponding description will not be repeated for brevity.
The references used herein are provided merely for convenience and hence do not define the extent of protection or the scope of the embodiments.
As shown as an example in
As shown as an example in
For instance:
In one or more embodiments, the first 15a and/or second 15b optical element is/are configured to correctly focus the target region in each sensor 16ij, and/or to compensate geometrical distortions occurring in the light projecting process via the arrangement 14, e.g., Keystone-Pincushion deformation, known per se.
As shown as an example in
In some embodiments, the known geometrical distortion can be compensated using a dedicated method of laser projection that properly select the (time and space) points in which the laser is pulsed. For instance, the control unit 20 may be configured to control the beam steering arrangement 14 and the laser source 12 in order to synchronize light pulse emission by the source 12 and position of the mirrors 140, 142, obtaining a compensated projection of the light pulse on the target scene T.
As shown as an example herein, the apparatus comprises at least one of:
For instance, the first and/or second optical elements are configured to counter a Keystone-Pincushion deformation during projecting at least one beam spot (for instance, P, P1, P1′) per grid cell (for instance, gij) in the portioned FOV region (for instance, T).
For the sake of simplicity, in the following the target surface T is considered to correspond to an entire field of view (briefly, FOV) of the laser scanner system 10, that is the angular extent of the field which can be observed with an optical instrument In the example considered, this FoV encompasses both to the area projected by the beam steering arrangement 14 and the area viewed by the array of sensors 16.
As shown as an example in
It will be once again recalled that the discussion of the apparatus as per the present disclosure within the framework of a vehicle/robot navigation system VS is merely examples provided for illustrative purposes and not limitative of the embodiments. An apparatus and method as described herein can also be used independently of any navigation system arrangement, and—more generally—in any area other than the field of navigation systems, such as augmented reality, visual support and graphical effects, for instance.
As shown as an example in
As shown as an example in
As shown as an example in
In one or more embodiments, the beam steering arrangement 14 can also comprise multiple and/or different types of optical components, such as: prisms, lenses, diffraction gratings, beam-splitters, polarizers, expanders, and other components known per se, combined to allow the properties of the laser beam L to be controlled according to the present disclosure.
In one or more embodiments, the beam steering arrangement 14 may comprise biaxial MEMS mirrors each suitable to rotate along two orthogonal axes. For instance, biaxial MEMS mirrors may be suitable for use in one or more embodiments.
As shown as an example herein, an apparatus (for instance, 10), comprises:
a laser light source (for instance, 12) configured to transmit at least one beam of light pulses (for instance, L) towards a target, projecting at least one corresponding beam spot (for instance, P) thereon,
As shown as an example herein, the beam steering arrangement comprises:
For instance, the first axis of oscillation of the first MEMS mirror and the second axis of oscillation of the second MEMS mirror are orthogonal therebetween.
In some embodiments, the beam steering arrangement 14 may comprise MEMS lenses in place of mirrors. For instance, MEMS lenses can be suitable for use and can provide a more compact solution with respect to those using mirrors.
As shown as an example herein, the beam steering arrangement further comprises a MEMS lens (for instance, 146) coupled to at least one of the first and second MEMS mirrors (140, 142), the MEMS lens configured to vary the direction of transmission of the light pulses (for instance, P1, P1′) within each grid cell (for instance, gij) in the portioned FOV of the array of sensors.
As shown as an example in
For instance, the array of optical sensors 16 may comprise processing circuitry configured to apply signal processing to the detected echo laser beam R, providing a (e.g., pulse-by-pulse) ToF measurement.
As shown as an example in
As shown as an example in
In other words, the field of view T of the system 10 is treated as a grid G where each grid cell gij is in a one-to-one correspondence with each sensor 16ij of the array of sensors 16, that is an ij-th grid cell gij is detected by a corresponding ij-th sensor 16ij of the array of sensors 16.
As shown as an example in
As shown as an example herein, the beam steering arrangement is configured to cyclically vary the direction of transmission of the light pulses according to a pattern, for example selected among a raster scan pattern and a Lissajous pattern.
α=A*sin(2πt*fx+φx)
β=B*tri(2πt*fy+φy)
where φ_x and φ_y are respective initial angular position or phase values.
α=A*sin(2πt*fx+φx)
β=B*sin(2πt*fy+φy)
For the sake of simplicity in illustration, an example case where the array of sensor comprises a number of nine sensors arranged as a column vector and focusing the attention on a single angle variation is used to illustrate principles of some of embodiments. It is otherwise understood that this example is purely illustrative and in no way limiting.
As shown as an example in
As shown as an example in
As shown as an example in
For the sake of simplicity in illustration, in the example case of
In one or more embodiments, the variation of the function to vary the first angle α, and/or second angle β, can comprise, for instance, varying the waveform equations, the phase (φx or φy):
It is noted that, even varying the phase, a same area in the i-th grid element gi may be illuminated multiple times in a number of cycles of beam steering 14, without this substantially affecting the resolution of the system 10.
In one or more embodiments, performance parameters, e.g., resolution, frame-rate and maximum target distance, can be tuned varying the vertical size of the sensor array 16 and the time to reposition the laser spot between two different trajectories.
In one or more embodiments, steering light pulses so that one light spot P1 per grid cell gi is cyclically moved in a different area P1′ within the grid cell facilitates improving a global resolution of the apparatus. For instance, the resolution of the apparatus 10 is a function of the number of sensors 16i, 16ij in the sensor array 16 times a resolution of the cyclical variation of the position between subsequent steering cycles (“secondary” resolution).
For example, using a “secondary” resolution about 30×17 resolution in a sensor array 16 comprising a matrix of 64×64 sensors, the total system resolution reaches a value compatible with the standard ITU 709 or full high definition (briefly, FHD).
As shown as an example herein, at least one sensor (for instance, 16i) in the array of sensors comprises an avalanche photodiode, APD, or a single photon avalanche photodiode, SPAD.
In one or more embodiments, a single avalanche photodiode (briefly, APD) is found suitable for use as a sensor in the array of sensors 16.
An APD is a well-known semiconductor-based photodetector (that is, a photodiode) which is operated with a relatively high reverse voltage (e.g., just below breakdown) so that carriers excited by absorbed photons are strongly accelerated in the strong internal electric field, generating secondary carriers as a result. This triggers an avalanche process that effectively amplifies the photocurrent by a (controllable) amplification factor.
In some embodiments, arrays of Geiger-mode APDs, also referred to as single-photon avalanche diode, or briefly SPADs, may be suitable for use in the array of sensors 16. These are also semiconductor-based photodetectors (e.g., photodiodes), a detailed description of which is not provided herein for the sake of brevity.
For instance, the array of sensors 16 can comprise a column vector whose elements comprise arrays of (e.g., sixteen) SPADs grouped together in order to provide a single ToF measurement (as discussed herein with reference to
In one or more embodiments, a ST-Ed 256×256 SPAD imager produced by STMicroelectronics may be suitable for use as photodetector 16ij in the array of sensors 16.
One or more embodiments may perform multiple, partial scans of the target T with a given beam sweeping cycle and with a reduced sampling rate of the target scene, e.g., number of light spots used to illuminate it, per steering cycle.
As shown as an example in
Data points obtained per each of these subframes, black dots represent missing data points in the images of
Subsequently, a method of obtaining an image of the target scene using these subframes may comprise:
As shown as an example in
It is noted that the example above is one of the possible ways of combining partial frames. In one or more embodiments, other combinations can be employed to reduce the size of the sensor array 16 by increasing the number of Lissajous sub framing, e.g., QQVGA sensor for sixteen frames or 80×60 for an amount of 256 frames.
As shown as an example herein, the apparatus comprises a diffusive optical element (for instance, 160) coupled to the array of sensors and placed intermediate the target and the array of sensors,
groups of SPAD sensors (for instance, 16j) in the array of sensors configured to provide a joint signal indicative of a time of incidence of at least one light pulse (for instance, R, R′) in a joint area of respective grid cells (for instance, g11, g12, g13).
For instance, the diffusive optical element is configured to split the light pulse incident thereon into single photons and to direct each photon towards respective SPAD sensors in the groups of SPAD sensors.
As shown as an example in
In the example of
In one or more embodiments, an improved resolution may be obtained selecting an OPA 144 with a certain pulse-to-pulse interval, e.g., about 16 (sixteen) nanoseconds (1 nanosecond=1 ns=10−9 s), and with an array of sensors 16 having a certain size, e.g., sixteen rows and eight columns.
In the example considered, a FullHD (e.g., 1920×1080) at 30 fps (frame-per-second) can be obtained, for instance using a sub-pixel scanning resolution about 120×135. This may result in a maximum detectable distance increased, e.g., from less than 3 meters to more than 300 meters, facilitating use of the instant solution in automotive applications (such as ADAS—Advanced Driver Assistance Systems, for instance). In an embodiment, the beam steering arrangement 14 comprises a multi-stage (e.g., double stage) steering arrangement capable of steering the light beam L along multiple (e.g., two) axes.
As shown as an example in
As shown as an example in
For instance, an initial position P11 of the light spot projected for each grid cell element may be varied to improve illumination coverage.
In the example considered in
In an embodiment, elements of the array of sensors 16 can be (sub)grouped together, e.g., column-wise or row-wise, so that a (sub)group is configured to provide a ToF measurement of a cell of the grid.
As shown as an example in
As shown as an example in
DOEs are optical elements. These elements exploit diffraction and interference phenomena to generate an arbitrary distribution of (projected) light (spots). A diffractive beam splitter or a diffractive lattice are exemplary DOEs.
As shown as an example in
As shown as an example herein, the apparatus comprises a diffractive optical element, DOE, (for instance, 130) intermediate the laser source and the beam steering arrangement, the DOE element configured to split the beam of light pulses, providing a plurality of beams of light pulses (for instance, L11, L23) to the beam steering arrangement.
As shown as an example in
As shown as an example in
For instance, the resolution of the system or apparatus 10 shown as an example in
In one or more embodiments, the beam shaping arrangement 13 with the DOE 130 can be further configured to compensate for geometrical distortion of the spot matrix due to the optical projection path (mainly MEMS lens or mirrors).
In one or more embodiments, exploiting MEMS technology facilitated providing a very small, lightweight and fast apparatus/system 10. For example, the lens/mirror can be a few millimeters wide with an oscillating frequency in the range 10-1000 Hz.
As shown as an example herein, a method of operating an apparatus (for instance, 10) as per the present disclosure, comprises:
As shown as an example herein, the method comprises driving the beam steering arrangement to vary the direction of transmission of the light pulses within each grid cell (for instance, gij) in the portioned FOV (for instance, T) of the array of sensors.
As shown as an example herein, the method comprises:
As shown as an example herein, the method comprises:
It will be otherwise understood that the various individual implementing options shown as an example throughout the figures accompanying this description are not necessarily intended to be adopted in the same combinations shown as an example in the figures. One or more embodiments may thus adopt these (otherwise non-mandatory) options individually and/or in different combinations with respect to the combination shown as an example in the accompanying figures.
Without prejudice to the underlying principles, the details and embodiments may vary, even significantly, with respect to what has been described by way of example only, without departing from the extent of protection. The extent of protection is defined by the annexed claims.
An apparatus (10) may be summarized as including a laser light source (12) configured to transmit at least one beam of light pulses (L) towards a target, projecting at least one corresponding beam spot (P) thereon, an array of sensors (16) with a plurality of sensors (16i, 16j, 16ij) distributed according to a grid (G), sensors (16i) in the array of sensors (16) configured to sense a light pulse incident thereon in response to reflection of at least one light pulse (P, P1, P1′) of the beam of light pulses (L) from a field of view, FOV, region (T) in the target, sensors (16i) in the array of sensors (16) further configured to provide a signal indicative of a time of incidence thereon of at least one light pulse (R, R′), wherein the FOV region (T) of the array of sensors (16) is portioned into grid cells (gij) according to the grid (G), each sensor (16i) in the array of sensors (16) is configured to sense at least one echo light pulse (R, R′) reflected from a respective grid cell (g1) in the portioned FOV region (T), the apparatus (10) comprises a beam steering arrangement (14) configured to cyclically vary a direction of transmission (α, β) of the beam of light pulses (L), projecting at least one beam spot (P, P1, P1′) per grid cell (gij) in the portioned FOV region (T).
The beam steering arrangement (14) may include a first microelectromechanical, MEMS, mirror (140) configured to oscillate around a first axis with a first oscillating angle (α), and a second MEMS mirror (142) configured to oscillate around a second axis with a second oscillating angle (β), wherein each of the first MEMS mirror (140) and the second MEMS mirror (142) may be coupled to a respective actuating device (A1, A2) configured to drive an oscillating movement of the respective mirror (140, 142).
The first axis of oscillation of the first MEMS mirror (140) and the second axis of oscillation of the second MEMS mirror (142) may be orthogonal therebetween.
The beam steering arrangement (14) may include a MEMS lens (146) coupled to at least one of the first (140) and second (142) MEMS mirrors (140, 142), the MEMS lens (146) configured to vary the direction of transmission (α, β) of the light pulses (P1, P1′) within each grid cell (gij) in the portioned FOV (T) of the array of sensors (16).
The apparatus (10) may include a diffractive optical element, DOE, (130) intermediate the laser source (12) and the beam steering arrangement (14), the DOE element (130) configured to split the beam of light pulses (L), producing a plurality of beams of light pulses (L11, L23) to the beam steering arrangement (14).
The beam steering arrangement (14) may include an optical phased array (144).
At least one sensor (16i) in the array of sensors (16) may include an avalanche photodiode, APD, or a single photon avalanche photodiode, SPAD.
The apparatus (10) may include a diffusive optical element (160) coupled to the array of sensors (16), the diffusive optical element (160) being intermediate the target and the array of sensors (16), groups of SPAD sensors (16j) in the array of sensors (16) configured to provide a joint signal indicative of a time of incidence of at least one light pulse (R, R′) in a joint area of respective grid cells (g11, g12, g13), wherein the diffusive optical element (160) may be configured to split the light pulse incident thereon into photons and to direct the photons towards respective SPAD sensors in the groups of SPAD sensors (16j).
The beam steering arrangement (14) may be configured to cyclically vary the direction of transmission (α, β) of the light pulses (P1, P1′) according to a pattern, for example selected among a raster scan pattern and a Lissajous pattern.
The apparatus (10) may include at least one of a first optical element (15a) coupled to the beam steering arrangement (14), the first optical element (15a) interposed the beam steering arrangement (14) and the target, and a second optical element (15b) coupled to the array of sensors (16), the second optical element interposed the target and the array of sensors (16), wherein the first and/or second optical elements may be configured to counter a Keystone-Pincushion deformation during projecting at least one beam spot (P, P1, P1′) per grid cell (gij) in the portioned FOV region (T).
A method of operating an apparatus (10) may be summarized as including driving (20) the beam steering arrangement (14) to cyclically vary the direction of transmission (α, β) of the light pulses (P1, P1′) and to transmit at least one light pulse (P1, P2, P3, P4, P5, P6, P7, P8) per each grid cell portion (g1, g2, g3, g4, g5, g6, g7, g8) of the partitioned FOV (T) of the array of sensors (16).
The method may include driving (20) the beam steering arrangement (14) to vary the direction of transmission (α, β) of the light pulses (P1, P1′) within each grid cell (gij) in the portioned FOV (T) of the array of sensors (16)
The method may include selecting a pattern among a raster scan pattern and a Lissajous pattern, driving (20) the beam steering arrangement (14) to cyclically vary the direction of transmission (α, β) of the light pulses (P1, P1′) according to the selected pattern.
The method may include collecting signals produced from sensors of the array of sensors (16), and calculating (20) a measurement of a distance of the target from the apparatus (10) based on the signals collected.
The various embodiments described above can be combined to provide further embodiments. Aspects of the embodiments can be modified, if necessary to employ concepts of the various embodiments to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
102021000020006 | Jul 2021 | IT | national |