The present invention relates generally to systems and methods for depth mapping, and particularly to sensor arrays used in time-of-flight sensing.
Time-of-flight (TOF) imaging techniques are used in many depth mapping systems (also referred to as 3D mapping or 3D imaging). In direct TOF techniques, a light source, such as a pulsed laser, directs pulses of optical radiation toward the scene that is to be mapped, and a high-speed detector senses the time of arrival of the radiation reflected from the scene. The depth value at each pixel in the depth map is derived from the difference between the emission time of the outgoing pulse and the arrival time of the reflected radiation from the corresponding point in the scene, which is referred to as the “time of flight” of the optical pulses.
Single-photon avalanche diodes (SPADs), also known as Geiger-mode avalanche photodiodes (GAPDs), are detectors capable of capturing individual photons with very high time-of-arrival resolution, of the order of a few tens of picoseconds. They may be fabricated in dedicated semiconductor processes or in standard CMOS technologies. Arrays of SPAD sensors, fabricated on a single chip, have been used experimentally in 3D imaging cameras.
U.S. Patent Application Publication 2017/0052065, whose disclosure is incorporated herein by reference, describes a sensing device that includes a first array of sensing elements, which output a signal indicative of a time of incidence of a single photon on the sensing element. A second array of processing circuits are coupled respectively to the sensing elements and comprise a gating generator, which variably sets a start time of the gating interval for each sensing element within each acquisition period, and a memory, which records the time of incidence of the single photon on each sensing element in each acquisition period. A controller controls the gating generator during a first sequence of the acquisition periods so as to sweep the gating interval over the acquisition periods and to identify a respective detection window for the sensing element, and during a second sequence of the acquisition periods, to fix the gating interval for each sensing element to coincide with the respective detection window.
Embodiments of the present invention that are described hereinbelow provide improved apparatus and methods for optical sensing.
There is therefore provided, in accordance with an embodiment of the invention, optical sensing apparatus, including at least one semiconductor substrate and a first array of single-photon detectors, which are disposed on the at least one semiconductor substrate and are configured to output electrical pulses in response to photons that are incident thereon. A second array of counters are disposed on the at least one semiconductor substrate and are configured to count the electrical pulses output by the single-photon detectors. Routing and aggregation logic is configured, in response to a control signal, to connect the single-photon detectors to the counters in a first mode in which each of at least some of the counters aggregates and counts the electrical pulses output by a respective first group of one or more of the single-photon detectors, and in a second mode in which each of the at least some of the counters aggregates and counts the electrical pulses output by a respective second group of two or more of the single-photon detectors.
In a disclosed embodiment, the single-photon detectors includes single-photon avalanche diodes (SPADs).
In some embodiments, the control signal includes a gating signal, and the counters are configured to aggregate and count the electrical pulses over respective periods indicated by the gating signal. Typically, the gating signal causes different ones of the counters to aggregate and count the electrical pulses over different, respective gating intervals, so that the second array of counters outputs a histogram of the electrical pulses output by the single-photon detectors with bins defined responsively to the gating intervals. In disclosed embodiments, the apparatus includes a radiation source, which is configured to direct a series of optical pulses toward a target scene, and the single-photon detectors are configured to receive optical radiation that is reflected from the target scene, and the counters are configured to aggregate and count the electrical pulses while the gating intervals are synchronized with the optical pulses with a delay between the optical pulses and the gating intervals that is swept over a sequence of different delay times during the series of the optical pulses. In one embodiment, the counters are configured to aggregate and count the electrical pulses in first and second bins of the histogram while the gating intervals are swept over the sequence of different delay times, and the apparatus includes a processor, which is configured to compute a time of flight of the optical pulses by comparing respective first and second counts accumulated in the first and second bins.
Additionally or alternatively, in the first mode, each of the counters counts the electrical pulses that are output by a single, respective one of single-photon detectors. In some embodiments, in the second mode, each of the at least some of the counters aggregates and counts the electrical pulses output by at least four of the single-photon detectors that are mutually adjacent in the first array. In disclosed embodiments, the control signal includes a gating signal, which causes the counters to aggregate and count the electrical pulses over respective gating intervals, and the apparatus includes a radiation source, which is configured to direct a series of optical pulses toward a target scene, wherein the single-photon detectors are configured to receive optical radiation that is reflected from the target scene, and a processor, which is configured to compute a time of flight of the optical pulses responsively to counts of the electrical pulses that are output by the counters over different gating intervals while operating in the second mode, and to apply the computed time of flight in setting a gating interval for the counters in the first mode. In one embodiment, the processor is configured to generate a three-dimensional (3D) map of the target scene responsively to the time of flight computed in the second mode, to identify an object of interest in the 3D map, and to set the gating interval for the counters in the first mode responsively to a depth of the object of interest in the 3D map so as to acquire a two-dimensional (2D) image of the object of interest.
There is also provided, in accordance with an embodiment of the invention, a method for optical sensing, which includes providing, on at least one semiconductor substrate, a first array of single-photon detectors, which are configured to output electrical pulses in response to photons that are incident on the single-photon detectors, and a second array of counters, which are configured to count the electrical pulses output by the single-photon detectors. In response to a control signal, the single-photon detectors are connected to the counters in a first mode in which each of at least some of the counters aggregates and counts the electrical pulses output by a respective first group of one or more of the single-photon detectors, and in a second mode in which each of the at least some of the counters aggregates and counts the electrical pulses output by a respective second group of two or more of the single-photon detectors.
There is additionally provided, in accordance with an embodiment of the invention, a method for optical sensing, which includes directing a series of optical pulses toward a target scene and imaging optical radiation that is reflected from the target scene onto an array of single-photon detectors, which output electrical pulses in response to photons that are incident thereon. The electrical pulses output by the single photon detectors are counted in multiple different gating intervals that are synchronized with each of the optical pulses, including at least first and second gating intervals at different, respective delays relative to the optical pulses, while the delays are swept over a sequence of different delay times during the series of the optical pulses. A time of flight of the optical pulses is computed by comparing respective first and second counts of the electrical pulses that were accumulated in the first and second gating intervals over the series of the optical pulses.
In some embodiments, counting the electrical pulses includes aggregating the pulses over groups of mutually-adjacent single-photon detectors in the array.
Additionally or alternatively, the first and second gating intervals are synchronized at respective first and second delays relative to the optical pulses, such that a difference between the first and second delays remains fixed while the first and second delays are swept over the sequence of different delay times during the series of the optical pulses. In one embodiment, the second gating interval begins upon termination of the first gating interval. Alternatively, an initial part of the second gating interval overlaps with the first gating interval. Additionally or alternatively, the first and second gating intervals have a common, predefined duration, and the sequence of the different delay times spans the predefined duration.
In a disclosed embodiment, the gating intervals are selected responsively to a range of the target scene so that the photons in the series of the optical pulses that are reflected from the target scene are incident on the array of single-photon detectors only during the first and second gating intervals.
In some embodiments, the multiple different gating intervals include at least a third gating interval, such that the electrical pulses counted during the third gating interval are indicative of a background component of the optical radiation that is incident on the array of single-photon detectors, and computing the time of flight includes compensating for the background component in comparing the first and second counts. In one embodiment, the third gating interval is synchronized with the optical pulses so as to measure stray photons in the optical pulses that are incident on the array of single-photon detectors without having reflected from the target scene. Alternatively or additionally, the electrical pulses counted during the third gating interval are indicative of an ambient component of the optical radiation that is incident on the array of single-photon detectors. In a disclosed embodiment, computing the time of flight includes calculating a ratio of the first and second counts after subtraction of the background component counted during at least the third gating interval.
There is further provided, in accordance with an embodiment of the invention, apparatus for optical sensing, including a radiation source, which is configured to direct a series of optical pulses toward a target scene, and a first array of single-photon detectors, which are configured to receive optical radiation that is reflected from the target scene and to output electrical pulses in response to photons that are incident thereon. A second array of counters are configured to count the electrical pulses output by the single photon detectors in multiple different gating intervals that are synchronized with each of the optical pulses, including at least first and second gating intervals at different, respective delays relative to the optical pulses, while the delays are swept over a sequence of different delay times during the series of the optical pulses. A processor is configured to compute a time of flight of the optical pulses by comparing respective first and second counts of the electrical pulses that were accumulated in the first and second gating intervals over the series of the optical pulses.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
The speed and sensitivity of single-photon detectors, such as SPADs, makes them a good choice for TOF imaging. SPAD arrays with integrated control logic and memory, such as those described above in the Background section, are starting to become commercially available. These integrated array devices, however, are still limited by the tradeoff of array pitch and power consumption against the spatial and depth resolution that they are capable of achieving.
Embodiments of the present invention that are described herein provide optical sensing apparatus and methods that address this tradeoff, and achieve more versatile SPAD array operation and more accurate depth mapping for a given array size and pitch.
Some embodiments provide optical sensing apparatus in which an array of single-photon detectors, such as SPADs, are disposed on a semiconductor substrate and output electrical pulses in response to incident photons. An array of counters, also disposed on the semiconductor substrate, count the electrical pulses output by the single-photon detectors. Routing and aggregation logic on the substrate is able to vary the configuration of the counters, relative to the detectors, in response to external control signals, and specifically to connect different groups of the single-photon detectors to different counters.
For example, in a first mode, each of the counters (or at least each of at least some of the counters) aggregates and counts the electrical pulses output by a respective first group of the single-photon detectors, which may even include only a single detector—meaning that each counter is connected to its own detector. In this mode it is also possible to create a two-dimensional (2D) image of a scene, in which the pixel values are given by the number of counts accumulated from each detector.
In a second mode, on the other hand, each of these counters aggregates and counts the electrical pulses output by a respective second group, which includes two or more of the detectors. Each counter can be gated to count the pulses it receives during a respective gating interval. In this manner, two or more counters with different gating intervals can be used together to construct a histogram of photon arrival times over the corresponding group of detectors. The gating intervals can be synchronized with optical pulses emitted by a radiation source in order to measure the times of flight of photons reflected from a target scene, and thus create a three-dimensional (3D) map of the scene.
If an object of interest (for example, a human face) is identified in such a 3D map, the gating interval for the counters in the first mode described above can then be set, relative to the optical pulses emitted toward the object, based on the depth of the object of interest in the 3D map. The detector array will thus acquire a 2D or 3D image of the object of interest with enhanced rejection of background radiation on account of the short, targeted gating interval that is applied.
In this sort of gated 3D acquisition, the gating intervals can made shorter, within the range of interest, thus narrowing the histogram bins and enhancing the depth resolution of the apparatus. Yet another benefit of the range-gating capabilities of the apparatus is the elimination of interference due to multi-path reflections, which propagate over a longer range and thus will reach the detector after the gate has closed. (In the absence of range gating, both direct and multi-path reflections will be detected in the histogram.) When the range to the target scene is known, the intensity of the radiation source can also be controlled as a function of the range, to avoid saturation of the detectors at short range and compensate for weaker signals at long range.
Other embodiments provide novel methods for TOF measurement using an array of single-photon detectors. These methods may be implemented advantageously using the aggregation and gated counting capabilities of the apparatus described above; but the methods may alternatively be performed using other sorts of single-photon detector arrays and gated counting logic.
In one of these embodiments, a series of optical pulses is directed toward a target scene, and optical radiation that is reflected from the target scene is imaged onto an array of single-photon detectors. Logic circuits associated with the array (such as the array of counters described above) count the electrical pulses output by the single photon detectors in multiple different gating intervals that are synchronized with each of the optical pulses, with each gating interval at a different, respective delay relative to the optical pulses. The delays of the gating intervals relative to the optical pulses are swept over a sequence of different delay times during the series of the optical pulses, and each counter accumulates the electrical pulses from the respective gating interval over the sequence of different delays. A processor computes the times of flight of the optical pulses simply by comparing the respective counts of the electrical pulses that were accumulated in two of the gating intervals over the series of the optical pulses.
As will be explained further hereinbelow, this approach is able to achieve high resolution in time of flight using only a small number of different gating intervals, due to the modulation of the delays between the optical pulses and the gating intervals. In fact, only two such gating intervals are required, although additional gating intervals can advantageously be used in order to measure and subtract out background components of the optical radiation that is incident on the detector array, for example due to stray photons and ambient radiation, as well as to enhance the temporal resolution.
Illumination assembly 24 typically comprises at least one pulsed laser 28, which emits short pulses of light, with pulse duration in the picosecond to nanosecond range and high repetition frequency, for example 100 MHz or more. Collection optics 30 direct the light toward object 22. Alternatively, other source configurations, pulse durations and repetition frequencies may be used, depending on application requirements. For example, illumination assembly may emit multiple pulsed beams of light along different, respective axes, so as to form a pattern of spots on object 22. In this case, although the spatial resolution of apparatus 20 in the transverse plane may be reduced, the depth resolution can be enhanced by concentrating the histogram capture and processing resources of imaging assembly 26 in the areas of the spots.
Imaging assembly 26 comprises objective optics 32, which image object 22 onto a sensing array 34, so that photons emitted by illumination assembly 24 and reflected from object 22 are incident on the sensing array. In the pictured embodiment, sensing array 34 comprises sensing circuits 36 and ancillary circuits 38. Sensing circuits 36 comprises an array of single-photon detectors 40, such as SPADs, each of which outputs electrical pulses indicative of a time of incidence of a single photon that is incident on the sensing element. Ancillary circuits 38 comprises an array of processing circuits 42, which are coupled respectively to the sensing elements.
Circuits 36 and 38 are disposed on a semiconductor substrate, which may comprise a single chip or two or more separate chips, which are then coupled together, for example using chip stacking techniques that are known in the art. Circuits 36 and 38 may be formed on one or more silicon wafers using well-known CMOS fabrication processes, based on SPAD sensor designs that are known in the art, along with accompanying counters and logic as described hereinbelow. Alternatively, the designs and principles of detection that are described herein may be implemented, mutatis mutandis, using other materials and processes. All such alternative implementations are considered to be within the scope of the present invention.
Imaging device 20 is timed to capture TOF information continually over a series of image frames, for example at a rate of thirty frames/sec. In each frame, processing circuits 42 count photons that are incident on detectors 40 in one or more gating intervals and store the respective counts in histogram bins corresponding to the gating intervals. A system controller 44 reads out the individual counter values, computes the times of flight of the optical pulses responsively to the counter values, and generates an output depth map, comprising the measured TOF—or equivalently, the measured depth value—at each pixel. The depth map is typically conveyed to a receiving device 46, such as a display or a computer or other processor, which segments and extracts high-level information from the depth map. Controller 44 may also set imaging device to capture two-dimensional images, as is described further hereinbelow.
System controller 44 typically comprises a programmable processor, such as a microprocessor or embedded microcontroller, which is programmed in software or firmware to carry out the functions that are described herein. This software or firmware may be stored in tangible, non-transitory computer-readable media, such as optical, magnetic, or electronic memory media.
Alternatively or additionally, at least some of the processing functions of controller 44 may be carried out by hard-wired or programmable digital logic circuits.
Reference is now made to
As shown in
Routing and control logic 52 also receives and decodes gating instructions from system controller 44, and accordingly outputs a gating signal to an AND gate 56 in each processing circuit 42. The gating signal controls the periods, i.e., the gating intervals, during which each counter 58 aggregates and counts the electrical pulses that are output from detectors 40 via multiplexer 54. System controller 44 typically synchronizes the gating intervals with the optical pulses emitted by illumination assembly 24. In a particular embodiment that is described below, with reference to
In some operating configurations, and particularly when operating in the second aggregation mode mentioned above, the gating signals cause different counters 58 in super-pixel 50 to aggregate and count the electrical pulses over different, respective gating intervals. As a result, the array of counters 58 will effectively output a histogram of the electrical pulses output by detectors 40 in super-pixel 50, with each bin of the histogram defined by a corresponding gating interval. Thus, in the present example, routing and control logic 52 may configure the histogram to have anywhere from two to sixteen bins. Alternatively, the gating signals may be set (particularly in the first aggregation mode) so that all of counters 58 share the same gating interval.
Based on the histogram generated by counters 58, system controller 44 computes the times of flight of the optical pulses that are emitted from illumination assembly and reflected back to each super-pixel 50. The system controller combines the TOF readings from the various super-pixels in array 34 in order to generate a 3D map of the target scene. In one embodiment, the system controller identifies an object of interest in the 3D map, for example object 22 (
In the embodiment shown in
For example, the gating interval applied to gate 56′ can be synchronized with the optical pulses emitted by illumination assembly 24 so that counter 58′ counts stray photons in the optical pulses that are incident on super-pixel 50 without having reflected from the target scene (in some cases due to photons with very short times of flight as a result of internal reflections within apparatus 20). Alternatively or additionally, the gating interval applied to gate 56′ may be chosen to occur at a time during which optical pulses from illumination assembly are not expected to reach sensing array 34, for example at a long delay after emission of the pulses. In this case, the electrical pulses counted by counter 58′ are indicative of the intensity of ambient optical radiation that is incident on super-pixel 50. Although only one background counter 58′ is shown in
Specifically,
Counters 58 are gated to count the electrical pulses output by the single photon detectors in multiple different gating intervals, which are synchronized with each of the optical pulses at different, respective delays relative to the optical pulses. The gating intervals are represented in
The delays of bins 76 and 78 relative to optical pulses 72 are not fixed, but rather are swept over a sequence of different delay times during series 70 of the optical pulses. (In
In the specific scheme that is shown in
Thus, as shown in
System controller 44 computes the TOF for super-pixel 50 by comparing the respective counts of the electrical pulses in bins 76 and 78 over series 70 of optical pulses 72. Specifically, in the present case, the system controller subtracts the ambient count in bin 80 from the counts in both of bins 76 and 78, and also subtracts the stray count in bin 82 from the count in bin 76, thus canceling out the background effects of ambient light and stray reflections. After subtracting these background components, system controller 44 computes the ratio R of the remainder of the count in bin 78 to the remainder of the count in bin 76. The TOF is proportional to the ratio, i.e., TOF=R*T. This approach enables system controller 44 to find depth coordinates with high resolution, even using only two bins for count accumulation.
In the embodiment of
In the pictured example, the range of the target scene is such that echoes 74 are divided between bin 3 and bin 4, and the ratio of the counts in these bins (after subtraction of the ambient background) gives the TOF relative to the start time of bin 3.
The overlap in this embodiment reflects the fact that optical pulses 72 have a finite width. The overlap between the gating intervals of successive bins is typically on the order of the pulse width. In this case, the simple ratio formula presented above is not strictly accurate. The precise relation between the numbers of counts in the bins and the corresponding time of flight can be estimated, for example, using a maximum likelihood analysis or a suitably trained neural network. System controller 44 compares the counts in the various bins 92 using this relation, and thus computes the TOF for each super-pixel 50.
It will be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
This application claims the benefit of U.S. Provisional Patent Application 62/942,761, filed Dec. 3, 2019, which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4623237 | Kaneda et al. | Nov 1986 | A |
4757200 | Shepherd | Jul 1988 | A |
5164823 | Keeler | Nov 1992 | A |
5270780 | Moran et al. | Dec 1993 | A |
5373148 | Dvorkis et al. | Dec 1994 | A |
5699149 | Kuroda et al. | Dec 1997 | A |
6301003 | Shirai et al. | Oct 2001 | B1 |
6384903 | Fuller | May 2002 | B1 |
6710859 | Shirai et al. | Mar 2004 | B2 |
7126218 | Darveaux et al. | Oct 2006 | B1 |
7193690 | Ossig et al. | Mar 2007 | B2 |
7303005 | Reis et al. | Dec 2007 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7508496 | Mettenleiter et al. | Mar 2009 | B2 |
7800067 | Rajavel et al. | Sep 2010 | B1 |
7800739 | Rohner et al. | Sep 2010 | B2 |
7812301 | Oike et al. | Oct 2010 | B2 |
7969558 | Hall | Jun 2011 | B2 |
8193482 | Itsler | Jun 2012 | B2 |
8259293 | Andreou | Sep 2012 | B2 |
8275270 | Shushakov et al. | Sep 2012 | B2 |
8279418 | Yee et al. | Oct 2012 | B2 |
8355117 | Niclass | Jan 2013 | B2 |
8405020 | Menge | Mar 2013 | B2 |
8594425 | Gurman et al. | Nov 2013 | B2 |
8675181 | Hall | Mar 2014 | B2 |
8736818 | Weimer et al. | May 2014 | B2 |
8766164 | Sanfilippo et al. | Jul 2014 | B2 |
8766808 | Hogasten | Jul 2014 | B2 |
8891068 | Eisele et al. | Nov 2014 | B2 |
8925814 | Schneider et al. | Jan 2015 | B1 |
8963069 | Drader et al. | Feb 2015 | B2 |
9002511 | Hickerson et al. | Apr 2015 | B1 |
9024246 | Jiang et al. | May 2015 | B2 |
9052356 | Chu et al. | Jun 2015 | B2 |
9076707 | Harmon | Jul 2015 | B2 |
9016849 | Duggal et al. | Aug 2015 | B2 |
9267787 | Shpunt et al. | Feb 2016 | B2 |
9335220 | Shpunt et al. | May 2016 | B2 |
9354332 | Zwaans et al. | May 2016 | B2 |
9465111 | Wilks et al. | Oct 2016 | B2 |
9516248 | Cohen et al. | Dec 2016 | B2 |
9709678 | Matsuura | Jul 2017 | B2 |
9736459 | Mor et al. | Aug 2017 | B2 |
9739881 | Pavek et al. | Aug 2017 | B1 |
9761049 | Naegle et al. | Sep 2017 | B2 |
9786701 | Mellot et al. | Oct 2017 | B2 |
9810777 | Williams et al. | Nov 2017 | B2 |
9874635 | Eichenholz et al. | Jan 2018 | B1 |
10063844 | Adam et al. | Aug 2018 | B2 |
10067224 | Moore et al. | Sep 2018 | B2 |
10132616 | Wang | Nov 2018 | B2 |
10215857 | Oggier et al. | Feb 2019 | B2 |
10269104 | Hannuksela et al. | Apr 2019 | B2 |
10386487 | Wilton et al. | Aug 2019 | B1 |
10424683 | Do Valle et al. | Sep 2019 | B1 |
10613203 | Rekow et al. | Apr 2020 | B1 |
10782393 | Dussan et al. | Sep 2020 | B2 |
10955552 | Fine | Mar 2021 | B2 |
11555900 | Barak | Jan 2023 | B1 |
20010020673 | Zappa et al. | Sep 2001 | A1 |
20020071126 | Shirai et al. | Jun 2002 | A1 |
20020131035 | Watanabe et al. | Sep 2002 | A1 |
20020154054 | Small | Oct 2002 | A1 |
20020186362 | Shirai et al. | Dec 2002 | A1 |
20040051859 | Flockencier | Mar 2004 | A1 |
20040135992 | Munro | Jul 2004 | A1 |
20040212863 | Schanz et al. | Oct 2004 | A1 |
20060044546 | Lewin et al. | Mar 2006 | A1 |
20060106317 | McConnell et al. | May 2006 | A1 |
20070145136 | Wiklof et al. | Jun 2007 | A1 |
20090009747 | Wolf et al. | Jan 2009 | A1 |
20090262760 | Krupkin et al. | Oct 2009 | A1 |
20090273770 | Bauhahn et al. | Nov 2009 | A1 |
20090275841 | Melendez et al. | Nov 2009 | A1 |
20100019128 | Itzler | Jan 2010 | A1 |
20100045965 | Meneely | Feb 2010 | A1 |
20100096459 | Gurevich | Apr 2010 | A1 |
20100121577 | Zhang et al. | May 2010 | A1 |
20100250189 | Brown | Sep 2010 | A1 |
20100286516 | Fan et al. | Nov 2010 | A1 |
20110006190 | Alameh et al. | Jan 2011 | A1 |
20110128524 | Vert et al. | Jun 2011 | A1 |
20110181864 | Schmitt et al. | Jul 2011 | A1 |
20120038904 | Fossum et al. | Feb 2012 | A1 |
20120075615 | Niclass et al. | Mar 2012 | A1 |
20120132636 | Moore | May 2012 | A1 |
20120153120 | Baxter | Jun 2012 | A1 |
20120154542 | Katz et al. | Jun 2012 | A1 |
20120176476 | Schmidt et al. | Jul 2012 | A1 |
20120249998 | Eisele et al. | Oct 2012 | A1 |
20120287242 | Gilboa et al. | Nov 2012 | A1 |
20120294422 | Cheung et al. | Nov 2012 | A1 |
20130015331 | Birk et al. | Jan 2013 | A1 |
20130079639 | Hoctor et al. | Mar 2013 | A1 |
20130092846 | Henning et al. | Apr 2013 | A1 |
20130107016 | Federspiel | May 2013 | A1 |
20130208258 | Eisele et al. | Aug 2013 | A1 |
20130236171 | Saunders | Sep 2013 | A1 |
20130258099 | Ovsiannikov et al. | Oct 2013 | A1 |
20130278917 | Korekado et al. | Oct 2013 | A1 |
20130300838 | Borowski | Nov 2013 | A1 |
20130342835 | Blacksberg | Dec 2013 | A1 |
20140027606 | Raynor et al. | Jan 2014 | A1 |
20140071433 | Eisele et al. | Mar 2014 | A1 |
20140077086 | Batkilin et al. | Mar 2014 | A1 |
20140078491 | Eisele et al. | Mar 2014 | A1 |
20140162714 | Kim et al. | Jun 2014 | A1 |
20140191115 | Webster et al. | Jul 2014 | A1 |
20140198198 | Geissbuehler et al. | Jul 2014 | A1 |
20140231630 | Rae et al. | Aug 2014 | A1 |
20140240317 | Go et al. | Aug 2014 | A1 |
20140240691 | Mheen et al. | Aug 2014 | A1 |
20140268127 | Day | Sep 2014 | A1 |
20140300907 | Kimmel | Oct 2014 | A1 |
20140321862 | Frohlich et al. | Oct 2014 | A1 |
20140353471 | Raynor et al. | Dec 2014 | A1 |
20150041625 | Dutton et al. | Feb 2015 | A1 |
20150062558 | Koppal et al. | Mar 2015 | A1 |
20150131080 | Retterath et al. | May 2015 | A1 |
20150163429 | Dai et al. | Jun 2015 | A1 |
20150192676 | Kotelnikov et al. | Jul 2015 | A1 |
20150200222 | Webster | Jul 2015 | A1 |
20150200314 | Webster | Jul 2015 | A1 |
20150260830 | Gosh et al. | Sep 2015 | A1 |
20150285625 | Deane et al. | Oct 2015 | A1 |
20150362585 | Gosh et al. | Dec 2015 | A1 |
20150373322 | Goma et al. | Dec 2015 | A1 |
20160003944 | Schmidtke et al. | Jan 2016 | A1 |
20160041266 | Smits | Feb 2016 | A1 |
20160072258 | Seurin et al. | Mar 2016 | A1 |
20160080709 | Viswanathan et al. | Mar 2016 | A1 |
20160259038 | Retterath et al. | Sep 2016 | A1 |
20160259057 | Ito | Sep 2016 | A1 |
20160274222 | Yeun | Sep 2016 | A1 |
20160334508 | Hall et al. | Nov 2016 | A1 |
20160344965 | Grauer | Nov 2016 | A1 |
20170006278 | Vandame et al. | Jan 2017 | A1 |
20170038459 | Kubacki et al. | Feb 2017 | A1 |
20170052065 | Sharma et al. | Feb 2017 | A1 |
20170067734 | Heidemann et al. | Mar 2017 | A1 |
20170131388 | Campbell et al. | May 2017 | A1 |
20170131718 | Matsumura et al. | May 2017 | A1 |
20170139041 | Drader et al. | May 2017 | A1 |
20170176577 | Halliday | Jun 2017 | A1 |
20170176579 | Niclass et al. | Jun 2017 | A1 |
20170179173 | Mandai et al. | Jun 2017 | A1 |
20170184450 | Doylend et al. | Jun 2017 | A1 |
20170184704 | Yang et al. | Jun 2017 | A1 |
20170184709 | Kenzler et al. | Jun 2017 | A1 |
20170188016 | Hudman | Jun 2017 | A1 |
20170219695 | Hall et al. | Aug 2017 | A1 |
20170242102 | Dussan et al. | Aug 2017 | A1 |
20170242108 | Dussan et al. | Aug 2017 | A1 |
20170257617 | Retterath | Sep 2017 | A1 |
20170269209 | Hall et al. | Sep 2017 | A1 |
20170303789 | Tichauer et al. | Oct 2017 | A1 |
20170329010 | Warke et al. | Nov 2017 | A1 |
20170343675 | Oggier et al. | Nov 2017 | A1 |
20170356796 | Nishio | Dec 2017 | A1 |
20170356981 | Yang et al. | Dec 2017 | A1 |
20180045816 | Jarosinski et al. | Feb 2018 | A1 |
20180059220 | Irish et al. | Mar 2018 | A1 |
20180062345 | Bills et al. | Mar 2018 | A1 |
20180081032 | Torruellas et al. | Mar 2018 | A1 |
20180081041 | Niclass et al. | Mar 2018 | A1 |
20180115762 | Bulteel et al. | Apr 2018 | A1 |
20180131449 | Kare et al. | May 2018 | A1 |
20180167602 | Pacala et al. | Jun 2018 | A1 |
20180203247 | Chen et al. | Jul 2018 | A1 |
20180205943 | Trail | Jul 2018 | A1 |
20180209846 | Mandai et al. | Jul 2018 | A1 |
20180259645 | Shu et al. | Sep 2018 | A1 |
20180299554 | Van Dyck et al. | Oct 2018 | A1 |
20180341009 | Niclass et al. | Nov 2018 | A1 |
20190004156 | Niclass et al. | Jan 2019 | A1 |
20190011556 | Pacala et al. | Jan 2019 | A1 |
20190011567 | Pacala et al. | Jan 2019 | A1 |
20190018117 | Perenzoni et al. | Jan 2019 | A1 |
20190018118 | Perenzoni et al. | Jan 2019 | A1 |
20190018119 | Laifenfeld et al. | Jan 2019 | A1 |
20190018143 | Thayer et al. | Jan 2019 | A1 |
20190037120 | Ohki | Jan 2019 | A1 |
20190056497 | Pacala et al. | Feb 2019 | A1 |
20190094364 | Fine et al. | Mar 2019 | A1 |
20190170855 | Keller et al. | Jun 2019 | A1 |
20190178995 | Tsai et al. | Jun 2019 | A1 |
20190257950 | Patanwala et al. | Aug 2019 | A1 |
20190277952 | Beuschel et al. | Sep 2019 | A1 |
20190361404 | Mautner et al. | Nov 2019 | A1 |
20200142033 | Shand | May 2020 | A1 |
20200233068 | Henderson et al. | Jul 2020 | A1 |
20200256669 | Roth et al. | Aug 2020 | A1 |
20200256993 | Oggier | Aug 2020 | A1 |
20200309955 | Laflaqueire et al. | Oct 2020 | A1 |
20200314294 | Schoenlieb et al. | Oct 2020 | A1 |
20200386890 | Oggier et al. | Oct 2020 | A1 |
20220244391 | Mandai | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
2605339 | Oct 1994 | CA |
201054040 | Apr 2008 | CN |
103763485 | Apr 2014 | CN |
104730535 | Jun 2015 | CN |
104914446 | Sep 2015 | CN |
105992960 | Oct 2016 | CN |
106405572 | Feb 2017 | CN |
110609293 | Dec 2019 | CN |
202013101039 | Mar 2014 | DE |
2157445 | Feb 2010 | EP |
2322953 | May 2011 | EP |
2469297 | Jun 2012 | EP |
2477043 | Jul 2012 | EP |
2827175 | Jan 2015 | EP |
3285087 | Feb 2018 | EP |
3318895 | May 2018 | EP |
3521856 | Aug 2019 | EP |
H02287113 | Nov 1990 | JP |
H0567195 | Mar 1993 | JP |
09197045 | Jul 1997 | JP |
H10170637 | Jun 1998 | JP |
H11063920 | Mar 1999 | JP |
2011089874 | May 2011 | JP |
2011237215 | Nov 2011 | JP |
2013113669 | Jun 2013 | JP |
2014059301 | Apr 2014 | JP |
101318951 | Oct 2013 | KR |
9008946 | Aug 1990 | WO |
2010149593 | Dec 2010 | WO |
2012154356 | Nov 2012 | WO |
2013028691 | Feb 2013 | WO |
2015199615 | Dec 2015 | WO |
2017106875 | Jun 2017 | WO |
2018122560 | Jul 2018 | WO |
2020101576 | May 2020 | WO |
2020109378 | Jun 2020 | WO |
2020201452 | Oct 2020 | WO |
Entry |
---|
CN Application # 201810571820.4 Office Action dated Sep. 9, 2022. |
KR Application # 1020220101419 Office Action dated Sep. 28, 2022. |
U.S. Appl. No. 17/026,365 Office Action dated Nov. 7, 2022. |
U.S. Appl. No. 16/532,513 Office Action dated Nov. 23, 2022. |
Charbon et al., “SPAD-Based Sensors”, TOF Range-Imaging Cameras, Springer-Verlag, pp. 11-38, year 2013. |
Niclass et al., “A 0.18 um CMOS SoC for a 100m range, 10 fps 200×96 pixel Time of Flight depth sensor”, IEEE International Solid-State Circuits Conference—(ISSCC), Session 27, Image Sensors, 27.6, pp. 488-490, Feb. 20, 2013. |
Walker et al., “A 128×96 pixel event-driven phase-domain ΔΣ-based fully digital 3D camera in 0.13μm CMOS imaging technology”, IEEE International Solid-State Circuits Conference—(ISSCC), Session 23, Image Sensors, 23.6, pp. 410-412, Feb. 23, 2011. |
Niclass et al., “Design and characterization of a 256×64-pixel single-photon imager in CMOS for a MEMS-based laser scanning time-of-flight sensor”, Optics Express, vol. 20, issue 11, pp. 11863-11881, May 21, 2012. |
Kota et al., “System Design and Performance Characterization of a MEMS-Based Laser Scanning Time-of-Flight Sensor Based on a 256×64-pixel Single-Photon Imager”, IEEE Photonics Journal, vol. 5, issue 2, pp. 1-15, Apr. 2013. |
Webster et al., “A silicon photomultiplier with >30% detection efficiency from 450-750nm and 11.6μm pitch NMOS-only pixel with 21.6% fill factor in 130nm CMOS”, Proceedings of the European Solid-State Device Research Conference (ESSDERC), pp. 238-241, Sep. 7-21, 2012. |
Bradski et al., “Learning OpenCV”, first edition, pp. 1-50, O'Reilly Media, Inc, California, USA, year 2008. |
Buttgen et al., “Pseudonoise Optical Modulation for Real-Time 3-D Imaging With Minimum Interference”, IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 54, Issue10, pp. 2109-2119, Oct. 1, 2007. |
Morbi et al., “Short range spectral lidar using mid-infrared semiconductor laser with code-division multiplexing technique”, Technical Digest, CLEO 2001, pp. 491-492, May 2001. |
Ai et al., “High-resolution random-modulation cw lidar”, Applied Optics, vol. 50, issue 22, pp. 4478-4488, Jul. 28, 2011. |
Chung et al., “Optical orthogonal codes: design, analysis and applications”, IEEE Transactions on Information Theory, vol. 35, issue 3, pp. 595-604, May 1989. |
Lin et al., “Chaotic lidar”, IEEE Journal of Selected Topics in Quantum Electronics, vol. 10, issue 5, pp. 991-997, Sep.-Oct. 2004. |
EP Application # 20177707.5 Search Report dated Nov. 12, 2020. |
U.S. Appl. No. 16/532,517 Office Action dated Oct. 14, 2020. |
IN Application # 202117029897 Office Action dated Mar. 10, 2022. |
IN Application # 202117028974 Office Action dated Mar. 2, 2022. |
International Application # PCT/US2020/058760 Search Report dated Feb. 9, 2021. |
TW Application # 109119267 Office Action dated Mar. 10, 2021. |
U.S. Appl. No. 16/752,653 Office action dated Apr. 5, 2021. |
U.S. Appl. No. 16/679,360 Office Action dated Jun. 29, 2022. |
EP Application # 22167103.5 Search Report dated Jul. 11, 2022. |
CN Application # 201780058088.4 Office Action dated Aug. 23, 2022. |
U.S. Appl. No. 16/885,316 Office Action dated Jun. 30, 2022. |
U.S. Appl. No. 16/532,513 Office Action dated Aug. 4, 2022. |
CN Application # 201680074428.8 Office Action dated Jun. 23, 2021. |
Zhu Jian, “Research of Simulation of Super-Resolution Reconstruction of Infrared Image”, abstract page, Master's Thesis, p. 1, Nov. 15, 2005. |
U.S. Appl. No. 16/752,653 Office Action dated Oct. 1, 2021. |
EP Application # 17737420.4 Office Action dated Oct. 28, 2021. |
KR Application # 1020200068248 Office Action dated Nov. 12, 2021. |
KR Application # 1020207015906 Office Action dated Oct. 13, 2021. |
JP Application # 2020001203 Office Action dated Feb. 4, 2021. |
U.S. Appl. No. 16/752,653 Office Action dated Feb. 4, 2021. |
U.S. Appl. No. 17/026,365 Office Action dated Jan. 26, 2023. |
CN Application # 201780097602.5 Office Action dated Mar. 15, 2023. |
CN Application # 202010063812.6 Office Action dated Mar. 18, 2023. |
KR Application # 1020217025136 Office Action dated Apr. 4, 2023. |
Number | Date | Country | |
---|---|---|---|
20210165083 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
62942761 | Dec 2019 | US |