Embodiments relate generally to cameras and sensors that provide measurements of distances to points in an image of a scene. More particularly, embodiments relate to an active light source sensor system whereby images of scenes and objects are acquired within a broad range of ambient lighting conditions.
Three dimensional photonic imaging systems, also referred to as three-dimensional (3D) cameras, are capable of providing distance measurements and photonic measurements for physical objects in a scene. Applications for such 3D cameras are industrial inspection, selective robot vision, 3D modeling, geographic surveying, and forensic analysis. 3D cameras can be implemented with a variety of technologies, with each combination of technologies presenting certain limitations that render the cameras ineffective in broad-use applications. Stereo vision 3D cameras implement two or more imaging arrays in a fixed, highly-calibrated configuration and utilize triangulation of common points within the fields of view to establish distances to each of the common points. Stereo vision systems suffer from image distance inaccuracies due to occlusion and parallax. Furthermore, distance accuracies suffer when the baseline distance between image arrays is small relative to the distances being measured. Lastly, stereo 3D cameras are expensive due to the need for multiple image arrays and the requirement for high precision for the baseline offset between the image arrays.
Time-of-flight (TOF) systems utilize light sources, such as lasers, that are pulsed or modulated so they provide pulses of light for illuminating scenes in conjunction with an imaging system for measuring the amplitude and timing of the light reflected from the objects in the scene. Distances to points in the scene are determined using the known speed of light for all of the reflected signals. The imaging systems for TOF devices comprise a camera with a photodetector array, typically fabricated using CCD or CMOS technology, and a method for rapidly gating the collection times for the photodetector elements. Reflected light is captured by the photodetector elements during the specific gating cycles.
Some TOF systems only utilize the timing between light pulses and gated photodetectors to determine 3D object distances. Other TOF systems utilize the amount of received light during a gated capture cycle to establish object distances. The accuracy of these systems depends on the uniformity of incident light and the speed of the gating mechanism for the photodetectors.
Utilizing gated photodetectors is an effective method to establish distances to objects in a scene. By precisely controlling the timing between incident light pulses and gated photodetectors the distances to objects in certain distance bands can be accurately determined. For establishing object distances for other distance bands, subsequent light and gated photodetector cycles are utilized while the stationary objects and stationary camera are maintained in their present configurations and orientations. Any movement of the camera and/or objects in the scene will result in distance measurement bands that are not registered with one another.
A 3D camera described in U.S. Pat. No. 4,935,616 utilizes a modulated source and imaging system. A preferred embodiment of this system uses a CW laser and utilizes the phase difference between the incident and reflected signals to establish the distances to objects.
Another 3D camera is described in U.S. Pat. No. 5,081,530. This system utilizes a pair of gates for each photodetector element. Distances to objects are determined from the ratio of differences between the sampled energy at the two gated elements.
U.S. Pat. Nos. 7,362,419 and 7,755,743 each utilize modulated light intensity sources and phase ranges to detect phase shifts between emitted and detected signals. An embodiment of U.S. Pat. No. 8,159,598 utilizes modulated light intensity and phase shift detection for time of flight determination. Other embodiments of U.S. Pat. No. 8,159,598 utilize a high resolution color path with a low resolution distance path to determine 3D information for a detector or a group of detectors.
U.S. Pat. No. 8,102,426 to Yahav describes 3D vision on a chip and utilizes an array of photodetector elements that are gated at operative times to establish object distances in a scene. Photodetector sites are utilized for either TOF distance measurement or for the determination of object color. Embodiments of Yahav describe utilizing groups or bands of photodetector elements to establish the various distance bands. Other embodiments of Yahav describe a single distance band for each capture cycle, with full scene distances established utilizing sequences of capture cycles. Although not specified in Yahav, the requirement for the embodiments is no movement of the camera and the objects in the scene throughout the sequence of capture cycles.
For real-world applications like autonomous vehicle navigation, mobile mapping, agriculture, mining, and surveillance it is not practical to require little or no movement between a 3D camera and objects in a scene during a sequence of imaging cycles. Furthermore, most of the real-world situations occur in scenes that have widely-varying ambient light conditions. Geiger mode avalanche photo diodes are solid state photodetectors that are able to detect single photons. Such Geiger mode avalanche photo diodes are also referred to as single-photon avalanche diodes (SPADs). Arrays of SPADs can be used as a single detector element in an active sensing system, but camera/sensor systems based on SPAD arrays have at least two shortcomings due to ambient light. First, solar background light can hamper the ability to accurately determine depth. Second, ambient light impacts the reflectivity precision because of challenges differentiating between reflected light and ambient light. It is desirable to have a 3D camera/sensor with one or more SPAD arrays that can address these shortcomings.
In embodiments, an active sensor system is configured to generate a lighting-invariant image of a scene utilizing at least one emitter configured to emit a set of active light pulses toward the scene. A focal plane array of Geiger mode avalanche photo diode detectors is configured to receive light for a field of view that includes at least a portion of the scene, wherein each detector is biased to operate as a single-photon avalanche diode (SPAD) detector in an array of SPAD detectors. Control circuitry is operably coupled to the at least one emitter and the array of SPAD detectors and is configured to emit the set of light pulses and to capture a set of intensity values for at least three successive distance range bands and store the set of captured intensity values in a set of frame buffers. A processing system is operably coupled to the control circuitry and the set frame buffers to generate the lighting-invariant image of the scene. In embodiments, the processing system is configured to analyze the at least three frames and determine a minimum intensity value due to ambient light, a maximum intensity value, and a frame of the at least three successive frames at which the maximum intensity value occurs, determine a depth based on the frame at which the maximum intensity value occurs, determine a reflectivity value based on the difference between the maximum intensity value and the minimum intensity value, and generate the lighting invariant depth-map of the scene based on the depths and the reflectivity values.
In embodiments, the incident light is full spectrum visible light in the energy band from roughly 400 nanometers to 700 nanometers. The photodetector sites are sensitive to radiation in this wavelength band. In embodiments the photodetectors utilize a bandpass filter to reduce or eliminate radiation outside the desired energy band. The bandpass filter(s) can be applied as a global array filter in the case of an IR-eliminating filter and can be applied as an individual filter for each photodetector in the case of a Bayer pattern for establishing RGB elements in the array.
In some embodiments, a photodetector element utilizes a photodiode coupled to a photodetector integration element whereby the current from the photodiode produces a charge that is collected or integrated during the gating cycle of the photodetector. The photodetector integration stage is emptied by rapidly transferring the integrated charge to the next processing element in the system, thus allowing the photodetector stage to begin the integration for the subsequent photodetector integration cycle.
In embodiments, each photodetector site in the array is connected to a dedicated charge transfer stage wherein each of the K stages facilitates the rapid transfer of charges from the photodetector site. Utilizing K charge transfer stages per photodetector allows for up to K gated emitter/detector cycles per imaging cycle.
In embodiments, the detector array and charge transfer sites are fabricated together on a focal plane array along with gating circuitry and charge transfer control circuitry. The number of photodetector sites will be sufficiently large and the focal plane array interface to the downstream camera circuitry will be of a relatively lower throughput rate than that of the throughput of the higher-speed charge transfer array.
In embodiments, the detector array is fabricated on a focal plane array along with gating circuitry. The signal interface for the focal plane array is sufficiently fast that the integrated charges can be transferred from the integration sites directly to a 4D frame buffer without the need for a charge transfer array.
In some embodiments, the 4D camera light source comprises one or more elements like LEDs that provide uniform intensity throughout the desired frequency range. In other embodiments, the light source is a combination of light elements like LEDs wherein the frequency responses on the separate light elements combine to form an output signal that is uniform throughout the desired frequency range.
In embodiments, the camera detects environmental conditions that attenuate emitted and reflected signals and utilizes detected information from environmental and ambient signals to establish non-attenuated signal strength, object distances and object color.
In embodiments, the light intensity is non-uniform throughout the frequency range. The non-uniform light is characterized to establish parameters for use in color correction applied during image post-processing. In embodiments the light intensity is spatially non-uniform throughout the field of view. The non-uniform spatial intensity is mapped to allow for scene intensity and color adjustments during image post-processing.
In some embodiments, the light energy is emitted and received as common laser wavelengths of 650 nm, 905 nm or 1550 nm. In some embodiments the light energy can be in the wavelength ranges of ultraviolet (UV)—100-400 nm, visible—400-700 nm, near infrared (NIR)—700-1400 nm, infrared (IR)—1400-8000 nm, long-wavelength IR (LWIR)—8 um-15 um, far IR (FIR)—15 um-1000 um, or terahertz—0.1 mm-1 mm.
For a detector array 50 with an in-focus lens 52 the individual fields of view 62 corresponding to each detector 58 should perfectly align with the fields of view for neighboring detectors. In practice, a lens 52 will almost never be perfectly in focus. Thus, the fields of view 62 of each detector 58 in a lensed system may typically overlap, though the field of view of each detector 58 is different from that of any other detector 58 in the detector array 50. Detector arrays 50 may not have optimal density in their configuration due to semiconductor layout limitations, substrate heat considerations, electrical crosstalk avoidance, or other layout, manufacturing, or yield constraints. As such, sparse detector arrays 50 may experience loss in photon detector efficiency within the device field of view 56 due to reflected photons contacting the unutilized spaces between successive detector elements 58.
For non-lensed systems the field of view 62 of each detector 58 can be determined by a diffraction grating, an interferometer, a waveguide, a 2D mask, a 3D mask, or a variety of other aperture configurations designed to allow light within a specific field of view. These individual detector apertures will typically have overlapping fields of view 62 within the device field of view 56.
An element of various embodiments is the determination of an angle 60 for each detector 58.
Variations will occur in the fabrication of detector arrays 50 used in 4D cameras. In single-lens 52 detector array devices like that shown in
Due to the importance of accurate determination of the optical path, in situ calibration may be desirable for devices according to various embodiments. As an example, a 4D camera device according to an embodiment may be used as a sensor in an autonomous vehicle. In order to protect the device it may be mounted inside a passenger vehicle affixed to the windshield behind the rear-view mirror. Since the device is facing in front of a vehicle, emitted light and reflected light will pass through the windshield on its way to and from external objects. Both components of light will undergo distortion when passing through the windshield due to reflection, refraction, and attenuation. In situ calibration for this autonomous vehicle 4D camera may include the device emitting pre-determined calibration patterns and measuring the intensity, location, and angle of the reflected signals. Device characterization parameters would be updated to account for a modified optical path of the incident and/or reflected light based on a calibration.
In embodiments a photodetector array 72 is fabricated as a focal plane array that utilizes electrical connections 78 to interface with other camera 70 circuitry. This electrical interface 78 is typically of lower bandwidth than that required by the high-speed photodetection elements 72. The charge transfer array 80 is a collection of fast analog storage elements that takes information from the photodetector array 72 at a rate sufficient to allow the photodetector elements 72 to rapidly process subsequent emitter/detector events. The size of the charge transfer array 80 is typically M×N×K analog storage elements where M is the number of rows in the detector array 72, N is the number of columns in the detector array 72, and K is the number of emitter/detector cycles that constitute a 4D capture cycle for a single 4D camera 70 event.
Information from a 4D frame buffer 74 is processed separately for color information and distance information. A controller 82 computes distance values from the TOF algorithm for each of the M×N pixels and stores the distance information in the depth map 84 memory. In embodiments a photodetector array 72 is fabricated with a color filter pattern like a Bayer pattern or some other red-green-blue (RGB) configuration. Each color from a detector filter pattern will require a corresponding color plane 86 in device 70 memory.
A controller 82 will assemble separate color planes 86 into an output image format and store a resulting file in device memory 88. An output file may be in a format such as TIFF, JPEG, BMP or any other industry-standard or other proprietary format. Depth map 84 information for an image may be stored in the image file or may be produced in a separate file that is associated with the image file. After completion of the creation of the output file(s) the controller 82 transmits information via the I/O 90 interface to an upstream application or device. A controller 82 configures all of the sequencing control information for the emitters 92, the photodetector 72 integration, the 4D frame buffer 74 transformation to color 86 and depth 84 information, and device 70 communication to other devices. A controller 82 can be a single CPU element or can be a collection of microcontrollers and/or graphics processing units (GPUs) that carry out the various control functions for the device 70.
Upon completion of the filling of a 4D frame buffer a camera controller will create an M×N depth map 106 and will create the color plane(s) 108. In embodiments where a camera utilizes multiple color planes produced by multiple color filters on a detector array a controller performs demosaicing for each of the sparse color planes to produce M×N color values for each color plane. A controller creates an output file for the present color image and will format 110 the file for transmission to the upstream device or application.
The frame rate of a 4D camera will typically be a function of the longest action in the processing sequence. For the
During a detector 122 integration cycle the intensity of the charge collected at the capacitor 132 is proportional to the number of incident photons 126 present during the gating time of the integrator 130. During photodetector 130 integration the charge transfer switch 136 remains in the open position. Upon completion of an integration cycle the integration switch 128 is opened and collected charge remains at the integrator 130 stage. During the start of the charge transfer cycle charge is migrated from the integration capacitor 132 to the charge transfer stage 0 138 capacitor 140 by closing the charge transfer stage 0 138 gate switch 136. At the exit line from the charge transfer stage 0 138 another gate switch 142 enables the transfer of charge from stage 0 138 to stage 1 144. The input switch 136 and the output switch 142 for stage 0 are not in the “on” or closed position at the same time, thus allowing charge to be transferred to and stored at stage 0 138 prior to being transferred to stage 1 on a subsequent charge transfer cycle. Charge transfer stage K−1 144 represents the last charge transfer stage for K emitter/detector cycles. Charge is transferred from stage K−1 144 to a data bus 146 leading to a 4D frame buffer when the K−1 output switch 148 is closed. At the end of each of K detector integration cycles the grounding switch 149 can be closed to remove any excess charge that may have collected at the photodetector 124.
At the completion of the first 160 detector integration 156 period the integrated charge is transferred 168 from each of the M×N integration elements to each of the M×N charge transfer stage 0 elements. After the second detector integration 156 period is complete a second charge transfer 170 operation is performed that transfers charge from stage 0 to stage 1 and transfers charge from the integration stage to charge transfer stage 0. The detector input 172 signal shows times at which charge is being collected at integration stages for the M×N integration elements.
Distance=(TOF*c)/2 Eq.1
Where TOF=time of flight
Using c=0.3 m/nSec as the speed of light the Minimum Dist. (m) 200 and Maximum Dist. (m) 202 values are established for the lower and upper bounds for the range detected for each of the K stages in a 4D camera capture sequence. The intensity (Hex) 204 column shows the digital hexadecimal value for the integrated intensity value for each of the K stages. It is noted that each of the M×N elements in the detector array will have K intensity values corresponding to the integrated intensities of the K stages. The timing parameters and the TOF values from
A review of the intensity values 204 shows a minimum value 206 of 0x28 and a maximum value 208 of 0xF0. These values are designated Imin[m,n]=0x28 and Imax[m,n]=0xF0. For embodiments that utilize constant pulse width timing for all detector/emitter stages in a capture sequence, the intensity value inserted in the color plane buffer is determined by:
Icolor[m,n]=Imax[m,n]−Imin[m,n] Eq. 2
By utilizing Eq. 2 for color plane intensity the effects of ambient light are eliminated by subtracting out the photonic component of intensity Imin[m,n] that is due to ambient light on the scene or object. Eq. 2 is an effective approach for eliminating ambient light when the photodetector integration response has a linear relationship to the number of incident photons at the photodetector. For non-linear photonic/charge collection relationships Eq. 2 would be modified to account for the second-order or N-order relationship between incident photons and integrated charge intensity.
For each photodetector m,n in embodiments that utilize multi-color filter elements the Icolor[m,n] value is stored at location m,n in the color plane that corresponds to the color of the filter. As an example, an embodiment with a Bayer filter pattern (RGBG) will have M×N/2 green filter detectors, M×N/4 blue filter detectors, and M×N/4 red filter detectors. At the completion of K integration stages, subsequent filling of the 4D frame buffer, and determination of the M×N color values the controller will store the M×N/4 Ired[m,n] values at the appropriate locations in the red color plane memory. In turn the controller will determine and store the M×N/4 blue values in the appropriate locations in the blue color plane memory and the M×N/2 green values in the appropriate locations in the green color plane memory.
Referring again to
where i is the stage at which the leading-edge-clipped signal is detected
The embodiments from
For ground-based vehicle-mounted and low-altitude aircraft-mounted applications the 4D imaging cycle time should be no longer than 50 microseconds. For higher-altitude aircraft at higher speeds the 4D imaging cycle time should be no longer than 10 microseconds. One skilled in the art can envision embodiments where the relative movement between scene objects and the camera exceeds 0.05 pixels. These longer-image-cycle-time embodiments will utilize inter-sample trajectory techniques to account for information from subsequent emitter/detector stages that do not align within the structure of the detector array grid.
Embodiments described in
An embodiment in
Based on the selection of emitter and detector pulse widths for this embodiment, the control algorithm establishes that the intensity values transition from environmental values at stage 6 to ambient values at stage 12. Furthermore, the control algorithm determines that anywhere from one to three stages will contain an integrated signal that includes 100% of the object-reflected waveform. From the data in
I(clr,m,n,s−3)=Ienv(clr,m,n) Eq. 5
I(clr,m,n,s−2)=E0*Ienv(clr,m,n)+(1−E0)Iobj(clr,m,n) Eq. 6
I(clr,m,n,s)=E1*Ienv(clr,m,n)+(1−E1−A1)Iobj(clr,m,n)+A1*Iamb(clr,m,n) Eq. 7
I(clr,m,n,s+2)=A0*Iamb(clr,m,n)+(1−A0)Iobj(clr,m,n) Eq. 8
I(clr,m,n,s+3)=Iamb(clr,m,n) Eq. 9
E0=E1+(2*temitter-clock-cycle)/D Eq. 10
A1=A0+(2*temitter-clock-cycle)/D Eq. 11
Where s is the stage number identifier for the detector stage with a 100% reflected signal
Utilizing the five equations (Eqs. 6, 7, 8, 10 and 11) with five unknowns, (Iobj( ), E0, E1, A0 and A1) the control algorithms determine Iobj( ) for each color and each pixel and assigns the computed intensity values to the appropriate locations in the color frame buffers. The distance to the object is determined by computing TOF to the object based on Eq. 7:
TOF(clr,m,n)=TOFmin(clr,m,n,s)+E1*tdetector-pulse-width(s) Eq. 12
Where TOF( ) is the time of flight for a particular pixel
The identification of stage s for Eqs. 5-12 depends on knowledge of the emitter pulse width for each stage and the detector pulse width for each stage. The known pulse widths determine the duty cycle and determine how many stages are involved in the transition from environmental signals to ambient signals for each pixel. Eqs. 5-12 are applicable for embodiments where the emitter pulses are shorter in duration than the detector pulses. For embodiments where emitter pulses are longer than detector pulses Eq. 7 will compute to either E1 or A1 being equal to zero. As a result, two more equations with two more unknowns are necessary to resolve the intensity values of the object. The first additional equation will describe two new unknowns (A2 and E2) as a function of the measured stage intensity and the second additional equation will describe A2 and E2 as a function of the stage duty cycle.
In various embodiments, it will be appreciated that utilizing techniques for evaluating signal attenuation may utilize a minimum of five emitter/detector cycles—one cycle in which an environmental detected signal is determined, one cycle containing a leading edge detected signal of the active pulsed signal, one full-emitter-cycle of the detected signal of the active pulsed signal, one cycle containing a trailing-edge detected signal, and one cycle containing an ambient detected signal. Depending upon timing, field of vision, distances, ambient and environmental conditions, additional emitter/detector cycles may be needed to obtain the necessary information to utilize the techniques for evaluating signal attenuation as described with respect to these embodiments.
For uniform emitter pulses Eq. 3 and Eq. 4 will produce the same value for TOF for each pixel m,n. Due to signal noise and ambient light TOF values based on higher integrated intensity values will produce higher accuracy distance computations than lower integrated intensity values. In embodiments the controller will utilize only one of values from Eq. 3 or Eq. 4 to establish the TOF for the pixel, with the preferred TOF value being selected from the equation that utilizes the largest amplitude integrated intensity value.
Objects farther from 4D cameras will receive less light from emitters than objects closer to the camera. As a result, reflected signals from far objects will have lower intensity than reflected signals from closer objects. One method to compensate for lower intensity return signals is to increase the emitter pulse width and to increase the detector integration time, thus increasing the intensity of the integrated signal for a given object distance.
Stage 0 has a four-period emitter cycle 210 and a six-period detector integration cycle 212. Stage 1 has a five-period emitter cycle 214 and has a seven-period detector integration cycle 216. Stage 9 is a special cycle that has a very long emitter pulse 218 and a correspondingly long detector integration cycle 220. This special long emitter/detector cycle may not be used for distance determination but is used to establish accurate color values for objects that are not very retroreflective at the wavelengths of the camera emitter.
In previous embodiments the distances to objects computed via TOF were dependent on distance ranges established by the multi-period detector integration cycles. It may be desirable to achieve greater precision for TOF distance measurements.
where i is the stage at which the leading-edge-clipped signal is detected
In practice f(t) will likely be a non-linear or higher-order relationship between cumulative intensity and time. As such, the inverse function f−1(t) may be implemented in embodiments as a lookup table or some other numerical conversion function.
In embodiments the intensity determination for separate color planes is achieved with an unfiltered detector array and selective use of multi-colored emitters.
An example in Table 2 below shows multiple emitter/detector stages for a K-stage sequence with K=12, whereby each emitter wavelength is utilized for K/3 sequential stages.
A K-stage sequence with K=12 can also be allocated to a single wavelength emitter, with subsequent K-stage sequences allocated to other wavelengths in a round-robin fashion as shown in Table 3 below.
Embodiments that utilize individual detector filters will have certain advantages and disadvantages over embodiments that utilize separate wavelength emitters to achieve multi-color detected signals. Table 4 below compares the relative advantages of embodiments.
In-motion imaging applications have the advantage of imaging an object from multiple viewpoints and, more importantly, multiple angles. Physical objects possess light-reflecting characteristics that, when sensed properly, can be utilized to categorize objects and even uniquely identify objects and their surface characteristics.
Upon completion of the processing for n=0 the processing algorithm obtains the next image 370 in a sequence. The image is analyzed to determine if point P0 is present 372 in the image. If P0 is present the loop counter is incremented 374 and the algorithm proceeds to the normal vector determination step 364. If P0 is not present the algorithm establishes whether there are enough points 376 to identify the object based on angular intensity characteristics. If the minimum requirements are not met the algorithm concludes 384 without identifying the object. If the minimum requirements are met the algorithm creates a plot in 3D space 378 for each color for the intensity information determined for all of the n points. The algorithm will define the object by comparing the collected angular intensity profile to reference characteristic profiles that are stored in a library. The characteristic profiles are retrieved from the library 380 and a correlation is determined 382 for each characteristic profile and the P0 profile. The characteristic profile with the highest correlation to P0 is used to determine the object type, class or feature for the object represented by P0.
The algorithm from
In practice the library of characteristic angular intensity profiles will contain hundreds or possibly thousands of profiles. Performing correlations on all profiles in real-time is a computationally intensive operation. As a way of parsing the challenge to a more manageable size the analysis functionality on the device can perform image analysis to classify detected objects. Once classified, angular intensity profiles from the detected objects can be compared to only the library profiles that are associated with the identified object class. As an example, the image analysis functionality in a vehicle-mounted application can identify roadway surfaces based on characteristics such as coloration, flatness, orientation relative to the direction of travel, etc. Having established that a profile for a point P0 is classified as a roadway surface point, the algorithm can access only those characteristic profiles from the library that are classified as road surface characteristics. Some road surface characteristic profiles could include, but not be limited to:
An object like road signs is another profile class that can be separate in the profile library. Some road sign characteristic profiles could include, but not be limited to:
The characteristic profile algorithm specifies correlation as the means to compare characteristic profiles and to select the most representative characteristic profile for the object represented by P0. Those reasonably skilled in the art can devise or utilize other methods to select the most representative characteristic profile based on the information collected and analyzed for the object represented by P0.
The rear-view mirror 408 displays an unobstructed view from a rear-facing camera (not shown) that is in a rear-facing orientation mounted at the rear of the vehicle 400 or inside the vehicle 400 projecting through the rear window. Environmental 416 obstructions for the side of the vehicle 400 are addressed with features in the side mirror 418. A rear oblique-angle camera 420 detects obstructed environmental conditions 416 and projects an obstruction-free image 422 on the mirror for use by the vehicle 400 operator. Alternately, or in addition, the obstruction-free image 422 is delivered to the vehicle control system for autonomous or semi-autonomous driving systems. An indicator 424 on the side mirror indicates the presence of objects within a certain space, thus assisting the vehicle 400 operator in maneuvers like lane changes.
The other embodiments, the processing system can include various engines, each of which is constructed, programmed, configured, or otherwise adapted, to autonomously carry out a function or set of functions. The term engine as used herein is defined as a real-world device, component, or arrangement of components implemented using hardware, such as by an application specific integrated circuit (ASIC) or field-programmable gate array (FPGA), for example, or as a combination of hardware and software, such as by a microprocessor system and a set of program instructions that adapt the engine to implement the particular functionality, which (while being executed) transform the microprocessor or controller system into a special-purpose device. An engine can also be implemented as a combination of the two, with certain functions facilitated by hardware alone, and other functions facilitated by a combination of hardware and software. In certain implementations, at least a portion, and in some cases, all, of an engine can be executed on the processor(s) of one or more computing platforms that are made up of hardware that execute an operating system, system programs, and/or application programs, while also implementing the engine using multitasking, multithreading, distributed processing where appropriate, or other such techniques.
Accordingly, it will be understood that each processing system can be realized in a variety of physically realizable configurations, and should generally not be limited to any particular implementation exemplified herein, unless such limitations are expressly called out. In addition, a processing system can itself be composed of more than one engine, sub-engines, or sub-processing systems, each of which can be regarded as a processing system in its own right. Moreover, in the embodiments described herein, each of the various processing systems may correspond to a defined autonomous functionality; however, it should be understood that in other contemplated embodiments, each functionality can be distributed to more than one processing system. Likewise, in other contemplated embodiments, multiple defined functionalities may be implemented by a single processing system that performs those multiple functions, possibly alongside other functions, or distributed differently among a set of processing system than specifically illustrated in the examples herein.
Embodiments utilize high-speed components and circuitry whereby the relative movement of the device and/or scene could be defined as the movement of less than the inter-element spacing in the detector array. For embodiments wherein the relative movement is small the processing software can assume the axis of the 3D volumetric computations is normal to the detector elements in the array. For relative movement greater than the inter-element spacing in the detector array during the timeframe of the emitter cycles the frame buffer analysis software will need to perform 3D analysis of the sampled waveforms whereby the representations have an axis that is non-normal to the detector elements in the array.
The electrical circuitry of embodiments is described utilizing semiconductor nomenclature. In other embodiment circuitry and control logic that utilizes optical computing, quantum computing or similar miniaturized scalable computing platform may be used to perform part or all of the necessary high-speed logic, digital storage, and computing aspects of the systems described herein. The optical emitter elements are described utilizing fabricated semiconductor LED and laser diode nomenclature. In other embodiments the requirements for the various techniques described herein may be accomplished with the use of any controllable photon-emitting elements wherein the output frequency of the emitted photons is known or characterizable, is controllable with logic elements, and is of sufficient switching speed.
In some embodiments, the light energy or light packet is emitted and received as near-collimated, coherent, or wide-angle electromagnetic energy, such as common laser wavelengths of 650 nm, 905 nm or 1550 nm. In some embodiments, the light energy can be in the wavelength ranges of ultraviolet (UV)—100-400 nm, visible—400-700 nm, near infrared (NIR)—700-1400 nm, infrared (IR)—1400-8000 nm, long-wavelength IR (LWIR)—8 um-15 um, far IR (FIR)—15 um-1000 um, or terahertz—0.1 mm-1 mm. Various embodiments can provide increased device resolution, higher effective sampling rates and increased device range at these various wavelengths.
Detectors as utilized in the various embodiments refer to discrete devices or a focal plane array of devices that convert optical energy to electrical energy. Detectors as defined herein can take the form of PIN photodiodes, avalanche photodiodes, photodiodes operating at or near Geiger mode biasing, or any other devices that convert optical to electrical energy whereby the electrical output of the device is related to the rate at which target photons are impacting the surface of the detector.
Persons of ordinary skill in the relevant arts will recognize that embodiments may comprise fewer features than illustrated in any individual embodiment described above. The embodiments described herein are not meant to be an exhaustive presentation of the ways in which the various features of the embodiments may be combined. Accordingly, the embodiments are not mutually exclusive combinations of features; rather, embodiments can comprise a combination of different individual features selected from different individual embodiments, as understood by persons of ordinary skill in the art. Moreover, elements described with respect to one embodiment can be implemented in other embodiments even when not described in such embodiments unless otherwise noted. Although a dependent claim may refer in the claims to a specific combination with one or more other claims, other embodiments can also include a combination of the dependent claim with the subject matter of each other dependent claim or a combination of one or more features with other dependent or independent claims. Such combinations are proposed herein unless it is stated that a specific combination is not intended. Furthermore, it is intended also to include features of a claim in any other independent claim even if this claim is not directly made dependent to the independent claim.
Any incorporation by reference of documents above is limited such that no subject matter is incorporated that is contrary to the explicit disclosure herein. Any incorporation by reference of documents above is further limited such that no claims included in the documents are incorporated by reference herein. Any incorporation by reference of documents above is yet further limited such that any definitions provided in the documents are not incorporated by reference herein unless expressly included herein.
For purposes of interpreting the claims, it is expressly intended that the provisions of Section 112, sixth paragraph of 35 U.S.C. are not to be invoked unless the specific terms “means for” or “step for” are recited in a claim.
This application is a continuation of U.S. patent application Ser. No. 18/524,180, filed Nov. 11, 2023, which is a continuation of U.S. patent application Ser. No. 17/967,365, filed on Oct. 17, 2022 and issued on Dec. 5, 2023 as U.S. Pat. No. 11,838,626, which is a continuation of U.S. patent application Ser. No. 17/127,461, filed on Dec. 18, 2020 and issued on Oct. 18, 2022 as U.S. Pat. No. 11,477,363, which is a continuation of U.S. patent application Ser. No. 16/537,305, filed on Aug. 9, 2019 and issued on Dec. 22, 2020 as U.S. Pat. No. 10,873,738, which is a division of U.S. patent application Ser. No. 16/167,196, filed on Oct. 22, 2018 and issued on Aug. 13, 2019 as U.S. Pat. No. 10,382,742, which is a continuation of U.S. patent application Ser. No. 15/853,222, filed on Dec. 22, 2017 and issued on May 21, 2019 as U.S. Pat. No. 10,298,908, which is a continuation of U.S. patent application Ser. No. 15/059,811, filed on Mar. 3, 2016 and issued on Jan. 9, 2018 as U.S. Pat. No. 9,866,816. All of the aforementioned applications are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
1743835 | Stimson | Jan 1930 | A |
3971065 | Bayer | Jul 1976 | A |
4145112 | Crone et al. | Mar 1979 | A |
4185891 | Kaestner | Jan 1980 | A |
4663756 | Retterath | May 1987 | A |
4739398 | Thomas et al. | Apr 1988 | A |
4935616 | Scott | Jun 1990 | A |
5006721 | Cameron et al. | Apr 1991 | A |
5026156 | Bayston et al. | Jun 1991 | A |
5054911 | Ohishi et al. | Oct 1991 | A |
5081530 | Medina | Jan 1992 | A |
5084895 | Shimada et al. | Jan 1992 | A |
5090245 | Anderson | Feb 1992 | A |
5122796 | Beggs et al. | Jun 1992 | A |
5212706 | Jain | May 1993 | A |
5400350 | Galvanauskas | Mar 1995 | A |
5418359 | Juds et al. | May 1995 | A |
5420722 | Bielak | May 1995 | A |
5446529 | Stettner et al. | Aug 1995 | A |
5465142 | Krumes et al. | Nov 1995 | A |
5485009 | Meyzonnetie et al. | Jan 1996 | A |
5497269 | Gal | Mar 1996 | A |
5619317 | Oishi et al. | Apr 1997 | A |
5675326 | Juds et al. | Oct 1997 | A |
5682229 | Wangler | Oct 1997 | A |
5793491 | Wangler et al. | Aug 1998 | A |
5805275 | Taylor | Sep 1998 | A |
5831551 | Geduld | Nov 1998 | A |
5870180 | Wangler | Feb 1999 | A |
5892575 | Marino | Apr 1999 | A |
5914776 | Streicher | Jun 1999 | A |
5940170 | Berg et al. | Aug 1999 | A |
6054927 | Brickell | Apr 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6118518 | Hobbs | Sep 2000 | A |
6133989 | Stettner et al. | Oct 2000 | A |
6150956 | Laufer | Nov 2000 | A |
6181463 | Galvanauskas et al. | Jan 2001 | B1 |
6212480 | Dunne | Apr 2001 | B1 |
6266442 | Laumeyer et al. | Jul 2001 | B1 |
6323942 | Bamji | Nov 2001 | B1 |
6327090 | Rando et al. | Dec 2001 | B1 |
6363161 | Laumeyer et al. | Mar 2002 | B2 |
6370291 | Mitchell | Apr 2002 | B1 |
6373557 | Megel et al. | Apr 2002 | B1 |
6377167 | Juds et al. | Apr 2002 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6448572 | Tennant et al. | Sep 2002 | B1 |
6449384 | Laumeyer et al. | Sep 2002 | B2 |
6453056 | Laumeyer et al. | Sep 2002 | B2 |
6456368 | Seo | Sep 2002 | B2 |
6480265 | Maimon et al. | Nov 2002 | B2 |
6512892 | Montgomery et al. | Jan 2003 | B1 |
6522396 | Halmos | Feb 2003 | B1 |
6535275 | McCaffrey et al. | Mar 2003 | B2 |
6619406 | Kacyra et al. | Sep 2003 | B1 |
6625315 | Laumeyer et al. | Sep 2003 | B2 |
6646725 | Eichinger et al. | Nov 2003 | B1 |
6654401 | Cavalheiro Vieira et al. | Nov 2003 | B2 |
6665055 | Ohishi et al. | Dec 2003 | B2 |
6674878 | Retterath | Jan 2004 | B2 |
6683727 | Göring et al. | Jan 2004 | B1 |
6711280 | Stafsudd et al. | Mar 2004 | B2 |
6717972 | Steinle et al. | Apr 2004 | B2 |
6774988 | Stam et al. | Aug 2004 | B2 |
6828558 | Arnone | Dec 2004 | B1 |
6843416 | Swartz et al. | Jan 2005 | B2 |
6873640 | Bradburn et al. | Mar 2005 | B2 |
6881979 | Starikov et al. | Apr 2005 | B2 |
6891960 | Retterath et al. | May 2005 | B2 |
6906302 | Drowley | Jun 2005 | B2 |
6967053 | Mullen et al. | Nov 2005 | B1 |
6967569 | Weber et al. | Nov 2005 | B2 |
6975251 | Pavicic | Dec 2005 | B2 |
6987447 | Baerenweiler et al. | Jan 2006 | B2 |
7016519 | Nakamura et al. | Mar 2006 | B1 |
7026600 | Jamieson et al. | Apr 2006 | B2 |
7043057 | Retterath et al. | May 2006 | B2 |
7092548 | Laumeyer et al. | Aug 2006 | B2 |
7148974 | Schmitt et al. | Dec 2006 | B1 |
7149613 | Stam et al. | Dec 2006 | B2 |
7168815 | Shipman et al. | Jan 2007 | B2 |
7171037 | Mahon et al. | Jan 2007 | B2 |
7173707 | Retterath et al. | Feb 2007 | B2 |
7187452 | Jupp et al. | Mar 2007 | B2 |
7224384 | Iddan et al. | May 2007 | B1 |
7227459 | Bos et al. | Jun 2007 | B2 |
7236235 | Dimsdale | Jun 2007 | B2 |
7248342 | Degnan | Jul 2007 | B1 |
7248344 | Morcom | Jul 2007 | B2 |
7282695 | Weber et al. | Oct 2007 | B2 |
7294863 | Lee et al. | Nov 2007 | B2 |
7319777 | Morcom | Jan 2008 | B2 |
7319805 | Remillard et al. | Jan 2008 | B2 |
7348919 | Gounalis | Mar 2008 | B2 |
7362419 | Kurihara et al. | Apr 2008 | B2 |
7411681 | Retterath et al. | Aug 2008 | B2 |
7436494 | Kennedy et al. | Oct 2008 | B1 |
7444003 | Laumeyer et al. | Oct 2008 | B2 |
7451041 | Laumeyer et al. | Nov 2008 | B2 |
7453553 | Dimsdale | Nov 2008 | B2 |
7474821 | Donlagic et al. | Jan 2009 | B2 |
7515736 | Retterath | Apr 2009 | B2 |
7521666 | Tsang | Apr 2009 | B2 |
7534984 | Gleckler | May 2009 | B2 |
7542499 | Jikutani | Jun 2009 | B2 |
7544945 | Tan et al. | Jun 2009 | B2 |
7551771 | England, III | Jun 2009 | B2 |
7560680 | Sato et al. | Jul 2009 | B2 |
7579593 | Onozawa | Aug 2009 | B2 |
7590310 | Retterath et al. | Sep 2009 | B2 |
7607509 | Schmiz et al. | Oct 2009 | B2 |
7623248 | Laflamme | Nov 2009 | B2 |
7649654 | Shyu et al. | Jan 2010 | B2 |
7663095 | Wong et al. | Feb 2010 | B2 |
7689032 | Strassenburg-Kleciak | Mar 2010 | B2 |
7697119 | Ikeno | Apr 2010 | B2 |
7701558 | Walsh et al. | Apr 2010 | B2 |
7733932 | Faybishenko | Jun 2010 | B2 |
7755743 | Kumahara et al. | Jul 2010 | B2 |
7755809 | Fujita et al. | Jul 2010 | B2 |
7787105 | Hipp | Aug 2010 | B2 |
7787511 | Jikutani et al. | Aug 2010 | B2 |
7800739 | Rohner et al. | Sep 2010 | B2 |
7830442 | Griffis et al. | Nov 2010 | B2 |
7830532 | De Coi | Nov 2010 | B2 |
7873091 | Parent et al. | Jan 2011 | B2 |
7881355 | Sipes, Jr. | Feb 2011 | B2 |
7888159 | Venezia et al. | Feb 2011 | B2 |
7894725 | Holman et al. | Feb 2011 | B2 |
7900736 | Breed | Mar 2011 | B2 |
7911617 | Padmanabhan et al. | Mar 2011 | B2 |
7940825 | Jikutani | May 2011 | B2 |
7941269 | Laumeyer et al. | May 2011 | B2 |
7944548 | Eaton | May 2011 | B2 |
7945408 | Dimsdale et al. | May 2011 | B2 |
7957448 | Willemin et al. | Jun 2011 | B2 |
7957639 | Lee et al. | Jun 2011 | B2 |
7960195 | Maeda et al. | Jun 2011 | B2 |
7961328 | Austin et al. | Jun 2011 | B2 |
7969558 | Hall | Jun 2011 | B2 |
7979173 | Breed | Jul 2011 | B2 |
7983817 | Breed | Jul 2011 | B2 |
7986461 | Bartoschewski | Jul 2011 | B2 |
7991222 | Dimsdale et al. | Aug 2011 | B2 |
7994465 | Bamji et al. | Aug 2011 | B1 |
7995796 | Retterath et al. | Aug 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8045595 | Ma | Oct 2011 | B2 |
8054203 | Breed et al. | Nov 2011 | B2 |
8054464 | Mathur et al. | Nov 2011 | B2 |
8072581 | Breiholz | Dec 2011 | B1 |
8072663 | O'Neill et al. | Dec 2011 | B2 |
8077294 | Grund et al. | Dec 2011 | B1 |
8089498 | Sato et al. | Jan 2012 | B2 |
8094060 | Beard et al. | Jan 2012 | B2 |
8098969 | Tolstikhin et al. | Jan 2012 | B2 |
8102426 | Yahav et al. | Jan 2012 | B2 |
8111452 | Butler et al. | Feb 2012 | B2 |
8115158 | Buettgen | Feb 2012 | B2 |
8120754 | Kaehler | Feb 2012 | B2 |
8125367 | Ludwig | Feb 2012 | B2 |
8125620 | Lewis | Feb 2012 | B2 |
8139141 | Bamji et al. | Mar 2012 | B2 |
8150216 | Retterath et al. | Apr 2012 | B2 |
8159598 | Watanabe et al. | Apr 2012 | B2 |
8194712 | Müller et al. | Jun 2012 | B2 |
8198576 | Kennedy et al. | Jun 2012 | B2 |
8199786 | Gaillard et al. | Jun 2012 | B2 |
8212998 | Rindle | Jul 2012 | B2 |
8213479 | Doerfel et al. | Jul 2012 | B2 |
8229663 | Zeng et al. | Jul 2012 | B2 |
8235416 | Breed et al. | Aug 2012 | B2 |
8235605 | Kim | Aug 2012 | B2 |
8238393 | Iwasaki | Aug 2012 | B2 |
8242428 | Meyers et al. | Aug 2012 | B2 |
8242476 | Mimeault et al. | Aug 2012 | B2 |
8249798 | Hawes et al. | Aug 2012 | B2 |
8259003 | Song | Sep 2012 | B2 |
8280623 | Trepagnier et al. | Oct 2012 | B2 |
8301027 | Shaw et al. | Oct 2012 | B2 |
8310654 | Weilkes et al. | Nov 2012 | B2 |
8319949 | Cantin et al. | Nov 2012 | B2 |
8325256 | Egawa | Dec 2012 | B2 |
8338900 | Venezia et al. | Dec 2012 | B2 |
8340151 | Liu et al. | Dec 2012 | B2 |
8354928 | Morcom | Jan 2013 | B2 |
8355117 | Niclass | Jan 2013 | B2 |
8363156 | Lo | Jan 2013 | B2 |
8363511 | Frank et al. | Jan 2013 | B2 |
8364334 | Au et al. | Jan 2013 | B2 |
8368005 | Wang et al. | Feb 2013 | B2 |
8368876 | Johnson et al. | Feb 2013 | B1 |
8378287 | Schemmann et al. | Feb 2013 | B2 |
8378885 | Cornic et al. | Feb 2013 | B2 |
8380367 | Schultz et al. | Feb 2013 | B2 |
8391336 | Chiskis | Mar 2013 | B2 |
8401046 | Shveykin et al. | Mar 2013 | B2 |
8401049 | Sato et al. | Mar 2013 | B2 |
8406992 | Laumeyer et al. | Mar 2013 | B2 |
8422148 | Langer et al. | Apr 2013 | B2 |
8426797 | Aull | Apr 2013 | B2 |
8437584 | Matsuoka et al. | May 2013 | B2 |
8442084 | Ungar | May 2013 | B2 |
8446470 | Lu et al. | May 2013 | B2 |
8451432 | Crawford et al. | May 2013 | B2 |
8451871 | Yankov | May 2013 | B2 |
8456517 | Spektor et al. | Jun 2013 | B2 |
8477819 | Kitamura | Jul 2013 | B2 |
8487525 | Lee | Jul 2013 | B2 |
8494687 | Vanek et al. | Jul 2013 | B2 |
8503888 | Takemoto et al. | Aug 2013 | B2 |
8508567 | Sato et al. | Aug 2013 | B2 |
8508720 | Kamiyama | Aug 2013 | B2 |
8508721 | Cates et al. | Aug 2013 | B2 |
8520713 | Joseph | Aug 2013 | B2 |
8531650 | Feldkhun et al. | Sep 2013 | B2 |
8538636 | Breed | Sep 2013 | B2 |
8558993 | Newbury et al. | Oct 2013 | B2 |
8570372 | Russell | Oct 2013 | B2 |
8587637 | Cryder et al. | Nov 2013 | B1 |
8594455 | Meyers et al. | Nov 2013 | B2 |
8599363 | Zeng | Dec 2013 | B2 |
8599367 | Canham | Dec 2013 | B2 |
8604932 | Breed et al. | Dec 2013 | B2 |
8605262 | Campbell et al. | Dec 2013 | B2 |
8619241 | Mimeault | Dec 2013 | B2 |
8633989 | Okuda | Jan 2014 | B2 |
8640182 | Bedingfield, Sr. | Jan 2014 | B2 |
8655513 | Vanek | Feb 2014 | B2 |
8660311 | Retterath et al. | Feb 2014 | B2 |
8675184 | Schmitt et al. | Mar 2014 | B2 |
8681255 | Katz et al. | Mar 2014 | B2 |
8687172 | Faul et al. | Apr 2014 | B2 |
8692980 | Gilliland | Apr 2014 | B2 |
8699755 | Stroila et al. | Apr 2014 | B2 |
8717417 | Sali et al. | May 2014 | B2 |
8717492 | McMackin et al. | May 2014 | B2 |
8723689 | Mimeault | May 2014 | B2 |
8724671 | Moore | May 2014 | B2 |
8736670 | Barbour et al. | May 2014 | B2 |
8736818 | Weimer et al. | May 2014 | B2 |
8742325 | Droz et al. | Jun 2014 | B1 |
8743455 | Gusev | Jun 2014 | B2 |
8754829 | Lapstun | Jun 2014 | B2 |
8760499 | Russell | Jun 2014 | B2 |
8767190 | Hall | Jul 2014 | B2 |
8773642 | Eisele et al. | Jul 2014 | B2 |
8781790 | Zhu et al. | Jul 2014 | B2 |
8797550 | Hays et al. | Aug 2014 | B2 |
8804101 | Spagnolia et al. | Aug 2014 | B2 |
8809758 | Molnar | Aug 2014 | B2 |
8810647 | Niclass et al. | Aug 2014 | B2 |
8810796 | Hays | Aug 2014 | B2 |
8811720 | Seida | Aug 2014 | B2 |
8820782 | Breed et al. | Sep 2014 | B2 |
8836921 | Feldkhun et al. | Sep 2014 | B2 |
8854426 | Pellman et al. | Oct 2014 | B2 |
8855849 | Ferguson | Oct 2014 | B1 |
8860944 | Retterath et al. | Oct 2014 | B2 |
8864655 | Ramamurthy et al. | Oct 2014 | B2 |
8885152 | Wright | Nov 2014 | B1 |
8903199 | Retterath et al. | Dec 2014 | B2 |
8908157 | Eisele et al. | Dec 2014 | B2 |
8908159 | Mimeault | Dec 2014 | B2 |
8908996 | Retterath et al. | Dec 2014 | B2 |
8908997 | Retterath et al. | Dec 2014 | B2 |
8918831 | Meuninck et al. | Dec 2014 | B2 |
8928865 | Rakuljic | Jan 2015 | B2 |
8933862 | Lapstun | Jan 2015 | B2 |
8934087 | Stobie et al. | Jan 2015 | B1 |
8947647 | Halmos et al. | Feb 2015 | B2 |
8963956 | Latta et al. | Feb 2015 | B2 |
8988754 | Sun et al. | Mar 2015 | B2 |
8995577 | Ullrich et al. | Mar 2015 | B2 |
9032470 | Meuninck et al. | May 2015 | B2 |
9066087 | Shpunt | Jun 2015 | B2 |
9069060 | Zbrozek | Jun 2015 | B1 |
9094628 | Williams | Jul 2015 | B2 |
9098931 | Shpunt et al. | Aug 2015 | B2 |
9102220 | Breed | Aug 2015 | B2 |
9103715 | Demers | Aug 2015 | B1 |
9113155 | Wu et al. | Aug 2015 | B2 |
9119670 | Yang et al. | Sep 2015 | B2 |
9131136 | Shpunt et al. | Sep 2015 | B2 |
9137463 | Gilboa et al. | Sep 2015 | B2 |
9137511 | LeGrand, III et al. | Sep 2015 | B1 |
9142019 | Lee | Sep 2015 | B2 |
9158375 | Maizels et al. | Oct 2015 | B2 |
9170096 | Fowler et al. | Oct 2015 | B2 |
9182490 | Velichko et al. | Nov 2015 | B2 |
9185391 | Prechtl | Nov 2015 | B1 |
9186046 | Ramamurthy et al. | Nov 2015 | B2 |
9186047 | Ramamurthy et al. | Nov 2015 | B2 |
9191582 | Wright et al. | Nov 2015 | B1 |
9194953 | Schmidt et al. | Nov 2015 | B2 |
9201501 | Maizels et al. | Dec 2015 | B2 |
9204121 | Marason et al. | Dec 2015 | B1 |
9219873 | Grauer et al. | Dec 2015 | B2 |
9228697 | Schneider et al. | Jan 2016 | B2 |
9237333 | Lee et al. | Jan 2016 | B2 |
9239264 | Demers | Jan 2016 | B1 |
9294754 | Billerbeck et al. | Mar 2016 | B2 |
9325920 | Van Nieuwenhove et al. | Apr 2016 | B2 |
9335255 | Retterath et al. | May 2016 | B2 |
9360554 | Retterath et al. | Jun 2016 | B2 |
9424277 | Retterath et al. | Aug 2016 | B2 |
9436880 | Bos et al. | Sep 2016 | B2 |
9513367 | David et al. | Dec 2016 | B2 |
9575184 | Gilliland | Feb 2017 | B2 |
9612153 | Kawada | Apr 2017 | B2 |
9671328 | Retterath et al. | Jun 2017 | B2 |
9723233 | Grauer | Aug 2017 | B2 |
9753141 | Grauer et al. | Sep 2017 | B2 |
9810785 | Grauer et al. | Nov 2017 | B2 |
9866816 | Retterath | Jan 2018 | B2 |
9880267 | Viswanathan et al. | Jan 2018 | B2 |
9921153 | Wegner et al. | Mar 2018 | B2 |
9958547 | Fu et al. | May 2018 | B2 |
9989456 | Retterath et al. | Jun 2018 | B2 |
9989457 | Retterath et al. | Jun 2018 | B2 |
10000000 | Marron | Jun 2018 | B2 |
10036801 | Retterath | Jul 2018 | B2 |
10055854 | Wan et al. | Aug 2018 | B2 |
RE47134 | Mimeault | Nov 2018 | E |
10139978 | Lindahl et al. | Nov 2018 | B2 |
10140690 | Chakraborty et al. | Nov 2018 | B2 |
10140956 | Ueda et al. | Nov 2018 | B2 |
10203399 | Retterath et al. | Feb 2019 | B2 |
10298908 | Retterath | May 2019 | B2 |
10302766 | Ito | May 2019 | B2 |
10359505 | Buettgen et al. | Jul 2019 | B2 |
10382742 | Retterath | Aug 2019 | B2 |
10397552 | Van Nieuwenhove et al. | Aug 2019 | B2 |
10481266 | Pei et al. | Nov 2019 | B2 |
10564267 | Grauer et al. | Feb 2020 | B2 |
10585175 | Retterath et al. | Mar 2020 | B2 |
10623716 | Retterath | Apr 2020 | B2 |
10873738 | Retterath | Dec 2020 | B2 |
10983197 | Zhu | Apr 2021 | B1 |
20020106109 | Retterath et al. | Aug 2002 | A1 |
20020179708 | Zhu et al. | Dec 2002 | A1 |
20020186865 | Retterath et al. | Dec 2002 | A1 |
20030043364 | Jamieson et al. | Mar 2003 | A1 |
20030085867 | Grabert | May 2003 | A1 |
20030155513 | Remillard | Aug 2003 | A1 |
20030016869 | Laumeyer et al. | Sep 2003 | A1 |
20040062442 | Laumeyer et al. | Apr 2004 | A1 |
20040133380 | Gounalis | Jul 2004 | A1 |
20040156531 | Retterath et al. | Aug 2004 | A1 |
20040213463 | Morrison | Oct 2004 | A1 |
20050249378 | Retterath et al. | Nov 2005 | A1 |
20050271304 | Retterath et al. | Dec 2005 | A1 |
20060132752 | Kane | Jun 2006 | A1 |
20060157643 | Bamji et al. | Jul 2006 | A1 |
20060262312 | Retterath et al. | Nov 2006 | A1 |
20060268265 | Chuang et al. | Nov 2006 | A1 |
20060279630 | Aggarwal | Dec 2006 | A1 |
20070055441 | Retterath et al. | Mar 2007 | A1 |
20070124157 | Laumeyer et al. | May 2007 | A1 |
20070154067 | Laumeyer et al. | Jul 2007 | A1 |
20070182949 | Niclass | Aug 2007 | A1 |
20070216904 | Retterath et al. | Sep 2007 | A1 |
20070279615 | Degnan | Dec 2007 | A1 |
20080180650 | Lamesch | Jul 2008 | A1 |
20090045359 | Kumahara et al. | Feb 2009 | A1 |
20090076758 | Dimsdale | Mar 2009 | A1 |
20090125226 | Laumeyer et al. | May 2009 | A1 |
20090128802 | Treado | May 2009 | A1 |
20090232355 | Minear | Sep 2009 | A1 |
20090252376 | Retterath et al. | Oct 2009 | A1 |
20100020306 | Hall | Jan 2010 | A1 |
20100045966 | Cauquy et al. | Feb 2010 | A1 |
20100082597 | Retterath et al. | Apr 2010 | A1 |
20100128109 | Banks | May 2010 | A1 |
20100231891 | Mase et al. | Sep 2010 | A1 |
20100265386 | Raskar et al. | Oct 2010 | A1 |
20100277713 | Mimeault | Nov 2010 | A1 |
20100301195 | Thor et al. | Dec 2010 | A1 |
20110007299 | Moench et al. | Jan 2011 | A1 |
20110037849 | Niclass et al. | Feb 2011 | A1 |
20110093350 | Laumeyer et al. | Apr 2011 | A1 |
20110101206 | Buettgen | May 2011 | A1 |
20110131722 | Scott et al. | Jun 2011 | A1 |
20110134220 | Barbour et al. | Jun 2011 | A1 |
20110216304 | Hall | Sep 2011 | A1 |
20110285980 | Newbury et al. | Nov 2011 | A1 |
20110285981 | Justice | Nov 2011 | A1 |
20110285982 | Breed | Nov 2011 | A1 |
20110295469 | Rafii et al. | Dec 2011 | A1 |
20110313722 | Zhu | Dec 2011 | A1 |
20120001463 | Breed et al. | Jan 2012 | A1 |
20120002007 | Meuninck et al. | Jan 2012 | A1 |
20120002025 | Bedingfield, Sr. | Jan 2012 | A1 |
20120011546 | Meuninck et al. | Jan 2012 | A1 |
20120023518 | Meuninck et al. | Jan 2012 | A1 |
20120023540 | Meuninck et al. | Jan 2012 | A1 |
20120062705 | Ovsiannikov et al. | Mar 2012 | A1 |
20120065940 | Retterath et al. | Mar 2012 | A1 |
20120086781 | Iddan | Apr 2012 | A1 |
20120098964 | Oggier et al. | Apr 2012 | A1 |
20120123718 | Ko et al. | May 2012 | A1 |
20120154784 | Kaufman et al. | Jun 2012 | A1 |
20120154785 | Gilliland et al. | Jun 2012 | A1 |
20120249998 | Eisele et al. | Oct 2012 | A1 |
20120261516 | Gilliland | Oct 2012 | A1 |
20120262696 | Eisele et al. | Oct 2012 | A1 |
20120274745 | Russell | Nov 2012 | A1 |
20120287417 | Mimeault | Nov 2012 | A1 |
20120299344 | Breed et al. | Nov 2012 | A1 |
20130044129 | Latta et al. | Feb 2013 | A1 |
20130060146 | Yang et al. | Mar 2013 | A1 |
20130070239 | Crawford et al. | Mar 2013 | A1 |
20130076861 | Sternklar | Mar 2013 | A1 |
20130083310 | Ramamurthy et al. | Apr 2013 | A1 |
20130085330 | Ramamurthy et al. | Apr 2013 | A1 |
20130085331 | Ramamurthy et al. | Apr 2013 | A1 |
20130085333 | Ramamurthy et al. | Apr 2013 | A1 |
20130085334 | Ramamurthy et al. | Apr 2013 | A1 |
20130085382 | Ramamurthy et al. | Apr 2013 | A1 |
20130085397 | Ramamurthy et al. | Apr 2013 | A1 |
20130090528 | Ramamurthy et al. | Apr 2013 | A1 |
20130090530 | Ramamurthy et al. | Apr 2013 | A1 |
20130090552 | Ramamurthy et al. | Apr 2013 | A1 |
20130100249 | Norita | Apr 2013 | A1 |
20130188043 | Decoster | Jul 2013 | A1 |
20130201288 | Billerbeck et al. | Aug 2013 | A1 |
20130215235 | Russell | Aug 2013 | A1 |
20130242283 | Bailey et al. | Sep 2013 | A1 |
20130242285 | Zeng | Sep 2013 | A1 |
20130271613 | Retterath et al. | Oct 2013 | A1 |
20130278917 | Korekado et al. | Oct 2013 | A1 |
20130300740 | Snyder et al. | Nov 2013 | A1 |
20130300838 | Borowski | Nov 2013 | A1 |
20130300840 | Borowski | Nov 2013 | A1 |
20130321791 | Feldkhun et al. | Dec 2013 | A1 |
20140035959 | Lapstun | Feb 2014 | A1 |
20140036269 | Retterath et al. | Feb 2014 | A1 |
20140152971 | James | Jun 2014 | A1 |
20140152975 | Ko | Jun 2014 | A1 |
20140160461 | Van Der Tempel et al. | Jun 2014 | A1 |
20140168362 | Hannuksela et al. | Jun 2014 | A1 |
20140211194 | Pacala et al. | Jul 2014 | A1 |
20140218473 | Hannuksela et al. | Aug 2014 | A1 |
20140240464 | Lee | Aug 2014 | A1 |
20140240469 | Lee | Aug 2014 | A1 |
20140240809 | Lapstun | Aug 2014 | A1 |
20140241614 | Lee | Aug 2014 | A1 |
20140253993 | Lapstun | Sep 2014 | A1 |
20140292620 | Lapstun | Oct 2014 | A1 |
20140313339 | Diessner | Oct 2014 | A1 |
20140313376 | Van Nieuwenhove et al. | Oct 2014 | A1 |
20140340487 | Gilliland et al. | Nov 2014 | A1 |
20140347676 | Velten et al. | Nov 2014 | A1 |
20140350836 | Stettner et al. | Nov 2014 | A1 |
20150002734 | Lee | Jan 2015 | A1 |
20150060673 | Zimdars | Mar 2015 | A1 |
20150077764 | Braker et al. | Mar 2015 | A1 |
20150082353 | Meuninck et al. | Mar 2015 | A1 |
20150116528 | Lapstun | Apr 2015 | A1 |
20150131080 | Retterath | May 2015 | A1 |
20150145955 | Russell | May 2015 | A1 |
20150153271 | Retterath et al. | Jun 2015 | A1 |
20150192677 | Yu et al. | Jul 2015 | A1 |
20150201176 | Graziosi et al. | Jul 2015 | A1 |
20150213576 | Meuninck et al. | Jul 2015 | A1 |
20150245017 | Di Censo | Aug 2015 | A1 |
20150256767 | Schlechter | Sep 2015 | A1 |
20150269736 | Hannuksela et al. | Sep 2015 | A1 |
20150292874 | Shpunt et al. | Oct 2015 | A1 |
20150293226 | Eisele et al. | Oct 2015 | A1 |
20150293228 | Retterath et al. | Oct 2015 | A1 |
20150296201 | Banks | Oct 2015 | A1 |
20150304534 | Kadambi et al. | Oct 2015 | A1 |
20150304665 | Hannuksela et al. | Oct 2015 | A1 |
20150309154 | Lohbihler | Oct 2015 | A1 |
20150319344 | Lapstun | Nov 2015 | A1 |
20150319355 | Lapstun | Nov 2015 | A1 |
20150319419 | Akin et al. | Nov 2015 | A1 |
20150319429 | Lapstun | Nov 2015 | A1 |
20150319430 | Lapstun | Nov 2015 | A1 |
20150378241 | Eldada | Dec 2015 | A1 |
20150379362 | Calmes et al. | Dec 2015 | A1 |
20160003946 | Gilliland et al. | Jan 2016 | A1 |
20160007009 | Offenberg | Jan 2016 | A1 |
20160047901 | Pacala et al. | Feb 2016 | A1 |
20160049765 | Eldada | Feb 2016 | A1 |
20160161600 | Eldada et al. | Jun 2016 | A1 |
20160259038 | Retterath et al. | Sep 2016 | A1 |
20160356881 | Retterath et al. | Dec 2016 | A1 |
20160377529 | Retterath et al. | Dec 2016 | A1 |
20170084176 | Nakamura | Mar 2017 | A1 |
20170103271 | Kawagoe | Apr 2017 | A1 |
20170115395 | Grauer et al. | Apr 2017 | A1 |
20170176578 | Rae | Jun 2017 | A1 |
20170230638 | Wajs et al. | Aug 2017 | A1 |
20170257617 | Retterath | Sep 2017 | A1 |
20170259753 | Meyhofer | Sep 2017 | A1 |
20170350812 | Retterath et al. | Dec 2017 | A1 |
20170358103 | Shao | Dec 2017 | A1 |
20180131924 | Jung | May 2018 | A1 |
20180295344 | Retterath | Oct 2018 | A1 |
20180372621 | Retterath et al. | Dec 2018 | A1 |
20190056498 | Sonn et al. | Feb 2019 | A1 |
20190058867 | Retterath | Feb 2019 | A1 |
20190079165 | Retterath et al. | Mar 2019 | A1 |
20190230297 | Knorr et al. | Jul 2019 | A1 |
20190285732 | Retterath et al. | Sep 2019 | A1 |
20190364262 | Retterath | Nov 2019 | A1 |
20200036958 | Retterath | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2005172437 | Jun 2005 | CN |
101142822 | Mar 2008 | CN |
101373217 | Feb 2009 | CN |
102590821 | Jul 2012 | CN |
103502839 | Jan 2014 | CN |
103616696 | Mar 2014 | CN |
103748479 | Apr 2014 | CN |
103760567 | Apr 2014 | CN |
105093206 | Nov 2015 | CN |
1764835 | Mar 2007 | EP |
1912078 | Apr 2008 | EP |
WO 1998010255 | Mar 1998 | WO |
WO 2000019705 | Apr 2000 | WO |
WO 2002015144 | Feb 2002 | WO |
WO 2002101340 | Dec 2002 | WO |
WO 2006121986 | Nov 2006 | WO |
WO 2013081984 | Jun 2013 | WO |
WO 2013127975 | Sep 2013 | WO |
WO 2015126471 | Aug 2015 | WO |
WO 2015156997 | Oct 2015 | WO |
WO 2015198300 | Dec 2015 | WO |
WO 2016190930 | Dec 2016 | WO |
WO 2017149370 | Sep 2017 | WO |
Entry |
---|
Harvey-Lynch, Inc., “Multibeam and Mobile LIDAR Solutions,” 2014, 2 pages. |
Krill et al., “Multifunction Array LIDAR Network for Intruder Detection, Tracking, and Identification,” IEEE ISSNIP, 2010, pp. 43-48. |
Levinson et al., “Unsupervised Calibration for Multi-Beam Lasers,” Stanford Artificial Intelligence Laboratory, 2010, 8 pages. |
Laurenzis, et al., “Long-Range Three-Dimensional Active Imaging with Superresolution Depth Mapping,” French-German Research Institute of Saint-Louis, vol. 32, No. 21, Nov. 1, 2007, 3 pages. |
Webpage http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan/specifications, Jul. 2015, 2 pages. |
Webpage, 3D LADAR & LIDAR Focal Planes and Instruments, Voxtelopto, 2007-2015, 3 pages. |
ASC 3D Bringing 3D Alive!, Advanced Scientific Concepts, Inc., Feb. 9, 2010, 14 pages. |
Albota et al., “Three-Dimensional Imaging Laser Radar with a Photo-Counting Avalanch Photodiode Array and Microchip Laser,” Dec. 20, 2002, 8 pages. |
Brazzel et al., “Flash LIDAR Based Relative Navigation,” 2015 IEEE Aerospace Conference, 2014, 11 pages. |
Love et al., “Active Probing of Cloud Multiple Scattering, Optical, Depth, Vertical Thickness, and Liquid Water Content Using Wide-Angle Imaging LIDAR,” 2002, 11 pages. |
Itzler, “Focal-Plane Arrays: Geiger-Mode Focal Plane Arrays Enable Swir 3D Imaging,” 2011, 8 pages. |
Superior Signal-to-Noise Ratio of a New AA1 Sequence for Random-Modulation Continuous-Wave LIDAR, Optics Letters, 2004, vol. 29, No. 15. |
Frequency-Modulated Continuous-Wave LIDAR Using I/Q Modulator for Simplified Heterodyne Detection, Optics Letters, 2012, vol. 37, No. 11. |
Möller et al., “Robust 3D Measurement with PMD Sensors,” Proceedings of the First Range Imaging Research Day at ETH Zurich, 2005, 14 pages. |
Hussmann et al., “A Performance of 3D TOF Vision Systems in Comparison to Stereo Vision Systems,” Stereo Vision, 2008, 20 pages. |
Al-Khafaji et al., “Spectral-Spatial Scale Invariant Feature Transform for Hyperspectral Images,” IEEE Transactions on Image Processing, vol. 27, Issue 2, Feb. 2018, 14 pages. |
Ling et al., “Deformation Invariant Image Matching,” Center for Automation Research, Computer Science Department, University of Maryland, College Park, 2005, 8 pages. |
Lindeberg, “Scale Invariant Feature Transform,” Scholarpedia, 7(5):10491, May 2012, 19 pages. |
McCarthy et al., “Long-Range Time-of-Flight Scanning Sensor Based on High-Speed Time-Correlated Single-Photon Counting,” School of Engineering and Physical Sciences, vol. 48, No. 32, Nov. 10, 2009, 11 pages. |
Foix et al., “Exploitation of Time-of-Flight (ToF) Cameras, IRI Technical Report,” Institut de Robotica I Informàtica Industrial (IRI), 2007, 22 pages. |
Dudek, “Adaptive Sensing and Image Processing with a General-Purpose Pixel-Parallel Sensor/Processor Array Integrated Circuit,” School of Electrical and Electronic Engineering, Sep. 2006, 6 pages. |
Dudek, “SCAMP Vision Sensor,” Microelectronics Design Lab, 2013, 6 pages. |
Dudek, “A General-Purpose CMOS Vision Chip with a Processor-Per-Pixel SIMD Array,” Department of Electrical Engineering and Electronics, Sep. 2001, 4 pages. |
Application and File history for U.S. Appl. No. 15/059,811, filed Mar. 3, 2016. Inventors: Retterath. |
Application and File history for U.S. Appl. No. 14/078,001, filed Nov. 12, 2013. Inventors: Retterath et al. |
Application and File history for U.S. Appl. No. 14/251,254, filed Apr. 11, 2014. Inventors: Retterath et al. |
Application and File history for U.S. Appl. No. 15/173,969, filed Jun. 6, 2016. Inventors: Retterath et al. |
Application and File history for U.S. Appl. No. 14/639,802, filed Mar. 5, 2015. Inventors: Retterath et al. |
Application and File history for U.S. Appl. No. 15/853,222, filed Dec. 22, 2017. Inventors: Retterath. |
Application and File history for U.S. Appl. No. 16/167,196, filed Oct. 22, 2018. Inventors: Retterath. |
Application and File history for U.S. Appl. No. 16/537,331, filed Aug. 9, 2019. Inventors: Retterath. |
Application and File history for U.S. Appl. No. 16/537,305, filed Aug. 9, 2019. Inventors: Retterath. |
Application and File history U.S. Appl. No. 16/047,793, filed Jul. 27, 2018. Inventors: Retterath et al. |
Number | Date | Country | |
---|---|---|---|
20240171857 A1 | May 2024 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16167196 | Oct 2018 | US |
Child | 16537305 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18524180 | Nov 2023 | US |
Child | 18417282 | US | |
Parent | 17967365 | Oct 2022 | US |
Child | 18524180 | US | |
Parent | 17127461 | Dec 2020 | US |
Child | 17967365 | US | |
Parent | 16537305 | Aug 2019 | US |
Child | 17127461 | US | |
Parent | 15853222 | Dec 2017 | US |
Child | 16167196 | US | |
Parent | 15059811 | Mar 2016 | US |
Child | 15853222 | US |