The present invention relates generally to methods and devices for projection and capture of optical radiation, and particularly to optical 3D mapping.
Various methods are known in the art for optical 3D mapping, i.e., generating a 3D profile of the surface of an object by processing an optical image of the object. This sort of 3D profile is also referred to as a 3D map, depth map or depth image, and 3D mapping is also referred to as depth mapping.
U.S. Patent Application Publication 2011/0279648 describes a method for constructing a 3D representation of a subject, which comprises capturing, with a camera, a 2D image of the subject. The method further comprises scanning a modulated illumination beam over the subject to illuminate, one at a time, a plurality of target regions of the subject, and measuring a modulation aspect of light from the illumination beam reflected from each of the target regions. A moving-mirror beam scanner is used to scan the illumination beam, and a photodetector is used to measure the modulation aspect. The method further comprises computing a depth aspect based on the modulation aspect measured for each of the target regions, and associating the depth aspect with a corresponding pixel of the 2D image.
U.S. Pat. No. 8,018,579 describes a three-dimensional imaging and display system in which user input is optically detected in an imaging volume by measuring the path length of an amplitude modulated scanning beam as a function of the phase shift thereof. Visual image user feedback concerning the detected user input is presented.
U.S. Pat. No. 7,952,781, whose disclosure is incorporated herein by reference, describes a method of scanning a light beam and a method of manufacturing a microelectromechanical system (MEMS), which can be incorporated in a scanning device.
U.S. Patent Application Publication 2012/0236379 describes a LADAR system that uses MEMS scanning. A scanning mirror includes a substrate that is patterned to include a mirror area, a frame around the mirror area, and a base around the frame. A set of actuators operate to rotate the mirror area about a first axis relative to the frame, and a second set of actuators rotate the frame about a second axis relative to the base. The scanning mirror can be fabricated using semiconductor processing techniques. Drivers for the scanning mirror may employ feedback loops that operate the mirror for triangular motions. Some embodiments of the scanning mirror can be used in a LADAR system for a Natural User Interface of a computing system.
The “MiniFaros” consortium, coordinated by SICK AG (Hamburg, Germany) has supported work on a new laser scanner for automotive applications. Further details are available on the minifaros.eu Web site.
Embodiments of the present invention that are described hereinbelow provide improved optoelectronic modules and methods for production of such modules.
There is therefore provided, in accordance with an embodiment of the present invention, an optoelectronic module, which includes a micro-optical substrate and a beam transmitter, including a laser die mounted on the micro-optical substrate and configured to emit at least one laser beam along a beam axis. A receiver includes a detector die mounted on the micro-optical substrate and configured to sense light received by the module along a collection axis of the receiver. Beam-combining optics are configured to direct the laser beam and the received light so that the beam axis is aligned with the collection axis outside the module.
In some embodiments, the beam-combining optics include a beamsplitter, which is intercepted by both the beam axis and the collection axis. In certain of these embodiments, the beam axis and the collection axis are both perpendicular to the substrate, and the beam-combining optics include a reflector, which is configured to deflect one of the beam axis and the collection axis toward the beamsplitter, so that the beam axis and the collection axis are incident on the beamsplitter at different, respective angles. The beam-combining optics may include a transparent plate having opposing, first and second surfaces, wherein the beamsplitter is formed on the first surface, while the reflector is formed on the second surface. The plate may include a filter formed on one of the surfaces so as to exclude the received light that is outside an emission band of the beam transmitter.
Additionally or alternatively, the beam-combining optics include at least one lens, which is configured to collimate the at least one laser beam and to focus the received light onto the detector die. In one embodiment, the at least one lens includes a bifocal lens, which is configured to collimate the at least one laser beam through a first aperture, and is configured to collect the received light through a second aperture, which is larger than the first aperture.
In some embodiments, the laser die is an edge-emitting die, and the module includes a turning mirror, which is mounted on the substrate and is configured to reflect the at least one laser beam from the laser die so as to direct the laser beam away from the substrate. A groove may be formed in the substrate between the laser die and the turning mirror, wherein the module includes a ball lens, which is mounted in the groove and is configured to collimate the at least one laser beam. In another embodiment, the module includes a lens, which is mounted over the substrate so as to collimate the at least one laser beam after reflection from the turning mirror, wherein the lens has a focal length, which is measured prior to assembly of the laser die on the substrate, and wherein a distance of the laser die from the turning mirror on the substrate is adjusted responsively to the measured focal length.
In other embodiments, the laser die includes a first array of vertical-cavity surface-emitting lasers (VCSELs), and the beam transmitter includes a second array of microlenses, which are respectively aligned with the VCSELs so as to transmit respective laser beams generated by the VCSELs.
In disclosed embodiments, the at least one laser beam and the received light are directed to impinge on a scanning mirror outside the module, wherein the mirror scans both the at least one laser beam and a field of view of the receiver over a scene.
There is further provided, in accordance with an embodiment of the present invention, a method for producing an optoelectronic module. The method includes mounting a beam transmitter, including a laser die configured to emit at least one laser beam along a beam axis, on a micro-optical substrate. A receiver, including a detector die configured to sense light received by the module along a collection axis of the receiver, is mounted on the micro-optical substrate. Beam-combining optics are positioned with respect to the micro-optical substrate so as to direct the laser beam and the received light so that the beam axis is aligned with the collection axis outside the module.
There is moreover provided, in accordance with an embodiment of the present invention, a beam generating device, including a semiconductor substrate, such as GaAs, having an optical passband. A first array of vertical-cavity surface-emitting lasers (VCSELs) is formed on a first face of the semiconductor substrate, wherein the VCSELs are configured to emit respective laser beams through the substrate at a wavelength within the passband. A second array of microlenses is formed on a second face of the semiconductor substrate in respective alignment with the VCSELs so as to transmit the laser beams generated by the VCSELs.
The VCSELs may be offset inwardly relative to the microlenses, so as to cause the respective laser beams to spread apart. Alternatively, the VCSELs may be offset outwardly relative to the microlenses, so as to cause the respective laser beams to converge together to a focal waist.
There is also provided, in accordance with an embodiment of the present invention, an optoelectronic module, which includes a micro-optical substrate, having a groove formed therein. A beam transmitter, including an edge-emitting laser die, is mounted on the micro-optical substrate adjacent to the groove and is configured to emit a laser beam along a beam axis parallel to the substrate. A ball lens is mounted in the groove and is configured to collimate the laser beam. A turning mirror is mounted on the substrate and is configured to reflect the collimated laser beam exiting the ball lens so as to direct the laser beam away from the substrate. A beam expander is configured to collect and expand the at least one laser beam after reflection from the turning mirror.
There is additionally provided, in accordance with an embodiment of the present invention, a method for producing an optoelectronic module. The method includes forming a groove in a micro-optical substrate and mounting a beam transmitter, including an edge-emitting laser die configured to emit a laser beam along a beam axis, on the micro-optical substrate adjacent to the groove so that the beam axis is parallel to the substrate. A ball lens is mounted in the groove so as to collimate the laser beam. A turning mirror is mounted on the substrate so as to reflect the collimated laser beam exiting the ball lens away from the substrate. A beam expander is mounted over the turning mirror so as to collect and expand the laser beam after reflection from the turning mirror.
In a disclosed embodiment, the beam transmitter, ball lens, turning mirror and beam expander are aligned and fastened in place in the module without powering on the laser die.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
PCT International Publication WO 2012/020380, which is assigned to the assignee of the present patent application and whose disclosure is incorporated herein by reference, describes apparatus for mapping, which includes an illumination module. This module includes a radiation source, which is configured to emit a beam of radiation, and a scanner, which is configured to receive and scan the beam over a selected angular range. Illumination optics project the scanned beam so as to create a pattern of spots extending over a region of interest. An imaging module captures an image of the pattern that is projected onto an object in the region of interest. A processor processes the image in order to construct a three-dimensional (3D) map of the object.
In contrast to such image-based mapping systems, some embodiments of the present invention that are described hereinbelow provide depth engines that generate 3D mapping data by measuring the time of flight of a scanning beam. A light transmitter, such as a laser, directs short pulses of light toward a scanning mirror, which scans the light beam over a scene of interest within a certain scan range. A receiver, such as a sensitive, high-speed photodiode (for example, an avalanche photodiode) receives light returned from the scene via the same scanning mirror. Processing circuitry measures the time delay between the transmitted and received light pulses at each point in the scan. This delay is indicative of the distance traveled by the light beam, and hence of the depth of the object at the point. The processing circuitry uses the depth data thus extracted in producing a 3D map of the scene.
Systems based on this sort of depth engine are able to provide dynamic, interactive zooming functionality. The scanner can be controlled so as to cause the beam to scan over a selected window within the scan range and thus generate a 3D map of a part of the scene that is within the selected window. A different window may be selected in each scan of the beam. For example, after first scanning over a wide angular range and creating a wide-angle, low-resolution 3D map of the scene of interest (possibly scanning the entire range), the depth engine may be controlled to zoom in on particular windows or objects that have been identified within the scene. provide data within the selected window at higher Zooming in this manner enables the depth engine to resolution or, alternatively or additionally, to increase the frame rate at which it scans.
Computer 24 processes data generated by engine 22 in order to reconstruct a depth map of VOI 30 containing users 28. In one embodiment, engine 22 emits pulses of light while scanning over the scene and measures the relative delay of the pulses reflected back from the scene. A processor in engine 22 or in computer 24 then computes the 3D coordinates of points in the scene (including points on the surface of the users' bodies) based on the time of flight of the light pulses at each measured point (X,Y) in the scene. This approach is advantageous in that it does not require the users to hold or wear any sort of beacon, sensor, or other marker. It gives the depth (Z) coordinates of points in the scene relative to the location of engine 22 and permits dynamic zooming and shift of the region that is scanned within the scene. Implementation and operation of the depth engine are described in greater detail hereinbelow.
Although computer 24 is shown in
For simplicity and clarity in the description that follows, a set of Cartesian axes is marked in
These dynamic zoom functions are implemented by controlling the scan range of engine 22. Typically, engine 22 scans VOI 30 in a raster pattern. For example, to generate window 32, the X-range of the raster scan is reduced, while the Y-range remains unchanged. This sort of windowing can be conveniently accomplished when the depth engine scans rapidly in the Y-direction, in a resonant scan with a fixed amplitude and frequency (such as 5-10 kHz), while scanning more slowly in the X-direction at the desired frame rate (such as 30 Hz). The X-direction scan is not a resonant frequency of rotation. Thus, the speed of the X-direction scan can be varied over the scan range so that each frame contains multiple vertical windows, such as scanning a respective window over each of users 28 while skipping over the space between them. As another alternative, the Y-range of the scan may be reduced, thus reducing the overall vertical field of view.
Additionally or alternatively, the Y-range of the scan may be controlled, as well, thus giving scan windows 34 and 36 with different ranges in both X and Y. Furthermore, the Y- and/or X-range and X-offset of the scan may be modulated during each frame, so that non-rectangular windows may be scanned.
Computer 24 may instruct depth engine 22 to change the zoom (i.e., to change the sizes and/or locations of the zoom windows) via a command interface provided by the depth engine. The computer may run an application program interface (API) and/or suitable middleware so that application programs running on the computer can invoke the command interface.
Various zoom control models can be implemented by the computer or, alternatively or additionally, by embedded software in depth engine 22. As noted earlier, the computer or depth engine may change the zoom on the fly based on analysis of the depth map. Initially, the depth engine and computer may operate in a wide-angle, low-resolution search mode, and may then zoom into a higher-resolution tracking mode when a user is identified in the scene. For example, when a user enters the scene, the computer may detect the presence and location of the user and instruct the depth engine to zoom in on his location. When the user then makes a certain gesture, the computer may detect the gesture and instruct the depth engine to zoom in further on the user's hand.
Scanning mirror designs and other details of the depth engine that support the above sorts of schemes are described with reference to the figures that follow.
Optical head 40 comprises a transmitter 44, such as a laser diode, whose output is collimated by a suitable lens. Transmitter 44 outputs a beam of light, which may comprise visible, infrared, and/or ultraviolet radiation (all of which are referred to as “light” in the context of the present description and in the claims). A laser driver, which may similarly be implemented in an ASIC 53, modulates the laser output, so that it emits short pulses, typically with sub-nanosecond rise time. The laser beam is directed toward a scanning micromirror 46, which may be produced and driven using MEMS technology, as described below. The micromirror scans beam 38 over the scene, typically via projection/collection optics, such as a suitable lens (shown in the figures below).
Pulses of light reflected back from the scene are collected by the optics and reflect from scanning mirror onto a receiver 48. (Alternatively, in place of a single mirror shared by the transmitter and the receiver, a pair of synchronized mirrors may be used, one for the transmitter and the other for the receiver, while still supporting the interactive zooming capabilities of engine 22.) The receiver typically comprises a sensitive, high-speed photodetector, such as an avalanche photodiode (APD), along with a sensitive amplifier, such as a transimpedance amplifier (TIA), which amplifies the electrical pulses output by the photodetector. These pulses are indicative of the times of flight of the corresponding pulses of light.
The pulses that are output by receiver 48 are processed by controller 42 in order to extract depth (Z) values as a function of scan location (X,Y). For this purpose, the pulses may be digitized by a high-speed analog/digital converter (A2D) 56, and the resulting digital values may be processed by depth processing logic 50. The corresponding depth values may be output to computer 24 via a USB port 58 or other suitable interface.
In some cases, particularly near the edges of objects in the scene, a given projected light pulse may result in two reflected light pulses that are detected by receiver 48—a first pulse reflected from the object itself in the foreground, followed by a second pulse reflected from the background behind the object. Logic 50 may be configured to process both pulses, giving two depth values (foreground and background) at the corresponding pixel. These dual values may be used by computer 24 in generating a more accurate depth map of the scene.
Controller 42 also comprises a power converter 57, to provide electrical power to the components of engine 22, and controls the transmit, receive, and scanning functions of optical head 40. For example, a MEMS control circuit 52 in controller 42 may direct commands to the optical head to modify the scanning ranges of mirror 46, as explained above. Position sensors associated with the scanning mirror, such as suitable inductive or capacitive sensors (not shown), may provide position feedback to the MEMS control function. A laser control circuit 54 and a receiver control circuit 55 likewise control aspects of the operation of transmitter 44 and receiver 48, such as amplitude, gain, offset, and bias.
The laser driver in ASIC 53 and/or laser control circuit 54 may control the output power of transmitter 44 adaptively, in order to equalize the level of optical power of the pulses that are incident on receiver 48. This adaptation compensates for variations in the intensity of the reflected pulses that occurs due to variations in the distance and reflectivity of objects in different parts of the scene from which the light pulses are reflected. It is thus useful in improving signal/noise ratio while avoiding detector saturation. For example, the power of each transmitted pulse may be adjusted based on the level of the output from receiver in response to one or more previous pulses, such as the preceding pulse or pulses emitted by the transmitter in the present scan, and/or the pulse at this X,Y position of mirror 46 in the preceding scan. Optionally, the elements of optical head 40 may be configured to transmit and receive “scout pulses,” at full or partial power, for the purpose of assessing returned power or object distance, and may then adjust the output of transmitter 44 accordingly.
Light pulses returned from the scene strike micromirror 46, which reflects the light via turning mirror 62 through beamsplitter 60. Receiver 48 senses the returned light pulses and generates corresponding electrical pulses. To enhance sensitivity of detection, the overall area of beamsplitter 60 and the aperture of receiver 48 are considerably larger than the area of the transmitted beam, and the beamsplitter is accordingly patterned, i.e., the reflective coating extends over only the part of its surface on which the transmitted beam is incident. The reverse side of the beamsplitter may have a bandpass coating, to prevent light outside the emission band of transmitter 44 from reaching the receiver. It is also desirable that micromirror 46 be as large as possible, within the inertial constraints imposed by the scanner. For example, the area of the micromirror may be about 10-15 mm2.
The specific mechanical and optical designs of the optical head shown in
Micromirror 46 is produced by suitably etching a semiconductor substrate 68 to separate the micromirror from a support 72, and to separate the support from the remaining substrate 68. After etching, micromirror 46 (to which a suitable reflective coating is applied) is able to rotate in the Y-direction relative to support 72 on spindles 70, while support 72 rotates in the X-direction relative to substrate 68 on spindles 74.
Micromirror 46 and support 72 are mounted on a pair of rotors 76, which comprise permanent magnets. (Only one of the rotors is visible in the figure.) Rotors 76 are suspended in respective air gaps of magnetic cores 78. Cores 78 are wound with respective coils 80 of conductive wire, thus creating an electromagnetic stator assembly. (Although a single coil per core is shown in
Specifically, coils 80 may be driven with high-frequency differential currents so as to cause micromirror 46 to rotate resonantly back and forth about spindles 70 at high speed (typically in the range of 5-10 kHz, as noted above, although higher or lower frequencies may also be used). This resonant rotation generates the high-speed Y-direction raster scan of the output beam from engine 22. At the same time, coils 80 are driven together at lower frequency to drive the X-direction scan by rotation of support 72 about spindles 74 through the desired scan range. The X- and Y-rotations together generate the overall raster scan pattern of micromirror 46.
The particular MEMS-based scanners shown in
Various scan modes can be enabled by applying appropriate drive signals to the sorts of micromirror-based scanners that are described above. The possibility of zooming in on particular windows was already mentioned above. As noted earlier, even when the entire field of view is scanned, the X-direction scan rate may be varied over the course of the scan to give higher resolution within one or more regions, by scanning the micromirror relatively slowly over these regions, while scanning the remainder of the scene at a faster rate. These high-resolution scans of particular regions can be interlaced, frame by frame, with low-resolution scans over the entire scene by maintaining a fixed X-direction scan rate as the micromirror scans over the scene in one direction (for example, scanning from left to right) to give the low-resolution depth map, and varying the X-direction scan rate between fast and slow while scanning in the reverse direction (on the return scan from right to left) to map the high-resolution window. Other sorts of variable, interlaced scan patterns may similarly be implemented by application of suitable drive signals.
Assembly of optical head 40 from discrete optical and mechanical components, as shown in
The laser typically has significantly lower numerical aperture (NA) than lens 110. Therefore, the laser beam at the lens will be much narrower than the return beam captured by the lens. (Optionally, a ball lens may be placed on SiOB 102 between laser die 104 and mirror 108, as shown in
Light returned from the scene via the scanning mirror is collected by lens 110, which focuses the light onto an avalanche photodiode (APD) die 114 on bench 102. The output of the APD is amplified by a transimpedance amplifier (TIA) 116, as explained above. Alternatively, other sorts of detectors and amplifiers may be used in module 100 (and in the alternative module designs that are described below), as long as they have sufficient sensitivity and speed for the application at hand. Lens 110 may present different or similar collimation properties to laser and APD, since transmission and reception use different portions of the lens.
Lens 110 may be produced by means of wafer-level optics or molding of polymeric materials or glass, for example. Such a lens may have “legs,” which create the side walls of module 100, thus sealing the module. Assembly of module 100 may be performed at wafer level, wherein a wafer of SiOB with mounted dies is bonded to a wafer of lenses, and then diced. Alternatively, a spacer wafer with appropriately-formed cavities may be bonded to the SiOB wafer, and the lens wafer bonded on top of it. Further alternatively, the assembly may be carried out using singulated silicon optical benches and lenses. In any case, the entire module 100 will have the form of a hollow cube, typically about 5-8 mm on a side. (Alternatively, the micro-optical bench and the components thereon may be sealed with a transparent cap, and lens 110 with other associate optics may then be assembled as a precision add-on, in both this embodiment and the other embodiments described below).
The angles of mirrors 108 and 118 in the foregoing figures are shown by way of example, and other angles, both greater than and less than 45°, may alternatively be used. It is generally desirable to shield APD die 114 from any stray light, including back-reflected light from the beam emitted by laser die 104. For this reason, the sharper reflection angle of mirror 118 (by comparison with mirror 108 in the embodiment of
The illumination beam emitted by laser die 104 is collimated by a ball lens 134, which is positioned in a groove 135 formed in SiOB 102. Groove 135 may be produced in silicon (and other semiconductor materials) with lithographic precision by techniques that are known in the art, such as wet etching. Alternatively or additionally, the ball lens may be attached directly to SiOB by an accurate pick-and-place machine, even without groove 135. A turning mirror 136 reflects the collimated beam away from SiOB 102 and through a cover glass 137, which protects the optoelectronic components in module 130. As ball lens 134 typically achieves only partial collimation, a beam expander 138 may be used to expand the laser beam, typically by a factor of three to ten, and thus enhance its collimation. Although beam expander 138 is shown here as a single-element optical component, multi-element beam expanders may alternatively be used. The design of module 130 is advantageous in that it can be assembled accurately without requiring active alignment, i.e., assembly and alignment can be completed to within fine tolerance without actually powering on laser die 104.
The collimated beam that is output by beam expander 138 is turned by a reflector 144 in beam combiner 142, and is then turned back outward toward the scanning mirror by a beamsplitter 146. Assuming laser die 104 to output a polarized beam, beamsplitter 146 may advantageously be polarization-dependent, as explained above with reference to
Although beam combiner 142 is shown in
If lenses 164 and 166 have tight manufacturing tolerances, they can be assembled in place using machine vision techniques to align their optical centers with the appropriate axes of module 160, on top of cover glass 162. Such miniature lenses, however, typically have large manufacturing tolerances, commonly on the order of 1-5%, particularly when the lenses are mass-produced in a wafer-scale process. Such tolerance could, if not measured and accounted for, result in poor collimation of the beam from laser die 104.
To avoid this sort of situation, the actual effective focal length (EFL) of collimation lens 164 can be measured in advance. For example, when lenses 164 are fabricated in a wafer-scale process, the EFL of each lens can be measured precisely at the wafer level, before module 160 is assembled. The distance of laser die 104 from turning mirror 136 on the substrate in each module 160 can then be adjusted at the time of fabrication, as illustrated by the horizontal arrow in
A pick-and-place machine may similarly be used to position collection lens 166. Because of the less stringent geometrical constraints of the collected beam and the relatively large size of APD 114, however, EFL variations of the collection lens are less critical. Thus, as an alternative to mounting collection lens 166 on cover glass 162 as shown in
Alternatively, as noted earlier, modules based on the principles of the embodiments described above may be fabricated on other sorts of micro-optical substrates, such as ceramic or glass substrates. Ceramic materials may be advantageous in terms of electrical performance.
In other alternative embodiments (not shown in the figures), the transmitting and receiving portions of the optoelectronic module may be mounted separately on two different micro-optical benches. This approach may be advantageous since the requirements for the receiver are high bandwidth, low loss for high-frequency signals, and low price, while for the transmitter the main requirement are thermal conductivity, as well as hermetic sealing for reliability of the laser diode.
Reference is now made to
Beam generator 172 comprises an array of surface-emitting devices 178, such as vertical-cavity surface-emitting lasers (VCSELs). The beams emitted by devices 178 are collected by a corresponding array of microlenses 176, which direct the beams toward a collimation lens 175. Devices 178 and microlenses 176 may conveniently be formed on opposing faces of a transparent optical substrate 180, such as a suitable semiconductor wafer, such as a GaAs wafer. (GaAs has an optical passband that begins at about 900 nm, i.e., it is transparent at wavelengths longer than about 900 nm, and will thus pass the radiation at such wavelengths that is emitted by devices 178 on the back side of substrate 180.) The thickness of substrate 180 is typically about 0.5 mm, although smaller or larger dimensions may alternatively be used. As shown most clearly in
Surface-emitting devices 178 in beam transmitters 170 and 186 may be driven individually or in predefined groups in order to change the characteristics of the beam that is output by lens 175. For example, all of devices 178 may be driven together to give a large-diameter, intense beam, or only the center device alone or the central group of seven devices together may be driven to give smaller-diameter, less intense beams. Although
In module 190, beam generator 172 (as illustrated in FIGS. 11B/C) is mounted on a micro-optical substrate 192, such as a SiOB, along with a receiver 194, which contains a suitable detector, such as an APD, for example, as described above. A beam combiner 196 combines the transmitted and received beams, which pass through lens 175 toward the scanning mirror (not shown in
The received beam collected by lens 175 enters beam combiner 196 through window 202, reflects internally from beamsplitter coating 200 and reflective coating 198, and then exits through a rear window 204 toward receiver 194. The thickness of the beam combiner plate is chosen to give the desired optical path length (which is longer than the back focal length of lens 175 would be otherwise). To reduce the amount of stray light reaching the receiver, window 204 may be located at the focus of lens 175 and thus can be made as small as possible. Window 204 (as well as window 202) may have a narrowband filter coating, so that ambient light that is outside the emission band of beam generator 172 is excluded.
Although in
A beam combiner 224 used in this embodiment has a front window 226 that is large enough to accommodate beam 232, but a much smaller window 228 in reflective coating 198 on the rear side. Window 228 need only be large enough to accommodate the narrow beam transmitted by beam generator 188. Consequently, most of the energy in beam 232 is reflected inside the beam combiner by reflective coating 198 and reaches receiver 194 via rear window 204 (which may be made small and coated with a narrowband coating, as described above). There is no need for a beamsplitter coating in this embodiment, and beam generator 188 may therefore comprise unpolarized, multimode surface-emitting devices.
Although the embodiments described above use a single detector element (such as an APD) to detect scanned light that is returned from the scene, other sorts of detector configurations may alternatively be used. For example, a linear array of photodetectors may be used for this purpose, in which case the mirror used in collecting light from the scene need scan in only a single direction, perpendicular to the axis of the array. This same one-dimensional scanning mirror can be used to project a line of laser radiation onto the instantaneous field of view of the detector array. Such a system is also capable of zoom functionality, which can be achieved on one axis by changing the scan pattern and amplitude along the one-dimensional scan.
As another alternative, a 2D matrix of photo-detectors with a stationary collection lens may be used to collect scanned light from the scene, covering the entire field of view so that no mechanical scanning of the receiver is needed. The transmitting laser is still scanned in two dimensions using a MEMS mirror, for example. The pixel positions in the resulting depth map are determined by the high precision of the scan, rather than the relatively lower resolution of the detector matrix. This approach has the advantages that alignment is easy (since the detector matrix is stationary); the scanning mirror can be small since it is not used to collect light, only to project the laser; and the collection aperture can be large. For example, using a collection lens of 6 mm focal length and detectors with a pitch of 0.1 mm, the field of view of each detector is approximately 1°. Thus, 60×60 detectors are needed for a 60° field of view. The resolution, as determined by the scan accuracy, however, can reach 1000×1000 points.
Another variant of this scheme may use multiple beams (created, for example, by a beamsplitter in the optical path of the transmitted beam after it reflects from the MEMS mirror). These beams create simultaneous readings on different detectors in the matrix, thus enabling simultaneous acquisition of several depth regions and points. It is desirable for this purpose that the beams themselves not overlap and be far enough apart in angular space so as not to overlap on any single element of the matrix.
More generally, although each of the different optoelectronic modules and other system components described above has certain particular features, this description is not meant to limit the particular features to the specific embodiments with respect to which the features are described. Those skilled in the art will be capable of combining features from two or more of the above embodiments in order to create other systems and modules with different combinations of the features described above. All such combinations are considered to be within the scope of the present invention.
It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
This application claims the benefit of U.S. Provisional Patent Application 61/598,921, filed Feb. 15, 2012, which is incorporated herein by reference. This application is related to another U.S. patent application, filed on even date, entitled “Scanning Depth Engine” 13/766,801.
Number | Name | Date | Kind |
---|---|---|---|
3401590 | Gail et al. | Sep 1968 | A |
3918068 | Reinke et al. | Nov 1975 | A |
4003626 | Reinke et al. | Jan 1977 | A |
4336978 | Suzuki | Jun 1982 | A |
4542376 | Bass et al. | Sep 1985 | A |
4802759 | Matsumoto | Feb 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
5075562 | Greivenkamp et al. | Dec 1991 | A |
5483261 | Yasutake | Jan 1996 | A |
5606181 | Sakuma et al. | Feb 1997 | A |
5630043 | Uhlin | May 1997 | A |
5636025 | Bieman et al. | Jun 1997 | A |
5712682 | Hannah | Jan 1998 | A |
5742419 | Dickensheets et al. | Apr 1998 | A |
5835218 | Harding | Nov 1998 | A |
5838428 | Pipitone et al. | Nov 1998 | A |
5856871 | Cabib et al. | Jan 1999 | A |
5909312 | Mendlovic et al. | Jun 1999 | A |
5938989 | Hambright | Aug 1999 | A |
6041140 | Binns et al. | Mar 2000 | A |
6081269 | Quarendon | Jun 2000 | A |
6084712 | Harding | Jul 2000 | A |
6088105 | Link | Jul 2000 | A |
6099134 | Taniguchi et al. | Aug 2000 | A |
6100517 | Yahav et al. | Aug 2000 | A |
6101269 | Hunter et al. | Aug 2000 | A |
6108036 | Harada et al. | Aug 2000 | A |
6140979 | Gerhard et al. | Oct 2000 | A |
6167151 | Albeck | Dec 2000 | A |
6259561 | George et al. | Jul 2001 | B1 |
6262740 | Lauer et al. | Jul 2001 | B1 |
6268923 | Michniewicz et al. | Jul 2001 | B1 |
6301059 | Huang et al. | Oct 2001 | B1 |
6377700 | Mack et al. | Apr 2002 | B1 |
6438263 | Albeck et al. | Aug 2002 | B2 |
6494837 | Kim et al. | Dec 2002 | B2 |
6495848 | Rubbert | Dec 2002 | B1 |
6517751 | Hambright | Feb 2003 | B1 |
6686921 | Rushmeier et al. | Feb 2004 | B1 |
6700669 | Geng | Mar 2004 | B1 |
6731391 | Kao et al. | May 2004 | B1 |
6741251 | Malzbender | May 2004 | B2 |
6750906 | Itani et al. | Jun 2004 | B1 |
6751344 | Grumbine | Jun 2004 | B1 |
6754370 | Hall-Holt et al. | Jun 2004 | B1 |
6759646 | Acharya et al. | Jul 2004 | B1 |
6803777 | Pfaff et al. | Oct 2004 | B2 |
6810135 | Berenz et al. | Oct 2004 | B1 |
6813440 | Yu et al. | Nov 2004 | B1 |
6825985 | Brown et al. | Nov 2004 | B2 |
6841780 | Cofer et al. | Jan 2005 | B2 |
6859326 | Sales | Feb 2005 | B2 |
6937348 | Geng | Aug 2005 | B2 |
7006952 | Matsumoto et al. | Feb 2006 | B1 |
7009742 | Brotherton-Ratcliffe et al. | Mar 2006 | B2 |
7013040 | Shiratani | Mar 2006 | B2 |
7076024 | Yokhin | Jul 2006 | B2 |
7112774 | Baer | Sep 2006 | B2 |
7120228 | Yokhin et al. | Oct 2006 | B2 |
7127101 | Littlefield et al. | Oct 2006 | B2 |
7194105 | Hersch et al. | Mar 2007 | B2 |
7231069 | Nahata | Jun 2007 | B2 |
7256899 | Faul et al. | Aug 2007 | B1 |
7335898 | Donders et al. | Feb 2008 | B2 |
7369685 | DeLean | May 2008 | B2 |
7385708 | Ackerman et al. | Jun 2008 | B2 |
7433024 | Garcia et al. | Oct 2008 | B2 |
7551719 | Yokhin et al. | Jun 2009 | B2 |
7560679 | Gutierrez | Jul 2009 | B1 |
7659995 | Knighton et al. | Feb 2010 | B2 |
7700904 | Toyoda et al. | Apr 2010 | B2 |
7751063 | Dillon et al. | Jul 2010 | B2 |
7811825 | Fauver et al. | Oct 2010 | B2 |
7840031 | Albertson et al. | Nov 2010 | B2 |
7936450 | Hoersch et al. | May 2011 | B2 |
7952781 | Weiss et al. | May 2011 | B2 |
8018579 | Krah | Sep 2011 | B1 |
8035806 | Jin et al. | Oct 2011 | B2 |
8126261 | Medioni et al. | Feb 2012 | B2 |
8319172 | Klein et al. | Nov 2012 | B2 |
8326025 | Boughorbel | Dec 2012 | B2 |
8437063 | Weiss et al. | May 2013 | B2 |
20010016063 | Albeck et al. | Aug 2001 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020075456 | Shiratani | Jun 2002 | A1 |
20030048237 | Sato et al. | Mar 2003 | A1 |
20030057972 | Pfaff et al. | Mar 2003 | A1 |
20030156756 | Gokturk et al. | Aug 2003 | A1 |
20030162313 | Kim et al. | Aug 2003 | A1 |
20040001145 | Abbate | Jan 2004 | A1 |
20040004775 | Turner | Jan 2004 | A1 |
20040040648 | Harden et al. | Mar 2004 | A1 |
20040063235 | Chang | Apr 2004 | A1 |
20040070816 | Kato et al. | Apr 2004 | A1 |
20040105580 | Hager et al. | Jun 2004 | A1 |
20040130730 | Cantin et al. | Jul 2004 | A1 |
20040130790 | Sales | Jul 2004 | A1 |
20040174770 | Rees | Sep 2004 | A1 |
20040207744 | Bock | Oct 2004 | A1 |
20040213463 | Morrison | Oct 2004 | A1 |
20040218262 | Chuang et al. | Nov 2004 | A1 |
20040228519 | Littlefield et al. | Nov 2004 | A1 |
20040264764 | Kochi et al. | Dec 2004 | A1 |
20050018209 | Lemelin et al. | Jan 2005 | A1 |
20050052637 | Shaw et al. | Mar 2005 | A1 |
20050111705 | Waupotitsch et al. | May 2005 | A1 |
20050134582 | Claus et al. | Jun 2005 | A1 |
20050135555 | Claus et al. | Jun 2005 | A1 |
20050200838 | Shaw et al. | Sep 2005 | A1 |
20050200925 | Brotherton-Ratcliffe et al. | Sep 2005 | A1 |
20050231465 | DePue et al. | Oct 2005 | A1 |
20050271279 | Fujimura et al. | Dec 2005 | A1 |
20060017656 | Miyahara | Jan 2006 | A1 |
20060072851 | Kang et al. | Apr 2006 | A1 |
20060156756 | Becke | Jul 2006 | A1 |
20060221218 | Adler et al. | Oct 2006 | A1 |
20060221250 | Rossbach et al. | Oct 2006 | A1 |
20060252169 | Ashida | Nov 2006 | A1 |
20060269896 | Liu et al. | Nov 2006 | A1 |
20070057946 | Albeck et al. | Mar 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070133840 | Cilia | Jun 2007 | A1 |
20070165243 | Kang et al. | Jul 2007 | A1 |
20070262985 | Watanabe et al. | Nov 2007 | A1 |
20080018595 | Hildreth et al. | Jan 2008 | A1 |
20080031513 | Hart | Feb 2008 | A1 |
20080037829 | Givon | Feb 2008 | A1 |
20080106746 | Shpunt et al. | May 2008 | A1 |
20080118143 | Gordon et al. | May 2008 | A1 |
20080143196 | Sprague et al. | Jun 2008 | A1 |
20080198355 | Domenicali et al. | Aug 2008 | A1 |
20080212835 | Tavor | Sep 2008 | A1 |
20080225368 | Ciabattoni et al. | Sep 2008 | A1 |
20080240502 | Freedman et al. | Oct 2008 | A1 |
20080247670 | Tam et al. | Oct 2008 | A1 |
20080278572 | Gharib et al. | Nov 2008 | A1 |
20080285827 | Meyer et al. | Nov 2008 | A1 |
20090016642 | Hart | Jan 2009 | A1 |
20090046152 | Aman | Feb 2009 | A1 |
20090060307 | Ghanem et al. | Mar 2009 | A1 |
20090096783 | Shpunt et al. | Apr 2009 | A1 |
20090183125 | Magal et al. | Jul 2009 | A1 |
20090183152 | Yang et al. | Jul 2009 | A1 |
20090185274 | Shpunt | Jul 2009 | A1 |
20090226079 | Katz et al. | Sep 2009 | A1 |
20090244309 | Maison et al. | Oct 2009 | A1 |
20090284817 | Orcutt | Nov 2009 | A1 |
20100007717 | Spektor et al. | Jan 2010 | A1 |
20100013860 | Mandella et al. | Jan 2010 | A1 |
20100020078 | Shpunt | Jan 2010 | A1 |
20100046054 | Jeong et al. | Feb 2010 | A1 |
20100118123 | Freedman et al. | May 2010 | A1 |
20100128221 | Muller et al. | May 2010 | A1 |
20100142014 | Rosen et al. | Jun 2010 | A1 |
20100177164 | Zalevsky et al. | Jul 2010 | A1 |
20100182406 | Benitez | Jul 2010 | A1 |
20100194745 | Leister et al. | Aug 2010 | A1 |
20100201811 | Garcia et al. | Aug 2010 | A1 |
20100225746 | Shpunt et al. | Sep 2010 | A1 |
20100243899 | Ovsiannikov et al. | Sep 2010 | A1 |
20100245826 | Lee | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100278384 | Shotton et al. | Nov 2010 | A1 |
20100284082 | Shpunt et al. | Nov 2010 | A1 |
20100290698 | Shpunt et al. | Nov 2010 | A1 |
20100302617 | Zhou | Dec 2010 | A1 |
20100303289 | Polzin et al. | Dec 2010 | A1 |
20110001799 | Rothenberger et al. | Jan 2011 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110043403 | Loffler | Feb 2011 | A1 |
20110074932 | Gharib et al. | Mar 2011 | A1 |
20110096182 | Cohen et al. | Apr 2011 | A1 |
20110134114 | Rais et al. | Jun 2011 | A1 |
20110158508 | Shpunt et al. | Jun 2011 | A1 |
20110187878 | Mor et al. | Aug 2011 | A1 |
20110188054 | Mor et al. | Aug 2011 | A1 |
20110211044 | Shpunt et al. | Sep 2011 | A1 |
20110228251 | Yee et al. | Sep 2011 | A1 |
20110279648 | Lutian et al. | Nov 2011 | A1 |
20110285910 | Bamji et al. | Nov 2011 | A1 |
20110310125 | McEldowney et al. | Dec 2011 | A1 |
20120012899 | Jin et al. | Jan 2012 | A1 |
20120051588 | McEldowney | Mar 2012 | A1 |
20120140094 | Shpunt et al. | Jun 2012 | A1 |
20120140109 | Shpunt et al. | Jun 2012 | A1 |
20120236379 | DaSilva et al. | Sep 2012 | A1 |
20120249744 | Pesach et al. | Oct 2012 | A1 |
20120281240 | Cohen et al. | Nov 2012 | A1 |
20130127854 | Shpunt et al. | May 2013 | A1 |
20130206967 | Shpunt et al. | Aug 2013 | A1 |
20130207970 | Shpunt et al. | Aug 2013 | A1 |
20140291496 | Shpunt et al. | Oct 2014 | A1 |
Number | Date | Country |
---|---|---|
19736169 | Aug 1997 | DE |
19638727 | Mar 1998 | DE |
2333603 | Jun 2011 | EP |
2271436 | Apr 1994 | GB |
2352901 | Feb 2001 | GB |
62206684 | Sep 1987 | JP |
01-240863 | Sep 1989 | JP |
03-029806 | Feb 1991 | JP |
H03-040591 | Feb 1991 | JP |
06-273432 | Sep 1994 | JP |
H08-186845 | Jul 1996 | JP |
H10-327433 | Dec 1998 | JP |
2000131040 | May 2000 | JP |
2001141430 | May 2001 | JP |
2002122417 | Apr 2002 | JP |
2002-152776 | May 2002 | JP |
2002-213931 | Jul 2002 | JP |
2002-365023 | Dec 2002 | JP |
2004191918 | Jul 2004 | JP |
2006-128818 | May 2006 | JP |
9303579 | Feb 1993 | WO |
9827514 | Jun 1998 | WO |
9828593 | Jul 1998 | WO |
9828593 | Jul 1998 | WO |
0247241 | Jun 2002 | WO |
03049156 | Jun 2003 | WO |
2005010825 | Feb 2005 | WO |
2012020380 | Feb 2012 | WO |
2012066501 | May 2012 | WO |
Entry |
---|
Abramson, N., “Holographic Contouring by Translation”, Applied Optics Journal, vol. 15, No. 4, pp. 1018-1976, Apr. 1976. |
Achan et al., “Phase Unwrapping by Minimizing Kikuchi Free Energy”, IEEE International Geoscience and Remote Sensing Symposium, pp. 1738-1740, Toronto, Canada, Jun. 2002. |
Theocaris et al., “Radial Gratings as Moire Gauges”, Journal of Scientific Instruments (Journal of Physics E), series 2, vol. 1, year 1968. |
Chinese Patent Application # 200780016625.5 Official Action dated Oct. 26, 2010. |
International Application PCT/IL2009/000285 Search Report dated Jun. 11, 2009. |
Brooks et al., “Moire Gauging Using Optical Interference Patterns”, Applied Optics Journal, vol. 8, No. 5, pp. 935-940, May 1969. |
Hovanesian et al., “Moire Contour-Sum Contour-Difference, and Vibration Analysis of Arbitrary Objects”, Applied Optics Journal, vol. 10, No. 12, pp. 2734-2738, Dec. 1971. |
Bryngdahl, O., “Characteristics of Superposed Patterns in Optics”, Journal of Optical Society of America, vol. 66, No. 2, pp. 87-94, Feb. 1976. |
International Application PCT/IL2008/000095 Search Report dated Jul. 24, 2008. |
Chen et al., “Overview of Three-Dimensional Shape Measurement Using Optical Methods”, Society of Photo-Optical Instrumentation Engineers Journal 39(1), pp. 10-22, Jan. 2000. |
Cohen et al., “High-Resolution X-ray Diffraction for Characterization and Monitoring of Silicon-On-Insulator Fabrication Processes”, Applied Physics Journal, vol. 93, No. 1, pp. 245-250, Jan. 2003. |
Zhang et al., “Shape from intensity gradient”, IEEE Transactions on Systems, Man and Cybernetics—Part A: Systems and Humans, vol. 29, No. 3, pp. 318-325, May 1999. |
Doty, J.L., “Projection Moire for Remote Contour Analysis”, Journal of Optical Society of America, vol. 73, No. 3, pp. 366-372, Mar. 1983. |
Ben Eliezer et al., “Experimental Realization of an Imaging System with an Extended Depth of Field”, Applied Optics Journal, vol. 44, No. 14, pp. 2792-2798, May 10, 2005. |
Tay et al., “Grating Projection System for Surface Contour Measurement”, Applied Optics Journal, vol. 44, No. 8, pp. 1393-1400, Mar. 10, 2005. |
Takeda et al., “Fourier Transform Methods of Fringe-Pattern Analysis for Computer-Based Topography and Interferometry”, Journal of Optical Society of America, vol. 72, No. 1, Jan. 1982. |
Takasaki, H., “Moire Topography”, Applied Optics Journal, vol. 12, No. 4, pp. 845-850, Apr. 1973. |
Takasaki, H., “Moire Topography”, Applied Optics Journal, vol. 9, No. 6, pp. 1467-1472, Jun. 1970. |
Hildebrand et al., “Multiple-Wavelength and Multiple-Source Holography Applied to Contour Generation”, Journal of Optical Society of America Journal, vol. 57, No. 2, pp. 155-162, Feb. 1967. |
Su et al., “Application of Modulation Measurement Profilometry to Objects with Surface Holes”, Applied Optics Journal, vol. 38, No. 7, pp. 1153-1158, Mar. 1, 1999. |
Btendo, “Two Uni-axial Scanning Mirrors Vs One Bi-axial Scanning Mirror”, Kfar Saba, Israel, Aug. 13, 2008. |
Hung et al., “Time-Averaged Shadow-Moire Method for Studying Vibrations”, Applied Optics Journal, vol. 16, No. 6, pp. 1717-1719, Jun. 1977. |
Idesawa et al., “Scanning Moire Method and Automatic Measurement of 3-D Shapes”, Applied Optics Journal, vol. 16, No. 8, pp. 2152-2162, Aug. 1977. |
Iizuka, K., “Divergence-Ratio Axi-Vision Camera (Divcam): A Distance Mapping Camera”, Review of Scientific Instruments 77, 0451111 (2006). |
Lim et al., “Additive Type Moire with Computer Image Processing”, Applied Optics Journal, vol. 28, No. 13, pp. 2677-2680, Jul. 1, 1989. |
Piestun et al., “Wave Fields in Three Dimensions: Analysis and Synthesis”, Journal of the Optical Society of America, vol. 13, No. 9, pp. 1837-1848, Sep. 1996. |
Post et al., “Moire Methods for Engineering and Science—Moire Interferometry and Shadow Moire”, Photomechanics (Topics in Applied Physics), vol. 77, pp. 151-196, Springer Berlin / Heidelberg, Jan. 1, 2000. |
Chinese Patent Application # 200780006560.6 Official Action dated Oct. 11, 2010. |
International Application PCT/IB2010/053430 Search Report dated Dec. 28, 2010. |
Scharstein et al., “High-Accuracy Stereo Depth Maps Using Structured Light”, IEEE Proceedings of the Conference on Computer Vision and Pattern Recognition, pp. 165-171, Jun. 18, 2003. |
Koschan et al., “Dense Depth Maps by Active Color Illumination and Image Pyramids”, Advances in Computer Vision, pp. 137-148, Springer 1997. |
Marcia et al., “Fast Disambiguation of Superimposed Images for Increased Field of View”, IEEE International Conference on Image Processing, San Diego, USA, Oct. 12-15, 2008. |
Microvision Inc., “Micro-Electro-Mechanical System (MEMS) Scanning Mirror”, years 1996-2009. |
Chinese Patent Application # 200780006560.6 Official Action dated Feb. 1, 2011. |
Yao Kun et al., “Measurement of Space Distribution of Laser Gaussian Beam by Speckles Displacement Method” High Power Laser and Particle Beams, vol. 12, No. 2, pp. 141-144, Apr. 30, 2000. |
Lavoie et al., “3-D Object Model Recovery From 2-D Images Using Structured Light”, IEEE Transactions on Instrumentation and Measurement, vol. 53, No. 2, pp. 437-443, Apr. 2004. |
Chinese Application # 200780016625.5 Office Action dated May 12, 2011. |
U.S. Appl. No. 11/899,542 Office Action dated Apr. 4, 2011. |
U.S. Appl. No. 11/724,068 Office Action dated Mar. 1, 2011. |
Chinese Application # 200780009053.8 Office Action dated Mar. 10, 2011. |
Japanese Application # 2008535179 Office Action dated Apr. 1, 2011. |
Kun et al., “Gaussian Laser Beam Spatial Distribution Measurement by Speckles Displacement Method”, HICH Power Laser and Particle Beams, vol. 12, No. 2, Apr. 2000. |
Chinese Patent Application # 200680038004.2 Official Action dated Dec. 24, 2010. |
Judy et al., “Magnetic Microactuation of Polysilicon Flexure Structures,” Solid-State Sensor and Actuator Workshop, year 1994. |
Judy et al., “Magnetically Actuated, Addressable Microstructures,” Journal of Microelectromechanical Systems, vol. 6, No. 3, pp. 249-256, Sep. 1997. |
Cho et al., “A Scanning Micromirror Using a Bi-Directionally Movable Magnetic Microactuator,” Proceedings of SPIE, MOEMS and Miniaturized Systems, vol. 4178, pp. 106-115, USA 2000. |
Hamamatsu Photonics K.K., “Position sensitive detectors”, Japan, Feb. 2010. |
Gale, M.T., “Replication Technology for Diffractive Optical Elements”, Proceedings of SPIE, vol. 3010, pp. 111-123, May 15, 1997. |
Kolste et al., “Injection Molding for Diffractive Optics”, Proceedings of SPIE, vol. 2404, pp. 129-131, Feb. 9, 1995. |
Gale et al., “Replicated Microstructures for Integrated Topics”, Proceedings of SPIE, vol. 2513, pp. 2-10, Aug. 29, 1994. |
Jahns et al., “Diffractive Optics and Micro-Optics: Introduction to the Feature Issue”, Applied Optics Journal, vol. 36, No. 20, pp. 4633-4634, Jul. 10, 1997. |
Nikolejeff et al., “Replication of Continuous Relief Diffractive Optical Elements by Conventional Compact Disc Injection-Molding Techniques”, Applied Optics Journal, vol. 36, No. 20, pp. 4655-4659, Jul. 10, 1997. |
Neyer et al., “New Fabrication Technology for Polymer Optical Waveguides”, Integrated Photonics Research, pp. 248-249, year 1992. |
Neyer et al., “Fabrication of Low Loss Polymer Waveguides Using Injection Moulding Technology”, Electronics Letters, vol. 29, No. 4, pp. 399-401, Feb. 18, 1993. |
Optical Society of America, “Diffractive Optics and Micro-Optics”, 1996 Technical Digest Series, vol. 5, Boston, USA, Apr. 29-May 2, 1996. |
Lintec Corporation, “Adwill D-510T Tape”, Japan, Apr. 4, 2006. |
EP Patent Application # 05804455.3 Official Action dated Feb. 15, 2010. |
International Application PCT/IL2005/001194 Search Report dated Jun. 6, 2006. |
U.S. Appl. No. 11/667,709 Official Action dated Apr. 30, 2010. |
Stark, B., “MEMS Reliability Assurance Guidelines for Space Applications”, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, USA, Jan. 1999. |
U.S. Appl. No. 13/100,312 Official Action dated Oct. 31, 2011. |
Mor et al., U.S. Appl. No. 12/723,644, filed Mar. 14, 2010. |
U.S. Appl. No. 12/723,644 Official Action dated Jan. 19, 2012. |
U.S. Appl. No. 12/723,644 Official Action dated Sep. 13, 2012. |
Sromin et al., PCT Application PCT/IB2013/056101 filed Jul. 25, 2013. |
Shpunt et al., U.S. Appl. No. 61/835,657, filed Jun. 17, 2013. |
Shpunt et al., U.S. Appl. No. 61/835,653, filed Jun. 17, 2013. |
Chayat et al., U.S. Appl. No. 13/798,251, filed Mar. 13, 2013. |
Erlich et al., U.S. Appl. No. 61/781,086, filed Mar. 14, 2013. |
Erlich et al., U.S. Appl. No. 61/717,427, filed Oct. 23, 2012. |
Fujita et al., “Dual-Axis MEMS Mirror for Large Deflection-Angle Using SU-8 Soft Torsion Beam,” Sensors and Actuators A: Physical, vol. 121, issue 1, pp. 16-21, May 2005. |
Shpunt et al., U.S. Appl. No. 61/786,711, filed Mar. 15, 2013. |
Stone et al., “Performance Analysis of Next-Generation LADAR for Manufacturing, Construction, and Mobility”, National Institute of Standards and Technology, document # NISTIR 7117, Gaithersburg, USA, May 2004. |
European Patent Application # 12155674.0 Search Report dated May 3, 2013. |
U.S. Appl. No. 12/844,864 Office Action dated Sep. 26, 2013. |
U.S. Appl. No. 13/921,224 Office Action dated Oct. 3, 2013. |
U.S. Appl. No. 12/958,427 Office Action dated Nov. 22, 2013. |
Minifaros, “D1.1-ProjectPresentation”, V3.0, 36 pages, Dec. 22, 2010. |
U.S. Appl. No. 12/522,171 Official Action dated Apr. 5, 2012. |
U.S. Appl. No. 12/397,362 Official Action dated Apr. 24, 2012. |
International Application PCT/IB2011/053560 Search Report dated Jan. 19, 2012. |
International Application PCT/IB2011/055155 Search Report dated Apr. 20, 2012. |
U.S. Appl. No. 13/036,023 Official Action dated Jan. 7, 2013. |
U.S. Appl. No. 12/522,176 Official Action dated Aug. 2, 2012. |
U.S. Appl. No. 12/758,047 Official Action dated Oct 25, 2012. |
Richardson, W. H., “Bayesian-Based Iterative Method of Image Restoration”, Journal of the Optical Society of America, vol. 62, No. 1, pp. 55-59, Jan. 1972. |
Omnivision Technologies Inc., “OV2710 1080p/720p HD Color CMOS Image Sensor with OmniPixel3-HS Technology”, Dec. 2011. |
U.S. Appl. No. 12/844,864 Official Action dated Dec 6, 2012. |
U.S. Appl. No. 12/282,517 Official Action dated Jun. 12, 2012. |
U.S. Appl. No. 12/522,172 Official Action dated Jun. 29, 2012. |
U.S. Appl. No. 12/703,794 Official Action dated Aug. 7, 2012. |
JP Patent Application # 2008558984 Office Action dated Jul. 3, 2012. |
Japanese Patent Application # 2011-517308 Official Action dated Dec. 5, 2012. |
U.S. Appl. No. 14/231,764 Office Action dated Dec. 12, 2014. |
International Application # PCT/IB2014/062245 Search Report dated Dec. 3, 2014. |
Hart, D., U.S. Appl. No. 09/616,606 “Method and System for High Resolution , Ultra Fast 3-D Imaging”, filed Jul. 14, 2000. |
International Application PCT/IL2007/000306 Search Report dated Oct. 2, 2008. |
International Application PCT/IL20027/000262 Search Report dated Oct. 16, 2008. |
International Application PCT/IL2008/000458 Search Report dated Oct. 28, 2008. |
International Application PCT/IL2008/000327 Search Report dated Sep. 26, 2008. |
International Application PCT/IL2006/000335 Preliminary Report on Patentability dated Apr. 24, 2008. |
Sazbon et al., “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding”, Pattern Recognition Letters 26, pp. 1772-1781, year 2005. |
Sjodahl et al., “Measurement of shape by using projected random and patterns and temporal digital speckle photography”, Applied Optics, vol. 38, No. 10, Apr. 1, 1999. |
Garcia et al., “Three dimensional mapping and range measurement by means of projected speckle patterns”, Applied Optics, vol. 47, No. 16, Jun. 1, 2008. |
Chen et al., “Measuring of a Three-Dimensional Surface by Use of a Spatial Distance Computation”, Applied Optics, vol. 42, issue 11, pp. 1958-1972, Apr. 10, 2003. |
Ypsilos et al., “Speech-driven Face Synthesis from 3D Video”, 2nd International Symposium on 3D Processing, Visualization and Transmission, Thessaloniki, Greece, Sep. 6-9, 2004. |
Hanson et al., “Optics and Fluid Dynamics Department”, Annual Progress Report for 1997 (an abstract). |
Ypsilos et al., “Video-rate capture of Dynamic Face Shape and Appearance”, Sixth IEEE International Conference on Automatic Face and Gesture Recognition (FGR 2004), Seoul, Korea, May 17-19, 2004. |
Goodman, J.W., “Statistical Properties of Laser Speckle Patterns”, Laser Speckle and Related Phenomena, pp. 9-75, Springer-Verlag, Berlin Heidelberg, 1975. |
Dainty, J.C., “Introduction”, Laser Speckle and Related Phenomena, pp. 1-7, Springer-Verlag, Berlin Heidelberg, 1975. |
Avidan et al., “Trajectory triangulation: 3D reconstruction of moving points from amonocular image sequence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, No. 4, pages, Apr. 2000. |
Leclerc et al., “The direct computation of height from shading”, Proceedings of Computer Vision and Pattern Recognition, pp. 552-558, year 1991. |
Zhang et al., “Height recovery from intensity gradients”, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 508-513, year 1994. |
Zigelman et al., “Texture mapping using surface flattening via multi-dimensional scaling”, IEEE Transactions on Visualization and Computer Graphics, 8 (2), pp. 198-207, year 2002. |
Kimmel et al., Analyzing and synthesizing images by evolving curves with the Osher-Sethian method, International Journal of Computer Vision, 24(1), pp. 37-56 , year 1997. |
Koninckx et al., “Efficient, Active 3D Acquisition, based on a Pattern-Specific Snake”, Luc Van Gool (Editor), (DAGM 2002) Pattern Recognition, Lecture Notes in Computer Science 2449, pp. 557-565, Springer 2002. |
Horn, B., “Height and gradient from shading”, International Journal of Computer Vision, No. 5, pp. 37-76, year 1990. |
Bruckstein, A., “On shape from shading”, Computer Vision, Graphics, and Image Processing, vol. 44, pp. 139-154, year 1988. |
Zhang et al., “Rapid Shape Acquisition Using Color Structured Light and Multi-Pass Dynamic Programming”, 1st International Symposium on 3D Data Processing Visualization and Transmission (3DPVT), Padova, Italy, Jul. 2002. |
Besl, P., “Active Optical Range Imaging Sensors”, Machine Vision and Applications, No. 1, pp. 127-152, USA 1988. |
Horn et al., “Toward optimal structured light patterns”, Proceedings of International Conference on Recent Advances in 3D Digital Imaging and Modeling, pp. 28-37, Ottawa, Canada, May 1997. |
Mendlovic, et al., “Composite harmonic filters for scale, projection and shift invariant pattern recognition”, Applied Optics, vol. 34, No. 2, pp. 310-316, Jan. 10, 1995. |
Asada et al., “Determining Surface Orientation by Projecting a Stripe Pattern”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, No. 5, year 1988. |
Winkelbach et al., “Shape from Single Stripe Pattern Illumination”, Luc Van Gool (Editor), (DAGM 2002) Patter Recognition, Lecture Notes in Computer Science 2449, p. 240-247, Springer 2002. |
EZconn Czech A.S., “Site Presentation”, Oct. 2009. |
Zhu et al., “Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, USA, Jun. 24-26, 2008. |
Luxtera Inc., “Luxtera Announces World's First 10GBit CMOS Photonics Platform”, Carlsbad, USA, Mar. 28, 2005 (press release). |
Lee et al., “Variable Pulse Mode Driving IR Source Based 3D Robotic Camera”, MVA2005 IAPR Conference on Machine Vision Applications, pp. 530-533, Japan, May 16-18, 2005. |
Mordohai et al., “Tensor Voting: A Perceptual Organization Approach to Computer Vision and Machine Learning”, Synthesis Lectures on Image, Video and Multimedia Processing, issue No. 8, Publishers Morgan and Claypool, year 2006. |
Beraldin et al., “Active 3D Sensing”, Scuola Normale Superiore Pisa, vol. 10, pp. 22-46, Apr. 2000. |
Bhat et al., “Ordinal Measures for Image Correspondence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, No. 4, pp. 415-423, Apr. 1998. |
Bradley et al., “Synchronization and Rolling Shutter Compensation for Consumer Video Camera Arrays”, IEEE International Workshop on Projector-Camera Systems—PROCAMS 2009 (Miami Beach, Florida, 2009). |
De Piero et al., “3D Computer Vision Using Structured Light: Design Calibration and Implementation Issues”, Advances in Computers, vol. 43, pp. 243-278, Academic Press 1996. |
Hongjun et al., “Shape Measurement by Digital Speckle Temporal Sequence Correlation Method”, Acta Optica Sinica Journal, vol. 21, No. 10, pp. 1208-1213, Oct. 2001 (with English abstract). |
Hongjun, D., “Digital Speckle Temporal Sequence Correlation Method and the Application in Three-Dimensional Shape Measurement”, Chinese Doctoral Dissertations & Master's Theses, Full-text Database (Master) Basic Sciences, No. 1, Mar. 15, 2004. |
Hsueh et al., “Real-time 3D Topography by Speckle Image Correlation”, Proceedings of SPIE Conference on Input/ Output and Imaging Technologies, vol. 3422, pp. 108-112, Taiwan, Jul. 1998. |
Chinese Patent Application # 200780009053.8 Official Action dated Apr. 15, 2010 (English translation). |
Chinese Patent Application # 200680038004.2 Official Action dated Mar. 30, 2010 (English translation). |
Chinese Patent Application # 200680038004.2 Official Action dated Aug. 3, 2011 (English translation). |
Engfield, N., “Use of Pseudorandom Encoded Grid in U.S. Appl. No. 11/899,542”, Andrews Robichaud, Jun. 22, 2011. |
International Application PCT/IB2013/051985 Search Report dated Jul. 22, 2013. |
Japanese Patent Application # 2008558981 Official Action dated Nov. 2, 2011. |
U.S. Appl. No. 12/522,171 Official Action dated Dec. 22, 2011. |
U.S. Appl. No. 12/522,172 Official Action dated Nov. 30, 2011. |
Japanese Patent Application # 2008558984 Official Action dated Nov. 1, 2011. |
U.S. Appl. No. 13/043,488 Official Action dated Jan. 3, 2012. |
Japanese Patent Application # 2008535179 Official Action dated Nov. 8, 2011. |
Chinese Patent Application # 2006800038004.2 Official Action dated Nov. 24, 2011. |
Marcia et al., “Superimposed Video Disambiguation for Increased Field of View”, Optics Express 16:21, pp. 16352-16363, year 2008. |
Guan et al., “Composite Structured Light Pattern for Three Dimensional Video”, Optics Express 11:5, pp. 406-417, year 2008. |
U.S. Appl. No. 13/856,444Office Action dated Nov. 12, 2013. |
Korean Patent Application # 10-2008-7025030 Office Action dated Feb. 25, 2013. |
U.S. Appl. No. 12/707,678 Office Action dated Feb. 26, 2013. |
U.S. Appl. No. 12/758,047 Office Action dated Apr. 25, 2013. |
U.S. Appl. No. 12/844,864 Office Action dated Apr. 11, 2013. |
U.S. Appl. No. 13/036,023 Office Action dated Sep. 3, 2013. |
Japanese Patent Application # 2011517308 Office Action dated Jun. 19, 2013. |
U.S. Appl. No. 13/036,023 Office Action dated Jul. 17, 2013. |
U.S. Appl. No. 12/707,678 Office Action dated Jun. 20, 2013. |
International Application PCT/IB2013/051189 Search Report dated Jun. 18, 2013. |
Number | Date | Country | |
---|---|---|---|
20130206967 A1 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
61598921 | Feb 2012 | US |