The present invention relates generally to methods and systems for mapping of three-dimensional (3D) objects, and specifically to optical 3D mapping.
Various methods are known in the art for optical 3D mapping, i.e., generating a 3D profile of the surface of an object by processing an optical image of the object.
Some methods are based on projecting a laser speckle pattern onto the object, and then analyzing an image of the pattern on the object. For example, PCT International Publication WO 2007/043036, whose disclosure is incorporated herein by reference, describes a system and method for object reconstruction, in which a coherent light source and a generator of a random speckle pattern projects onto the object a coherent random speckle pattern. An imaging unit detects the light response of the illuminated region and generates image data. Shifts of the pattern in the image of the object relative to a reference image of the pattern are used in real-time reconstruction of a 3D map of the object.
Other methods of optical 3D mapping project different sorts of patterns onto the object to be mapped. For example, PCT International Publication WO 93/03579 describes a three-dimensional vision system in which one or two projectors establish structured light comprising two sets of parallel stripes having different periodicities and angles. As another example, U.S. Pat. No. 6,751,344 describes a method for optically scanning a subject in which the subject is illuminated with a matrix of discrete two-dimensional image objects, such as a grid of dots. Other methods involve projection of a grating pattern, as described, for example, in U.S. Pat. No. 4,802,759. The disclosures of the above-mentioned patents and publications are incorporated herein by reference.
In embodiments of the present invention, a pattern of spots is projected onto an object, and an image of the pattern on the object is processed in order to reconstruct a 3D map of the object. The pattern on the object is created by projecting optical radiation through a transparency containing the pattern. The embodiments disclosed herein differ in this respect from methods of 3D reconstruction that use laser speckle, in which the pattern is created by optical interference using a diffuser. At the same time, the novel patterns that are used in these embodiments make it possible to perform 3D reconstruction quickly and accurately, using a single, stationary transparency to project the pattern, and a single, stationary image capture assembly to capture images of the object.
There is therefore provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including:
an illumination assembly, including:
an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the single transparency; and
a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
In a disclosed embodiment, the pattern is uncorrelated over a range of depths that is mapped by the apparatus.
In some embodiments, the image capture assembly is arranged to capture images of the pattern on the object from a single, fixed location and angle relative to the illumination assembly. Typically, the transparency and light source are fixed in respective positions in the illumination assembly, and the processor is configured to reconstruct the 3D map using the images that are captured only from the single, fixed location and angle with the transparency and light source only in the respective positions.
In one embodiment, the light source includes a point source of the optical radiation. Alternatively, the light source may include a light-emitting diode (LED).
In a disclosed embodiment, the processor is arranged to process a succession of images captured while the object is moving so as to map a 3D movement of the object, wherein the object is a part of a human body, and the 3D movement includes a gesture made by the part of the human body, and wherein the processor is coupled to provide an input to a computer application responsively to the gesture.
There is also provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including:
an illumination assembly, including:
an image capture assembly, which is configured to capture an image of the uncorrelated pattern that is projected onto the object; and
a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
In one embodiment, the uncorrelated pattern includes a pseudo-random pattern. In another embodiment, the uncorrelated pattern includes a quasi-periodic pattern, wherein the quasi-periodic pattern has an n-fold symmetry, with n=5 or n≧7.
Typically, the uncorrelated pattern has a duty cycle that is less than 1/e. Alternatively or additionally, the spots have a local duty cycle that varies across the pattern.
In an alternative embodiment, the transparency contains a plurality of parallel bands, repeating periodically in a first direction, each band containing a replica of the uncorrelated pattern extending across at least a part of the transparency in a second direction, perpendicular to the first direction.
In some embodiments, the processor is configured to derive the 3D map by finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, wherein the respective offsets are indicative of respective distances between the areas and the image capture assembly. In one embodiment, the spots have a local duty cycle that varies monotonically along an axis across the pattern, and the processor is configured to determine local gray levels of the multiple areas in the image responsively to the local duty cycle, and to estimate the respective offsets based on the local gray levels.
In an alternative embodiment, the spots in the transparency comprise micro-lenses arranged in the fixed or uncorrelated pattern.
There is furthermore provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including:
an illumination assembly, including:
an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the transparency; and
a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
Typically, the micro-lenses are configured to focus the optical radiation to form respective focal spots at a focal plane in the non-uniform pattern, and the light source includes optics for projecting the non-uniform pattern of the focal spots from the focal plane onto the object. Alternatively, at least some of the micro-lenses have differing focal lengths, and the light source includes optics for projecting the non-uniform pattern of the focal spots so that the pattern that is projected on the object varies with distance from the illumination assembly.
There is additionally provided, in accordance with an embodiment of the present invention, a method for mapping an object, including:
transilluminating a single transparency containing a fixed pattern of spots so as to project the pattern onto the object;
capturing an image of the pattern that is projected onto the object using the single transparency; and
processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
There is further provided, in accordance with an embodiment of the present invention, a method for mapping an object, including:
an illumination assembly, including:
transilluminating a transparency, containing an uncorrelated pattern of spots so as to project the uncorrelated pattern onto the object;
capturing an image of the uncorrelated pattern that is projected onto the object; and
processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
There is moreover provided, in accordance with an embodiment of the present invention, a method for mapping an object, including:
transilluminating a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern so as to project the non-uniform pattern onto the object;
capturing an image of the non-uniform pattern that is projected onto the object; and
processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
In some embodiments, device 22 projects an uncorrelated pattern of spots onto object 28. In the context of the present patent application and in the claims, the term “uncorrelated pattern” refers to a projected pattern of spots (which may be bright or dark), whose positions are uncorrelated in planes transverse to the projection beam axis. The positions are uncorrelated in the sense that the auto-correlation of the pattern as a function of transverse shift is insignificant for any shift larger than the spot size and no greater than the maximum shift that may occur over the range of depths mapped by the system. Random patterns, such as a laser speckle pattern, are uncorrelated in this sense. Synthetic patterns, created by human or computer design, such as pseudo-random and quasi-periodic patterns, may also be uncorrelated to the extent specified by the above definition.
An image processor 24 processes image data generated by device 22 in order to reconstruct a 3D map of object 28. The term “3D map” refers to a set of 3D coordinates representing the surface of the object. The derivation of such a map based on image data is referred to herein as “3D mapping” or equivalently, “3D reconstruction.” Image processor 24 computes the 3D coordinates of points on the surface of object 28 by triangulation, based on the transverse shifts of the spots in an image of the pattern that is projected onto the object relative to a reference pattern at a known distance from device 22. Methods for this sort of triangulation-based 3D mapping using a projected laser speckle pattern are described in the above-mentioned PCT publication WO 2007/043036 and in PCT Patent Application PCT/IL2007/000306, filed Mar. 8, 2007, and published as WO 2007/105205, which is assigned to the assignee of the present patent application, and whose disclosure is incorporated herein by reference. These methods may be implemented, mutatis mutandis, using synthetic uncorrelated patterns in system 20.
Image processor 24 may comprise a general-purpose computer processor, which is programmed in software to carry out the functions described hereinbelow. The software may be downloaded to processor 24 in electronic form, over a network, for example, or it may alternatively be provided on tangible media, such as optical, magnetic, or electronic memory media. Alternatively or additionally, some or all of the functions of the image processor may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP). Although processor 24 is shown in
The 3D map that is generated by processor 24 may be used for a wide range of different purposes. For example, the map may be sent to an output device, such as a display 26, which shows a pseudo-3D image of the object. In the example shown in
Transparency 36 may contain various sorts of fixed, uncorrelated patterns of spots. For example, the transparency may contain a pattern of binary (white/black) spots, distributed over the area of the transparency according to the values of a pseudo-random distribution function. Other examples of uncorrelated spot patterns are described hereinbelow with reference to
Light source 34 transilluminates transparency 36 with optical radiation so as to project an image of the spot pattern that is contained by the transparency onto object 28. (The terms “light” and “optical radiation” in the context of the present patent application refer to any band of optical radiation, including infrared and ultraviolet, as well as visible light. In some applications, however, near-infrared light is preferred on account of the availability of suitable, low-cost sources and detectors and the fact that the spot pattern is thus invisible to human viewers.) In the configuration shown in
An image capture assembly 32 captures an image of the pattern that is projected by illumination assembly 30 onto object 28. Assembly 32 comprises objective optics 40, which focus the image onto an image sensor 42. Typically, sensor 42 comprises a rectilinear array of detector elements 44, such as a CCD or CMOS-based image sensor array. Assembly 32 may also comprise a bandpass filter (not shown in the figures), chosen and positioned so that sensor 42 receives only light in the emission band of light source 34, while filtering out ambient light that might otherwise reduce the contrast of the image of the projected pattern that is captured by the sensor.
In the embodiment shown in
To simplify the computation of the 3D map and of changes in the map due to motion of object 28 in the configuration of
Specifically, by triangulation in this arrangement, a Z-direction shift of a point on the object, δZ, will engender a concomitant transverse shift δX in the spot pattern observed in the image. Z-coordinates of points on the object, as well as shifts in the Z-coordinates over time, may thus be determined by measuring shifts in the X-coordinates of the spots in the image captured by assembly 32 relative to a reference image taken at a known distance Z. Y-direction shifts may be disregarded. This sort of triangulation approach is appropriate particularly in 3D mapping using uncorrelated patterns of spots, although aspects of the approach may be adapted for use with other types of patterns, as well.
Thus, to generate the 3D map of object 28, image processor 24 (
For n=5 or (n≧5, 7, 8, . . . ), these patterns are uncorrelated in the sense defined above. Alternatively, transparency 36 may contain uncorrelated quasi-periodic patterns of other types.
The use of quasi-periodic patterns in system 20 is advantageous in that the pattern has a known spatial frequency spectrum, with distinct peaks (as opposed to random and pseudo-random patterns, whose spectrum is flat). Processor 24 may use this spectral information in filtering digital images of the pattern that are captured by image capture assembly 32, and may thus reduce the effects of noise and ambient light in the image correlation computation. On the other hand, because the pattern is uncorrelated over the range of depths mapped by the system, the likelihood of erroneous mapping results is reduced, since only a correct match between an area of the image of the object and a corresponding area of the reference image will give a high correlation value.
When slide 36 contains pattern 70, the gray level of the pattern projected onto object 28, when observed at low resolution, will likewise vary across the image of the object. Therefore, in an initial processing phase, processor 24 may process the image at low resolution in order to determine the gray level of each area in the image of the object. The processor may then compare this gray level to the distribution of gray levels across the reference image in order to make a rough estimate of the depth (Z-coordinate) of each area of the object. For some applications, this rough estimate may be sufficient.
Alternatively, the processor may use this initial estimate in choosing, for each area of the image of the object, the appropriate area of the reference image in which to search for a matching part of the spot pattern. By matching the spot pattern, the processor computes more accurate depth values. This two-step processing approach can be advantageous in avoiding erroneous mapping results and possibly in reducing the overall computation time. Although
Because bands 82 in pattern 80 repeat periodically in the Y-direction, processor 24 may use the image of a single band 82 as a reference image in determining the X-direction shift of an area in the image of the object, regardless of the Y-coordinates of the area. Therefore the memory required to store the reference image is reduced. The complexity of the computation may be reduced, as well, since the range of the search for a matching area in the reference image is limited. Bands 82 may alternatively comprise other types of patterns that are uncorrelated in the X-direction, such as types of patterns shown above in
Reference is now made to
In this embodiment, the spots on transparency 90 comprise micro-lenses 92, which are distributed over a transparent substrate 94 in a non-uniform, uncorrelated pattern, such as a random or pseudo-random pattern. The duty cycle of the pattern is given by the density of the micro-lenses per unit area and the optical properties of the, micro-lenses and other projection optics (which define the focal spot size). The duty cycle is typically (although not necessarily) less than 1/e, as explained above. Micro-lenses 92 may be formed on substrate 94 using a photolithographic process, for example, as is used to produce uniform micro-lens grid arrays that are known in the art. Such processes are capable of fabricating micro-lenses with diameter on the order of 0.1 mm and focal lengths of 5-6 mm. Alternatively, micro-lenses 92 may have larger or smaller dimensions and focal lengths, depending on the process and application requirements.
As shown in
As a further alternative, the micro-lenses may have non-uniform focal lengths. For example, different micro-lenses may have different focal lengths, so that the pattern that is projected on the object varies with distance from the illumination assembly. As another example, some or all of the micro-lenses may have multiple different focal lengths. Alternatively or additionally, the micro-lenses or the projection optics (such as optic 56 in
The patterns in
It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
This application claims the benefit of U.S. Provisional Patent Application 61/016,832, filed Dec. 27, 2007. This application is also a continuation-in-part of U.S. patent application Ser. No. 11/899,542, filed Sep. 6, 2007, which claims the benefit of U.S. Provisional Patent Application 60/909,487, filed Apr. 2, 2007. All of these related applications are incorporated herein by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IL2008/000458 | 4/2/2008 | WO | 00 | 12/10/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2008/120217 | 10/9/2008 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4336978 | Suzuki | Jun 1982 | A |
4542376 | Bass et al. | Sep 1985 | A |
4802759 | Matsumoto | Feb 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
5075562 | Greivenkamp et al. | Dec 1991 | A |
5483261 | Yasutake | Jan 1996 | A |
5630043 | Uhlin | May 1997 | A |
5636025 | Bieman et al. | Jun 1997 | A |
5835218 | Harding | Nov 1998 | A |
5838428 | Pipitone et al. | Nov 1998 | A |
5856871 | Cabib et al. | Jan 1999 | A |
5909312 | Mendlovic et al. | Jun 1999 | A |
6041140 | Binns et al. | Mar 2000 | A |
6081269 | Quarendon | Jun 2000 | A |
6084712 | Harding | Jul 2000 | A |
6088105 | Link | Jul 2000 | A |
6099134 | Taniguchi et al. | Aug 2000 | A |
6100517 | Yahav et al. | Aug 2000 | A |
6101269 | Hunter et al. | Aug 2000 | A |
6108036 | Harada et al. | Aug 2000 | A |
6167151 | Albeck | Dec 2000 | A |
6259561 | George et al. | Jul 2001 | B1 |
6262740 | Lauer et al. | Jul 2001 | B1 |
6268923 | Michniewicz et al. | Jul 2001 | B1 |
6301059 | Huang et al. | Oct 2001 | B1 |
6438263 | Albeck et al. | Aug 2002 | B2 |
6494837 | Kim et al. | Dec 2002 | B2 |
6495848 | Rubbert | Dec 2002 | B1 |
6686921 | Rushmeier et al. | Feb 2004 | B1 |
6700669 | Geng | Mar 2004 | B1 |
6731391 | Kao et al. | May 2004 | B1 |
6741251 | Malzbender | May 2004 | B2 |
6751344 | Grumbine | Jun 2004 | B1 |
6754370 | Hall-Holt et al. | Jun 2004 | B1 |
6803777 | Pfaff et al. | Oct 2004 | B2 |
6810135 | Berenz et al. | Oct 2004 | B1 |
6813440 | Yu et al. | Nov 2004 | B1 |
6825985 | Brown et al. | Nov 2004 | B2 |
6841780 | Cofer et al. | Jan 2005 | B2 |
6859326 | Sales | Feb 2005 | B2 |
6937348 | Geng | Aug 2005 | B2 |
7006952 | Matsumoto et al. | Feb 2006 | B1 |
7009742 | Brotherton-Ratcliffe et al. | Mar 2006 | B2 |
7013040 | Shiratani | Mar 2006 | B2 |
7076024 | Yokhin | Jul 2006 | B2 |
7112774 | Baer | Sep 2006 | B2 |
7120228 | Yokhin et al. | Oct 2006 | B2 |
7127101 | Littlefield et al. | Oct 2006 | B2 |
7194105 | Hersch et al. | Mar 2007 | B2 |
7256899 | Faul et al. | Aug 2007 | B1 |
7335898 | Donders et al. | Feb 2008 | B2 |
7369685 | Delean | May 2008 | B2 |
7385708 | Ackerman et al. | Jun 2008 | B2 |
7433024 | Garcia et al. | Oct 2008 | B2 |
7551719 | Yokhin et al. | Jun 2009 | B2 |
7560679 | Gutierrez | Jul 2009 | B1 |
7659995 | Knighton et al. | Feb 2010 | B2 |
7751063 | Dillon et al. | Jul 2010 | B2 |
7840031 | Albertson et al. | Nov 2010 | B2 |
8126261 | Medioni et al. | Feb 2012 | B2 |
8326025 | Boughorbel | Dec 2012 | B2 |
20010016063 | Albeck et al. | Aug 2001 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020075456 | Shiratani | Jun 2002 | A1 |
20030048237 | Sato et al. | Mar 2003 | A1 |
20030057972 | Pfaff et al. | Mar 2003 | A1 |
20030156756 | Gokturk et al. | Aug 2003 | A1 |
20040130730 | Cantin et al. | Jul 2004 | A1 |
20040130790 | Sales | Jul 2004 | A1 |
20040174770 | Rees | Sep 2004 | A1 |
20040213463 | Morrison | Oct 2004 | A1 |
20040218262 | Chuang et al. | Nov 2004 | A1 |
20040228519 | Littlefield et al. | Nov 2004 | A1 |
20050018209 | Lemelin et al. | Jan 2005 | A1 |
20050052637 | Shaw et al. | Mar 2005 | A1 |
20050200838 | Shaw et al. | Sep 2005 | A1 |
20050200925 | Brotherton-Ratcliffe et al. | Sep 2005 | A1 |
20050231465 | DePue et al. | Oct 2005 | A1 |
20050271279 | Fujimura et al. | Dec 2005 | A1 |
20060017656 | Miyahara | Jan 2006 | A1 |
20060072851 | Kang et al. | Apr 2006 | A1 |
20060156756 | Becke | Jul 2006 | A1 |
20060221218 | Adler et al. | Oct 2006 | A1 |
20060221250 | Rossbach et al. | Oct 2006 | A1 |
20070057946 | Albeck et al. | Mar 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070133840 | Cilia | Jun 2007 | A1 |
20070165243 | Kang et al. | Jul 2007 | A1 |
20080018595 | Hildreth et al. | Jan 2008 | A1 |
20080031513 | Hart | Feb 2008 | A1 |
20080106746 | Shpunt et al. | May 2008 | A1 |
20080118143 | Gordon et al. | May 2008 | A1 |
20080198355 | Domenicali et al. | Aug 2008 | A1 |
20080212835 | Tavor | Sep 2008 | A1 |
20080240502 | Freedman et al. | Oct 2008 | A1 |
20080247670 | Tam et al. | Oct 2008 | A1 |
20090016642 | Hart | Jan 2009 | A1 |
20090060307 | Ghanem et al. | Mar 2009 | A1 |
20090096783 | Shpunt et al. | Apr 2009 | A1 |
20090183125 | Magal et al. | Jul 2009 | A1 |
20090183152 | Yang et al. | Jul 2009 | A1 |
20090185274 | Shpunt | Jul 2009 | A1 |
20090226079 | Katz et al. | Sep 2009 | A1 |
20090244309 | Maison et al. | Oct 2009 | A1 |
20100007717 | Spektor et al. | Jan 2010 | A1 |
20100013860 | Mandella et al. | Jan 2010 | A1 |
20100020078 | Shpunt | Jan 2010 | A1 |
20100128221 | Muller et al. | May 2010 | A1 |
20100177164 | Zalevsky | Jul 2010 | A1 |
20100182406 | Benitez | Jul 2010 | A1 |
20100194745 | Leister et al. | Aug 2010 | A1 |
20100201811 | Garcia et al. | Aug 2010 | A1 |
20100225746 | Shpunt et al. | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100278384 | Shotton et al. | Nov 2010 | A1 |
20100290698 | Shpunt et al. | Nov 2010 | A1 |
20100303289 | Polzin et al. | Dec 2010 | A1 |
20110001799 | Rothenberger et al. | Jan 2011 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110074932 | Gharib et al. | Mar 2011 | A1 |
20110096182 | Cohen et al. | Apr 2011 | A1 |
20110134114 | Rais et al. | Jun 2011 | A1 |
20110158508 | Shpunt et al. | Jun 2011 | A1 |
20110211044 | Shpunt et al. | Sep 2011 | A1 |
20110285910 | Bamji et al. | Nov 2011 | A1 |
20120051588 | Mceldowney | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
19736169 | Aug 1997 | DE |
19638727 | Mar 1998 | DE |
2352901 | Feb 2001 | GB |
62206684 | Sep 1987 | JP |
01-240863 | Sep 1989 | JP |
03-029806 | Feb 1991 | JP |
H03-040591 | Feb 1991 | JP |
06-273432 | Sep 1994 | JP |
H08-186845 | Jul 1996 | JP |
H10-327433 | Dec 1998 | JP |
2001141430 | May 2001 | JP |
2002122417 | Apr 2002 | JP |
2002-152776 | May 2002 | JP |
2002-213931 | Jul 2002 | JP |
2002-365023 | Dec 2002 | JP |
2006-128818 | May 2006 | JP |
199303579 | Feb 1993 | WO |
9827514 | Jun 1998 | WO |
9828593 | Jul 1998 | WO |
9828593 | Jul 1998 | WO |
2005010825 | Feb 2005 | WO |
2007043036 | Apr 2007 | WO |
2007096893 | Aug 2007 | WO |
2007105205 | Sep 2007 | WO |
2007105215 | Sep 2007 | WO |
2008120217 | Oct 2008 | WO |
Entry |
---|
Mor et al., U.S. Appl. No. 12/762,373 “Synchronization of Projected Illumination with Rolling Shutter of Image Sensor” filed Apr. 19, 2010. |
Petronius et al., U.S. Appl. No. 61/300,465 “Integrated Photonics Module for Optical Projection” filed Feb. 2, 2010. |
Sali et al., U.S. Appl. No. 12/758,047 “Three-Dimensional Mapping and Imaging” filed Apr. 12, 2010. |
Garcia et al., U.S. Appl. No. 12/703,794 “Depth Ranging with Moire Patterns” filed Feb. 11, 2010. |
Shpunt et al., U.S. Appl. No. 12/707,678 “Reference Image Techniques for 3D sensing” filed Feb. 18, 2010. |
Luxtera Inc., “Luxtera Announces World's First 10GBit CMOS Photonics Platform”, Carlsbad, USA, Mar. 28, 2005 (press release). |
Lee et al., “Variable Pulse Mode Driving IR Source Based 3D Robotic Camera”, MVA2005 IAPR Conference on Machine Vision Applications, pp. 530-533, Japan, May 16-18, 2005. |
Mordohai et al., “Tensor Voting: A Perceptual Organization Approach to Computer Vision and Machine Learning”, Synthesis Lectures on Image, Video and Multimedia Processing, issue No. 8, Publishers Morgan and Claypool, year 2006. |
Beraldin et al., “Active 3D Sensing”, Scuola Normale Superiore Pisa, vol. 10, pp. 22-46, Apr. 2000. |
Bhat et al., “Ordinal Measures for Image Correspondence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, No. 4, pp. 415-423, Apr. 1998. |
Bradley et al., “Synchronization and Rolling Shutter Compensation for Consumer Video Camera Arrays”, IEEE International Workshop on Projector-Camera Systems—PROCAMS 2009 (Miami Beach, Florida, 2009). |
De Piero et al., “3D Computer Vision Using Structured Light: Design Calibration and Implementation Issues”, Advances in Computers, vol. 43, pp. 243-278, Academic Press 1996. |
Hongjun et al., “Shape Measurement by Digital Speckle Temporal Sequence Correlation Method”, Acta Optica Sinica Journal, vol. 21, No. 10, pp. 1208-1213, Oct. 2001. |
Hongjun, D., “Digital Speckle Temporal Sequence Correlation Method and the Application in Three-Dimensional Shape Measurement”, Chinese Doctoral Dissertations & Master's Theses, Full-text Database (Master) Basic Sciences, No. 1, Mar. 15, 2004. |
Hsueh et al., “Real-time 3D Topography by Speckle Image Correlation”, Proceedings of SPIE Conference on Input/Output and Imaging Technologies, vol. 3422, pp. 108-112, Taiwan, Jul. 1998. |
Chinese Patent Application # 200780009053.8 Official Action dated Apr. 15, 2010 (with English translation). |
Chinese Patent Application # 200680038004.2 Official Action dated Mar. 30, 2010 (with English translation). |
Zhu et al., “Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, USA, Jun. 24-26, 2008. |
EZconn Czech A.S., “Site Presentation”, Oct. 2009. |
Zhang et al., “Shape from intensity gradient”, IEEE Transactions on Systems, Man and Cybernetics—Part A: Systems and Humans, vol. 29, No. 3, pp. 318-325, May 1999. |
Ben Eliezer et al., “Experimental Realization of an Imaging System with an Extended Depth of Field”, Applied Optics Journal, vol. 44, No. 14, pp. 2792-2798, May 10, 2005. |
Lavoie et al., “3-D Object Model Recovery From 2-D Images Using Structured Light”, IEEE Transactions on Instrumentation and Measurement, vol. 53, No. 2, pp. 437-443, Apr. 2004. |
Chinese Application # 200780016625.5 Office Action dated May 12, 2011. |
U.S. Appl. No. 11/899,542 Office Action dated Apr. 4, 2011. |
U.S. Appl. No. 11/724,068 Office Action dated Mar. 1, 2011. |
Chinese Application # 200780009053.8 Office Action dated Mar. 10, 2011. |
Japanese Application # 2008535179 Office Action dated Apr. 1, 2011. |
Kun et al., “Gaussian Laser Beam Spatial Distribution Measurement by Speckles Displacement Method”, HICH Power Laser and Particle Beams, vol. 12, No. 2, Apr. 2000. |
Chinese Patent Application # 200680038004.2 Official Action dated Dec. 24, 2010. |
Engfield, N., “Use of Pseudorandom Encoded Grid in U.S. Appl. No. 11/899,542”, Andrews Robichaud, Jun. 22, 2011. |
U.S. Appl. No. 61/471,215, filed Apr. 4, 2011. |
Chinese Patent Application # 200680038004.2 Official Action dated Aug. 3, 2011 (English translation). |
International Application PCT/IB2011/053560 filed on Aug. 10, 2011. |
Abramson, N., “Holographic Contouring by Translation”, Applied Optics Journal, vol. 15, No. 4, pp. 1018-1976, Apr. 1976. |
Achan et al., “Phase Unwrapping by Minimizing Kikuchi Free Energy”, IEEE International Geoscience and Remote Sensing Symposium, pp. 1738-1740, Toronto, Canada, Jun. 2002. |
Garcia et al., U.S. Appl. No. 61/151,853 “Depth Ranging with Moire Patterns” filed Feb. 12, 2009. |
Theocaris et al., “Radial Gratings as Moire Gauges”, Journal of Scientific Instruments (Journal of Physics E), series 2, vol. 1, year 1968. |
Hovanesian et al., “Moire Contour-Sum Contour-Difference, and Vibration Analysis of Arbitrary Objects”, Applied Optics Journal, vol. 10, No. 12, pp. 2734-2738, Dec. 1971. |
Brooks et al., “Moire Gauging Using Optical Interference Patterns”, Applied Optics Journal, vol. 8, No. 5, pp. 935-940, May 1969. |
International Application PCT/IL2008/000095 Search Report dated Jul. 24, 2008. |
Bryngdahl, O., “Characteristics of Superposed Patterns in Optics”, Journal of Optical Society of America, vol. 66, No. 2, pp. 87-94, Feb. 1976. |
Chen et al., “Measuring of a Three-Dimensional Surface by Use of a Spatial Distance Computation”, Applied Optics Journal, vol. 42, issue 11, pp. 1958-1972, 2003. |
Tay et al., “Grating Projection System for Surface Contour Measurement”, Applied Optics Journal, vol. 44, No. 8, pp. 1393-1400, Mar. 10, 2005. |
Cohen et al., “High-Resolution X-ray Diffraction for Characterization and Monitoring of Silicon-On-Insulator Fabrication Processes”, Applied Physics Journal, vol. 93, No. 1, pp. 245-250, Jan. 2003. |
Takeda et al., “Fourier Transform Methods of Fringe-Pattern Analysis for Computer-Based Topography and Interferometry”, Journal of Optical Society of America, vol. 72, No. 1, Jan. 1982. |
Doty, J.L., “Projection Moire for Remote Contour Analysis”, Journal of Optical Society of America, vol. 73, No. 3, pp. 366-372, Mar. 1983. |
Takasaki, H., “Moire Topography”, Applied Optics Journal, vol. 12, No. 4, pp. 845-850, Apr. 1973. |
Takasaki, H., “Moire Topography”, Applied Optics Journal, vol. 9, No. 6, pp. 1467-1472, Jun. 1970. |
Su et al., “Application of Modulation Measurement Profilometry to Objects with Surface Holes”, Applied Optics Journal, vol. 38, No. 7, pp. 1153-1158, Mar. 1, 1999. |
Spektor et al., U.S. Appl. No. 61/162,309 “Integrated Chip with Experience Understanding” filed Mar. 22, 2009. |
Shpunt et al., U.S. Appl. No. 61/229,754 “Pattern-Based Depth Mapping with Stereoscopic Assistance” filed Jul. 30, 2009. |
Hildebrand et al., “Multiple-Wavelength and Multiple-Source Holography Applied to Contour Generation”, Journal of Optical Society of America Journal, vol. 57, No. 2, pp. 155-162, Feb. 1967. |
Shpunt et al., U.S. Appl. No. 61/157,560 “Reference Image Techniques for 3D Sensing” filed Mar. 5, 2009. |
Post et al., “Moire Methods for Engineering and Science—Moire Interferometry and Shadow Moire”, Photomechanics (Topics in Applied Physics), vol. 77, pp. 151-196, Springer Berlin / Heidelberg, Jan. 1, 2000. |
Hung et al., “Time-Averaged Shadow-Moire Method for Studying Vibrations”, Applied Optics Journal, vol. 16, No. 6, pp. 1717-1719, Jun. 1977. |
Idesawa et al., “Scanning Moire Method and Automatic Measurement of 3-D Shapes”, Applied Optics Journal, vol. 16, No. 8, pp. 2152-2162, Aug. 1977. |
Iizuka, K., “Divergence-Ratio Axi-Vision Camera (Divcam): A Distance Mapping Camera”, Review of Scientific Instruments 77, 0451111 (2006). |
Piestun et al., “Wave Fields in Three Dimensions: Analysis and Synthesis”, Journal of the Optical Society of America, vol. 13, No. 9, pp. 1837-1848, Sep. 1996. |
Lim et al., “Additive Type Moire with Computer Image Processing”, Applied Optics Journal, vol. 28, No. 13, pp. 2677-2680, Jul. 1, 1989. |
International Application PCT/IL2009/000285 Search Report dated Jun. 11, 2009. |
Hart, D., U.S. Appl. No. 09/616,606 “Method and System for High Resolution , Ultra Fast 3-D Imaging” filed Jul. 14, 2000. |
International Application PCT/IL2007/000306 Search Report dated Oct. 2, 2008. |
International Application PCT/IL2007/000306 Preliminary Report on Patentability dated Mar. 19, 2009. |
International Application PCT/IL20027/000262 Search Report dated Oct. 16, 2008. |
International Application PCT/IL2007/000262 Preliminary Report on Patentability dated Mar. 19, 2009. |
Ben Eliezer et al., “All optical extended depth of field imaging system”, Journal of Optics A: Pure and Applied Optics—A, 5, pp. S164-S169, 2003. |
Sazbon et al., “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding”, Pattern Recognition Letters 26, pp. 1772-1781, year 2005. |
Sjodahl et al., “Measurement of shape by using projected random and patterns and temporal digital speckle photography”, Applied Optics, vol. 38, No. 10, Apr. 1, 1999. |
Garcia et al., “Three dimensional mapping and range measurement by means of projected speckle patterns”, Applied Optics, vol. 47, No. 16, Jun. 1, 2008. |
Chen et al., “Measuring of a Three-Dimensional Surface by Use of a Spatial Distance Computation”, Applied Optics, vol. 42, issue 11, pp. 1958-1972, 2003. |
Ypsilos et al., “Speech-driven Face Synthesis from 3D Video”, 2nd International Symposium on 3D Processing, Visualization and Transmission, Thessaloniki, Greece, Sep. 6-9, 2004. |
S.G. Hanson, et al. “Optics and Fluid Dynamics Department”, Annual Progress Report for 1997 (an abstract). |
International Application PCT/IL2008/000458 Search Report dated Oct. 28, 2008. |
International Application PCT/IL2007/000327 Search Report dated Sep. 26, 2008. |
International Application PCT/IL2007/000327 Preliminary Report on Patentability dated Mar. 12, 2009. |
Goodman, J.W., “Statistical Properties of Laser Speckle Patterns”, Laser Speckle and Related Phenomena, pp. 9-75, Springer-Verlag, Berlin Heidelberg, 1975. |
International Application PCT/IL2006/000335 Preliminary Report on Patentability dated Apr. 24, 2008. |
Avidan et al., “Trajectory triangulation: 3D reconstruction of moving points from amonocular image sequence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, No. 4, pages, Apr. 2000. |
Leclerc et al., “The direct computation of height from shading”, Proceedings of Computer Vision and Pattern Recognition, pp. 552-558, year 1991. |
Zhang et al., “Shape from intensity gradient”, IEEE Transactions on Systems, Man and Cybernetics—Part A: Systems and Humans, vol. 29, No. 3, pp. 318-325, May 199. |
Zhang et al., “Height recovery from intensity gradients”, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 508-513, year 1994. |
Horn, B., “Height and gradient from shading”, International Journal of Computer Vision, No. 5, pp. 37-76, year 1990. |
Bruckstein, A., “On shape from shading”, Computer Vision, Graphics, and Image Processing, vol. 44, pp. 139-154, year 1988. |
Zhang et al., “Rapid Shape Acquisition Using Color Structured Light and Multi-Pass Dynamic Programming”, 1st International Symposium on 3D Data Processing Visualization and Transmission (3DPVT), Padova, Italy, Jul. 2002. |
Besl, P., “Active Optical Range Imaging Sensors”, Machine Vision and Applications, No. 1, pp. 127-152, USA 1988. |
Horn et al., “Toward optimal structured light patterns”, Proceedings of International Conference on Recent Advances in 3D Digital Imaging and Modeling, pp. 28-37, Ottawa, Canada, May 1997. |
Mendlovic, et al., “Composite harmonic filters for scale, projection and shift invariant pattern recognition”, Applied Optics, vol. 34, No. 2, pp. 310-316, Jan. 10, 1995. |
Asada et al., “Determining Surface Orientation by Projecting a Stripe Pattern”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, No. 5, year 1988. |
Winkelbach et al., “Shape from Single Stripe Pattern Illumination”, Luc Van Gool (Editor), (DAGM 2002) Patter Recognition, Lecture Notes in Computer Science 2449, p. 240-247, Springer 2002. |
Koninckx et al., “Efficient, Active 3D Acquisition, based on a Pattern-Specific Snake”, Luc Van Gool (Editor), (DAGM 2002) Pattern Recognition, Lecture Notes in Computer Science 2449, pp. 557-565, Springer 2002. |
Kimmel et al., Analyzing and synthesizing images by evolving curves with the Osher-Sethian method, International Journal of Computer Vision, 24(1), pp. 37-56 , year 1997. |
Zigelman et al., “Texture mapping using surface flattening via multi-dimensional scaling”, IEEE Transactions on Visualization and Computer Graphics, 8 (2), pp. 198-207, year 2002. |
Dainty, J.C., “Introduction”, Laser Speckle and Related Phenomena, pp. 1-7, Springer-Verlag, Berlin Heidelberg, 1975. |
Ypsilos et al., “Video-rate capture of Dynamic Face Shape and Appearance”, Sixth IEEE International Conference on Automatic Face and Gesture Recognition (FGR 2004), Seoul, Korea, May 17-19, 2004. |
Japanese Patent Application # 2008558981 Official Action dated Nov. 2, 2011. |
Chinese Patent Application # 2006800038004.2 Official Action dated Nov. 24, 2011. |
U.S. Appl. No. 12/522,172 Official Action dated Nov. 30, 2011. |
Japanese Patent Application # 2008558984 Official Action dated Nov. 1, 2011. |
U.S. Appl. No. 13/043,488 Official Action dated Jan. 3, 2012. |
Japanese Patent Application # 2008535179 Official Action dated Nov. 8, 2011. |
International Application PCT/IB2011/055155 Search Report dated Apr. 20, 2012. |
U.S. Appl. No. 12/397,362 Official Action dated Apr. 24, 2012. |
Japanese Patent Application # 2011-517308 Official Action dated Dec. 5, 2012. |
U.S. Appl. No. 12/844,864 Official Action dated Dec. 6, 2012. |
U.S. Appl. No. 12/758,047 Official Action dated Oct. 25, 2012. |
U.S. Appl. No. 13/036,023 Official Action dated Jan. 7, 2013. |
U.S. Appl. No. 13/541,775, filed Jul. 5, 2012. |
U.S. Appl. No. 12/282,517 Official Action dated Jun. 12, 2012. |
U.S. Appl. No. 12/522,172 Official Action dated Jun. 29, 2012. |
U.S. Appl. No. 12/703,794 Official Action dated Aug. 7, 2012. |
U.S. Appl. No. 12/522,176 Official Action dated Aug. 2, 2012. |
JP Patent Application # 2008558984 Office Action dated Jul. 3, 2012. |
Korean Patent Application # 10-2008-7025030 Office Action dated Feb. 25, 2013. |
U.S. Appl. No. 12/707,678 Office Action dated Feb. 26, 2013. |
U.S. Appl. No. 12/758,047 Office Action dated Apr. 25, 2013. |
U.S. Appl. No. 12/844,864 Office Action dated Apr. 11, 2013. |
Number | Date | Country | |
---|---|---|---|
20100118123 A1 | May 2010 | US |
Number | Date | Country | |
---|---|---|---|
60909487 | Apr 2007 | US | |
61016832 | Dec 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11899542 | Sep 2007 | US |
Child | 12522171 | US |