The present invention relates generally to methods and systems for mapping of three-dimensional (3D) objects, and specifically to optical 3D mapping.
Various methods are known in the art for optical 3D mapping, i.e., generating a 3D profile of the surface of an object by processing an optical image of the object.
Some methods are based on projecting a laser speckle pattern onto the object, and then analyzing an image of the pattern on the object. For example, PCT International Publication WO 2007/043036, whose disclosure is incorporated herein by reference, describes a system and method for object reconstruction in which a coherent light source and a generator of a random speckle pattern projects onto the object a coherent random speckle pattern. An imaging unit detects the light response of the illuminated region and generates image data. Shifts of the pattern in the image of the object relative to a reference image of the pattern are used in real-time reconstruction of a 3D map of the object.
In other methods of optical 3D mapping, different sorts of patterns are projected onto the object to be mapped. For example, PCT International Publication WO 93/03579 describes a three-dimensional vision system in which one or two projectors establish structured light comprising two sets of parallel stripes having different periodicities and angles. As another example, U.S. Pat. No. 6,751,344 describes a method for optically scanning a subject in which the subject is illuminated with a matrix of discrete two-dimensional image objects, such as a grid of dots. Other methods involve projection of a grating pattern, as described, for example, in U.S. Pat. No. 4,802,759. The disclosures of the above-mentioned patents and publications are incorporated herein by reference.
Other methods for 3D mapping and ranging use coded illumination. For example, Sazbon et al. describe a method of this sort for range estimation in “Qualitative Real-Time Range Extraction for Preplanned Scene Partitioning Using Laser Beam Coding,” Pattern Recognition Letters 26 (2005), pages 1772-1781, which is incorporated herein by reference. A phase-only filter codes the laser beam into M different diffraction patterns, corresponding to M different range segments in the workspace. Thus, each plane in the illuminated scene is irradiated with the pattern corresponding to the range of the plane from the light source. A common camera can be used to capture images of the scene, which may then be processed to determine the ranges of objects in the scene.
As another example, PCT International Publication WO 2007/105215 (published after the priority date of the present patent application), whose disclosure is incorporated herein by reference, describes a method for mapping in which a pattern of multiple spots is projected onto an object. The positions of the spots in the pattern are uncorrelated, but the shapes of the spots share a common characteristic. In some embodiments, the spot shape characteristic changes with distance from the illumination source. This distance-varying shape characteristic may be achieved by passing the illumination beam through one or more optical elements that are designed to superpose two optical constraints: one to split the beam into multiple spots, and another to create the distance-varying shape. An image of the spots on the object is captured and processed so as to derive a 3D map of the object.
Embodiments of the present invention that are described hereinbelow provide methods and systems for optical 3D mapping of an object. These methods operate by projection of a light pattern onto the object, capturing an image of the pattern on the object, and processing the image to detect features of the pattern that vary with distance.
There is therefore provided, in accordance with an embodiment of the present invention, a method for mapping, including:
projecting a pattern onto an object via an astigmatic optical element having different, respective focal lengths in different meridional planes of the element;
capturing an image of the pattern on the object; and
processing the image so as to derive a three-dimensional (3D) map of the object responsively to the different focal lengths.
In some embodiments, the astigmatic optical element causes the pattern that is projected on the object to be elongated with a direction of elongation that varies with a distance from the element, and processing the image includes finding the distance to the object responsively to the direction of the elongation. The pattern may include multiple spots, which are projected onto the object by the astigmatic optical element as ellipses, having respective major axes in the direction of elongation. The ellipses have respective minor axes, and the major and minor axes of each ellipse have respective lengths, and finding the distance may include comparing the respective lengths of the major and minor axes.
In one embodiment, the astigmatic optical element includes a combination of at least two cylindrical lens surfaces in different, respective orientations. Alternatively or additionally, the astigmatic optical element may include a diffractive optical element or an off-axis element.
There is also provided, in accordance with an embodiment of the present invention, a method for mapping, including:
directing light via an aperture so as to project onto an object a diffraction pattern characterized by a transition from near-field to far-field diffraction;
capturing an image of the diffraction pattern on the object; and
processing the image so as to derive a three-dimensional (3D) map of the object responsively to the transition of the diffraction pattern.
In a disclosed embodiment, directing the light includes projecting multiple spots onto the object, wherein each spot exhibits the diffraction pattern responsively to a distance from the aperture of a respective location on the object onto which the spot is projected.
There is additionally provided, in accordance with an embodiment of the present invention, a method for mapping, including:
capturing an image of an object, which has a surface with features having respective shapes at respective locations on the surface, using an optical objective that is configured to modify the shapes of the features in the image as a function of a distance of the respective locations from the objective; and
processing the image so as to derive a three-dimensional (3D) map of the object responsively to the modified shapes of the features.
In disclosed embodiments, the method includes projecting a pattern of multiple spots onto the surface of the object, wherein the features in the image include the spots, and wherein processing the image includes analyzing the shapes of the spots in the image.
In one embodiment, the optical objective includes an astigmatic optical element having different, respective focal lengths in different meridional planes of the element. Alternatively or additionally, the objective may include a diffractive optical element.
There is further provided, in accordance with an embodiment of the present invention, a method for mapping, including:
projecting a pattern having a size characteristic onto an object from an illumination assembly at a first distance from the object;
capturing an image of the pattern on the object using an image capture assembly at a second distance from the object, which is different from the first distance; and
processing the image so as to derive a three-dimensional (3D) map of the object responsively to the size characteristic of the pattern in the image and to a difference between the first and second distances.
Typically, the projected pattern includes multiple spots, and the size characteristic is selected from a set of characteristics consisting of sizes of the spots and distances between the spots.
There is moreover provided, in accordance with an embodiment of the present invention, apparatus for mapping, including:
an illumination assembly, which includes an astigmatic optical element having different, respective focal lengths in different meridional planes of the element and is configured to project a pattern onto an object via the astigmatic optical element;
an image capture assembly, which is configured to capture an image of the pattern on the object; and
an image processor, which is configured to process the image so as to derive a three-dimensional (3D) map of the object responsively to the different focal lengths.
There is furthermore provided, in accordance with an embodiment of the present invention, apparatus for mapping, including:
an illumination assembly, which includes an aperture and is configured to direct light via the aperture so as to project onto an object a diffraction pattern characterized by a transition from near-field to far-field diffraction;
an image capture assembly, which is configured to capture an image of the diffraction pattern on the object; and
an image processor, which is configured to process the image so as to derive a three-dimensional (3D) map of the object responsively to the transition of the diffraction pattern.
There is also provided, in accordance with an embodiment of the present invention, apparatus for mapping, including:
an image capture assembly, which is configured to capture an image of an object, which has a surface with features having respective shapes at respective locations on the surface, and includes an optical objective that is configured to modify the shapes of the features in the image as a function of a distance of the respective locations from the objective; and
an image processor, which is configured to process the image so as to derive a three-dimensional (3D) map of the object responsively to the modified shapes of the features.
There is additionally provided, in accordance with an embodiment of the present invention, apparatus for mapping, including:
an illumination assembly, which is located at a first distance from an object and is configured to project a pattern having a size characteristic onto the object;
an image capture assembly, which is located at a second distance from the object, which is different from the first distance, and is configured to capture an image of the pattern on the object; and
an image processor, which is configured to process the image so as to derive a three-dimensional (3D) map of the object responsively to the size characteristic of the patter in the image and to a difference between the first and second distances.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
In some embodiments, the pattern that is projected by imaging device 22 comprises a pattern of spots, which may be either bright or dark and may have any suitable shapes, including shapes that exhibit random variations. In some embodiments, the pattern comprises a speckle pattern, meaning a projected pattern of bright spots whose positions are uncorrelated in planes transverse to the projection beam axis. Speckle patterns of this sort may be created by diffusion of a laser beam or by human or computer design, as described, for example, in the above-mentioned PCT International Publications WO 2007/043036 and WO 2007/105215. In other embodiments, the spots may be arranged in a regular, non-random pattern, such as the type of pattern that may be created by passing the illumination beam through a Damman grating or a suitable lenslet array.
An image processor 24 processes image data generated by device 22 in order to determine the distance to one or more points on the surface of object 28 and thus, typically, to reconstruct a 3D map of the object. The term “3D map” refers to a set of 3D coordinates representing the surface of the object. The derivation of such a map based on image data is referred to herein as “3D mapping” or equivalently, “3D reconstruction.” Image processor 24 computes the 3D coordinates of points on the surface of object 28 based on variations in the shapes and/or sizes of the spots appearing in the images captured by device 22, as described hereinbelow. The information provided by the shapes and/or sizes may optionally be supplemented by triangulation, based on the transverse shifts of the spots in the image relative to a reference pattern. Methods for this sort of triangulation-based 3D mapping using a projected laser speckle pattern are described in the above-mentioned PCT publications.
Image processor 24 may comprise a general-purpose computer processor, which is programmed in software to carry out the functions described hereinbelow. The software may be downloaded to processor 24 in electronic form, over a network, for example, or it may alternatively be provided on tangible media, such as optical, magnetic, or electronic memory media. Alternatively or additionally, some or all of the functions of the image processor may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP). Although processor 24 is shown in
The 3D map that is generated by processor 24 may be used for a wide range of different purposes. For example, the map may be sent to an output device, such as a display 26, which shows a pseudo-3D image of the object. In the example shown in
The pattern created by light source 34 and transparency 36 is projected by an astigmatic optical element 38 onto object 28. Although transparency 36 is located in
Transparency 36 may be configured to create various sorts of patterns of spots. In some embodiments, the transparency comprises a diffuser, which creates a random speckle pattern when illuminated by a coherent light source. Alternatively, the transparency may contain a pattern of binary (white/black) spots, distributed over the area of the transparency according to the values of a pseudo-random distribution function. Details of these and other means for creating random and pseudo-random patterns are described in the above-mentioned PCT publications. Further alternatively, transparency 36 may contain a regular, non-random pattern of spots or possibly other shapes.
An image capture assembly 32 captures an image of the pattern that has been projected by illumination assembly 30 onto object 28. The image capture assembly comprises objective optics 40 (also referred to simply as an “objective”), which focus the image onto an image sensor 42. Typically, sensor 42 comprises an array of detector elements 44, such as a CCD or CMOS-based image sensor array. Assembly 32 may also comprise a bandpass filter (not shown in the figures), chosen and positioned so that sensor 42 receives only light in the emission band of light source 34, while filtering out ambient light that might otherwise reduce the contrast of the image of the projected pattern that is captured by the sensor.
In the embodiment shown in
Astigmatic optical element 38 causes the shapes of the spots that are projected on the object to be elongated with a direction of elongation that varies with distance from element 38. This phenomenon arises because the astigmatic element has different, respective focal lengths in different meridional planes, as shown below in
To generate the 3D map of object 28, image processor 24 (
The spot direction calculation may be used by itself or in combination with other methods of image analysis for distance determination. For example, image processor 24 may match the group of spots in each area of the captured image to a reference image in order to find the relative shift between the matching groups of spots. The image processor may then use this shift to find the Z-coordinate of the corresponding area of the object by triangulation, as described in the above-mentioned PCT international publications.
Reference is now made to
In the embodiment shown in
As illustrated in
Assuming lens 50 to have height h1 (the dimension in the direction perpendicular to its meridional plane 54), and lens 52 to have height h2, the respective lengths of spots 46 and 60 will be l1 and l2, as given by:
The major and minor axes of the ellipses between the foci will vary linearly between these limits and may thus be used by processor 24 in computing the distance to the spot on the object.
In an alternative embodiment of the present invention, transparency 36 (
In the present embodiment, image capture assembly 32 captures an image of the diffraction pattern that is projected onto the object. Processor 24 compares the form of the pattern to the expected forms of the near- and far-field patterns (typically including intermediate transition patterns) in order to determine the distance of the object from the illumination assembly. As in the preceding embodiments, when an array of spots is projected onto the object, the processor typically examines the diffraction pattern exhibited by each spot in order to determine the 3D coordinates of the corresponding location on the object and thus create a 3D map of the object. This sort of pattern analysis may similarly be combined with triangulation-based depth information.
In the preceding embodiments, optical elements with non-uniform response, such as astigmatic optics, are used in the illumination assembly to create spots on the object whose shape changes with distance, and this shape change is used in 3D mapping of the object. (In the context of the present patent application and in the claims, terms such as “changes of shape” and “to modify the shape” refer to changes other than the simple linear increase in spot size that normally occurs with distance from the illumination source or objective optics.) In alternative embodiments, on the other hand, objective optics with non-uniform response may be used to create depth-dependent shapes of features in images of the object. The features may comprise spots in a pattern that is projected onto the object or any other suitable type of projected pattern or even inherent features of the object surface itself.
An image capture assembly 94 comprises an astigmatic optical objective 98, which forms an image on sensor 42 of the spot pattern that is projected onto the object. In this simplified example, objective 98 comprises two cylindrical lenses 100 and 102 in different orientations, like the lenses shown in
Processor 24 analyzes the spot shapes in order to determine the 3D coordinates of the spot locations and thus derive a 3D map of the object. Again, this sort of shape analysis may be combined with triangulation-based shift analysis, as described above.
In other embodiments, objective 98 may comprise a diffractive optical element (DOE), which likewise modifies the shapes of the spots in the image as a function of object distance. The DOE may have an astigmatic response, like the DOE shown above in
Alternatively or additionally, objective 98 may comprise other sorts of refractive optical elements with sufficient astigmatism to create a suitable shape variation of the spots in the image. For example, an off-axis spherical lens may be used for this purpose.
In the embodiments shown in the preceding figures, it is convenient that the illumination and image capture assemblies be located at approximately the same distance from object 28. As a result, the size characteristics of the projected pattern (such as the sizes of the spots and distances between the spots) in the image of the pattern that is formed on the image sensor do not change significantly with distance of the spot locations on the object from the imaging assembly. The reason for the constant size characteristics in the image stems simply from geometrical optics: The projected pattern on the object is magnified linearly with the distance, and the respective images of this pattern are demagnified linearly in the same proportion. In other words, the angular extent of any given spot or distance between spots in both the projected pattern and in the image remains constant regardless of the distance to which the spots are projected.
In order to create a 3D map of object 28, processor 24 (
Although
It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
This application claims the benefit of U.S. Provisional Patent Application 60/944,807, filed Jun. 19, 2007, whose disclosure is incorporated herein by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IL2008/000838 | 6/19/2008 | WO | 00 | 7/6/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2008/155770 | 12/24/2008 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4336978 | Suzuki | Jun 1982 | A |
4542376 | Bass et al. | Sep 1985 | A |
4802759 | Matsumoto | Feb 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
5075562 | Greivenkamp et al. | Dec 1991 | A |
5630043 | Uhlin | May 1997 | A |
5636025 | Bieman et al. | Jun 1997 | A |
5835218 | Harding | Nov 1998 | A |
5838428 | Pipitone et al. | Nov 1998 | A |
5856871 | Cabib et al. | Jan 1999 | A |
5909312 | Mendlovic et al. | Jun 1999 | A |
6041140 | Binns et al. | Mar 2000 | A |
6081269 | Quarendon | Jun 2000 | A |
6084712 | Harding | Jul 2000 | A |
6088105 | Link | Jul 2000 | A |
6099134 | Taniguchi et al. | Aug 2000 | A |
6101269 | Hunter et al. | Aug 2000 | A |
6108036 | Harada et al. | Aug 2000 | A |
6167151 | Albeck | Dec 2000 | A |
6259561 | George et al. | Jul 2001 | B1 |
6262740 | Lauer et al. | Jul 2001 | B1 |
6268923 | Michniewicz et al. | Jul 2001 | B1 |
6301059 | Huang et al. | Oct 2001 | B1 |
6404553 | Wootton et al. | Jun 2002 | B1 |
6438263 | Albeck et al. | Aug 2002 | B2 |
6494837 | Kim et al. | Dec 2002 | B2 |
6495848 | Rubbert | Dec 2002 | B1 |
6631647 | Seale | Oct 2003 | B2 |
6686921 | Rushmeier et al. | Feb 2004 | B1 |
6700669 | Geng | Mar 2004 | B1 |
6731391 | Kao et al. | May 2004 | B1 |
6741251 | Malzbender | May 2004 | B2 |
6751344 | Grumbine | Jun 2004 | B1 |
6754370 | Hall-Holt et al. | Jun 2004 | B1 |
6759646 | Acharya et al. | Jul 2004 | B1 |
6803777 | Pfaff et al. | Oct 2004 | B2 |
6810135 | Berenz et al. | Oct 2004 | B1 |
6825985 | Brown et al. | Nov 2004 | B2 |
6841780 | Cofer et al. | Jan 2005 | B2 |
6859326 | Sales | Feb 2005 | B2 |
6937348 | Geng | Aug 2005 | B2 |
7006952 | Matsumoto et al. | Feb 2006 | B1 |
7009742 | Brotherton-Ratcliffe et al. | Mar 2006 | B2 |
7013040 | Shiratani | Mar 2006 | B2 |
7127101 | Littlefield et al. | Oct 2006 | B2 |
7194105 | Hersch et al. | Mar 2007 | B2 |
7231069 | Nahata | Jun 2007 | B2 |
7256899 | Faul et al. | Aug 2007 | B1 |
7369685 | DeLean | May 2008 | B2 |
7385708 | Ackerman et al. | Jun 2008 | B2 |
7433024 | Garcia et al. | Oct 2008 | B2 |
7560679 | Gutierrez | Jul 2009 | B1 |
7659995 | Knighton et al. | Feb 2010 | B2 |
7700904 | Toyoda et al. | Apr 2010 | B2 |
7751063 | Dillon et al. | Jul 2010 | B2 |
7840031 | Albertson et al. | Nov 2010 | B2 |
7952781 | Weiss et al. | May 2011 | B2 |
8018579 | Krah | Sep 2011 | B1 |
8035806 | Jin et al. | Oct 2011 | B2 |
8126261 | Medioni et al. | Feb 2012 | B2 |
8326025 | Boughorbel | Dec 2012 | B2 |
20010016063 | Albeck et al. | Aug 2001 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20030048237 | Sato et al. | Mar 2003 | A1 |
20040001145 | Abbate | Jan 2004 | A1 |
20040130730 | Cantin et al. | Jul 2004 | A1 |
20040130790 | Sales | Jul 2004 | A1 |
20040174770 | Rees | Sep 2004 | A1 |
20040213463 | Morrison | Oct 2004 | A1 |
20040218262 | Chuang et al. | Nov 2004 | A1 |
20050007551 | Wakil et al. | Jan 2005 | A1 |
20050018209 | Lemelin et al. | Jan 2005 | A1 |
20050052637 | Shaw et al. | Mar 2005 | A1 |
20050200838 | Shaw et al. | Sep 2005 | A1 |
20050231465 | DePue et al. | Oct 2005 | A1 |
20060017656 | Miyahara | Jan 2006 | A1 |
20060221218 | Adler et al. | Oct 2006 | A1 |
20060221250 | Rossbach et al. | Oct 2006 | A1 |
20060269896 | Liu et al. | Nov 2006 | A1 |
20070057946 | Albeck et al. | Mar 2007 | A1 |
20070133840 | Cilia | Jun 2007 | A1 |
20070165243 | Kang et al. | Jul 2007 | A1 |
20080031513 | Hart | Feb 2008 | A1 |
20080106746 | Shpunt et al. | May 2008 | A1 |
20080118143 | Gordon et al. | May 2008 | A1 |
20080198355 | Domenicali et al. | Aug 2008 | A1 |
20080212835 | Tavor | Sep 2008 | A1 |
20080240502 | Freedman et al. | Oct 2008 | A1 |
20080247670 | Tam et al. | Oct 2008 | A1 |
20080278572 | Gharib et al. | Nov 2008 | A1 |
20090016642 | Hart | Jan 2009 | A1 |
20090060307 | Ghanem et al. | Mar 2009 | A1 |
20090096783 | Shpunt et al. | Apr 2009 | A1 |
20090226079 | Katz et al. | Sep 2009 | A1 |
20090244309 | Maison et al. | Oct 2009 | A1 |
20100007717 | Spektor et al. | Jan 2010 | A1 |
20100013860 | Mandella et al. | Jan 2010 | A1 |
20100020078 | Shpunt | Jan 2010 | A1 |
20100118123 | Freedman et al. | May 2010 | A1 |
20100128221 | Muller et al. | May 2010 | A1 |
20100142014 | Rosen et al. | Jun 2010 | A1 |
20100177164 | Zalevsky | Jul 2010 | A1 |
20100182406 | Benitez | Jul 2010 | A1 |
20100194745 | Leister et al. | Aug 2010 | A1 |
20100201811 | Garcia et al. | Aug 2010 | A1 |
20100225746 | Shpunt et al. | Sep 2010 | A1 |
20100243899 | Ovsiannikov et al. | Sep 2010 | A1 |
20100245826 | Lee | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100278384 | Shotton et al. | Nov 2010 | A1 |
20100303289 | Polzin et al. | Dec 2010 | A1 |
20110001799 | Rothenberger et al. | Jan 2011 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110074932 | Gharib et al. | Mar 2011 | A1 |
20110096182 | Cohen et al. | Apr 2011 | A1 |
20110134114 | Rais et al. | Jun 2011 | A1 |
20110158508 | Shpunt et al. | Jun 2011 | A1 |
20110211044 | Shpunt et al. | Sep 2011 | A1 |
20110279648 | Lutian et al. | Nov 2011 | A1 |
20110285910 | Bamji et al. | Nov 2011 | A1 |
20120012899 | Jin et al. | Jan 2012 | A1 |
20120051588 | McEldowney | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
19736169 | Aug 1997 | DE |
19638727 | Mar 1998 | DE |
2352901 | Feb 2001 | GB |
62206684 | Sep 1987 | JP |
01-240863 | Sep 1989 | JP |
03-029806 | Feb 1991 | JP |
H03-040591 | Feb 1991 | JP |
06-273432 | Sep 1994 | JP |
H08-186845 | Jul 1996 | JP |
H10-327433 | Dec 1998 | JP |
2001141430 | May 2001 | JP |
2002122417 | Apr 2002 | JP |
2002-152776 | May 2002 | JP |
2002-213931 | Jul 2002 | JP |
2002-365023 | Dec 2002 | JP |
2006-128818 | May 2006 | JP |
9303579 | Feb 1993 | WO |
9827514 | Jun 1998 | WO |
9828593 | Jul 1998 | WO |
9828593 | Jul 1998 | WO |
2005010825 | Feb 2005 | WO |
2007043036 | Apr 2007 | WO |
2007096893 | Aug 2007 | WO |
2007105205 | Sep 2007 | WO |
2007105215 | Sep 2007 | WO |
2008120217 | Oct 2008 | WO |
2008155770 | Dec 2008 | WO |
Entry |
---|
Lavoie et al., “3-D Object Model Recovery From 2-D Images Using Structured Light”, IEEE Transactions on Instrumentation and Measurement, vol. 53, No. 2, pp. 437-443, Apr. 2004. |
Chinese Application # 200780016625.5 Office Action dated May 12, 2011. |
U.S. Appl. No. 11/899,542 Office Action dated Apr. 4, 2011. |
U.S. Appl. No. 11/724,068 Office Action dated Mar. 1, 2011. |
Chinese Application # 200780009053.8 Office Action dated Mar. 10, 2011. |
Japanese Application # 2008535179 Office Action dated Apr. 1, 2011. |
Kun et al., “Gaussian Laser Beam Spatial Distribution Measurement by Speckles Displacement Method”, HICH Power Laser and Particle Beams, vol. 12, No. 2, Apr. 2000. |
Chinese Patent Application # 200680038004.2 Official Action dated Dec. 24, 2010. |
Hongjun et al., “Shape Measurement by Digital Speckle Temporal Sequence Correlation Method”, Acta Optica Sinica Journal, vol. 21, No. 10, pp. 1208-1213, Oct. 2001. |
Japanese Patent Application # 2008558981 Official Action dated Nov. 2, 2011. |
U.S. Appl. No. 12/522,171 Official Action dated Dec. 22, 2011. |
U.S. Appl. No. 12/522,172 Official Action dated Nov. 30, 2011. |
Japanese Patent Application # 2008558984 Official Action dated Nov. 1, 2011. |
U.S. Appl. No. 13/043,488 Official Action dated Jan. 3, 2012. |
Japanese Patent Application # 2008535179 Official Action dated Nov. 8, 2011. |
Chinese Patent Application # 2006800038004.2 Official Action dated Nov. 24, 2011. |
Marcia et al., “Superimposed Video Disambiguation for Increased Field of View”, Optics Express 16:21, pp. 16352-16363, year 2008. |
Guan et al., “Composite Structured Light Pattern for Three Dimensional Video”, Optics Express 11:5, pp. 406-417, year 2008. |
U.S. Appl. No. 13/311,584, filed Dec. 6, 2011. |
PCT Application PCT/IB2011/055155 filed on Nov. 17, 2011. |
Engfield, N., “Use of Pseudorandom Encoded Grid in U.S. Appl. No. 11/899,542”, Andrews Robichaud, Jun. 22, 2011. |
U.S. Appl. No. 61/471,215, filed Apr. 4, 2011. |
Chinese Patent Application # 200680038004.2 Official Action dated Aug. 3, 2011 (English translation). |
International Application PCT/IB2011/053560 filed on Aug. 10, 2011. |
U.S. Appl. No. 61/419,891, filed Dec. 6, 2010. |
U.S. Appl. No. 61/415,352, filed Nov. 19, 2010. |
Hart, D., U.S. Appl. No. 09/616,606 “Method and System for High Resolution , Ultra Fast 3-D Imaging” filed on Jul. 14, 2000. |
International Application PCT/IL2007/000306 Search Report dated Oct. 2, 2008. |
International Application PCT/IL2007/000306 Preliminary Report on Patentability dated Mar. 19, 2009. |
International Application PCT/IL20027/000262 Search Report dated Oct. 16, 2008. |
International Application PCT/IL2007/000262 Preliminary Report on Patentability dated Mar. 19, 2009. |
Ben Eliezer et al., “All optical extended depth of field imaging system”, Journal of Optics A: Pure and Applied Optics—A, 5, pp. S164-S169, 2003. |
Sazbon et al., “Qualitative real-time range extraction for preplanned scene partitioning using laser beam coding”, Pattern Recognition Letters 26, pp. 1772-1781, year 2005. |
Sjodahl et al., “Measurement of shape by using projected random and patterns and temporal digital speckle photography”, Applied Optics, vol. 38, No. 10, Apr. 1, 1999. |
Garcia et al., “Three dimensional mapping and range measurement by means of projected speckle patterns”, Applied Optics, vol. 47, No. 16, Jun. 1, 2008. |
Chen et al., “Measuring of a Three-Dimensional Surface by Use of a Spatial Distance Computation”, Applied Optics, vol. 42, issue 11, pp. 1958-1972, 2003. |
Ypsilos et al., “Speech-driven Face Synthesis from 3D Video”, 2nd International Symposium on 3D Processing, Visualization and Transmission, Thessaloniki, Greece, Sep. 6-9, 2004. |
S.G. Hanson, et al. “Optics and Fluid Dynamics Department”, Annual Progress Report for 1997 (an abstract). |
International Application PCT/IL2008/000458 Search Report dated Oct. 28, 2008. |
International Application PCT/IL2007/000327 Search Report dated Sep. 26, 2008. |
International Application PCT/IL2007/000327 Preliminary Report on Patentability dated Mar. 12, 2009. |
Goodman, J.W., “Statistical Properties of Laser Speckle Patterns”, Laser Speckle and Related Phenomena, pp. 9-75, Springer-Verlag, Berlin Heidelberg, 1975. |
International Application PCT/IL2006/000335 Preliminary Report on Patentability dated Apr. 24, 2008. |
Avidan et al., “Trajectory triangulation: 3D reconstruction of moving points from amonocular image sequence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, No. 4, pages, Apr. 2000. |
Leclerc et al., “The direct computation of height from shading”, Proceedings of Computer Vision and Pattern Recognition, pp. 552-558, year 1991. |
Zhang et al., “Shape from intensity gradient”, IEEE Transactions on Systems, Man and Cybernetics—Part A: Systems and Humans, vol. 29, No. 3, pp. 318-325, May 199. |
Zhang et al., “Height recovery from intensity gradients”, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 508-513, year 1994. |
Horn, B., “Height and gradient from shading”, International Journal of Computer Vision, No. 5, pp. 37-76, year 1990. |
Bruckstein, A., “On shape from shading”, Computer Vision, Graphics, and Image Processing, vol. 44, pp. 139-154, year 1988. |
Zhang et al., “Rapid Shape Acquisition Using Color Structured Light and Multi-Pass Dynamic Programming”, 1st International Symposium on 3D Data Processing Visualization and Transmission (3DPVT), Padova, Italy, Jul. 2002. |
Besl, P., “Active Optical Range Imaging Sensors”, Machine Vision and Applications, No. 1, pp. 127-152, USA 1988. |
Horn et al., “Toward optimal structured light patterns”, Proceedings of International Conference on Recent Advances in 3D Digital Imaging and Modeling, pp. 28-37, Ottawa, Canada, May 1997. |
Mendlovic, et al., “Composite harmonic filters for scale, projection and shift invariant pattern recognition”, Applied Optics, vol. 34, No. 2, pp. 310-316, Jan. 10, 1995. |
Asada et al., “Determining Surface Orientation by Projecting a Stripe Pattern”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, No. 5, year 1988. |
Winkelbach et al., “Shape from Single Stripe Pattern Illumination”, Luc Van Gool (Editor), (DAGM 2002) Patter Recognition, Lecture Notes in Computer Science 2449, p. 240-247, Springer 2002. |
Koninckx et al., “Efficient, Active 3D Acquisition, based on a Pattern-Specific Snake”, Luc Van Gool (Editor), (DAGM 2002) Pattern Recognition, Lecture Notes in Computer Science 2449, pp. 557-565, Springer 2002. |
Kimmel et al., Analyzing and synthesizing images by evolving curves with the Osher-Sethian method, International Journal of Computer Vision, 24(1), pp. 37-56 , year 1997. |
Zigelman et al., “Texture mapping using surface flattening via multi-dimensional scaling”, IEEE Transactions on Visualization and Computer Graphics, 8 (2), pp. 198-207, year 2002. |
Dainty, J.C., “Introduction”, Laser Speckle and Related Phenomena, pp. 1-7, Springer-Verlag, Berlin Heidelberg, 1975. |
Ypsilos et al., “Video-rate capture of Dynamic Face Shape and Appearance”, Sixth IEEE International Conference on Automatic Face and Gesture Recognition (FGR 2004), Seoul, Korea, May 17-19, 2004. |
International Application PCT/IL2008/000838 Search Report dated Nov. 12, 2008. |
U.S. Appl. No. 12/522,171 Official Action dated Apr. 5, 2012. |
U.S. Appl. No. 12/397,362 Official Action dated Apr. 24, 2012. |
International Application PCT/IB2011/053560 Search Report dated Jan. 19, 2012. |
International Application PCT/IB2011/055155 Search Report dated Apr. 20, 2012. |
U.S. Appl. No. 13/311,589 ,filed Dec. 6, 2011. |
U.S. Appl. No. 13/437,977, filed Apr. 3, 2012. |
U.S. Appl. No. 61/598,921, filed Feb. 15, 2012. |
Richardson, W. H., “Bayesian-Based Iterative Method of Image Restoration”, Journal of the Optical Society of America, vol. 62, No. 1, pp. 55-59, Jan. 1972. |
Omnivision Technologies Inc., “OV2710 1080p/720p HD Color CMOS Image Sensor with OmniPixel3-HS Technology”, Dec. 2011. |
U.S. Appl. No. 13/541,775, filed Jul. 5, 2012. |
U.S. Appl. No. 12/282,517 Official Action dated Jun. 12, 2012. |
U.S. Appl. No. 12/522,172 Official Action dated Jun. 29, 2012. |
U.S. Appl. No. 12/703,794 Official Action dated Aug. 7, 2012. |
JP Patent Application # 2008558984 Office Action dated Jul. 3, 2012. |
Japanese Patent Application # 2011-517308 Official Action dated Dec 5, 2012. |
U.S. Appl. No. 12/844,864 Official Action dated Dec 6, 2012. |
U.S. Appl. No. 12/758,047 Official Action dated Oct 25, 2012. |
U.S. Appl. No. 13/036,023 Official Action dated Jan. 7, 2013. |
Korean Patent Application # 10-2008-7025030 Office Action dated Feb. 25, 2013. |
U.S. Appl. No. 12/707,678 Office Action dated Feb. 26, 2013. |
U.S. Appl. No. 12/758,047 Office Action dated Apr. 25, 2013. |
U.S. Appl. No. 12/844,864 Office Action dated Apr. 11, 2013. |
Number | Date | Country | |
---|---|---|---|
20100290698 A1 | Nov 2010 | US |
Number | Date | Country | |
---|---|---|---|
60944807 | Jun 2007 | US |