The present invention relates generally to methods and systems for mapping of three-dimensional (3D) objects, and specifically to optical 3D mapping.
Various methods are known in the art for optical 3D mapping, i.e., generating a 3D profile of the surface of an object by processing an optical image of the object.
Some methods are based on projecting a laser speckle pattern onto the object, and then analyzing an image of the pattern on the object. For example, PCT International Publication WO 2007/043036, whose disclosure is incorporated herein by reference, describes a system and method for object reconstruction, in which a coherent light source and a generator of a random speckle pattern projects onto the object a coherent random speckle pattern. An imaging unit detects the light response of the illuminated region and generates image data. Shifts of the pattern in the image of the object relative to a reference image of the pattern are used in real-time reconstruction of a 3D map of the object.
Other methods use pulsed modulation in order to measure time of flight of light from an object to a detector. For example, U.S. Pat. No. 6,100,517, whose disclosure is incorporated herein by reference, describes a camera for creating an image indicating distances to points in objects in a scene. A modulated source of radiation, having a first modulation function, directs radiation toward a scene. An array detector, having a second modulation function, different from the first modulation function, detects radiation from the scene. A processor forms an image having an intensity value distribution indicative of the distance to each of the points in the scene based on the detected radiation.
Yet another method for distance mapping is described by Iizuka in “Divergence-Ratio Axi-Vision Camera (Divcam): A Distance Mapping Camera,” Review of Scientific Instruments 77, 045111 (2006), which is incorporated herein by reference. Two similar infrared (IR) light-emitting diodes (LEDs) are installed for illuminating the same object, one in front of an IR charge-coupled device (CCD) camera and the other behind the camera. One snapshot is taken of the object lighting only the front LED, and another lighting only the back LED. The ratio of the intensities of the two images is used to calculate the distance to the object, based on the decay of the intensities with distance due to the divergence of the light.
The embodiments of the present invention that are described hereinbelow provide methods and apparatus for 3D mapping of an object, based on illuminating the object with two (or more) beams of radiation having different beam characteristics. An image capture assembly captures at least one image of the object while the object is under illumination by the beams. The local differences in the illumination cast on the object by the beams, as captured in the image, are indicative of the distance of each point on the object from a reference point, such as the location of the image capture assembly. These local differences may thus be analyzed in order to generate the 3D map of the object.
The use of multiple beams with different beam characteristics affords substantial flexibility in the design and deployment of the apparatus. This feature obviates the need for multiple light sources to be positioned at different distances from the object, as in some systems that are known in the art. Embodiments of the present invention permit the beams to be generated either by multiple illumination source or by a single source, and to illuminate the object either sequentially, in separate image frames, or simultaneously in a single image frame, as explained hereinbelow.
There is therefore provided, in accordance with an embodiment of the present invention, a method for mapping an object, including:
illuminating the object with at least two beams of radiation having different beam characteristics;
capturing at least one image of the object under illumination with each of the at least two beams;
processing the at least one image to detect local differences in an intensity of the illumination cast on the object by the at least two beams; and
analyzing the local differences in order to generate a three-dimensional (3D) map of the object.
The at least two beams may have different respective divergence characteristics, different geometrical shapes, or different wavelengths, or may be configured so as to project different patterns on the object.
Illuminating the object may include generating the at least two beams using light originating from a single source. Alternatively, the at least two beams may be generated using light originating from different sources in different respective positions, wherein the different respective positions of the different sources may be located at equal respective distances from a plane of the object.
Capturing the at least one image may include capturing a single image under the illumination of both beams or capturing first and second images under illumination of the object by first and second beams, respectively.
In a disclosed embodiment, the at least two beams include first and second beams having known, respective intensity distributions I1(x, y, z) and I2(x, y, z), and capturing the image includes detecting first and second intensity patterns on the object D1(x, y, z) and D2(x, y, z) using an image capture device, wherein z is a distance from the image capture device to each point (x,y) on the object, and analyzing the local differences includes inverting an equation of a form
in order to generate the 3D map of z as a function of (x,y).
In some embodiments, the image processor is configured to analyze the local differences in the intensity of the illumination cast on the object by the at least two beams in order to generate the 3D map while canceling ambiguities due to ambient light reflected from the object. In one embodiment, the at least two beams include three beams, and the image processor is configured to use the at least one image captured under the illumination with each of the three beams to cancel the ambiguity.
In some embodiments, capturing the at least one image includes capturing a succession of images while the object is moving, and analyzing the local differences includes mapping a 3D movement of the object. In one embodiment, the object is a part of a human body, and the 3D movement includes a gesture made by the part of the human body, and mapping the 3D movement includes providing an input to a computer application responsively to the gesture.
There is also provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including:
an illumination assembly, which is configured to illuminate the object with at least two beams of radiation having different beam characteristics;
an image capture assembly, which is configured to capture at least one image of the object under illumination with each of the at least two beams; and
an image processor, which is configured to process the at least one image to detect local differences in an intensity of the illumination cast on the object by the at least two beams, and to analyze the local differences in order to generate a three-dimensional (3D) map of the object.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
The multiple beams projected by device 22 have different beam characteristics, which result in distinguishably-different aspects of the illumination that is cast on the object by the different beams. For example, the beams may have different divergence characteristics, meaning that the intensity of one beam as a function of distance from device 22 drops at a different rate from another beam. Additionally or alternatively, the beams may have different geometrical shapes or may otherwise project different sorts of patterned light, such as a speckle pattern, or may have different wavelengths. These differences make it possible to distinguish the illumination due to each of the beams even when they are projected simultaneously. It is not necessary for the operation of device 22 that the beams be generated by different light sources at different distances from the plane of object 28 (in contrast to the system described in the above-mentioned article by Iizuka, for example). In fact, the beams may be generated by a single source using suitable optics to create the multiple beams.
An image processor 24 processes image data generated by device 22 in order to reconstruct a 3D map of object 28. The term “3D map” refers to a set of 3D coordinates representing the surface of the object. The derivation of such a map based on image data is referred to herein as “3D mapping” or equivalently, “3D reconstruction.” Image processor 24 computes the 3D coordinates of points on the surface of object 28 by detecting differences in the intensity of the illumination that is cast on each point of the object by each of the illumination beams. The methods that may be used for this purpose are described hereinbelow with reference to
Image processor 24 may comprise a general-purpose computer processor, which is programmed in software to carry out the functions described hereinbelow. The software may be downloaded to processor 24 in electronic form, over a network, for example, or it may alternatively be provided on tangible media, such as optical, magnetic, or electronic memory media. Alternatively or additionally, some or all of the functions of the image processor may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP). Although processor 24 is shown in
The 3D map that is generated by processor 24 may be used for a wide range of different purposes. For example, the map may be sent to an output device, such as a display 26, which shows a pseudo-3D image of the object. In the example shown in
An image capture assembly 32 captures an image of the light that is cast by illumination assembly 30 onto object 28 and reflected back by the object. For convenience and clarity of illustration, illumination assembly 30 and image capture assembly 32 are positioned side by side in
Image capture assembly 32 comprises objective optics 40, which focus the image onto an image sensor 42, comprising an array of detector elements 44. Typically, sensor 40 comprises a rectilinear array of detector elements 44, such as a CCD or CMOS-based image sensor array. Assembly 32 may also comprise one or more bandpass filters (not shown in the figures), chosen and positioned so that sensor 42 receives only light in the emission band or bands of light sources 34 and 36, while filtering out ambient light that might otherwise reduce the contrast of the image of the projected pattern that is captured by the sensor.
In the embodiment shown in
For purposes of 3D mapping, image processor 24 determines the distance (z-coordinate) of different points (x,y) on object 28 based on the differences in the illumination cast on and reflected by the object due to sources 34 and 36. For this purpose, it is important that the respective illumination due to each of the sources be distinguishable from the other source. One way to distinguish the sources is to operate them sequentially, so that each image captured by assembly 32 is due to only one of the two sources. Sequential operation, however, may be problematic when object 28 is moving from frame to frame.
Therefore, in some embodiments, sources 34 and 36 operate simultaneously, so that each image frame captures light reflected from the object due to each of the sources. In the context of the present patent application and in the claims, the terms “light” and “illumination” refer to optical radiation of any wavelength, including infrared and ultraviolet, as well as visible, radiation. To distinguish the light due to the two sources, each source may radiate light at a different wavelength, and imaging assembly 32 may comprise appropriate means (not shown, such as filters) for optically separating and detecting the images of object 28 at each wavelength. For example, each detector elements 44 may have an appropriate color filter, or assembly 32 may comprise two or more image sensors, with a dichroic beamsplitter for casting the light due to each of the sources onto a different sensor.
As another example, one or both of the sources may emit patterned light with a known, respective pattern. Image processor 24 may then extract the pattern from the images captured by imaging assembly 32 and analyze the brightness of the pattern in order to measure the intensity of the illumination due to each source, even in the absence of wavelength discrimination. High-contrast patterns, such as a primary laser speckle pattern or structured light, are advantageous for this purpose. For instance, source 36 could comprise a diffuser generating a laser speckle pattern, while source 34 generates a pattern without spatial modulation, i.e., a constant pattern. The brightness of pixels in dark areas of the speckle pattern in the images captured by assembly 32 will be due almost entirely to source 34. Assuming the reflectivity of object 28 is roughly constant, this brightness may be measured and subtracted out of the pixels in the bright areas of the speckle pattern in order to determine the component of the brightness that is due to source 36. Alternatively or additionally, discrimination between the sources can be achieved by other computational methods known in the art, such as belief propagation or other methods of inference, or coding and decoding techniques similar to those used in some communication applications.
Thus, assuming beams 37 and 38 from sources 34 and 36 have suitable distinguishing characteristics, image processor 24 is able to extract, even from a single image, the respective light intensities D1(x, y, z) and D2(x, y, z) detected by imaging assembly 32 due to reflection of the two illumination beams from points (x, y, z) on the surface of object 28. The x and y coordinates correspond to pixel coordinates in the image, but the z-coordinate is implicit and cannot be determined unambiguously from the reflected intensities alone. The z-coordinate is therefore inferred by the image processor from comparison of the reflected intensities, as explained below:
Assuming the known beam patterns in space due to sources 34 and 36 may be represented respectively as I1(x, y, z) and I2(x, y, z), the detected intensities will then be given by:
wherein α(x, y) is the reflectivity of each point on the object 28 and R(x, y) is the distance from the imaging assembly to each such point. For z>>x,y, R(x, y)≅z(x, y), wherein z is the distance from the image capture assembly to each point (x,y) on the object. Because the area illuminated by beam 37 grows at a faster relative rate than beam 38 (since source 34 is smaller than source 36), I1(x, y, z) decreases more rapidly with increasing z than does I2(x, y, z).
Comparing equations (1) and (2) permits the effect of local reflectivity α(x, y) to be factored out, resulting in the relation:
As long as I1(x, y, z) and I2(x, y, z) are sufficiently different, in terms of their functional dependence on z, image processor 24 can invert this equation for each z to give z(x, y) in equations (1) and (2). In other words, since the intensity of beam 37 in
The above derivation assumes that the effect of ambient light, i.e., light other than from sources 34 and 36, can be ignored (due to effective filtering, for example). More generally, the captured intensity is given by IS(x, y, z)+IA(x, y, z), wherein IS(x, y, z) is the reflected source intensity and IA(x, y, z) is the ambient intensity that is reflected from the same place at the object. When ambient light is taken into account, equation (3) becomes
As a result, the ambient light may introduce ambiguities into the depth measurement.
The effect of the ambient light can be canceled out by various techniques. One possible method, for example, is to project three beams of light onto the object and solve the resulting system of two equations for R and IA(x, y, z).
Alternatively, even with only two projected beams, it is still possible to discriminate depth in the presence of ambient light by applying suitable inference techniques and assumptions. For example, it can generally be assumed that the level of ambient light varies slowly over the image, and therefore that the ambient light level in one area of the image whose depth is to be computed is approximately equal to the ambient light level that has been inferred or is to be inferred in neighboring areas. The processor may use this assumption in choosing the depth values that result in maximal consistency of ambient levels over the entire image. Various computational techniques that may be used for this purpose are known in the art. One possibility is to apply belief propagation techniques, such as those described by Achan et al., in “Phase Unwrapping by Minimizing Kikuchi Free Energy,” IEEE International Geoscience and Remote Sensing Symposium (Toronto, Canada, June 2002), pages 1738-1740, which is incorporated herein by reference.
On the other hand, DOE 58 is designed so that the output of laser 54 forms a beam 62 with a divergence characteristic in the far field that is different from the divergence characteristic in the near field. Thus, as shown in
Alternatively or additionally, a DOE or other suitable optical element may be used to give a change of shape of one of the beams illuminating the object relative to the other beam.
When assembly 70 is used in device 22, image processor 24 is able to compare the intensity of light reflected from object 28 in each of the spots due to beams 78, 80, . . . , to the mean intensity over beam 82 in order to measure the depth of different points on the object, as described above. Alternatively, if an axicon is used in place of lens 74, for example, beams 78, 80, . . . , will define rings, whose radii may be used to determine the distances at which they are incident on the object. Further alternatively or additionally, the distance between pairs of spots (or rings) due to pairs of beams 78, 80, . . . , may be measured in order to compute the depth. Further alternatively, other types of optical elements and filters may be used in modifying the beam characteristics of the beams generated by grating 76.
Although a number of specific optical setups are shown and described above in order to illustrate how beams with different beam characteristics may be created for the purpose of 3D mapping, other schemes for creation of multiple, distinguishable beams with different behavior as a function of distance from the beam source will be apparent to those skilled in the art. All such schemes, when combined with the principles of intensity-based 3D mapping that are described above, are considered to be within the scope of the present invention.
It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
This application claims the benefit of U.S. Provisional Patent Application 60/885,899, filed Jan. 21, 2007, whose disclosure is incorporated herein by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IL2008/000095 | 1/21/2008 | WO | 00 | 7/5/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2008/087652 | 7/24/2008 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4336978 | Suzuki | Jun 1982 | A |
4542376 | Bass et al. | Sep 1985 | A |
4802759 | Matsumoto et al. | Feb 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
5075562 | Greivenkamp, Jr. et al. | Dec 1991 | A |
5483261 | Yasutake | Jan 1996 | A |
5630043 | Uhlin | May 1997 | A |
5636025 | Bieman | Jun 1997 | A |
5835218 | Harding | Nov 1998 | A |
5838428 | Pipitone et al. | Nov 1998 | A |
5856871 | Cabib et al. | Jan 1999 | A |
5909312 | Mendlovic et al. | Jun 1999 | A |
6041140 | Binns et al. | Mar 2000 | A |
6081269 | Quarendon | Jun 2000 | A |
6084712 | Harding | Jul 2000 | A |
6088105 | Link | Jul 2000 | A |
6099134 | Taniguchi et al. | Aug 2000 | A |
6100517 | Yahav et al. | Aug 2000 | A |
6101269 | Hunter et al. | Aug 2000 | A |
6108036 | Harada et al. | Aug 2000 | A |
6167151 | Albeck et al. | Dec 2000 | A |
6259561 | George et al. | Jul 2001 | B1 |
6262740 | Lauer et al. | Jul 2001 | B1 |
6268923 | Michniewicz et al. | Jul 2001 | B1 |
6301059 | Huang et al. | Oct 2001 | B1 |
6438263 | Albeck et al. | Aug 2002 | B2 |
6494837 | Kim et al. | Dec 2002 | B2 |
6495848 | Rubbert | Dec 2002 | B1 |
6686921 | Rushmeier et al. | Feb 2004 | B1 |
6731391 | Kao et al. | May 2004 | B1 |
6741251 | Malzbender | May 2004 | B2 |
6751344 | Grumbine | Jun 2004 | B1 |
6754370 | Hall-Holt et al. | Jun 2004 | B1 |
6803777 | Pfaff et al. | Oct 2004 | B2 |
6813440 | Yu et al. | Nov 2004 | B1 |
6825985 | Brown et al. | Nov 2004 | B2 |
6841780 | Cofer et al. | Jan 2005 | B2 |
6859326 | Sales | Feb 2005 | B2 |
6937348 | Geng | Aug 2005 | B2 |
7006952 | Matsumoto et al. | Feb 2006 | B1 |
7009742 | Brotherton-Ratcliffe et al. | Mar 2006 | B2 |
7013040 | Shiratani | Mar 2006 | B2 |
7076024 | Yokhin | Jul 2006 | B2 |
7112774 | Baer | Sep 2006 | B2 |
7120228 | Yokhin et al. | Oct 2006 | B2 |
7127101 | Littlefield et al. | Oct 2006 | B2 |
7194105 | Hersch et al. | Mar 2007 | B2 |
7256899 | Faul et al. | Aug 2007 | B1 |
7335898 | Donders et al. | Feb 2008 | B2 |
7369685 | DeLean | May 2008 | B2 |
7433024 | Garcia et al. | Oct 2008 | B2 |
7551719 | Yokhin et al. | Jun 2009 | B2 |
7659995 | Knighton et al. | Feb 2010 | B2 |
7751063 | Dillon et al. | Jul 2010 | B2 |
7840031 | Albertson et al. | Nov 2010 | B2 |
20010016063 | Albeck et al. | Aug 2001 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020075456 | Shiratani | Jun 2002 | A1 |
20030048237 | Sato et al. | Mar 2003 | A1 |
20030057972 | Pfaff et al. | Mar 2003 | A1 |
20030156756 | Gokturk et al. | Aug 2003 | A1 |
20040130730 | Cantin et al. | Jul 2004 | A1 |
20040130790 | Sales | Jul 2004 | A1 |
20040174770 | Rees | Sep 2004 | A1 |
20040213463 | Morrison | Oct 2004 | A1 |
20040218262 | Chuang et al. | Nov 2004 | A1 |
20040228519 | Littlefield et al. | Nov 2004 | A1 |
20050052637 | Shaw et al. | Mar 2005 | A1 |
20050200838 | Shaw et al. | Sep 2005 | A1 |
20050200925 | Brotherton-Ratcliffe et al. | Sep 2005 | A1 |
20050231465 | DePue et al. | Oct 2005 | A1 |
20050271279 | Fujimura et al. | Dec 2005 | A1 |
20060072851 | Kang et al. | Apr 2006 | A1 |
20060156756 | Becke | Jul 2006 | A1 |
20070057946 | Albeck et al. | Mar 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070165243 | Kang et al. | Jul 2007 | A1 |
20080018595 | Hildreth et al. | Jan 2008 | A1 |
20080031513 | Hart | Feb 2008 | A1 |
20080106746 | Shpunt et al. | May 2008 | A1 |
20080198355 | Domenicali et al. | Aug 2008 | A1 |
20080212835 | Tavor | Sep 2008 | A1 |
20080240502 | Freedman et al. | Oct 2008 | A1 |
20080247670 | Tam et al. | Oct 2008 | A1 |
20090016642 | Hart | Jan 2009 | A1 |
20090096783 | Shpunt | Apr 2009 | A1 |
20090183125 | Magal et al. | Jul 2009 | A1 |
20090183152 | Yang et al. | Jul 2009 | A1 |
20090185274 | Shpunt | Jul 2009 | A1 |
20100007717 | Spektor et al. | Jan 2010 | A1 |
20100013860 | Mandella et al. | Jan 2010 | A1 |
20100118123 | Freedman et al. | May 2010 | A1 |
20100128221 | Muller et al. | May 2010 | A1 |
20100177164 | Zalevsky | Jul 2010 | A1 |
20100194745 | Leister et al. | Aug 2010 | A1 |
20100201811 | Garcia et al. | Aug 2010 | A1 |
20100225746 | Shpunt et al. | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100290698 | Shpunt et al. | Nov 2010 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110074932 | Gharib et al. | Mar 2011 | A1 |
20110096182 | Cohen et al. | Apr 2011 | A1 |
20110134114 | Rais et al. | Jun 2011 | A1 |
20110158508 | Shpunt et al. | Jun 2011 | A1 |
20110211044 | Shpunt et al. | Sep 2011 | A1 |
Number | Date | Country |
---|---|---|
19736169 | Aug 1997 | DE |
19638727 | Mar 1998 | DE |
2352901 | Feb 2001 | GB |
62206684 | Sep 1987 | JP |
01-240863 | Sep 1989 | JP |
03-029806 | Feb 1991 | JP |
06-273432 | Sep 1994 | JP |
2001141430 | May 2001 | JP |
2002122417 | Apr 2002 | JP |
2002-213931 | Jul 2002 | JP |
2002-365023 | Dec 2002 | JP |
9303579 | Feb 1993 | WO |
9827514 | Jun 1998 | WO |
9828593 | Jul 1998 | WO |
9828593 | Jul 1998 | WO |
2005010825 | Feb 2005 | WO |
2007043036 | Apr 2007 | WO |
2007096893 | Aug 2007 | WO |
2007105205 | Sep 2007 | WO |
2007105215 | Sep 2007 | WO |
2008120217 | Oct 2008 | WO |
Number | Date | Country | |
---|---|---|---|
20100020078 A1 | Jan 2010 | US |
Number | Date | Country | |
---|---|---|---|
60885899 | Jan 2007 | US |