Three-dimensional (3D) capture systems can be used to capture the surface structure of a wide range of 3D objects. These can vary from the very small to the medium/large.
Traditional 3D capture systems may use a digital light processing (DLP) projector to project a sequence of phase shifted high spatial frequency sine waves which are captured by a single camera. Analysis of the intensities recovered at each point in the image sequence allows the overall phase to be recovered. In such projectors, large fans are used to decrease the temperature, along with control boards for managing the projection sequence and high-power illumination sources.
Various features of certain examples will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate, by way of example only, a number of features, and wherein:
In the following description, for purposes of explanation, numerous specific details of certain examples are set forth. Reference in the specification to “an example” or similar language means that a particular feature, structure, or characteristic described in connection with the example is included in at least that one example, but not necessarily in other examples.
According to an example, there is provided a low-cost method to capture 3D surface structure data based on phase shift structured light. Use of a DLP projector is replaced by a single low-cost binary mask, such as a square wave grating for example, and one or more illumination sources, such as LEDs for example and a defocusing element such as a defocused lens for example.
In an example, phase shifting can be achieved by either moving the grating or switching/moving the illumination source. This results in a significantly reduced cost system with potential to reduce the overall device size, weight and power consumption, which makes the 3D scanner more portable and more energy efficient. Temperature instability in a DLP projector changes the projection characteristics which increases the scanner errors. As the projector used in the present approach is much simpler and more efficient than a DLP, this effect is negligible.
A binary mask 105 is used to generate a binary pattern. In the example of
Multiple such assemblies can be provided according to an example in order to illuminate the surface 111 of an object 113 from different angles.
In the example of
In the example of
As an example, a 16 mm radius fringe with 16 line pairs can be generated.
According to an example, a series of fringe patterns can be projected on to an object and captured by a camera 115. The phase distribution that contains the object's height information can then be calculated by analysis of the images. Thus, according to an example, the generated fringe pattern is shifted in order to use a phase shifting method to determine a surface structure of an object.
In an example, a fringe pattern can be shifted by moving the binary mask 105 in order to generate shifted fringes. For example, a manual XYZ stage 117 can be used to generate the required fringe and move the mask 105.
As an illustrative example, the area of the generating fringe pattern can be 4 mm×4 mm and there are 16 line pairs in this area. After (or before) defocusing the element 107, one period of the fringe is calculated as 256 um (4 mm/16). Four images, which are 90 degrees phase shifted from each other, can be generated. As one period of the fringe is 256 um, three 64 um movements of the mask 105 in the X direction using stage 117 provides four successive 90° phase shifted images, I1 to I4. Other phase increments can be used, e.g. 45, 22.5 deg. and so on.
According to an example, the intensity of a captured image at location c by camera n can be expressed as:
Inc=Ac+Bc cos(ϕ+δn)
where Ac is the background intensity, Bc is the intensity modulation, ϕ is the unknown depth dependent phase value for each pixel and δn is one of the user imposed phase shift values.
In an example, the desired phase information can be calculated by a processor according to:
Referring to the above where four phase shifted fringe patterns are provided, this means that, in that example:
According to an example, a direct mapping between the recovered phase ϕ and the 3-D coordinates of the object can be derived through a calibration process using a planar calibration plate measured at a sequence of carefully controlled depths.
That is, in an example, a flat calibration surface (e.g. diffuse white) can be used and moved though small increments in the Z axis to capture the phase at incremental changes in height. An alternative is to use a stepped calibration surface that can be moved laterally to simulate effective changes in Z height.
In this way, a given measurement of the phase at a point on the surface of an object will correspond to a specific height of the object surface structure. Accordingly, by determining the phase as described above, it is possible to determine the surface structure of an object.
The results show that moving the fringes generated by mask 105 with an illumination source 103 and an element 107 can replace the DLP projector for 3D capture. The phase error is directly related to 3D capture resolution and accuracy.
After generating a fringe pattern 109, the shifted images can, in an example, be generated by moving the illumination source 103 instead of moving the mask 105. The illumination source 103 can be shifted using the stage 117. Moving the illumination source 103 by the appropriate amount will shift the images 90° in phase. The desired phase can be calculated using the same formulas as given above.
The phase error with moving the illumination source is compared with the projector as shown in
Thus, according to an example, moving the illumination source to generate phase shifted images can replace the traditional projector for 3D capture.
According to an example, another way of changing the position of the illumination source 103 is to use different illumination sources. For example, illumination source 103 can comprises multiple independently addressable sources. Each of these could be switched on/off in a row to generate phase shifted images. For example, four different illumination sources (LED's) can be placed in a row to obtain four 90° phase shifted images. The first illumination source can be switched on while the other three are switched off to obtain the first image. A second image which is 90° phase shifted from the first image can be obtained by switching on the second illumination source and switching off the rest of the illumination sources. Third and fourth image can be obtained in similar way.
When moving the illumination, the movement will be small relative to the object distance, as any movement of the illumination results in a minor change in the angle of illumination of light source which in turn has a surface orientation dependent effect on the reflectance seen by the camera. For example, the reflectance of a Lambertian surface depends directly on the cosine of the angle between the illumination source and the surface normal.
In block 505, the binary pattern is modified using a defocusing element to generate a continuously modulated fringe pattern for illuminating the surface of the object. For example, as described above, a lens can be used so that a substantially sinusoidal wave pattern is provided.
In block 603, the intensity of light at multiple positions on the surface of the object at respective different fringe pattern positions is detected. In block 605, a phase value at respective ones of the multiple positions is calculated using the detected intensities of light. In block 607, the calculated phase values are used to generate a measure for the height of the object at respective ones of the multiple positions.
A system as described herein is therefore low cost and lightweight. It also reduces the size of a 3D scanner system as high-resolution projectors are bulky, which means that it is more portable, thereby enabling small robotic arms, for example, to carry it for automated scanning. Furthermore, power consumption is significantly reduced allowing the 3D scanner to be hand-held, battery operated and wireless.
The scan area can be increased without increasing the cost, as a larger grating and more powerful illumination source do not increase the cost considerably. In an example, a dual projector solution (one horizontal and one vertical) can be used to reduce asymmetry of the recovered 3D data at small incremental cost, and overcome occlusions.
A system as described herein in which multiple sources of illumination are used has no moving parts, and provides the potential for high resolution. Multiple devices can be combined to build a scan bar that would enable surface scanning of large areas at high resolution e.g. an MJF (Multi Jet Fusion) print bed.
While the method, apparatus and related aspects have been described with reference to certain examples, various modifications, changes, omissions, and substitutions can be made without departing from the spirit of the present disclosure. In particular, a feature or block from one example may be combined with or substituted by a feature/block of another example.
The word “comprising” does not exclude the presence of elements other than those listed in a claim, “a” or “an” does not exclude a plurality, and a single processor or other unit may fulfil the functions of several units recited in the claims.
The features of any dependent claim may be combined with the features of any of the independent claims or other dependent claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/059114 | 10/30/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/088982 | 5/9/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6040910 | Wu et al. | Mar 2000 | A |
6696679 | Graef et al. | Feb 2004 | B1 |
6788210 | Huang | Sep 2004 | B1 |
9621876 | Federspiel | Apr 2017 | B2 |
20030048458 | Mieher et al. | Mar 2003 | A1 |
20030234786 | Cole | Dec 2003 | A1 |
20070091302 | Harding | Apr 2007 | A1 |
20070291993 | Nisper | Dec 2007 | A1 |
20110080471 | Song | Apr 2011 | A1 |
20110101570 | John et al. | May 2011 | A1 |
20130153651 | Fedorovskaya | Jun 2013 | A1 |
20130155417 | Ohsawa | Jun 2013 | A1 |
20130156330 | Kane | Jun 2013 | A1 |
20140064603 | Zhang | Mar 2014 | A1 |
20140078264 | Zhang | Mar 2014 | A1 |
20140111616 | Blayvas | Apr 2014 | A1 |
20140354681 | Xiong | Dec 2014 | A1 |
20150260509 | Kofman | Sep 2015 | A1 |
20160261851 | Tian | Sep 2016 | A1 |
Entry |
---|
EpiToF: A Time-of-flight Depth Camera That Works Outdoors in Bright Sunlight; (2017); Available at: http://www.cs.cmu.edu/˜ILIM/epitof/html/index.html. |
Ullah, Dr. Furqan “Real 3D: Think of Design with Dimensions,” (2017) real3d.pk, Available at: http://real3d.pk/intraoralscanner.html. |
Number | Date | Country | |
---|---|---|---|
20200263979 A1 | Aug 2020 | US |