The invention belongs to the field of three-dimensional imaging technology, in particular to a highly efficient three-dimensional image acquisition method based on multi-mode composite encoding and the epipolar constraint
In the field of three-dimensional imaging, the fast acquisition of high-precision three-dimensional data of target objects has always been an important technical difficulty. In early days, people used the mechanical three-coordinate measuring machine to detect target objects point by point so as to obtain three-dimensional images, but on the one hand this point-by-point contact image acquisition technique is extremely inefficient, and on the other hand the technique damages the measured object. The disadvantages of this technique make it difficult to be applied in areas such as human detection and cultural relics protection. Compared with the traditional mechanical three-dimensional image acquisition technique, the optical three-dimensional image acquisition technique has been widely used in scientific research, industrial inspection and other fields due to its advantages of non-contact and high efficiency. In recent years, with the development of digital projection equipment, fringe projection techniques in optical three-dimensional imaging methods can realize full-field imaging and has become a research hot spot (S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Eng. 48, 133-140 (2010).). At present, two mainstream techniques in the field of fringe projection are Fourier profilometry (M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shapes,” Applied optics 22, 3977-3982 (1983).) and phase shift profilometry (V. Srinivasan, H.-C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Applied optics 23, 3105-3108 (1984).).
Compared with Fourier profilometry, phase shift profilometry is more suitable for automated high-precision three-dimensional image acquisition due to its insensitivity to ambient light and noise and computational simplicity. Phase shift profilometry obtains the phase of a measured object by projecting a plurality of (at least three) phase shift grating fringes onto the measured object and a synchronously acquiring the grating fringes modulated by the measured object by a camera, and finally the three-dimensional image of the measured object is obtained by means of a phase-height mapping, relationship. How to ensure higher imaging precision at higher imaging efficiency has always been a focus and difficulty in the field of phase shift profilometry. In general, in phase shift profilometry, the more grating fringes that are projected, the more favorable precise phase acquisition is, thereby the more precise three-dimensional image of the measured object can be obtained. However, too many grating fringes will greatly affect the efficiency of phase acquisition and thus affect the speed (efficiency) of three-dimensional image acquisition (Chen Qian; Feng Shijie; Gu Guohua; Zuo Chao; Sun Jiayu; Yu Shiling; Shen Guochen; Li Rubin. A time phase unwrapping method based on dual-frequency three-gray scale sinusoidal grating fringe projection: China, 201410027275.4[1].2013-04-30.). For three-dimensional imaging for dynamic (quasi-static) objects, high imaging efficiency must be required in order to ensure correct imaging. While imaging efficiency has no effect on the final imaging accuracy when measuring static objects, high imaging efficiency still has absolute advantages in the three-dimensional data collection of pipe-lined bulk objects (Long Jiale; Zhang Jianmin; Fan Zhihui. A fast three-dimensional measurement system based on three-wavelength fringe projection: China, 201620177719.7[1].2016-09-07.).
Compared with other methods, three-dimensional imaging technology based on phase shift profilometry has great advantages in measurement mode, imaging efficiency and imaging precision, and it has been widely used in fields such as cultural relic protection and human body detection, but the imaging efficiency and imaging precision must be further improved to overcome the contradiction between the two and thus it can be more widely applied.
The object of the present invention is to provide a highly efficient three-dimensional image acquisition method based on multi-mode composite encoding and the epipolar constraint, which improves the precision and efficiency of three-dimensional imaging by composite coding and the epipolar constraint.
The technical solution for achieving the object of the present invention is: a highly efficient three-dimensional image acquisition method based on multi-mode composite encoding and the epipolar constraint, with same respectively using a fast imaging mode or a high-precision imaging mode, wherein in the fast imaging mode, two phase maps having different frequencies are obtained by four stripe gratings, and a high-frequency absolute phase is obtained by means of the epipolar constraint and a left-right consistency test, and the three-dimensional image is obtained by means of a mapping relationship between a phase and three-dimensional coordinates; and in the high precision imaging mode, two phases having different frequencies are obtained by means of N+2 stripe gratings, a low-frequency absolute phase is obtained by the epipolar constraint, and the unwrapping of a high-frequency phase is assisted by means of the low-frequency absolute phase, so as to obtain the high-frequency absolute phase, and finally, the three-dimensional image is obtained by the mapping relationship between the phase and the three-dimensional coordinates.
Compared with prior arts, the present invention has significant advantages: (1) The fast imaging mode of the present invention utilizes four grating fringes to obtain a three-dimensional image of the measured object, and compared with the prior art, the combination of the four composite coding grating fringes ensures the high efficiency of three-dimensional image measurement; at the same time, the introduction of the epipolar constraint in the binocular vision enables the technique to obtain high-precision phases with up to 64 fringe periods by using four composite grating stripes, and the high precision of the three-dimensional image is ensured and finally, since the high-frequency absolute phase is directly solved by the epipolar constraint, it does not rely on the low-frequency absolute phase, which avoids the inaccuracy of the three-dimensional image caused by the difference in modulation degree between the two frequency stripe gratings. As shown in
The invention is further described in detail below with reference to the accompanying drawings.
Combining
step one, imaging system calibration;
step two, generating, projecting and collecting four dual-frequency grating fringes;
step three, analyzing the grating fringes collected by left camera and right camera respectively so as to obtain a set of high frequency phases and a set of low frequency phases;
step four, searching for each point on the high frequency phase maps of the left camera by using the epipolar constraint, that is, the corresponding point of the original point in space, and removing some error points by the depth constraint;
step five, projecting the remaining spatial corresponding points onto the high-frequency and low-frequency phase maps of the right camera, and determining the final corresponding point by the phase difference between the original points and the corresponding points, so as to obtain a high-frequency absolute phase;
step six, acquiring a three-dimensional image according to the absolute phase, so as to realize efficient and precise acquisition of three-dimensional images of dynamic scenes
the high-precision imaging mode comprises the following steps:
step one, imaging system calibration;
step two, generating, projecting, and acquiring N+2 double-frequency grating stripe patterns;
step three, analyzing the grating fringes collected by left camera and right camera respectively so as to obtain a set of high frequency phases and a set of low frequency phases;
step four, using the epipolar constraint to search for each point on the high frequency phase maps of the left camera, that is, the corresponding point of the original point in space, and removing some error points by the depth constraint;
step five, the remaining space corresponding points are projected to the low-frequency phase map of the right camera, projecting the remaining spatial corresponding points onto the low-frequency phase maps of the right camera, and determining the final corresponding point by the phase difference between the original points and the corresponding points, so as to obtain a high-frequency absolute phase through a low-frequency absolute phase;
step six, acquiring a three-dimensional image according to the absolute phase, so as to realize efficient and precise acquisition of three-dimensional images of static scenes.
The process of the two imaging modes is described in detail below.
The flow diagram of the steps of the fast imaging mode of the present invention is shown in
step one, imaging system is calibrated.
The imaging system comprises a computer, a left camera, a right camera, and a projector, wherein the left camera, the right camera and the projector are respectively connected to the computer through data lines, and the projector is connected with the left camera and the right camera through trigger lines; After the imaging system is built, the calibration method mentioned in “A flexible new technique for camera calibration.” by Z. Zhang (Z Zhang, “a flexible new technique for camera calibration.” IEEE Transactions on pattern analysis and machine intelligence. 22 (11), 1330-1334 (2000).) is used for the imaging system calibration so as to obtain calibration parameters of the left camera, the right camera and the projector in a world coordinate system, wherein the calibration parameters comprise a scaling parameter, a translation parameter, a rotation parameter and a distortion parameter between the pixel coordinate system and the world coordinate system.
step two, four dual-frequency grating fringes are generated, projected and collected. the four dual-frequency grating fringes generated by the computer through MatLab are two sinusoidal stripe gratings and two triangular wave stripe gratings, and the stripe gratings are as follows:
I1(x,y)=A(x,y)+B(x,y)sin [πFH(2x/W−1)]
I2(x,y)=A(x,y)+B(x,y)cos [πFH(2x/W−1)]
I3(x,y)=A(x,y)+B(x,y)tri[(2FLx/W−1)]
I4(x,y)=A(x,y)−B(x,y)tri[(2FLx/W−1)]
where Ii(x,y) represents the intensity of the grating fringe at the pixel coordinates (x,y) of the generated image, i=1, 2, 3, 4 representing the i-th grating fringe image, A is the image DC component, B is the amplitude, and tri is the triangular wave function with threshold interval [−1, 1], FH, FL is the number of fringe periods included in I1, I2 and I3, I4. W is the pixel width of the entire grating fringe image, A=B=127.5, the values for F, are respectively 64 and 9, and the range of values for x is 0 to W−1; the gratings are synchronously collected by the left camera and the right camera after being projected by the projector. For the sake of simplicity, only the left camera is analyzed here. The analysis process of the right camera is the same as that of the left camera. The grating fringes collected by the left camera are as follows:
I1c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)sin ΦH(xc,yc)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
I2c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)cos ΦH(xc,yc)]+α(x,y)β1(xc,yc)+β2(xc,yc)
I3x(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)ΦL(xc,yc)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
I4(xcyc)=α(xc,yc)[A(xc,yc)−B(xc,yc)ΦL(xc,yc)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
where I1c(xc,yc) is the grating fringe image actually captured by the left camera, i=1, 2, 3, 4. (xc,yc) is the pixel coordinates of the image captured by the camera, and α is the surface reflectance of the measured object, β1 is the reflected ambient light, β2 is the ambient light directly into the camera, ΦH(xc,yc) is the phase included in the grating fringe diagram I1c and the grating fringe diagram I2c, ΦL (xc,yc) is the phase included in the grating fringe diagram I3c and the grating fringe diagram I4c; assuming
Ac=α(xc,yc)A(xc,yc)+α(xc,yc)β1(xc,yc)+β2(xc,yc),Bc=α(xc,yc)B(xc,yc),
and leaving out (xc,yc) the above four equations can be reduced to:
I1c=Ac+Bc sin ΦH
I2c=Ac+Bc cos ΦH
I3c=Ac+BcΦL
I4c=Ac−BcΦL
The acquisition process of the right camera is the same as the acquisition process of the left camera.
Step three, the grating fringes collected are analyzed by the left camera and the right camera to obtain respectively a set of high-frequency phases and a set of low-frequency phases.
According to the image acquired by the left camera in step two, two sets of phases are obtained as follows:
where ϕH is a high-frequency wrapped phase and ϕL is a low-frequency wrapped phase, and the same steps as those of the left camera can be used to determine that the phases corresponding to the stripe gratings collected by the right camera are ϕH′,ϕL′.
Step four, the epipolar constraint is used to search for each point on the high frequency phase maps of the left camera, that is, the corresponding point of the original point in space, and removing some error points by the depth constraint.
For any point p on ϕH, that is, the original point, its cycle order k has FH possibilities, which means that its absolute phase
ΦH=ϕH+2kπ,k∈[0,FH−1]
has FH different values and from the following formula:
It can be seen that p corresponds to FH points in the three-dimensional space, that is, FH corresponding points, wherein MZ, NZ and CZ are derived from the calibration parameters obtained in step one (K. Liu, Y. Wang, et al “Dual-Frequency pattern scheme for high-speed 3-D shape measurement.” Optics express. 18(5), 5229-5244 (2010).). There is at most one correct corresponding point in the FH corresponding points. The key to 3D imaging is to identify the only correct corresponding point among the FH corresponding points. Considering the limited measurement space range in the actual 3D imaging system, the effective range of the left camera, the right camera and the projector is preset as (Zmin, Zmax), such as Zmin=−200 mm, Zmax=200 mm, all k which make Z exceeds this preset range and its corresponding spatial points will be confirmed as error points and be eliminated: after this step is processed, the correct cycle order k of p and the range of its spatial corresponding points will be reduced to FH′, where FH′<<FH.
Step five, the residuary spatial corresponding points are projected onto the high frequency and low frequency phase maps of the right camera, and the final corresponding point is determined by the phase difference between the original point and the corresponding point, thus obtaining the high frequency absolute phase.
First of all, the residuary spatial points in (Zmin, Zmax) of step four are projected onto the imaging surface of the right camera, and FH′ two-dimensional corresponding points on the imaging surface of the right camera are obtained. In fact, the wrapped phase ϕH′,ϕL′ of the correct corresponding point p′ and the wrapped phase ϕH, ϕL of the original point should be very close, so that, the two-dimensional corresponding points whose ϕdiff exceed the threshold of 0.5 rad are further excluded by the formula
ϕdiff=ϕH(p)−ϕH′(p′)
(the rad is the radian unit and the threshold size is a prior determination) where p′ is the corresponding point of p on the right camera, ϕdiff is the difference between the original point and the corresponding point in the wrapped phases ϕH and ϕH′, the range of the correct corresponding points is reduced from FH′ to FH″; finally, selecting the corresponding point that makes ϕL(p)−ϕL′(p′) the smallest among the residuary FH″ corresponding points, and considering the point to be the correct corresponding point; then the cycle order k corresponding to the point is the correct cycle order and thus the unique ΦH of the original point is confirmed.
Step six, acquiring a three-dimensional image according to the absolute phase.
The three-dimensional image coordinates are obtained by combining the absolute phase ΦH obtained in step five with the following formula:
Where EX, FX, EY, FY are obtained from the calibration parameters in step one (K. Liu, Y. Wang, et al “Dual-frequency pattern scheme for high-speed 3-D shape measurement. “Optics express. 18 (5), 5229-5244 (2010).), XP, YP, Zp are the three-dimensional coordinates of the measured object, and thus the three-dimensional image of the measured object can be obtained. It can be seen through the above steps that the fast imaging mode of the present invention utilizes four grating fringes to obtain a three-dimensional image of the measured object, and compared with the prior art, the combination of the four composite coding grating fringes ensures the high efficiency of three-dimensional image measurement; at the same time, the introduction of the epipolar constraint in the binocular vision enables the technique to obtain high-precision phases with up to 64 fringe periods by using four composite grating stripes, and the high precision of the three-dimensional image is ensured and finally, since the high-frequency absolute phase is directly solved by the epipolar constraint, it does not rely on the low-frequency absolute phase, which avoids the inaccuracy of the three-dimensional image caused by the difference in modulation degree between the two frequency stripe gratings.
In order to test the effect of a highly efficient three-dimensional image acquisition method based on multi-mode composite encoding and the epipolar constraint, the present invention gives two sets of test results.
It can be seen from these experimental results that the fast imaging mode of the present invention retains more detailed measurement results than the prior art, and the imaging precision is greatly improved while ensuring high-efficiency three-dimensional imaging.
The flow diagram of the steps of the high-precision imaging mode of the present invention is shown in
Step one, imaging system is calibrated
The imaging system comprises a computer, a left camera, a right camera, and a projector, wherein the left camera, the right camera and the projector are respectively connected to the computer through data lines, and the projector is connected with the left camera and the right camera through trigger lines; After the imaging system is built, the calibration method mentioned in “A flexible new technique for camera calibration.” by Z. Zhang (Z. Zhang, ‘a flexible new technique for camera calibration.” IEEE Transactions on pattern analysis and machine intelligence. 22 (11), 1330-1334 (2000).) is used for the imaging system calibration so as to obtain calibration parameters of the left camera, the right camera and the projector in a world coordinate system.
Step two, N+2 double-frequency grating stripe patterns are generated, projected, and acquired.
N+2 stripe gratings are two low frequency sinusoidal stripe gratings plus N high frequency sinusoidal stripe gratings, where N≤3, for the sake of simplicity, taking N=3 as an example, the stripe gratings are as follows:
J1(x,y)=A(x,y)+B(x,y)cos [πNH(2x/W−1)]
J2(x,y)=A(x,y)+B(x,y)cos [πNH(2x/W−1)+2π/3]
J3(x,y)=A(x,y)+B(x,y)cos [πNH(2x/W−1)+/4π3]
J4(x,y)=A(x,y)+B(x,y)sin [πNL(2x/W−1)]
J5(x,y)=A(x,y)+B(x,y)cos [πNL(2x/W−1)]
where Ji(x,y) represents the intensity of the grating fringe at the pixel coordinates (x,y) of the generated image, i=1, 2, 3, 4, 5, representing the i-th grating fringe image, and A is the DC component of the image, B is the amplitude, NH, NL are the number of fringe periods included in J1˜J3 and J4, J5 respectively, W is the pixel width of the entire grating fringe image, A=B=127.5, and the values for NH, NL are respectively 128 and 8, and the range of values for x is 0 to W−1; the gratings are synchronously collected by the left camera and the right camera after being projected by the projector. For the sake of simplicity, only the left camera is analyzed here. The analysis process of the right camera is the same as that of the left camera. The grating fringes collected by the left camera are as follows:
J1c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)cos ΨH(xc,yc)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
J2c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)cos(ΨH(xc,yc)+2π/3)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
J3c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)cos(ΨH(xc,yc)+4π/3)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
J4c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)sin ΨI(xc,yc)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
J5c(xc,yc)=α(xc,yc)[A(xc,yc)+B(xc,yc)cos ΨI(xc,yc)]+α(xc,yc)β1(xc,yc)+β2(xc,yc)
where J1c(xc,yc) is a grating fringe image actually captured by the left camera, i=1, 2, 3, 4, 5, (xc,yc) is the pixel coordinates of the image captured by the camera, and α is the surface reflectance of the measured object, β1 the reflected ambient light, β2 is the ambient light directly into the camera, ΨH (xc,yc) is the phase included in the grating fringe diagram I1c and the grating fringe diagram Ix3, ΨL (xc,yc) is the phase included in the grating fringe diagram I4c and the grating fringe diagram I5c; assuming
A′=α(xc,yc)A(xc,yc)+α(xc,yc)β1(xc,yc)+β2(xc,yc),Bc=α(xc,yc)B(xc,yc),
and leaving out (xc,yc), the above four equations can be reduced to:
J1c=Ac+Bc cos ΨH
J2c=Ac+Bc cos(ΨH+2π/3)
J3c=Ac+Bc cos(ΨH+4π/3)
J4c=Ac+B3 sin ΨI
J5c=Ac+Bc cos ΨI
The acquisition process of the right camera is the same as that of the above left camera.
Step three, the grating fringes collected by left camera and right camera respectively are analyzed so as to obtain a set of high frequency phases and a set of low frequency phases, where the phase of the left camera is as follows:
Where ψH is a high-frequency wrapped phase and ψL is a low-frequency wrapped phase, and the same steps as those of the left camera can be used to determine that the phases corresponding to the stripe gratings collected by the right camera are ψH′,ψL′.
Step four, using the epipolar constraint to search for the corresponding spatial point of each point (original point) on the high frequency phase map of the left camera, and eliminating some error points by the depth constraint.
For any point q on ϕL, that is, the original point, its cycle order k has NL possibilities, which means that its absolute phase
ΨL=ψL+2π,l∈[0,NL−1]
has NL different values and from the following formula:
It can be seen that q corresponds to NL points in the three-dimensional space, that is, corresponding points, wherein MZ, NZ and CZ are derived from the calibration parameters obtained in step one (K. Liu. Y. Wang, et al “Dual-frequency pattern Scheme for high-speed 3-D shape measurement.” Optics express. 18(5). 229-5244 (2010).); the key to 3D imaging is to identify the only correct corresponding point among the NL corresponding points. Considering the limited measurement space range in the actual 3D imaging system, the effective range of the left camera, the right camera and the projector is preset as (Zmin, Zmax), such as Zmin=−200 mm, Zmin=200 mm, all l which make Zq exceeds this preset range and its corresponding spatial points will be confirmed as error points and be eliminated; after this step, the correct cycle level of q and the range of its spatial corresponding points will be reduced to NL′, where NL′<<NL.
Step 5, the residuary spatial corresponding points are projected onto the low frequency phase map of the right camera, and the final corresponding point is determined by the phase difference between the original point and the corresponding point, and the high frequency absolute phase is obtained by means of the low frequency absolute phase.
First of all, the residuary spatial points in (Zmin, Zmax) of step four are projected onto the imaging surface of the right camera, and FH′ two-dimensional corresponding points on the imaging surface of the right camera are obtained.
In fact, the wrapped phase ψL′ the correct corresponding point q′ and the wrapped phase ψL of the original point should be very close, since NL′<<NL, the corresponding point making ψL (p)−ψL′(p′) the smallest can be selected from NL′ corresponding points, and considering that the point is a correct corresponding point, then the corresponding, cycle level l of the correct point is the correct cycle level, so that the unique ΨL of the original point is confirmed.
Finally, the final high frequency absolute phase ΨH is obtained by the following equation
Step six, a three-dimensional image is acquired according to the absolute phase.
The coordinates of the three-dimensional image are obtained by combining the absolute phase ΨH obtained in step five with the following formula:
where EX, FX, EY, FY are obtained from the calibration parameters in step one (K. Liu, Y. Wang, et al “Dual-frequency pattern scheme for high-speed 3-D shape measurement.” Optics express. 18 (5), 5229-5244 (2010).), Xq, Yq, Zq are the three-dimensional coordinates of the measured object, and thus obtaining the three-dimensional image of the measured object.
From the above four steps, it can be seen that the high-precision imaging mode of the invention acquires the three-dimensional image of the measured object through the N+2 grating fringes, compared with the prior art, the introduction of the epipolar constraint enables the 8-cycle low-frequency absolute phase to be directly obtained through the two low-frequency grating fringes, which greatly avoids the redundancy in the prior art that utilizes multi-frame grating fringes (usually much larger than 2) to obtain low-frequency absolute phase, thus improving measurement efficiency. On the other hand, the coding pattern of N 128 fringe periods ensures that the precision of the finally acquired three-dimensional image is not lower than the precision of the prior art.
In order to test the effect of a highly efficient three-dimensional image acquisition method based on multi-mode composite encoding and the epipolar constraint, one set of experimental results are presented.
Number | Date | Country | Kind |
---|---|---|---|
2017 1 0182704 | Mar 2017 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2018/077215 | 2/26/2018 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/171384 | 9/27/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5796288 | Krech, Jr. | Aug 1998 | A |
7724379 | Kawasaki | May 2010 | B2 |
8872928 | Jin | Oct 2014 | B2 |
9013634 | Agarwala | Apr 2015 | B2 |
20090092311 | Kim | Apr 2009 | A1 |
20090097039 | Kawasaki | Apr 2009 | A1 |
20110222372 | O'Donovan | Sep 2011 | A1 |
20130120600 | Jin | May 2013 | A1 |
20130128121 | Agarwala | May 2013 | A1 |
20130258060 | Kotake | Oct 2013 | A1 |
20150350678 | Shimizu | Dec 2015 | A1 |
20160188994 | Wu | Jun 2016 | A1 |
20170070751 | Shimizu | Mar 2017 | A1 |
20170178353 | Smirnov | Jun 2017 | A1 |
20180020205 | Aflalo | Jan 2018 | A1 |
Number | Date | Country |
---|---|---|
101082481 | Dec 2007 | CN |
101089547 | Dec 2007 | CN |
101245998 | Aug 2008 | CN |
101650164 | Feb 2010 | CN |
102269575 | Dec 2011 | CN |
103697815 | Apr 2014 | CN |
103968782 | Aug 2014 | CN |
104330052 | Feb 2015 | CN |
104390608 | Mar 2015 | CN |
105066906 | Nov 2015 | CN |
106197320 | Dec 2016 | CN |
106931910 | Jul 2017 | CN |
1 875 161 | Jun 2014 | EP |
3870275 | Jan 2007 | JP |
Entry |
---|
International Search Report for PCT/CN2018/077215 (PCT/ISA/210) dated Feb. 26, 2018, with English translation. |
Jian et al., “Real-Time Three-Dimensional Measurement Composite of Epipolar Constraint and Speckle Correlation”, Acta Optica Sinica, vol. 36, No. 10, Oct. 2016, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20200128180 A1 | Apr 2020 | US |