This invention relates to a method for determining the 3D coordinates of an object and to an apparatus for carrying out such method.
In the method, a pattern is projected onto the object. The light reflected by the object is captured and evaluated. The apparatus for carrying out such method comprises a projector for projecting a pattern onto the object, a camera for recording the object, and an evaluating means for evaluating the shots recorded.
Methods and apparatuses of this kind are known already, for example from DE 10 2006 048 234 A1 and DE 10 2007 042 963 A1. In these methods, a stripe pattern can be projected onto the object. Usually, the stripe pattern is projected onto the object by the method of white-light stripe projection, i.e. with white light.
As for determining the 3D coordinates of the object a single shot generally is not sufficient to satisfy the measurement requirements and/or to completely cover the object, it is required to position the pattern projection system at various recording positions in the space and to transfer the shots made there into a common, superordinate coordinate system, which can also be referred to as absolute coordinate system. This process frequently referred to as “global registration” requires a high accuracy.
In a previously known method of this kind shots are made which partly overlap. These shots can be oriented to each other via an optimization of the overlap regions. However, with larger objects with little surface structure, the method possibly is not sufficiently accurate.
Furthermore, there are known methods in which reference marks are used, which are mounted on and/or beside the object and/or on one or more of the set-ups surrounding the object. The reference marks initially are calibrated. Preferably, this is effected by the method of photogrammetry. By means of the reference marks, which are detected by a pattern projection system, in particular a stripe projection system, the various shots of the object can be transformed onto the calibrated points, so that a global registration is possible.
From EP 2 273 229 A1 a method is known, in which for determining the 3D coordinates of an object a stripe pattern is projected onto the object by a projector. The stripe pattern reflected by the object is recorded by a camera which comprises an optical system and an area sensor, in particular a CCD sensor or CMOS sensor. The projector and the camera form a stripe projection system. In the vicinity of the object a plurality of reference set-ups are arranged, which each include a plurality of reference marks. The reference set-ups initially are surveyed. Subsequently, the 3D coordinates of the object are determined by the stripe projection system.
From WO 2004/011876 A1 an apparatus according to the generic part of claim 8 is known. The apparatus comprises a projector for projecting a pattern onto the object, a camera for recording the object, and an evaluating means for evaluating the shots of the object. The apparatus furthermore comprises a projection means for projecting reference marks onto the object and a further camera for recording the reference marks.
From DE 195 36 297 A1 a method for the geometrical calibration of optical 3D sensors for the three-dimensional survey of objects is known, in which a pattern is projected onto the object and the light reflected by the object is recorded and evaluated by a camera. In the image field of the camera a calibration means with at least four calibrated and a plurality of further signal marks or reference marks is located, which are recorded by the camera.
From DE 10 2005 051 020 A1 a method for the three-dimensional digitization of bodies is known, in which a camera is moved around the body to be digitized. The body to be digitized is surrounded by a plurality of photogrammetrically evaluable marks. Around the body, a plurality of light pattern projectors furthermore are stationarily mounted, which are switched on one after the other.
US 2010/0092041 A1 discloses a method for determining the three-dimensional shape of an object, in which a pattern is projected onto the object and the light reflected by the object is captured and evaluated by a camera. The camera furthermore detects a reference mark beside the object.
It is the object of the invention to propose an improved method as indicated above and an improved apparatus as indicated above.
In a method as indicated above, this object is solved by the features herein. Reference marks on and/or beside the object are recorded by a reference camera. The reference camera has a larger field of view than the camera. The reference camera is connected with the camera or with a 3D sensor which comprises the projector and the camera. The reference marks can already be present on the object. In particular, characteristic regions of the object can be used as reference marks. Instead or in addition, reference marks can be mounted on the object and/or be positioned beside the object. In particular those reference marks can be used which are described in DE 10 2009 032 262 A1, to which reference is made here expressly. Thus, there can be used reference marks which are inherently coded and/or reference marks which are not inherently coded, but which are spatially arranged relative to each other such that this spatial arrangement includes a coding. Several reference marks can be combined to one or more reference set-ups.
It is possible to take several shots of the object with the camera. Several or all shots can overlap. Putting the individual shots together in a superordinate coordinate system can be effected by means of photogrammetry. For this purpose, the reference marks are surveyed by the reference camera.
In an apparatus as indicated above, the object of the invention is solved by the features herein. The apparatus comprises a reference camera for recording reference marks on and/or beside the object. The reference camera has a larger field of view than the camera. The reference camera is connected with the camera or with a 3D sensor which comprises the projector and the camera.
Preferably, a stripe pattern is projected onto the object. The pattern, in particular the stripe pattern, can be projected onto the object by an image-forming element, in particular by a transparency or by a digital pattern generator, in particular a liquid crystal display such as a DLP display, an LCD display and/or an LCOS display. Furthermore, it is possible to project a pattern onto the object, which consists of one or more laser lines. The laser lines can be oriented parallel to each other and/or at an angle to each other.
The camera can comprise an optical system and a planar sensor, in particular a CCD sensor and/or a CMOS sensor. The evaluating means can comprise or consist of a computer, in particular a PC.
The recorded shot data can be stored temporarily in a camera memory. They can be passed on before or after the temporary storage or without temporary storage.
The projector and the camera can be structurally integrated in a so-called 3D sensor. The 3D sensor comprises the projector and the camera.
The invention also provides for supplementing an existing apparatus for determining the 3D coordinates of an object according to the generic part of claim 8 in a simple way, in order to be able to determine the 3D coordinates of an object which is larger than the field of view of the camera with this apparatus. Furthermore, it is possible to provide several cameras. For example, a so-called stereo camera can be used, i.e. a combination of two cameras. It is, however, also possible to provide more than two cameras. By using more than one camera, the allocation of the image points recorded by the cameras to the projected lines can be improved.
Advantageous developments are described herein.
Advantageously, the reference marks are projected onto the object.
According to a further advantageous development, the shots of the camera and the shots of the reference camera are effected temporally offset. Instead or in addition, the shots of the camera and the shots of the reference camera can be effected at the same time.
Advantageously, several shots of the camera are put together. For putting the shots together, matching methods can be employed in addition.
A further advantageous development is characterized in that in a first measurement run the positions of the reference marks are detected and stored. The first measurement run can be carried out without surveying (determining the 3D coordinates) the object. It is, however, also possible to also survey an object in this first measurement run. The detected positions of the reference marks can be transformed into the superordinate coordinate system. The detected and/or transformed positions can be stored.
It may be advantageous when the 3D coordinates of an object are determined on the basis of stored positions of the reference marks. This is advantageous in particular when in a first measurement run the positions of the reference marks are detected and possibly transformed and stored. It is, however, also possible to determine the 3D coordinates of an object on the basis of positions of the reference marks which have already been stored. By using the stored positions of the reference marks, measurement time can be saved. It may, however, also be advantageous to newly determine the positions of the reference marks during a or each determination of the 3D coordinates of an object.
An advantageous development of the apparatus according to the invention is characterized by a projector for projecting the reference marks onto the object.
An exemplary embodiment of the invention will be explained in detail below with reference to the attached drawing, in which:
A pattern, namely a stripe pattern 6, is projected onto the object 1 by the projector 3. The light reflected by the object 1 in the region of the stripe pattern 6 is captured by the camera 4.
The shot of the camera 4 is forwarded to an evaluating means (not shown in the drawing), by which it is evaluated. From the shot, the 3D coordinates of the object 1 can be determined in the field of view 7 of the camera 4. The contour 7′ of the stripe pattern 6 is larger than the field of view 7 of the camera 4. The contour 7′ encloses the field of view 7 on all sides. The camera 4 includes an area sensor which captures the object 1 preferably in the entire field of view 7 of the camera 4.
Onto the object 1 and the surroundings of the object 1 reference marks 8 are projected by a further projector (not shown in the drawing). The reference marks which are located within the field of view 9 of the reference camera 5 are captured by the reference camera 5.
The field of view 9 of the reference camera 5 is larger than the field of view 7 of the camera 4. Furthermore, the field of view 9 of the reference camera 5 encloses the field of view 7 of the camera 4 on all sides. The reference marks 8, which are located within its field of view 9, are picked up by the reference camera 5. The position and the orientation of the reference camera 5 can be determined therefrom.
After the part of the object 1, which is located within the field of view 7 of the camera 4, has been recorded by the camera 4, the apparatus comprising the projector 3, the camera 4 and the reference camera 5 is positioned at another point. This positioning is made such that the field of view 7 of the camera 4 is located at another point of the object 1, preferably at a point beside the original field of view 7. The field of view which the camera 4 covers in its new position can overlap with the original field of view 7 of the camera 4. Due to the fact that the field of view 9 of the reference camera 5 is larger than the field of view 7 of the camera 4, it is ensured that the reference camera 5 in the original position and in the new position can detect a sufficient number of reference marks 8. The number and the positions of the reference marks 8 are chosen such that a sufficient number of reference marks 8 is present, which the reference camera 5 can detect in both positions. In this way, a global registration of the shots of the camera 4 and hence of the 3D coordinates of the object 1 is ensured.
Due to the fact that the projector 2 which projects the stripe pattern 6 onto the object 1 is connected with the camera 4 and the reference camera 6, the stripe pattern 6 is moved along with a movement of the apparatus which comprises the projector 3, the camera 4 and the reference camera 5.
The size of the field of view of the camera 4 and/or the reference camera 5 can be determined by the focal length of the associated optics. Instead or in addition, the field of view can be determined by the sensor size. The shots of the reference camera 5 can be transmitted to an evaluation system, in particular a PC. The 3D coordinates of the reference marks 8 can mathematically be calculated therefrom. In this way, photogrammetry points can be obtained. Subsequently, these photogrammetry points and the positions of the reference camera 5 and of the camera 4 calculated by the photogrammetry can be used for accurately aligning and registering the data, i.e. the 3D coordinates.
For determining the 3D coordinates of the object 1, the positions of the reference marks 8 can be detected and stored in a first measurement run. In a second measurement run, these positions then can be used for determining the 3D coordinates of the object 1. In further measurement runs they can be used for determining the 3D coordinates of the object 1 or for determining the 3D coordinates of further objects.
Number | Date | Country | Kind |
---|---|---|---|
10 2011 114 674 | Sep 2011 | DE | national |
Number | Name | Date | Kind |
---|---|---|---|
4753569 | Pryor | Jun 1988 | A |
5083073 | Kato | Jan 1992 | A |
5175601 | Fitts | Dec 1992 | A |
5198877 | Schulz | Mar 1993 | A |
6026172 | Lewis, Jr. | Feb 2000 | A |
6078846 | Greer et al. | Jun 2000 | A |
6175647 | Schick, et al. | Jan 2001 | B1 |
6542249 | Kofman et al. | Apr 2003 | B1 |
8243123 | Geshwind | Aug 2012 | B1 |
20030025788 | Beardsley | Feb 2003 | A1 |
20030079360 | Ziegler | May 2003 | A1 |
20030112448 | Maidhof | Jun 2003 | A1 |
20040143359 | Yogo | Jul 2004 | A1 |
20040234122 | Kochi | Nov 2004 | A1 |
20050174581 | Liu | Aug 2005 | A1 |
20060265177 | Steinbichler | Nov 2006 | A1 |
20070081714 | Wallack | Apr 2007 | A1 |
20080112700 | Foxenland | May 2008 | A1 |
20080201101 | Hebert | Aug 2008 | A1 |
20090067706 | Lapa | Mar 2009 | A1 |
20090080766 | Daxauer | Mar 2009 | A1 |
20090323121 | Valkenburg | Dec 2009 | A1 |
20100092041 | Kim et al. | Apr 2010 | A1 |
20100134598 | St-Pierre | Jun 2010 | A1 |
20100231711 | Taneno | Sep 2010 | A1 |
20100283842 | Guissin | Nov 2010 | A1 |
20110007326 | Daxauer et al. | Jan 2011 | A1 |
20110134225 | Saint-Pierre | Jun 2011 | A1 |
20120120072 | Se et al. | May 2012 | A1 |
20130050410 | Steinbichler | Feb 2013 | A1 |
20130100282 | Siercks | Apr 2013 | A1 |
20130293684 | Becker | Nov 2013 | A1 |
20140168370 | Heidemann | Jun 2014 | A1 |
20140186218 | Sweet et al. | Jul 2014 | A1 |
20140198185 | Haugen | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
3712958 | Oct 1988 | DE |
19536294 | Apr 1997 | DE |
19536294 | Apr 1997 | DE |
19536297 | Apr 1997 | DE |
19840334 | Aug 1999 | DE |
102005020844 | Jul 2006 | DE |
102005051020 | Apr 2007 | DE |
102005051020 | Apr 2007 | DE |
102006048234 | Apr 2008 | DE |
102007042963 | Mar 2009 | DE |
102009032771 | Jan 2011 | DE |
102009032771 | Jan 2011 | DE |
102009032771 | Jan 2011 | DE |
102011011360 | Aug 2012 | DE |
1134546 | Sep 2001 | EP |
1189732 | May 2003 | EP |
1593930 | Nov 2005 | EP |
2034269 | Mar 2009 | EP |
2273229 | Jan 2011 | EP |
2273229 | Jan 2011 | EP |
2400261 | Dec 2011 | EP |
2400261 | Dec 2011 | EP |
2489977 | Feb 2012 | EP |
201069634 | Aug 2010 | JP |
2011017700 | Jan 2011 | JP |
2012031979 | Feb 2012 | JP |
0227264 | Apr 2002 | WO |
2004011876 | Feb 2004 | WO |
2004011876 | Feb 2004 | WO |
WO 2004011876 | Feb 2004 | WO |
WO-2004011876 | Feb 2004 | WO |
2009086495 | Sep 2009 | WO |
WO-2011160962 | Dec 2011 | WO |
WO 2011160962 | Dec 2011 | WO |
Entry |
---|
S. Jecic, N. Drvar: The Assessment of Structured Light and Laser Scanning Methods in 3D Shape Measurements: 4th International Congress of Croatian Society of Mechanics, Sep. 18-20, 2003, Bizovac, Croatia, pp. 237-244. |
A. Georgopoulos, CH. Ioannidis, A. Valanis: Assessing the Performance of a Structured Light Scanner: International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXVIII, Part 5 Commission V Symposium, Newcastle upon Tyne, UK 2010, pp. 250 to 255. |
K. Al-Manasirr, C.S. Fraser: Registration of Terrestrial Laser Scanner Data Using Imagery: ASPRS 2006, Annual Conference Reno, Nevada, May 1-5, 2006. |
Annex bundle regarding prior public use of ATOS III+ Triple Scanner—MV700 as per order No. GOMI 10401A, delivered on Apr. 29, 2011. |
Opposition to DE 10 2011 114 674.5 B4 dated Sep. 30, 2016. |
Opposition dated Oct. 28, 2015. |
Addendum to the Summons in Respect of File Reference 10 2011 114 674.5—Date of Hearing Sep. 21, 2017. |
Summons dated Jul. 16, 2014. |
Communication enclosing the European Search Report from the European Patent Office dated Mar. 6, 2013. |
Communication Pursuant to Article 94(3) EPC. |
Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC dated Nov. 30, 2016. |
Examination Request dated Oct. 25, 2011. |
Japanese Office Action dated May 9, 2017. |
Decision announced at the hearing on Sep. 21, 2017. |
Minutes of the public hearing in the opposition proceedings dated Sep. 21, 2017. |
Related U.S. Appl. No. 13/397,056. |
Opposition dated Sep. 8, 2018 of EP2574876B1 with English translation thereof. |
Correspondence Table between documents cited in the Opposition dated Sep. 8, 2018 regarding the parallel EP patent and the Opposition dated Oct. 28, 2015 regarding the parallel DE patent. |
Stjepan Jecić, Nenad Drvar: The Assessment of Structured Light and Laser Scanner Methods in 3D Shape Measurements, in: 4th International Congress of Croatian Society of Mechanics, September 18-20, 2003, Bizovac, Croatia, pp. 237-244. |
Annex bundle relating to alleged prior public use ATOS III+ Triple Scanner MV700; Invoice # D1104291 of 29.04.2011 relating to order GOM110401A, Packing List of 29.04.2011 relating to order GOM110401A; Photograph of the delivered ATOS III+ Triple Scanners MV700. |
Number | Date | Country | |
---|---|---|---|
20130271573 A1 | Oct 2013 | US |