The disclosure of Japanese Patent Application No. 2016-095682 filed on May 11, 2016 including specifications, drawings and claims are incorporated herein by reference in its entirety.
The present invention relates to a non-contact 3D measuring system, and particularly, to a non-contact 3D measuring system which is suitable for use in a vision measuring system or a measuring microscope that includes an interlace camera, and which enables a combined frame image to be provided with high accuracy and thus highly accurate shape measurement.
A non-contact 3D combining method employed for a conventional vision measuring system may be classified into the provision of a combined three-dimensional shape by focal position detection (Point From Focus; hereinafter referred to as PFF) and the provision of a combined three-dimensional shape by white light interference (hereafter referred to as WLI). Both the methods provides a combined three-dimensional shape on the basis of successive images, captured while the measuring head is being scanned in the direction of an optical axis (images referred to here are frame images), and the information on the position at which the images are acquired.
Here, when an interlace camera is used to capture successive images, as illustrated by way of example in
As shown in
However, as illustrated by way of example in
Iij=(I1ij+I2ij)/2.
However, in practice, since the measuring head is not moved at constant speeds due to variations in speed during the acceleration/deceleration of the measuring head or due to speed ripples at the time of a low-speed movement thereof, images are not captured at constant distance intervals. In particular, using a servo motor to move the measuring head may cause the acceleration and deceleration thereof to be gradually performed at the start and end of a movement, thereby making it difficult to acquire images at accurate constant space pitches by the image acquirement at constant time pitches. As a result, properly combined frame images cannot be acquired, with the result of reduced measurement accuracy.
Note that disclosed in Japanese Patent Application Laid-Open No. 2000-270211 is to combine together images of Odd data and Even data; disclosed in Japanese Patent Application Laid-Open No. 2009-49547 is to weight each of pieces of image data of a plurality of images, followed by combining them; and disclosed in Japanese Patent Application Laid-Open No. Hei. 9-274669 is to produce a combined image without determining in advance the distance to an object of interest. However, any of them did not successfully address the aforementioned problem.
The present invention has been developed to address the aforementioned conventional problems. An object of the present invention is to provide an improved method for acquiring a combined frame image, thereby improving measurement accuracy.
The present invention has solved the aforementioned problems by providing a non-contact 3D measuring system configured to provide a combined three-dimensional shape of an object to be imaged based on a frame image captured by a camera while scanning a measuring head thereof in an optical axis direction and information of a position at which the image is acquired, the non-contact 3D measuring system including: a position detector for detecting scanning positions of the measuring head while a plurality of raw images are captured; and a computer for generating interpolation images by linear interpolation using information of the scanning positions for the captured raw images and for generating a combined frame image using the interpolation images.
Here, when the camera is an interlace camera, the plurality of raw images are employed as raw images of respective odd and even fields, and interpolation images of even and odd fields at the same respective positions are generated by linear interpolation of the captured raw images using the information of the scanning positions, so that the raw image of the even field and the interpolation image of the odd field at each position, and the raw image of the odd field and the interpolation image of the even field are combined to produce the combined frame image.
Alternatively, when the camera is a noninterlace camera, it is possible to produce interpolation images between the plurality of raw images.
Furthermore, the non-contact 3D measuring system can include at least any one of an image optical measuring head that includes an objective lens, a camera, and an illumination unit and is capable of performing the Point From Focus (PFF) measurement, and a White Light Interference (WLI) optical measuring head that includes an interference objective lens, a camera, and an illumination unit.
Furthermore, the position detector can be a Z-axis scale.
Furthermore, in performing the Point From Focus (PFF) measurement with a non-contact 3D measuring system using an image optical measuring head that includes an objective lens, a camera, and an illumination unit, it is possible to include the following steps of:
scanning the objective lens along a Z-axis column in a Z-axis direction over a work;
acquiring a raw image from the camera mounted on the image optical measuring head and also acquiring a Z coordinate value from a Z-axis scale mounted on the Z-axis column, thereby stacking images and Z coordinate values at a constant pitch;
generating an interpolation image by linear interpolation using information of the scanning positions for the captured raw images and generating a combined frame image using the interpolation image;
generating a contrast curve at each pixel position from the stacked images; and
combining a 3D shape with a contrast peak position of each pixel being employed as a Z position.
Or alternatively, in performing the WLI measurement with a non-contact 3D measuring system using a White Light Interference (WLI) optical measuring head that includes an interference objective lens, a camera, and an illumination unit, it is possible to include the following steps of:
scanning the interference objective lens in a Z-axis direction;
acquiring a raw image from the camera mounted on the WLI optical measuring head and also acquiring a Z coordinate value from a Z-axis scale mounted on a Z-axis column, thereby stacking images and Z coordinate values at a constant pitch;
generating an interpolation image by linear interpolation using information of the scanning positions for the captured raw images and generating a combined frame image using the interpolation image;
generating an interference signal of each pixel from interference fringes of the stacked images; and
combining a 3D shape with the peak position of the interference fringes of each pixel employed as a Z position.
In the present invention, for example, when the interlace camera is used, as schematically illustrated in
Iij={(dZ2−dZ1)*I1ij+dZ1*I2ij}/dZ2,
where the brightness value Iij is acquired by linear interpolation using the position information (Z coordinate values, dZ1 and dZ2) of the field image captured by the measuring head 10, the position information being acquired, for example, by the Z-axis scale 12.
Thus, greater promise for high precision can be expected as compared with the combined frame image by the conventional simple average. Furthermore, even when there occurs a lack in the field image acquired by the interlace camera, the present invention can be employed so as to produce an interpolation field image accurately from adjacent field images that have no lack. It is thus possible to prevent degradation of precision.
Note that the applications of the present invention are not limited to those using an interlace camera, but may also be applicable even to the case where interpolation images are generated between a plurality of raw images when a noninterlace progressive camera is used.
These and other novel features and advantages of the present invention will become apparent from the following detailed description of preferred embodiments.
The preferred embodiments will be described with reference to the drawings, wherein like elements have been denoted throughout the figures with like reference numerals, and wherein:
Embodiments of the present invention will be described below in detail with reference to the drawings. It should be noted that the present invention is not limited to the contents described in the following embodiments and practical examples. The components of the embodiments and practical examples described below may include ones easily conceivable by those skilled in the art, substantially identical ones, and ones within the range of equivalency. The components disclosed in the embodiments and practical examples described below may be combined as appropriate, and may be selected and used as appropriate.
As illustrated in the entire configuration of
The Z-axis column 24 is provided with a Z-axis scale (not shown) (symbol 12 in
As shown in detail in
As also shown in detail in
Note that in the case of the 3D measuring system for the PFF measurement, the WLI optical measuring head 40, the interference objective lens 42, and the frame grabber 76 for the WLI optical measuring head are eliminated. On the other hand, in the case of the WLI measurement 3D measuring system, the PFF measurement is not performed.
Now, with reference to
First, in step 100, as shown in
Next, in step 110, as shown in
Next, in step 120, according to the present invention, by the method as shown in
Next, in step 140, as shown in
Next, in step 150, as shown in
In this manner, it is possible to increase the shape measurement accuracy for the PFF image measurement using an interlace camera.
Now, the WLI measurement also using an interlace camera will be described.
In this case, the Mirau type optical system is configured as shown in
In any of the configurations, the illumination beam emitted from a white light source 49 of the illumination unit is split into two beams of light, i.e., one to a reference mirror 50 and the other to the work 8, by a beam splitter 52 for the interference objective lens 42. Here, scanning the interference objective lens 42 in the Z-axis direction causes interference fringes to occur about the position at which the optical-path difference between the beam of light reflected on the reference mirror 50 and the beam of light reflected on the surface of the work 8 is zero. In this context, the position of the peak strength of the interference fringes is detected at each pixel position of the CCD camera 44, thereby making it possible to acquire a three-dimensional shape of the surface of the work 8. The figure shows a collimator lens 54, a beam splitter 56, and a tube lens 58.
Now, with reference to
First, in step 200, the interference objective lens 42 is scanned in the Z-axis direction.
Next, in step 210, images and Z coordinate values are stacked at a constant pitch. Here, a raw image is acquired from the CCD camera 44 mounted on the WLI optical measuring head 40, and a Z coordinate value is acquired from the Z-axis scale (not illustrated) mounted on the Z-axis column 24.
Next, in step 220, according to the present invention, by the method as shown in
Next, in step 240, as shown in
Next, in step 250, the peak position of the interference fringes of each pixel is employed as a Z position to provide a combined 3D shape.
It is thus possible to improve the shape measurement accuracy of the WLI image measurement using an interlace camera.
Note that in any of the embodiments, the present invention is applied to a 3D measuring system using an interlace camera. However, the invention is not limited thereto. As the principle is shown in
It should be apparent to those skilled in the art that the above-described embodiments are merely illustrative which represent the application of the principles of the present invention. Numerous and varied other arrangements can be readily devised by those skilled in the art without departing from the spirit and the scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2016-095682 | May 2016 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
3095475 | Brake | Jun 1963 | A |
3128374 | McKelvie | Apr 1964 | A |
5175616 | Milgram | Dec 1992 | A |
5615003 | Hermary | Mar 1997 | A |
5995650 | Migdal | Nov 1999 | A |
6205243 | Migdal | Mar 2001 | B1 |
6441888 | Azuma | Aug 2002 | B1 |
6522712 | Yavuz | Feb 2003 | B1 |
6542249 | Kofman | Apr 2003 | B1 |
6600168 | Geng | Jul 2003 | B1 |
7310154 | Kitaguchi | Dec 2007 | B2 |
7493470 | Cumplido | Feb 2009 | B1 |
7538764 | Salomie | May 2009 | B2 |
7751066 | Iwasaki | Jul 2010 | B2 |
7804585 | Tropf | Sep 2010 | B2 |
7978346 | Riza | Jul 2011 | B1 |
9007532 | Rumreich | Apr 2015 | B2 |
9568304 | Haitjema | Feb 2017 | B2 |
9605950 | Wolff | Mar 2017 | B2 |
9958267 | Angot | May 2018 | B2 |
9962244 | Esbech | May 2018 | B2 |
20020041282 | Kitaguchi | Apr 2002 | A1 |
20020126917 | Akiyoshi | Sep 2002 | A1 |
20030071194 | Mueller | Apr 2003 | A1 |
20030112507 | Divelbiss | Jun 2003 | A1 |
20030228053 | Li | Dec 2003 | A1 |
20040091174 | Wang | May 2004 | A1 |
20040208277 | Morikawa | Oct 2004 | A1 |
20040257360 | Sieckmann | Dec 2004 | A1 |
20050238135 | Younis | Oct 2005 | A1 |
20050257748 | Kriesel | Nov 2005 | A1 |
20070195408 | Divelbiss | Aug 2007 | A1 |
20080208499 | Miyashita | Aug 2008 | A1 |
20080317334 | Hausler | Dec 2008 | A1 |
20090046947 | Kobayashi | Feb 2009 | A1 |
20100002950 | Arieli | Jan 2010 | A1 |
20110007136 | Miura | Jan 2011 | A1 |
20120140243 | Colonna de Lega | Jun 2012 | A1 |
20120176475 | Xu | Jul 2012 | A1 |
20120203108 | Tsujita | Aug 2012 | A1 |
20120238878 | Tanabe | Sep 2012 | A1 |
20130064344 | Carol | Mar 2013 | A1 |
20130155203 | Watanabe | Jun 2013 | A1 |
20130243352 | Khurd | Sep 2013 | A1 |
20130303913 | Tian | Nov 2013 | A1 |
20140115022 | Yasuno | Apr 2014 | A1 |
20140120493 | Levin | May 2014 | A1 |
20140340426 | Furuhata | Nov 2014 | A1 |
20150145964 | Kazakevich | May 2015 | A1 |
20160035076 | Schwarzband | Feb 2016 | A1 |
20160151024 | Goto | Jun 2016 | A1 |
20170330340 | Watanabe | Nov 2017 | A1 |
20180252517 | Le Neel | Sep 2018 | A1 |
20180307950 | Nealis | Oct 2018 | A1 |
Number | Date | Country |
---|---|---|
09-274669 | Oct 1997 | JP |
2000-270211 | Sep 2000 | JP |
2009-049547 | Mar 2009 | JP |
Number | Date | Country | |
---|---|---|---|
20170330340 A1 | Nov 2017 | US |