The present invention relates generally to the field of interferometry and, in particular, to an interferometric profilometer structured to measure a curvature and roughness of a free-form surface.
High-quality precision surfaces are often mechanically machined with the use of, for example, diamond turning lathes or other numerically controlled machine tools that are equipped with natural and/or synthetic diamond-tipped tool bits, or grinding and magneto-rheological finishing. The initial stages of this multi-stage machining process (referred to as “diamond turning”) are carried out using a series of lathes characterized by increasing accuracy. The diamond tip lathe tool is then used in the final stage to achieve sub-nanometer level surface finishes and sub-micrometer form accuracies.
As even miniscule errors in the diamond turning can result in defective parts, the quality of the fabricated elements (such as optical elements, for example) must be checked after each stage of the process. Presently, such quality checks are performed by removing the optical elements from the lathe and measuring them in a laboratory environment. The optical elements are then returned to the machine for further refinement, if the results of the measurements show that the elements do not meet specifications. The remounting of some optical elements is difficult, if not impossible, and such optical elements are often scrapped. (Indeed, elements with free-form surfaces may have no planes or axes of symmetry and, therefore, have to be aligned with all six degrees of freedom of a rigid-body motion with a micron precision with respect to translation and a few seconds of arc with respect to an angle.) Additionally, as the process is done manually, it is slow and costly, making diamond turning expensive and unsuitable for mass production. While metrological devices for measuring the finish and profile of optical elements exist (such as, for example, Zygo New View, described at www.zygo.com/?/met/profilers/newview7000; or Bruker ContourGT-1, described at http://www.bruker.com/fileadmin/user_upload/8-PDF-Docs/SurfaceAnalysis/3D-OpticalMicroscopy/DataSheets/ContourGTI—3D_Optical_Microscope-Datasheet-_DS553-RevA1.pdf), these devices are not necessarily suitable for in situ metrology, because they are either too large or too heavy, require the use of a contact probe, or have a geometry that prevents them from being mounted on a lathe during the machining process.
Therefore, there remains a need in a device enabled to perform non-contact in situ profiling metrology of an optical element that is being machined (such that errors in the finish or contour of the optical element can be quickly and automatically detected and corrected before the element is taken off the lathe) or that is being characterized in the laboratory environment.
Embodiments of the invention provide a method for optically determining a descriptor of a surface of an object. The method includes receiving, with an optical detector, light that has been reflected by the object and that has traversed an interference objective having an axis. The received light characterized by three distinct wavelengths. The method further includes acquiring, with a programmable processor, first, second, and third data from the optical detector, wherein said first, second, and third optical data represent interferometric images of the object formed in light at respectively corresponding wavelengths from the three distinct wavelengths. The method additionally includes determining the descriptor of the surface based on a product of said first, second, and third data. The description of a surface of the object under test may contain a contour of the surface of the object from said fringe and/or a figure of merit characterizing roughness of the surface of the object based of deviation of a line corresponding to a center of said fringe from a straight line.
Embodiments of the invention additionally provide a method for optically determining a descriptor of a surface of an object. Such method includes (i) receiving, with an optical detector, light that has been reflected by the object and that has traversed an interference objective having an axis (where the received light is characterized by three distinct wavelengths); (ii) acquiring, with a programmable processor, first, second, and third data from the optical detector (wherein said first, second, and third optical data represent interferometric images of the object formed in light at respectively corresponding wavelengths from the three distinct wavelengths); and (iii) forming, on a display device, an image of the object based on the product of said first, second, and third data. The method additionally includes determining a fringe of said image corresponding to light that has traversed substantially equal optical paths at the three distinct wavelengths; and axially moving the interference objective with respect to the surface of the object to keep the above-defined fringe substantially in a center of a field-of-view (FOV) of the optical detector. Furthermore, the method includes determining the descriptor of the surface based on the product of the first, second, and third data.
Alternatively or in addition, the embodiments of the invention provide a non-contact interferometric sensor for optical testing of an object. Such sensor includes an interference objective having an optical axis; and an optical detection system positioned to receive light transmitted through the interference objective (wherein the received light is characterized by three distinct wavelengths). The optical detection system is operably cooperated with a data-processing unit containing a programmable processor and tangible, non-transitory computer-readable medium with computer-readable product contained thereon. The computer program product contains program code that, when loaded onto the processor, enables the data-processing unit to (i) acquire, independently from one another, three sets of data from the optical detection system, said sets respectively representing interferometric light distributions formed in light transmitted through the interference objective at respectively corresponding distinct wavelengths; and (ii) form a product of the three sets of data to produce sensor data.
Implementations of the invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings. The present application contains at least one drawing executed in color. Copies of this patent application with color drawings will be provided by the Office upon request and payment of the necessary fee. The drawings include:
The implementations of the present invention provide an active, non-contact interferometric sensor that is small and light enough and structured to be mountable on a turning or milling machine and used to measure the finish and contour of a specular free-form surface, and in particular, a surface of an chosen (for example, optical) element fixed in the turning/milling or precision CNC machine so that errors (such as those resulting from tool wear or distortion of the element due to the rotation of the element in the machine or poor finish) can be detected and corrected before the element is taken off the machine. The use of the embodiments for testing of a roughness of a surface of the element is also within the scope of the invention. While for convenience the description of the invention is presented primarily in reference to one-dimensional profiling of the optical element, one of ordinary skill in the art will appreciate that the two-dimensional scanning and profiling (resulting, for example, in a two-dimensional topographic map of the scanned free-form specularly-reflecting surface of the optical element) is also within the scope of the invention.
The term “active” is used to denote that an embodiment of the sensor is enabled to operably communicate with the system or machine on which it is mounted. The present sensor is also a null sensor and, therefore, exerts an influence on the measured system so as to oppose the effect of the measurand. Specifically, the present sensor generates an output (in a form of an electrical signal, for example, accepted by the machine), based on which a data-processing unit of the system enables the system to maintain the sensor oriented at a fixed distance from and facing towards the surface being measured along the normal to the surface during the process of mutual repositioning between the surface and the sensor (for example, during the optical scanning of the surface). For example, as discussed below in more detail, an optical portion of the sensor can be configured to be repositionable and/or moveable along an optical axis, for example with the use of a piezoelectric transducer juxtaposed to and enabling the repositioning of the optical objective. The output of the sensor (required to maintain the distance and direction from the surface under test) is generated in response to an optical wavefront caused by reflection of light from an area of the specularly-reflecting surface under test and interfering with another wavefront associated with the reference mirror of the interferometric sensor. The reflected, interfering beams use light of different frequencies to unambiguously maintain a fixed distance and direction of the sensor from the surface being profiled.
References throughout this specification to “one embodiment,” “an embodiment,” “a related embodiment,” or similar language mean that a particular feature, structure, or characteristic described in connection with the referred to “embodiment” is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment. It is to be understood that no portion of disclosure, taken on its own and in possible connection with a figure, is intended to provide a complete description of all features of the invention.
In the drawings like numbers are used to represent the same or similar elements wherever possible. The depicted structural elements are generally not to scale, and certain components are enlarged relative to the other components for purposes of emphasis and understanding. It is to be understood that no single drawing is intended to support a complete description of all features of the invention. In other words, a given drawing is generally descriptive of only some, and generally not all, features of the invention. A given drawing and an associated portion of the disclosure containing a description referencing such drawing do not, generally, contain all elements of a particular view or all features that can be presented is this view, for purposes of simplifying the given drawing and discussion, and to direct the discussion to particular elements that are featured in this drawing. A skilled artisan will recognize that the invention may possibly be practiced without one or more of the specific features, elements, components, structures, details, or characteristics, or with the use of other methods, components, materials, and so forth. Therefore, although a particular detail of an embodiment of the invention may not be necessarily shown in each and every drawing describing such embodiment, the presence of this detail in the drawing may be implied unless the context of the description requires otherwise. In other instances, well known structures, details, materials, or operations may be not shown in a given drawing or described in detail to avoid obscuring aspects of an embodiment of the invention that are being discussed. Furthermore, the described single features, structures, or characteristics of the invention may be combined in any suitable manner in one or more further embodiments.
Moreover, if the schematic flow chart diagram is included, it is generally set forth as a logical flow-chart diagram. As such, the depicted order and labeled steps of the logical flow are indicative of one embodiment of the presented method. Other steps and methods may be conceived that are equivalent in function, logic, or effect to one or more steps, or portions thereof, of the illustrated method. Additionally, the format and symbols employed are provided to explain the logical steps of the method and are understood not to limit the scope of the method. Although various arrow types and line types may be employed in the flow-chart diagrams, they are understood not to limit the scope of the corresponding method. Indeed, some arrows or other connectors may be used to indicate only the logical flow of the method. For instance, an arrow may indicate a waiting or monitoring period of unspecified duration between enumerated steps of the depicted method. Without loss of generality, the order in which processing steps or particular methods occur may or may not strictly adhere to the order of the corresponding steps shown.
The invention as recited in claims appended to this disclosure is intended to be assessed in light of the disclosure as a whole, including features disclosed in prior art to which reference is made.
Examples of a System. A schematic diagram of an embodiment 100 of the invention is presented in
The objective sub-system 101 is characterized by an entrance pupil 110, a focusing lens 112, a reference mirror 114, and a beamsplitter 116. In one embodiment, various components of the interference objective 101 are all housed separately from the rest of the sensor 100 (as shown —in a cell 118) to keep them permanently optically aligned with respect to each other. In the illustrative embodiment of
As shown in the specific example of the embodiment 100, the illumination system 102 is (preferably rigidly) attached to the interference objective 101 and includes a light source 120, a diffuser 122, a lens 124, and a beamsplitter 126. Here, the lens 124 is positioned to focus light emanating from the diffuser 122 through the beamsplitter 126 on to the entrance pupil 110 of the interference objective 101, to provide Kohler illumination for the interference objective 101. Generally, however, a different type of illumination (such as, for example, uniform illumination at the entrance pupil 110) can be used. The light source 120 is configured to emit quasi-monochromatic light at three wavelengths (corresponding, in a specific embodiment, to red, green, and blue portions of the visible spectrum). In one implementation, the light source 120 includes three operationally independent sources of quasi-monochromatic light the radiant outputs from which are spatially combined for delivery along the same optical path through the objective 101 towards the surface under test. In a related embodiment, the source 120 includes a source of spectrally broad-band light (for example, the source 120 may include a white-light LED) and a set of three spectrally-distinct optical filters defining spectrally-distinct throughputs when placed in front of the optical source 120. In a specific embodiment, the output from the light source 120 may include light at wavelengths of approximately 480 nm, 540 nm, and 635 nm.
The detection system 103 includes a focusing lens 130 and a color array detector 132. In one implementation, the color array detector 132 is a three-color array sensor. Alternatively, the color array detector 132 may include a color CCD (charged-coupled device), a CMOS (complementary metal-oxide semiconductor) device, or another color image sensor known in the art, operating without optical filter(s) that define spectral band(s) of light incident onto the detector 132.
The analysis/data-processing system 104 is in communication with the color array detector 132 via a communication link 144 and includes a computing device 140 equipped with a programmable processor 141 that is in operable communication with tangible, non-transitory computer-readable medium 143. The medium 143 has a computer program product (with a program code and instructions) 145 encoded thereon. In one implementation, the computing device 140 includes one or more of an application server, a web server, a work station, a host computer, or other like device from which information can be stored and/or processed. The computer readable medium 143 may include, for example, at least one of a magnetic information storage medium (such as a hard disk drive, floppy disk drive, or magnetic tape), an optical information storage medium (such as a Digital Versatile Disk, or DVD; a High-Definition DVD, or HD-DVD; a Blu-Ray Disk, or BD; a Magneto-optical (MO) disk, and a Phase-Change medium, to name just a few), and an electronic information storage medium (a device such as PROM, EPROM, EEPROM, Flash PROM, compactflash, and smartmedium).
The analysis system 104 may further include a display device 142, optionally in operable communication with a computing device 140 via a communication link 146. In certain embodiments, analysis system 104 may further comprise other peripheral devices in communication with computing device 140 such as, and without limitation, input devices, printers, and other computing devices. All in all, the data-processing system 104 is equipped with a sub-system enabled to generate a visually-perceivable representation of the results of operation of the data-processing system 104.
Alternatively or in addition, the analysis system 104 is further in operable communication with the equipment or machine (not shown) upon which sensor 100 is mounted (such as, for example, and without limitation, a diamond turning lathe). In a related implementation, the analysis system 104 is in communication with a separate moveable stage (not shown) attached to a machine, such as a diamond turning lathe, which moveable stage is configured to change the orientation and/or position of the sensor 100. The analysis/data-processing system 104 is structured to generate an output to be transmitted to the machine or a greater system which is used to define and/or maintain the position and/or orientation of the sensor 100 in space. Such output may include, for example, an electrical signal (such as, for example, a voltage pulse or a pulse of current) or a set of computer-readable instructions that cause the programmable processor 141 move the sensor 100. In a related implementation, the output may include data based on which the programmable processor 141 is programmed to calculate how to move the sensor 100.
While the computing device 104 is illustrated as being in communication with the color ray detector 132 and display 142 via the communication links 144 and 146, respectively, generally the computing device 104 may employ wireless communication means (such as, for example, means including source(s) and detector(s) of radio, microwave, and infrared waves).
In one implementation, the objective sub-system 101 may be configured to be repositionable by cooperating it with a piezoelectric actuator (PZT) 101A (shown with a dashed line) the operation of which is governed by the computing device 104, whether through an electric and/or data-transmitting cable or wirelessly.
In a specific implementation, the color array detector 132 is configured to produce an output that represents a distance between the interference objective 101 and the measurand 105 (for example, the output the value of which is proportional to such distance) and that is proportional to the dihedral angle formed by the reference mirror 114 of interference objective 101 and the measurand 105. The generated output is then analyzed by the computing device 104 with the use of instructions 145 to detect errors in the finish and/or contour of measurand 105.
Where the display device 142 is part of the system, the computing device 104 may be employed to generate an image of measurand 105. In such embodiments, all or at least a portion of the image of the measurand 105 may be displayed. In preferred embodiments, the displayed image corresponds to a portion of measurand 105 that is as large as the area of the detecting, photo-sensitive surface of the color array detector 132 divided by a coefficient substantially equal to the image magnification produced by the combination of the interference objective 101 and the detection system 103 (provided the objective 101 is well focused on the measurand 105).
In a related embodiment, the sensor of the invention has the configuration 160 illustrated in the schematic diagram of
By way of example, if the light source 120 is replaced with a source of substantially monochromatic light, then when the objective 101 is focused on the measurand 105, the image of the measurand 105 on the video screen 142 would show a series of substantially straight and approximately equally-spaced interference fringes (such as fringes 204 illustrated in
Interferometric Output. The interference fringes resulting in a system of the invention that utilizes the light source 120 generating quasi-monochromatic light at three distinct wavelengths are shown schematically in
As is illustrated in
As can be seen in
As can be seen in
When the measurand 105 and the reference mirror 114 are perfectly parallel to one another, as shown in
When the surface of the measurand and the surface of the reference mirror form a non-zero dihedral angle with respect to one another, the zero order fringe 210 is localized in the middle portion of the reference mirror 114 (and, therefore, in the middle portion of the detector 132, as depicted in
Because of the small angle α between the measurand 105 and the reference mirror 114, a set of interferometric fringes at each of the three wavelengths of the light source 132 will be formed at the detector 132 of the embodiment 100, creating a zero-order fringe 210 at the area corresponding to optical path lengths equal at all three wavelength. The number of the interferometric fringes across the clear aperture of the detector 132 depends on the tilt angle α. Indeed, the width of the interferometric picture is tilt-angle dependent according to
y=λ/2*tan(α), (1)
where y is the space between fringes along the y-axis, and λ is the wavelength of the light producing the set of fringes. The axial sensitivity of measuring the displacement between the embodiment of the sensor and the object under test can be determined based on the Eq.(1). As the wavelength of light increases, the spacing of the fringes increases as shown in
If the distance d between the measurand 105 and the reference mirror 114 changes (in comparison to that between the beamsplitter 116 and the reference mirror 114), the position of the zero-order fringe 210 changes in the xy-plane, as shown in
Δy=e/tan(α), (2)
which is the distance separating points 913 and 210 in
As the change of the distance e between the measurand 105 and the reference mirror 114 is not the same across the measurand 105 (i.e., it is not the same for every point of the measurand's surface) due to the measurand's curvature and the roughness of its surface, the separation Δy of locations of the zero-order fringe (point 210) and point 913 in
e
i
=y
i*tan(α) (3)
where i is the index of the columns along the x-direction in
By separating the sub-sets of data, acquired at the detector 132 and corresponding to the red, blue, and green light, the profiles of irradiance represented by such sub-sets of data appear similar to those empirically acquired and depicted in
Irradiance(new)=A+B*Irradiance(old) (4)
where the coefficients A, B are defined to ensure the zero mean and no tilt.
In certain embodiments, the sensor 100 (and/or the sensor 160) is mounted on a movable stage, such as stage 300 shown in wire-frame view in
In operation, the stage 300 is mounted on the diamond turning machine. As will be appreciated by one of ordinary skill in the art, the means for mounting of the stage 300 on the machine depends on the particular diamond turning machine and requires only routine skill. Furthermore, one of ordinary skill in the art will appreciate that stage 300 is meant to be illustrative and not limiting and that other stages having other configurations that would allow the sensor 100 to move in the Z, X, and N directions are also within the scope of the present invention.
In a specific case when stages 310 and 320 are already a part of the diamond turning machine, the embodiments of the sensor could be mounted directly to these stages and used to scan parts if they are planar or substantially planar. If the parts are substantially curved, then the N axis stage must be used to keep the sensor oriented perpendicularly to the part.
In the preferred embodiment, however, the sensor 100 of
Whether the stage 300 is employed or the movement of the diamond turning machine is used, the sensor 100 is preferably centered on the rotary stage to minimize the overhung moments, as illustrated in
Example of Algorithm. Turning now to
With respect to
As is indicated by steps 406, 408 and 410, the sensor's 3-color detector is used to capture interferogram at position (n) and to readout and detrend the same to make the intensities lie about their mean on a line free of tilt, such as is illustrated in
As is illustrated in
y=A*sin(B*x+C). (5)
Alternatively or in addition, and in a related embodiment, the zero-order fringe of an interferometric pattern produced by the detector 132 can be identified by forming a product of the three sets of data acquired from the detector (which sets respectively correspond to light received by the detector at the three distinct wavelengths) and finding the extrema of the distribution of this product across the surface of the detector. This can be done on a column-by-column basis or on a row-by-row basis (i.e., for each linear arrangement of pixels across the detector).
In further reference to
For each color (i), once the amplitude, spatial frequency, and phase are determined, the differences between the current values for the spatial frequency and phase (B(n)(i), C(n)(i)) and the original values (B(0)(i), C(0)(i)) are determined, as indicated by steps 416, and 418. As will be appreciated, as the sensor is moved outwardly along the X-axis, either using a preprogrammed trajectory or by receiving instruction from an operator, both the distance to the objective and the angle between the objective and the measurand will change slightly. The difference between B(n)(i), and C(n)(i)) and the original values (B(0)(i), C(0)(i)) can be used as feedback to the Z and N axis of the sensor stage to restore the original alignment conditions even though the sensor is in a new position along the X axis. Thus, once the difference is calculated, the sensor's position along the Z and N axes is adjusted to restore the original intensity pattern, as is indicated by step 422.
Once the process represented by steps 414-422 has been repeated for each color (i) (i.e., i=1), the N(n), Z(n), and X(n) positions are recorded. The sensor is then moved to the next X position (i.e., X(n+1)) and the process repeated until the sensor has scanned across the measurand. The N, Z, and X positions for n=0 through n=N can then be plotted, wherein the plot represents a contour that is strictly proportional to the contour of the measurand. The constant of proportionality is determined by the distance between the sensor zero-order fringe position and the point about which the sensor is rotated in N. In certain embodiments this distance is pre-calibrated before the sensor is used.
The measurement process has been described above in reference to a rotationally symmetric measurand. However it can also be applied with minor variation where the measurand is non-rotationally symmetric. In such situations, if the asymmetry is small (for example, it has a slope variation in azimuth of roughly 10 minutes or less) the stage illustrated in
For larger slopes, the stage illustrated in
While the movement of the sensor in X is described in
In one example, the process described in reference to the flow-chart of
Turning now to
Once the sensor is positioned, the first intensity map is captured, read out, and recorded, as indicated by step 430. Next, using a combination of X and Z motion to move the sensor (away from or towards the measurand along a line normal to the surface under test) by a distance corresponding to one-quarter of an interferometric fringe, a series of three more interferograms are captured, read out, and recorded at each position, as indicated by steps 432, 434, and 436. Since the sensor was initially positioned at an angle N with respect to the normal to the surface of the measurand, the ratio of the extent of the X motion to that of the Z motion is represented by tan(N):
The total extent of the motion of the sensor corresponding to a change between the acquired interferograms is substantially 480 nm for the blue light divided in half for reflection and the further by one-quarter for the one-quarter fringe, or about 60 nm.
Once all four maps of irradiance are taken, the surface roughness can be determined by using any one of a number of known algorithms, the most common of which is the so-called 4-bucket algorithm. The 4-bucket algorithm will be well known and within the skill of one of ordinary skill in the art. However, among other sources, the phase shifting algorithms for surface roughness and contouring in general are discussed in D. Malacara, Optical Shop Testing, Ch. 14-15 (3rd ed., Wiley 2007).
More specifically, as is indicated by step 438, the surface roughness is calculated by finding the phase θ over the entire viewing field of the objective (i.e. at every pixel) for the blue intensity curve, using:
where I is the intensity of the pixel from the mth interferogram. The phase is unwrapped, meaning that the 2*Pi steps are removed, and then the mean and tilt are removed in the phase map. At this point, the RMS (root mean-square) phase over all the pixels is the RMS surface roughness. In certain embodiments, the surface roughness is determined on the sub-nanometer scale.
Using the example of a one megapixel camera and a 10× objective operating, in an embodiment of the sensor, at approximately 5× magnification, as described above, at the end of the process 400B of
In certain embodiments, only one fringe pattern (that corresponding to light at a blue wavelength) is used because it is of about the same intensity as the red but has a shorter wavelength, thereby having approximately 30% (thirty-percent) greater sensitivity. Also, the red and green fringe patterns would corrupt the phase calculation.
While in the described embodiment 400 of the method the sensor is moved in steps corresponding to a quarter of the interferometric fringe to produce four fringe (irradiance) maps that are used to calculate the phase, in a related embodiment more or fewer than four interferograms may be captured. Further, while in the preferred embodiments the diamond turning machine is used to move the sensor toward the measurand, in other embodiments, a separate stage is used with the sensor to move the sensor towards the measurand. As will be readily apparent to one of ordinary skill in the art, the benefits of using the machine upon which the sensor is mounted to move the sensor, as opposed to a separate piezo-electric device attached to the sensor, is that it is a far more inexpensive way to measure contour and surface roughness.
It should be pointed out that since there is the ability to obtain both contour along a diameter and roughness on either side of the diameter, a complete mapping of the band of the surface of the measurand about the diameter is possible. In certain embodiments, the mapping may be performed at a resolution of about 1 μm spatially in both directions.
Program code(s), such as that containing instructions 141 (
An embodiment of the invention may include an article of manufacture containing a programmed processor governed with computer-readable instructions executed by a computer external to, or internal to, a computing system to perform one or more of blocks 402-438 recited in
Additional Considerations. As follow from the discussion provided in reference to
Referring now to
In further reference to
It is worth noting that both the hardware and methodology of optical testing effectuated by an embodiment of the present invention significantly differs from that known as white-light interferornetry or modality. In applications employing white-light techniques (WLTs)), a monochromatic camera is used without the hardware differentiation of any particular color. The coherence function (represented by a fringe visibility envelope of the acquired interferometric data) is typically fit to the data, based on which the peak of the interferometric fringe distribution is located. When the white-light interferometer is correctly aligned, the location of the peak is in focus and the height of the peak is known (because the scan height is known). The surface of the object under test is scanned until all points within the field of view are mapped out.
In contradistinction with the WLTs, an implementation of the proposed method requires, a color-sensitive imaging system (for example, a system distinguishing among the three wavelengths such as RGB wavelengths) is used. The three color channels of the imaging system of the embodiment 100, for example, are equivalent to using color filters for wavelength separation at the surface of the detector 132. The data corresponding to these three color channels are easily stored together separately or in a single color image and are easily separable for subsequent processing.
In addition, the data processing required by the present invention involves separating the three channels (RGB) and them multiplying than together (as discussed above, for example in reference to
Moreover, a map of height of the surface under test can be built up, according to an embodiment of the invention, by thresholding the resulting image as the test object is scanned. The WLTs do not involve thresholding of acquired data and often require complex fitting techniques to find the visibility envelope peak. At least for the reasons discussed above, a typical white-light interferometric system is simply not operable for the purpose intended by an embodiment of the present invention. At least for the same reasons, the adaptation of a typical white-light interferometric system for the purpose of the present invention would require hardware and/or data-processing modifications that would change the principle of operation of the white-light interferometric system.
While the preferred embodiments of the present invention have been illustrated in detail, it should be apparent that modifications and adaptations to those embodiments may occur to one skilled in the art without departing from the scope of the present invention.
The present application claims benefit of and priority from the U.S. provisional patent application No. 61/645,278 filed on May 10, 2012 and titled “Noncontact Interferometric Sensor and Method of Use”. The disclosure of the above-mentioned provisional patent application is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61645278 | May 2012 | US |