Generally, the disclosure relates to optical measurements and imaging. More particularly, however not exclusively, the disclosure relates to the determination of a position and optical properties of an object.
Non-contact optoelectronic measuring devices provide a non-destructive and relative high-speed approach for providing surface profiles and surface characterisation. There exists a variety of different techniques to perceive a depth information for measuring distances and orientations of objects and surfaces such as confocal, structured light or laser triangulation, white light interferometry, fringe projection or depth of focus imaging techniques.
Triangulation devices are widely used to add a depth information into different kinds of industrial machine vision systems partly due to their relative simplicity and ease of use. In triangulation-based devices, typically a laser beam or LED, is projected onto the object under measurement and light reflected from the object point is detected by a light sensor at an imaging angle depending on the distance between the light source and the object point. The imaging angle, the baseline between the light source and light sensor and the angle of the light projected from the light source define a triangulation geometry from which the depth information and surface profile can be extracted. It is common to also extract the intensity of the detected light to provide reflectance of the surface revealing information about important material optical properties.
One known problem in the present triangulation devices is that the measurement speed is limited to the frame rate of the light sensor.
One known problem in the present laser-triangulation devices is an inevitable speckle noise that is seen as interference patterns on the light sensor due to coherent illuminating photons, impairing the extraction of the surface position and surface optical properties.
Yet another known problem in the present laser-triangulation devices arises from the restriction to use only small numerical aperture optics which severely limits the capability to produce surface profiles for glossy, sloped surfaces.
The aim of the invention is to at least alleviate one or more problems associated with the existing solutions in the context of determination of a surface position and surface optical properties of an object under measurement. Especially an aim of the invention is to provide a measurement device for determining surface position and surface optical properties that allows to increase measurement speed. Additionally, an aim of the invention is to provide such a measurement device that is also suitable for determining surface position and surface optical properties of glossy, sloped surfaces. Additionally, an aim of the invention is to simplify the arrangement of the measurement device. Further aim of the invention is to provide a device and a method capable of determining simultaneously surface position and at least one of property of the surface selected from the group consisting of reflectance, reflectivity, transmittance and transmission with one or more different wavelengths of light.
The following presents a simplified summary to provide a basic understanding of some concepts of the invention in a simplified form as a prelude to a more detailed description of exemplifying and non-limiting embodiments of the invention.
The object of the invention can be achieved by the features of independent claims.
In accordance with one aspect of the present invention, there is provided a new device for determining a position and/or optical properties of an object. A device according to the invention comprises:
The illuminating optics is configured to focus light from a location point of the output element on a plurality of illuminating focus points or focus areas positioned at different distances from the illuminating optics along an illuminating coordinate axis associated with a principal ray of the illuminating optics for the location point of the output element, wherein the principal ray is the mutual for the plurality of illuminating focus points or focus areas focused from the location point of the output element, and wherein each of the illuminating focus points or focus areas along the same illuminating coordinate axis differs from each other at least in the dominant wavelength or shape and/or is formed with a different optical aperture of the illuminating optics.
The imaging optics is configured to form from each of the location points of the light sensor a plurality of imaging focus points or focus areas positioned at different distances from the imaging optics along an imaging coordinate axis associated with a corresponding principal ray of the imaging optics for the corresponding location point of the light sensor, wherein the corresponding principal ray is the mutual for the plurality of imaging focus points formed from the corresponding location point of the light sensor, and wherein each of the imaging focus points or focus areas along the same imaging coordinate axis differs from each other at least in the dominant wavelength or shape and/or is focused with a different optical aperture of the imaging optics.
The illuminating optics and the imaging optics are configured to form a plurality of coincident focus points or focus areas so that each of the various focus points or focus areas from the plurality of illuminating focus points or focus areas along the same illuminating coordinate axis coincides at a coincident focus point or focus area with an imaging focus point or focus area positioned along a different imaging coordinate axis, where the orientation of the illuminating coordinate axis is different from the orientations of the imaging coordinate axes. Furthermore, the illuminating optics and the imaging optics are configured to form the plurality of coincident focus points or focus areas so that each of the coincident focus points or focus areas consists of an illuminating and imaging focus point or focus area associated with the common dominant wavelength or shape and/or is formed with the correlated optical apertures of the illuminating optics and the imaging optics.
The device is configured to determine the position and/or optical properties of an object point of the object from the local maximum of the intensity distribution of the light detected by the light sensor so that:
In accordance with the present invention, there is provided also a new method for determining a position and/or optical properties of an object. A method according to the invention comprises:
In accordance with the present invention, there is provided also the use of the device for determining a thickness between a first surface and a second surface of the object, wherein the object is at least partially transparent or translucent for the light being directed to the object.
The present invention offers advantages over the prior art, for example, that the invention:
Various exemplifying and non-limiting embodiments are described in accompanied dependent claims. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated.
In this document, the word “plurality” refers to a quantity of two or more. Furthermore, it is to be understood that the use of “a” or “an”, i.e. a singular form, throughout this document does not exclude a plurality.
In this document, the word “axis” means an imaginary line and the word “axes” is the plural form of the word “axis”.
The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of unrecited features.
The novel features which are considered as characteristic of the invention are set forth in particular in the depended claims. The invention itself, however, both as to its construction and its method of operation, together with additional objects and advantages thereof, will be best understood from the following description of specific example embodiments when read in connection with the accompanying drawings.
The considerations concerning the various embodiments of the device may be flexibly applied to the embodiments of the method mutatis mutandis, and vice versa, as being appreciated by a skilled person.
The embodiments in the following detailed description are given as examples only and someone skilled in the art can carry out the basic idea of the invention also in some other way than what is described in the description. Most embodiments can be actualized in a variety of combinations with other embodiments. Though the description may refer to a certain embodiment or embodiments in several places, this does not imply that the reference is directed towards only one described embodiment or that the described characteristic is usable only in one described embodiment. The individual characteristics of a plurality of embodiments may be combined and new embodiments of the invention may thus be provided.
Next the present invention will be described in greater detail with reference to the accompanying drawings, in which:
1, 5b2 illustrate exemplary intensity distribution diagrams in accordance with
In the following, some embodiments of the invention are disclosed. In the described embodiments, illuminating and imaging of the surface of the object are realized biaxially from different directions. Thus, a virtual measuring space can be created by coinciding the image of the projected illuminating source or illuminating sources of the illuminating and the image of the light sensor of the imaging in such a way that numerous coincident focus points (measuring lines and/or groups of measuring points of coinciding focus points or focus areas) are common for both of said optics. As the object to be measured intersects the virtual measuring space, one or more coincident focus points or focus areas of the virtual measuring space intersect said object in certain object points. A strong reflection generated from this intersection of the object point and a coincident focus point or focus area is indicated by a light sensor and converted to a surface position of the object under measurement.
In this document, reflection refers to specular reflection and diffuse reflection in which reflection may take place from a smooth or rough surface. In addition, reflection also refers herein to scattering, refracting and reflecting radiation from inside the object to be measured.
In order to intersect the object point, which can be for example a point from the surface of the object to be measured, advantageously the geometric shape of the output element of illuminating which provides light, or at least part of it, is projected via the illuminating optics, and this image is monitored with the imaging wherein light reflected from the object is collected via the imaging optics onto the light sensor. In the intensity distribution of the light detected by the light sensor, local intensity maxima are resulted in those points that correspond to intersections of the object point and the images of the location point of projected light source geometry and the light sensor geometry. Position of the object point is determined from the place of the local intensity maximum of the light distribution defined by the light sensor and the optical properties object point are determined from the intensity or the wavelength of the local intensity maximum.
In this document the location or place of a local maximum of the intensity distribution is the location on the light sensor 8 where the intensity value of the detected light reaches a local maximum value.
If the object to be measured (i.e. target) consists of several partly light-permeable and reflecting surfaces, such as, for example, a plastic film or a glass plate, the method creates from these surfaces an individual local intensity maximum, respectively, and the thickness of the film or plate can be defined on basis of the difference between the positions of these intensity maxima when the refractive indexes of the materials involved in the measurement and the measurement geometry are known. The measurement geometry comprises illuminating angles and imaging angles and the geometry of the object 2 under measurement respect to said angles.
In some cases, it is also possible to determine positions of object points on sub-surfaces of the object, i.e. internal structures of the object depending on optical properties (like transparency and translucency) of the object for the light being directed to the object.
In respective figures, the same or corresponding parts are denoted by the same reference numerals, and in most cases duplicate textual description will be omitted as well.
Both the optical illuminating means and optical imaging means of the device are configured to form the focus points or focus areas of them both to the virtual measuring space such that the illuminating optics 11 focuses different points from the illuminating, comprising one or more output elements 4.1, and imaging optics 13 focuses different points from the imaging area, comprising the light sensor 8, in a known way on the virtual measuring space to different positions. Each focus points or focus areas formed by the illuminating optics 11 from a certain location point of the illuminating (i.e. output element) differs from each other at least in the dominant wavelength or shape and/or is formed with a different optical aperture of the illuminating optics 11. Correspondingly, each focus points or focus areas formed by the imaging optics 13 from a certain location point of the imaging area (i.e. light sensor) differs from each other at least in the dominant wavelength or shape and/or is formed with a different optical aperture of the imaging optics 13. Coincident focus points or focus areas are formed such that at each coincident focus point or coincident focus area an illuminating focus point or an illuminating focus area (i.e. focus point or focus area formed by the illuminating optics 11) coincides (overlaps) with an imaging focus point or an imaging focus area (i.e. focus point or focus area formed by the imaging optics 13) with the same common dominant wavelength or shape. Additionally, or alternatively, said focus points or focus areas are formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13. In other words, each of the coincident focus points or focus areas consists of an illuminating and imaging focus point or focus area associated with the common dominant wavelength or shape and/or is formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13. Thereby, there is possible to form a plurality of coincident focus points or focus areas (i.e. common focus points or common focus areas) which each coincident focus point or focus area is associated with the common dominant wavelength of formed focus points or focus areas and/or is formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13. When an object point of the surface of the object 2 to be measured coincides one of such formed coincident focus point or coincident focus area of the virtual measuring space, the reflection generated from it is very strong in comparison with the light reflected from other object points of the surface of the object 2. With the imaging optics 13, the light reflected from the surface of the object 2 is directed to the light sensor 8, where the position of the local intensity maximum of the propagated light is detected and the position and intensity data of detected light is formed into an electrical signal. The light sensor 8 can be, for example, a CCD, CMOS matrix, a position sensitive detector (PSD) or the like. The device determines by the processing means (not shown in
In various embodiments, the output element 4.1 may generally comprise or consist of a LED or laser based light source for illuminating and providing light.
In various embodiments, an individual output element may advantageously comprise a separate light source and a separate slit or pinhole through which light (i.e. illumination) from the separate light source is provided.
The provided light (i.e. optical radiation) from the output element can be electromagnetic radiation with the wavelength band of which is located from ultraviolet radiation (wavelength ca. 50 nm) to infrared radiation (wavelength ca. 1 mm).
The provided light can be either white light (i.e. comprising all visible wavelengths) or can comprise either one wavelength or many different wavelengths.
In various embodiments the provided light can be either coherent or noncoherent light.
The illuminating optics 11 and imaging optics 13 may comprise lens or lenses, mirrors or other optical components. Furthermore, the illuminating optics 11 and imaging optics 13 may be identical or nonidentical.
In the device of the embodiment 100 longitudinal chromatic aberration is utilized in the illuminating optics 11 and the imaging optics 13. Due to the longitudinal chromatic aberration, different illuminating focus points from the location point P1(4.1) of the output element 4.1 in wavelengths λ1(4.1), λ2(4.1), λ3(4.1), λ4(4.1), λm(4.1) and different imaging focus points from the different location points between the end points P1(8) and Pn(8) of the light sensor 8 in wavelengths λ1(8), λ2(8), λ3(8), λ4(8), λm(8) are formed at different distances form said optics in direction of the corresponding principal rays, see
In this document a set of different objects is denoted as “the first object”→“the last object”, such that by P1(8)→Pn(8) is meant a set of different location points between the extreme location points P1(8) and Pn(8) of the light sensor, wherein n can be any integer that is greater than 1.
The point-like output element 4.1 of the device in the embodiment 100 may advantageously be a LED based light source radiating white light.
In the embodiment 100, by adjusting a distance of the output element 4.1 from the illuminating optics 11, a displacement of the output element 4.1 from the optical axis of the illuminating optics 11, a distance of the light sensor 8 from the imaging optics 13 and an inclination of the light sensor 8 from the optical axis of the imaging optics 13, said optics are configured to form coincident focus points from the illuminating focus points of the location point P1(4.1) of the output element 4.1 and the imaging focus points of the different location points P1(8)→Pn(8) of the light sensor 8. At each coincident focus point one illuminating focus point of the location point P1(4.1) coincides (i.e. meets) with one imaging focus point of one location point P1(8)→Pn(8) of the light sensor 8 with the same wavelength. In other words, the coinciding focus point is associated with the common dominant wavelength of the coinciding focus points. Furthermore, in order to measure different positions of the intersecting object 2, the coincident focus points are formed so that each of the illuminating focus points of the location point P1(4.1) of the output element 4.1, which forms coincident focus point, coincides with such an imaging focus point of the location points P1(8)→Pn(8) of the light sensor 8 that is along different imaging coordinate axis A1(8)→An(8). Different imaging coordinate axes A1(8)→An(8) and imaging focus points along these axes correspond different location points P1(8)→Pn(8) of the light sensor 8.
Therefore, there are advantageously numerous (various) coincident focus points formed from the various illuminating focus points of the location point P1(4.1) of the output element 4.1 which each of the various illuminating focus points are coinciding with an imaging focus point of the location points P1(8)→Pn(8) of the light sensor 8 positioned along a different imaging coordinate axis A1(8)→An(8).
In the embodiment 100, by lens or lenses of the illuminating optics 11 and imaging optics 13 is produced longitudinal (axial), chromatic aberration. Thereby, in the embodiment 100, the components λm(4.1), λm(8) of the red end of the light are in focus in the virtual measuring space under the surface of the object 2, and, correspondingly, the components λ1(4.1), λ1(8) of the blue end of the light are in focus above the object 2. In the embodiment 100, both the blue ends λ1(4.1), λ1(8) of the illuminating and imaging are in focus in the upper edge of the virtual measuring space. Accordingly, the red ends λm(4.1), λm(8) of the spectra are in focus in the bottom edge of the measuring space (under the object 2 at height h1 in
In the example of
In the device the position of the received local maximum on the light sensor 8 and the location point of the output element from which the light reflected from the intersection of the object point and the coincident focus point, is calibrated, based on triangulation, to correspond a certain position of the object 2.
In the embodiment 100, both the illuminating and imaging are shown in a symmetrical angle in regard of the surface of the object 2. The invention is however not restricted to this kind of a symmetrical measuring situation, but the angles of the illuminating and imaging in regard of the object surface can be of different size without having an effect of the measuring event. Although the angle between illuminating and imaging changes, nevertheless, coincident focus points defining the measuring space can be formed such that at each coincident focus points an individual illuminating focus point coincides with an imaging focus point associated with a common dominant wavelength.
In the embodiment 100, the output element 4.1 is in the optical axis of the illuminating optics 11. The position of illuminating and the output element 4.1 is however not restricted to this position, but the output element 4.1 can be located also otherwise as we see in the further embodiments.
In the embodiment 100 the spectrum of the light provided by the output element 4.1 does not need to be continuous, but can comprise, for example sequences of different wavelengths or sequences of differently way discontinuous light.
Switching over to second embodiment 200, various features of embodiment 200 are generally considered similar to the one of embodiment 100 and are not repeated here to avoid unnecessary redundancy.
However, the embodiment 200 differs from the embodiment 100 in that spherical aberration is provided in the lens or lenses of the illuminating optics 11 and imaging optics 13. In that case said optics refract the light to different illuminating focus points θ1(4.1)→θm(4.1) and imaging focus points θ1(8)→θm(8) on the grounds that where the light propagates through the illuminating optics 11 or imaging optics 13. Light beams propagating through the optics at the centre have their focus point θ1→m(4.1), θ1→m(8) further away than the light beams propagating through the outer edges of the lens.
This is illustrated in
In the embodiment 200, the measuring is based on the spherical aberration of the lens or lenses, or generally, the spherical aberration of the optical components, of the illuminating optics 11 and imaging optics 13. The coinciding focus points can be arranged in the same way as illustrated in the example of
A third embodiment 300 differs from the embodiment 200 in that astigmatism is provided in the lens or lenses of the illuminating optics 11 and imaging optics 13. In that case said optics refract the light to different focus areas φ1(4.1)→φm(4.1) and φ1(8)→φm(8) so that a focus from a certain location point is appearing as in between a line or ellipse shape instead of a point shape at different distances from said optics. Due to astigmatism light beams lying in the tangential and sagittal planes are refracted in said optics differently and both sets of beams intersect the corresponding principal ray for corresponding location point at different focus areas on the grounds where the light propagates through the illuminating optics 11 or imaging optics 13. These light beams of tangential and sagittal planes fail to produce a focused image point, but rather produce a series of elongated focus areas ranging from linear to elliptical, depending upon the distance from said optics.
This is shown in
In the embodiment 300, the measuring is based on the astigmatism of the lens or lenses of the illuminating optics 11 and imaging optics 13. The coinciding focus points can be arranged in the same way as illustrated in the example of
The illuminating optics 11 and the imaging optics 13 are arranged to form coincident focus points or coincident focus areas in the manner as described above. Now, when the output elements 4.1 and 4.2 both have line like geometric shape it is possible to arrange coincident focus points or focus areas by illuminating optics 11 and imaging optics 13 into lines at different depths which lines are formed from different location points of both said output elements and different location points of the light sensor 8b.
When longitudinal chromatic aberration is utilized in the illuminating optics 11 and imaging optics 13 there are lines of coincident focus points with common dominant wavelengths of coinciding illuminating focus points and imaging focus points at different depths in the virtual measuring space. Correspondingly, if spherical aberration is utilized there are lines of coincident focus points at different depths, which each corresponding line of coincident focus points consists of coinciding illuminating focus points and imaging focus points which are formed with the corresponding correlated (equal) optical apertures of the illuminating optics 11 and the imaging optics 13. Correspondingly, if astigmatism is utilized in illuminating optics 11 and the imaging optics 13 there are lines of coincident focus areas at different depths, which each corresponding line of coincident focus areas consists of coinciding illuminating focus areas and imaging focus areas associated with the same shape and which coinciding illuminating focus areas and imaging focus areas are formed with the corresponding correlated (equal) optical apertures of the illuminating optics 11 and the imaging optics 13.
In
The intersection of the surface of the object 2 at the location of the coincident focus point Az(4.1)Axy(8) generates a reflection of the light originated from the output element location Pz(4.1) from which part of the reflected light is collected onto the location point Px(8b)Ky(8b) of the light sensor 8b, generating a local maximum to the formed reflection data. Respectively, the intersection of the surface of the object 2 at the location of the coincident focus point Az(4.2)Axx(8) generates a reflection of the light originated from the output element location Pz(4.2) from which part of the reflected light is collected onto the location point Px(8b)Kx(8b) of the light sensor 8b, generating a local maximum to the formed reflection data. In the same manner, a local maximum is formed from the other intersections of L1a and L1b as illustrated corresponding lines on the light sensor 8b.
In various embodiments, the device comprises a plurality of line-like or point-like output elements 4.1→4.n or combinations thereof.
In embodiments which comprises various output elements, the illuminating optics 11 is configured to focus light from each of the location points P1→Pn for each of the output elements 4.1→4.n on a plurality of illuminating focus points or focus areas positioned at different distances from the illuminating optics 11 along the corresponding illuminating coordinate axis A1→An. Each of these axes is associated with the corresponding principal ray F1→Fn of the illuminating optics 11 for the corresponding location point P1→Pn of the corresponding output element 4.1→4.n. The corresponding principal ray F1→Fn is the mutual for the plurality of illuminating focus points or focus areas focused from the corresponding location point P1→Pn of the corresponding output element 4.1→4.n. Each of the illuminating focus points or focus areas along the same corresponding illuminating coordinate axis A1→An for the corresponding location point P1→Pn of the corresponding output element 4.1→4.n differs from each other at least in the dominant wavelength or shape and/or is formed with a different optical aperture of the illuminating optics 11.
Furthermore, the illuminating optics 11 and imaging optics 13 are configured to form a plurality of coincident focus points or focus areas so that each of the various focus points or focus areas from the plurality of illuminating focus points or focus areas along the same corresponding illuminating coordinate axis A1→An for each of the output elements 4.1→4.n coincides at a coincident focus point or focus area with an imaging focus point or focus area positioned along a different imaging coordinate axis A1(8)→An(8), where the orientations of the illuminating coordinate axes A1→An for each of the output elements 4.1→4.n are different from the orientations of the imaging coordinate axes A1(8)→An(8).
In various embodiments longitudinal chromatic aberration is utilized in the illumination optics 11 to focus light from each of the location points P1→Pn for each of the output elements 4.1→4.n on a plurality of illuminating focus points positioned at different distances from the illuminating optics 11 along the corresponding illuminating coordinate axis A1→An are associated with the corresponding principal ray F1→Fn of the illuminating optics 11 for the corresponding location point P1→Pn of the corresponding output element 4.1→4.n so that each of the illuminating focus points along the same illuminating coordinate axis A1→An differs in the dominant wavelength. Correspondingly, longitudinal chromatic aberration is provided in the imaging optics 13 to form from each of the location points P1(8)→Pn(8) of the light sensor 8 a plurality of imaging focus points positioned at different distances from the imaging optics 13 along the corresponding imaging coordinate axis A1(8)→An(8) associated with the corresponding principal ray F1(8)→Fn(8) of the imaging optics 13 for the corresponding location point P1(8)→Pn(8) of the light sensor 8 so that each of the focus points along the same imaging coordinate axis A1(8)→An(8) differs in the dominant wavelength.
In those embodiments where a matrix light sensor is used, each imaging coordinate axis, principal ray and location point may be associated with the coordinate system of location points of the light sensor 8b like described for the embodiment 400.
When longitudinal chromatic aberration is provided in the illumination optics 11 and the imaging optics 13, each of the coincident focus points consists of an illuminating focus point and an imaging focus point associated with the common dominant wavelength.
In various embodiments spherical aberration is provided in the illuminating optics 11 to focus light from each of the location points P1→Pn for each of the output elements 4.1→4.n on a plurality of imaging focus points positioned at different distances from the illuminating optics 11 along the corresponding illuminating coordinate axis A1→An associated with the corresponding principal ray F1→Fn of the illuminating optics 11 for the corresponding location point P1→Pn of the corresponding output element 4.1→4.n so that each of the illuminating focus points along the same illuminating coordinate axis A1→An is formed with a different optical aperture of the illuminating optics 11. Correspondingly, spherical aberration is provided in the imaging optics 13 to form from each of the location points P1(8)→Pn(8) of the light sensor 8 a plurality of imaging focus points positioned at different distances from the imaging optics 13 along the corresponding imaging coordinate axis A1(8)→An(8) associated with the corresponding principal ray F1(8)→Fn(8) of the imaging optics 13 for the corresponding location point P1(8)→Pn(8) of the light sensor 8 so that each of the focus points along the same imaging coordinate axis A1(8)→An(8) is formed with a different optical aperture of the imaging optics 13.
When spherical aberration is provided in the illumination optics 11 and the imaging optics 13, each of the coincident focus point is formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13.
In various embodiments astigmatism is provided in the illuminating optics 11 to focus light from each of the location points P1→Pn for each of the output elements 4.1→4.n on a plurality of illuminating focus areas with different shapes positioned at different distances from the illuminating optics 11 along the corresponding illuminating coordinate axis A1→An associated with the corresponding principal ray F1→Fn of the illuminating optics 11 for the corresponding location point P1→Pn of the corresponding output element 4.1→4.n so that each of the illuminating focus areas along the same illuminating coordinate axis A1→An is formed with the different optical apertures of the illuminating optics 11. Correspondingly, astigmatism is provided in the imaging optics 13 to form from each of the location points P1(8)→Pn(8) of the light sensor 8 a plurality of imaging focus areas with different shapes positioned at different distances from the imaging optics 13 along the corresponding imaging coordinate axis A1(8) >An(8) associated with the corresponding principal ray F1(8)→Fn(8) of the imaging optics 13 for the corresponding location point P1(8)→Pn(8) of the light sensor 8 so that each of the focus areas along the same imaging coordinate axis A1(8)→An(8) is formed with the different optical apertures of the imaging optics 13.
When astigmatism is provided in the illumination optics 11 and the imaging optics 13, each of the coincident focus area consists of an illuminating and imaging focus area associated with the common shape and is formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13.
In the measurement of optical properties intensity of the local maximum is determined in certain object point of the object under measurement. The intensity of the local maximum is proportional to the reflectivity of the object point from which the local maximum is determined. Furthermore, in the cases where the object point is on the surface of the object, by determining the intensity of the local maxima it is obtained information about reflectivity of the surface.
When longitudinal chromatic aberration is utilized in the illuminating optics 11 and imaging optics 13, it is possible to calibrate the location of the coincident focus point in the measuring space and the wavelength of the associated local maximum whereby by measuring the position of the object, like the surface height, it is obtained information about reflectivity of the object from that position in specific, known wavelength. Furthermore, when the device comprises various output elements there are various measuring geometries whereby by said various measuring geometries it is possible to measure simultaneously the object by using diversity of wavelengths.
In
When the object 2 has the surface height at hb there occurs an intersection of the surface and the virtual measuring space at the position of object point 2.1b and coincident focus point A1(4.1)Aw(8). This intersection generates a local maximum, shown in
In one embodiment, the wavelength of the local maximum can be determined such that the light sensor 8 is an RGB camera.
An embodiment which comprises more than one output elements 4.1→4.n and utilizes longitudinal chromatic aberration in the illuminating optics 11 and imaging optics 13, the device is configured to determine the wavelength of the local maximum of the intensity distribution of the light detected by the light sensor 8 to distinguish from which output element 4.1→4.n the light of the local maximum is provided to determine the position of the intersected object point 2.1. The light sensor 8 may be an RGB camera that is configured to determine the wavelength of the local maximum.
In various embodiments the light sensor 8 may be a CCD or CMOS sensor or the like.
The device can be called also as a measuring device, an imaging device, a triangulating device or an optical device.
The output element of the device can be called also as an illuminating element or light source element.
The distribution of the coincident focus points defines a measuring space for determining the position of an object point 2.1 of the object 2 being measured.
In an embodiment the position of an object point of the object (2) being measured is determined for at least one of the x, y or z coordinates of the xyz coordinate system associated with the distribution of the coincident focus points.
In one embodiment, a thickness between a first surface (2f) and a second surface (2s) of the object (2) that is at least partially transparent or translucent for the light being directed to the object (2) is determined from the position difference between the local maximum of the intensity distribution of the detected light on the light sensor 8 that is a result from the first surface (2f) and the local maximum of the intensity distribution of the detected light on the light sensor 8 that is a result from the second surface (2s).
At 802, light is focused from each of the location points for each of the output elements on illuminating focus points or focus areas positioned at different distances from the illuminating optics along the corresponding illuminating coordinate axis associated with the corresponding principal ray of the illuminating optics for the corresponding location point of the corresponding output element. The corresponding principal ray is the mutual for the plurality of illuminating focus points or focus areas focused from the corresponding location point of the corresponding output element. Each of the illuminating focus points or focus areas along the same corresponding illuminating coordinate axis for the corresponding location point of the corresponding output element differs from each other at least in the dominant wavelength or shape and/or is formed with a different optical aperture of the illuminating optics.
Further at 802, a plurality of different imaging focus points or focus areas are formed from each of the location points of the light sensor at different distances from the imaging optics along an imaging coordinate axis associated with a corresponding principal ray of the imaging optics for the corresponding location point of the light sensor. The corresponding principal ray is the mutual for the plurality of imaging focus points formed from the corresponding location point of the light sensor. Each of the imaging focus points or focus areas along the same imaging coordinate axis differs from each other at least in the dominant wavelength or shape and/or is focused with a different optical aperture of the imaging optics.
Further at 802, a plurality of coincident focus points or focus areas are formed so that each of the various focus points or focus areas from the plurality of illuminating focus points or focus areas along the same corresponding illuminating coordinate axis for each of the output elements coincides at a coincident focus point or focus area with an imaging focus point positioned along a different imaging coordinate axis. The orientations of the illuminating coordinate axes for each of the output elements are configured to be different from the orientations of the imaging coordinate axes. Each of the coincident focus points or focus areas consists of an illuminating and imaging focus point or focus area associated with the common dominant wavelength or shape and/or is formed with the correlated optical apertures of the illuminating optics and the imaging optics.
At 804, the intensity values of the light collected from the object by the imaging optics is detected.
At 806, the positions of the object points are determined from the locations of the local maxima of the intensity distribution of the detected light.
At 808, the optical properties of the object points are determined from the intensities or the wavelengths of the of the local maxima of the intensity distribution of the detected light.
Steps 806 and 808 are optional steps from which either 806 or 808, or alternatively both 806 and 806 can be accomplished.
It is obvious for someone skilled in the art that the measuring process can be returned back to step 802 whereby a continuous measuring process is produced.
In a method according an embodiment, the method comprises a step of determining a thickness between a first surface and a second surface of the object that is at least partially transparent or translucent for the light being directed to the object from the position difference between an object point on the first surface of the object and an object point on the second surface of the object.
In a method according an embodiment, the method comprises a step where the illuminating optics focuses each of the illuminating focus points along the same corresponding illuminating coordinate axis for the corresponding location point of the corresponding output element differing from each other at least in the dominant wavelength by using longitudinal chromatic aberration. Further, the imaging optics focuses each of the imaging focus points or focus areas along the same imaging coordinate axis differing from each other at least in the dominant wavelength by using longitudinal chromatic aberration.
In a method according an embodiment, the method comprises a step where the illuminating optics focuses each of the illuminating focus points along the same corresponding illuminating coordinate axis for the corresponding location point of the corresponding output element with different optical apertures of the illuminating optics by using spherical aberration. Further, the imaging optics focuses each of the imaging focus points along the same imaging coordinate axis with different optical apertures of the imaging optics by using spherical aberration.
In a method according an embodiment, the method comprises a step where the illuminating optics focuses each of the illuminating focus areas along the same corresponding illuminating coordinate axis for the corresponding location point of the corresponding output element differing from each other in shape by using astigmatism. Further, the imaging optics focuses each of the imaging focus areas along the same imaging coordinate axis differing from each other in shape by using astigmatism.
In a method according an embodiment, the method comprises a step where arranging the light sensor substantially perpendicular to an optical axis of the imaging optics, and disposing a plurality of output elements on a plane surface. The plane surface forms an oblique angle to an optical axis of the illuminating optics, preferably wherein the plurality of output elements consists of a plurality of line-like output elements.
If the output elements 4.1→4.5 are line-like output elements those extend in x-direction in the coordinate system 199 of the device.
The output elements are disposed on a plane surface 5 aligned each other. In this embodiment 900 the plane surface 5 comprising said more than one output elements is disposed in an oblique angle α respect to an optical axis 11′ of the illuminating optics 11. The light sensor 8 that can be either a line sensor or a matrix sensor depending on the application is disposed substantially perpendicular angle β respect to an optical axis 13′ of the imaging optics 13, as shown in
Illuminating and imaging focus points and coincident focus points formed from those illuminating and imaging focus points are provided similarly as in the previous embodiments from location points of each output elements 4.1→4.5 by providing longitudinal chromatic aberration in the illuminating optics 11 and imaging optics 13. Alternatively, spherical aberration or astigmatism or any combination of these three alternatives to form focus points and areas can be used in the device 900.
In the device 900 of
In
As in other embodiments of this disclosure also in the device 900 each output element 4.1→4.n are calibrated separately such that each positions of the coincident focus points associated with different location points P1→Pn of the output elements 4.1→4.n corresponds a certain wavelength λ1→m and a certain location point on the light sensor 8.
In the device 900 the coincident focus points are restricted between the focal plane FP8(λ1) of the shortest wavelength λ1 and the focal plane FP8(λm) of the longest wavelength λm. This restriction of the coincident focus points between those focal planes can be done in various ways. For example, a (band-pass) filter can be disposed in front of the output elements 4.1→4.n to restrict the light emitted therefrom. Alternatively or additionally, a (band-pass) filter can be disposed in front of the light sensor 8 to restrict the light detected thereon. Alternatively, the output elements 4.1→4.n can be configured to emit light only in wavelengths which are within the restricted range of λ1-λm.
Since the total Z-measurement range of the device 900 consists of subranges associated with individual output elements the total Z-measurement range can be controlled by the number of the output elements 4.1→4.n.
It is possible to adjust the spacing of the adjacent output elements in such a way that the coincident focus points cover the whole Z-range of the measuring space. Further it is possible to adjust the spacing of the adjacent output elements in such a way that there are at least two different coincident focus points for each height position (i.e. Z-coordinate) of the measuring space.
There are various advantages that can be achieved with the device 900. One advantage is that it may simplify the imaging optics 13 and also may improve the quality of the imaging since the light to be detected is arranged to reach the sensor surface substantially in a straight angle.
Other advantage is that the coincident focus points can be formed with relatively large numerical apertures of the illumination and imaging optics. This makes possible to measure glossy, tilted surfaces. As a reference, with a typical laser triangulation 3D sensor that utilizes collimated laser light to illuminate the object under the measurement, already a slightly tilted glossy surface can direct all the light outside the imaging aperture making the 3D measurement of a tilted glossy surface impossible.
Other advantage is that a total Z-measurement range of the device can be increased without increasing longitudinal chromatic aberration in the imaging and illuminating optics since the measurement range can be increased by increasing the number of the output elements. Consequently, it makes possible to increase Z-measurement range of the device whose output elements (i.e. light sources) are configured to emit light in narrow wavelength band. For example, longitudinal chromatic aberration of the imaging optics can be as short as 200 μm for providing a measurement Z-range of e.g. 4 mm.
For example, the output elements of the device can be configured to emit only blue light and hence longitudinal chromatic aberration of the optics of such a device can be adjusted for those blue wavelengths.
As described hereinabove that alternatively or additionally to the longitudinal chromatic aberration, spherical aberration and/or astigmatism can be used in the optics of the device 900. In these cases, the device can be realized similarly as shown in the example of
As in other embodiments of this disclosure utilizing spherical aberration or astigmatism each output element 4.1→4.n of the device can be calibrated separately such that each positions of the coincident focus points or areas associated with different location points P1→Pn of the output elements 4→14.n is calibrated to correspond a certain correlated optical apertures of the illuminating and imaging optics and a certain location point on the light sensor 8.
In an embodiment, the light sensor 8 is a line scan camera or a matrix camera being disposed substantially perpendicular to an optical axis 13 of the imaging optics 13 and the plurality of output elements 4.1→4.n being disposed on a plane surface 5, wherein the plane surface 5 forms an oblique angle to an optical axis 11′ of the illuminating optics 11.
In order to acquire a surface profile or topography for the object under the measurement the object has to be scanned through the measuring space of the device. The surface profile is then producible as a reconstruction of position measurements (i.e. height or Z-coordinate) of individual object points of the object 2. If a 3D measurement for the object is desired the device comprise advantageously at least one line-like elements 4.1→4.5. The design of the device concerned how many and which kind of output elements 4.1→4.5 the device includes depends on the application. What is relevant in acquiring a surface measurement is that the design of the output elements 4.1→4.5 of the device is such that all object points which are wanted to be measured intersect at least one coincident focus point or area.
The scanning measurement can be actualized in such a way that the object to be measured is moved (i.e. scanned) through the measuring space defined by coincident focus points or areas of the device. In this kind of scanning the object can be moved by a linear actuator or a conveyor or the like, for example. Alternatively, the object can be kept in its place when the scanning is actualized by moving the device, for example on a rail or with aid of any other kind of moving mechanism or means, so that the object points of the surface of the object are intersected with at least one coincident focus points of the device.
The predetermined scanning path and the Z-measurement sub-ranges of the device can be adjusted so that each surface point of the object being measured will pass through at least one coincident focus point regardless of the Z-position of the object point within the Z-measurement range.
In a method according to an embodiment where the coincident focus points are formed by utilizing longitudinal chromatic aberration in the illuminating 11 and imaging optics 13, the method comprises the following step: moving the object 2 with respect to the device so that the object point 2.1 is intersected with at least one coincident focus point, wherein each of the at least one coincident focus point has the same common dominant wavelength.
In a method according to an embodiment where the coincident focus points are formed by utilizing spherical aberration in the illuminating 11 and imaging optics 13, the method comprises the following step: moving the object 2 with respect to the device so that the object point 2.1 intersected with at least one coincident focus point, wherein each of the at least one coincident focus point is formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13.
In a method according to an embodiment where the coincident focus points are formed by utilizing spherical aberration in the illuminating 11 and imaging optics 13, the method comprises the following step: moving the object 2 with respect to the device so that the object point 2.1 intersected with at least one coincident focus area, wherein each of the at least one coincident focus area consists of illuminating and imaging focus areas associated with the common shape and is formed with the correlated optical apertures of the illuminating optics 11 and the imaging optics 13.
In
The determining intensity values for the light reflected from a certain point of the object 2 with a plurality of different wavelengths with the device according to the invention can be done when the device comprises more than one output element and when the coincident focus points are formed by utilizing longitudinal chromatic aberration in the illuminating optics 11 and imaging optics 13. In other words, there are a plurality of coincident focus points which are formed from more than one output elements. Said coincident focus points are distributed such that there are lines of coincident focus points that are spatially separated from each other and are each associated with a certain location point of a certain output element 4.1→4.5.
Furthermore, in each line of coincident focus points the coincident focus points are arranged in increasing order of wavelength with different distances from the illuminating optics 11 and in different heights (i.e. Z-coordinate).
When determining of intensity values for the light reflected from a certain object point of the object 2 with a plurality of different wavelengths the object point is scanned through the measuring space so that it is intersected with different coincident focus points each having different dominant wavelength and is associated with a different output element as compared to the other intersected coincident focus points. In
As described herein elsewhere the light reflected from the intersection of the object point with a coincident focus point produces a local maximum intensity from whose location on the light sensor 8 the position of the object point can be determined. The processor (not shown in the figures) of the device can also be configured to determine the wavelength of the light reflected from that intersection corresponding the emitted light to said coincident focus point involved with the intersection. This can be determined based on the location of the local maximum and on an information about from which output element 4.1→4.5 the light of the local maximum is provided. Therefore, the device can be configured to determine intensity value and the corresponding wavelength of the light reflected from the object point intersected a certain coincident focus point. The information from which output element 4.1→4.5 the light of the local maximum is provided can be acquired when the object 2 is moved through the measuring space in a predetermined path, whereby the coordinate of the object point in movement or scanning direction object is known in different time instances. This can be for example be encoded to the means actualizing the scanning of the object 2.
Alternatively, to know the information from which output element 4.1→4.5 the light of the local maximum is provided can be actualized such that output elements emitting the light associated with intersected coincident focus points are switched on and off in a synchronized manner with the scanning of the object.
As described hereinabove, the device utilizing longitudinal chromatic aberration in its illuminating 11 and imaging optics 13 to form coincident focus points can be calibrated such that each positions of the coincident focus points associated with different location points P1→Pn of the individual output elements 4.1→4.n corresponds a certain wavelength λ1→m and a certain location point on the light sensor 8. Therefore, it is possible to determine the wavelength of the local maximum based on such a calibration and on an location information of the local maximum on the light sensor 8 and an information about from which output element 4.1→4.n the light of the local maximum is provided.
In
Consequently, within the same time as the surface profile of the object is acquired in the scanning measurement it is advantageously possible to determine intensity value for the light reflected from the surface points with a plurality of different wavelengths. Furthermore, it is advantageously possible to increase the number of determined intensity values with different wavelengths for the light reflected from a certain object point by increasing the number of coincident focus points to be intersected with that object point that can be done by increasing the number of output elements.
One further advantage is that when the same surface point is measured with several wavelengths this improves also the resolution of the acquired surface profile measurement due to a reduced uncertainty caused by noise compared to a measurement where each of the measured object points are intersected with only one coincident focus.
In a method according to an embodiment, the method comprises a step of determining intensity value for the light reflected from the object point 2.1 with at least two different wavelength, the step comprises the following phases:
In
When the adjacent output elements 4.1→4.n are spaced far enough from each other the local intensity maxima (i.e. peaks) of detected light provided from these adjacent, different output elements 4.1→4.n cannot overlap at all on the light sensor 8. In this case the calibration of the device can be realized unambiguously and for each local intensity maxima on the light sensor 8 there is just one, known output element 4.1→4.n from which the light of the local intensity maximum may be provided. This kind of case is illustrated in
One way to space or dispose adjacent output elements 4.1→4.n more dense compared to the design shown in
In
In the figure by a reference sign D is depicted the distance or spacing between adjacent lines of coincident focus points that is adjusted in this example to be same between each lines of coincident focus points. This means that the output elements in this example are disposed at equal distances from each other. It has to be noted that the distances between adjacent output elements of the device do not need to be the same but can be varied depending on the application.
When it is known how much the object has to be moved in the predetermined path 1010 in the scanning direction between two successive time instances and the corresponding spacing D of the adjacent lines of coincident focus points (i.e. the distances between the output elements are known), the period of time dT between two successive time instances tn-1 and tn (i.e. dT=tn−tn-1) is also known.
At the first time instance t1 the object point of the object 2 intersects with the coincident focus point of one line of coincident focus points S4.n(λ1→m). This intersection causes a local intensity maximum on a certain location (e.g. a XY-location) of the light sensor 8. Because the trajectory of the object is known, due to the predetermined path 1010, the physical object point of the object causes another local intensity maximum on a new location (e.g. a new XY-location) of the light sensor at the second time instance t2=t1+dT1 which new location can be calculated from the preceding location. This procedure can be repeated at new time instances t1+dT1+dT2+dT3 . . . as many times until the same object point has moved through the whole measuring space. When the object 2 has passed the whole measuring space no more signal (i.e. intensity maximum) is detected. In this example there are five intersection of the object point with five coincident focus points of different lines of coincident focus points, and hence, five local intensity maxima is detected at five different time instances t1→t5. Each output element 4.1→4.n associated with these intersections can be then identified by fitting these measured intersections to a proper set of five corresponding output elements that provide the light of detected successive local intensity maximum on the light sensor 8. The surface height on the position of the object point can be determined from the calibration based on the location of the local intensity maximum on the light sensor 8 and the information from which output element the light caused said local intensity maximum is provided. The wavelength of said local maximum can be determined based on the location of said local intensity maximum on the light sensor 8 and the information from which output element the light caused said local intensity maximum is provided. The position of the object point can thus be determined, among other properties such as a wavelength and intensity of a certain local intensity maximum, as many times as the object point is intersected with the line of coincident focus points, which improves e.g. determining of the height (i.e. Z-coordinate) of the object point. In this example the height of the object point is therefore determined five times.
As mentioned hereinabove the spacing between a certain pair of output elements 4.1→4.n can be different than the spacing between other adjacent output elements in the device. This dissimilarity in the spacing of the adjacent output elements 4.1→4.n can be used in identifying which output element provided the light of a certain local maximum. For example, if there was one dissimilar spacing of the adjacent output elements in the device of example of
In this case four local intensity maxima are determined at time instances t4→t5 whereby the dissimilarity in the spacing of the adjacent output elements 4.1→4.n identifies the missing local intensity maximum to be the first or the last in the order of the detected local intensity maxima. In the case when the local intensity maximum is missing between the first and the last local intensity maximum (i.e. reflectivity of the surface of the object is low at the corresponding wavelength) the locations of the missing local intensity maximum can be determined based on the determined periods of time between successive time instances and on the total time between the first and last time instances.
In an embodiment, at least one spacing between the adjacent output elements 4.1→4.n is dissimilar to other spacings between the adjacent line-like output elements 4.1→4.n.
The invention has been explained above with reference to the above embodiments, and several advantages of the invention have been demonstrated. The disclosure is not limited to the embodiments described above, but the inventive idea can be applied in numerous ways within the scope of the claims.
The features recited in dependent claims are mutually freely combinable unless otherwise explicitly stated.
Number | Date | Country | Kind |
---|---|---|---|
20205942 | Sep 2020 | FI | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/FI2021/050644 | 9/29/2021 | WO |