I. Technical Field
The instant disclosure relates generally to a system for imaging-based profile and/or dimensional measurement of an object.
II. Description of Related Art
This background description is set forth below for the purpose of providing context only. Therefore, any aspects of this background description, to the extent that it does not otherwise qualify as prior art, is neither expressly nor impliedly admitted as prior art against the instant disclosure.
Systems for imaging-based profile measurement of a two-dimensional (2D) and three-dimensional (3D) object are widely in use. One can find such imaging-based profile measurement systems in many applications, ranging from measurement, process control, modeling, to guidance.
Among many technical approaches, a very common technique is known as triangulation, in which a structured light pattern, such as a bright dot, a bright line, a cross, a circle, or a plural of them, or a combination of them, are projected onto an object surface of interest from one angle, and an imaging sensor, or more than one imaging sensor, is used to view the reflected light pattern(s) from a different angle. The angular difference will form the basis to solve for the distance(s) from the object surface that reflects the light pattern to the light pattern generating source. The imaging sensor is typically a camera, although other photo sensitive sensors may be used. The most common light source would be a laser, for generating a laser dot, a laser line, or other patterns. Other light sources generating similar lighting effect, known as structural lighting, can be used as well. Similar setup of the imaging sensor(s) is also applied to situations in which mechanical interference to another object, such as an ink-dispensing needle, may be avoided while taking measurements of the patterns on the plane perpendicular to the said object (e.g., the needle). A uniform area illumination, either directional (with predetermined incident angles) or non-directional (i.e., cloudy day illumination), may be used in this case.
The approach (having the imaging sensor at an angle to the lighting or lighted plane) is well known and well-practiced. However, this approach has some undesired characteristics. First, the angle difference between the projection of the light or light pattern (from the light source) and the viewing of the reflected light or light pattern by the imaging sensor, referred to as the measurement angle, is critical to the measurement resolution. Known approaches that use an angle less than 90 degrees enlarge the size in a portion and reduce that in another portion of an imaging plane; however, this results in a reduced overall image resolution. Second, the measurement angle also determines the complexity of the mathematical model in the triangulation calculations. For measurement angles less than 90 degrees, the model is complex in that it involves at least trigonometric transformations.
There is therefore a desire for an improved profile and dimension measurement system and method that overcomes one or more of the problems set forth above.
The foregoing discussion is intended only to illustrate the present field and should not be taken as a disavowal of claim scope.
Embodiments consistent with the teachings of the instant disclosure provide a system and method for profile and dimension measurement of an object that feature, at least, parallelism of a light plane or a lighted plane and an imaging plane (in which an image is captured) by virtue of an offset image acquisition assembly having, at least, a shifted lens.
In an embodiment, a system is provided for determining a profile of an object. The system includes an illumination assembly, an image acquisition assembly, and a data unit. The illumination assembly is configured to project a light plane onto an outer surface of the object. The image acquisition assembly includes an imaging sensor and a lens. The imaging sensor is configured to capture an image of an imaging plane wherein the imaging plane is substantially parallel to and lies in the light plane. The lens has a principal axis and is disposed between the light plane and the imaging sensor. The lens is positioned relative to the imaging sensor such that the principal axis is offset from a sensor axis wherein the sensor axis is substantially perpendicular to the imaging sensor and passes through a center of the imaging sensor. The data unit, typically a computer with a display or a display alone, is configured to receive the captured image and form the profile using at least the captured image.
A method of forming a profile of an outer surface of an object is also presented.
In another embodiment, a system is provided for determining the planar features of an object, such as the 2D projection dimensions or shapes when the principal axis (the axis perpendicular to and passing through the center of the object) is occupied by another object. The system includes an illumination assembly, an image acquisition assembly, and a data unit. The illumination assembly is configured to project a plane light onto the surface of the object and form the lighted plane. The image acquisition assembly includes an imaging sensor and a lens. The imaging sensor is configured to capture an image of an imaging plane wherein the imaging plane is substantially parallel to and lies in the lighted plane. The lens has a principal axis and is disposed between the lighted plane and the imaging sensor. The lens is positioned relative to the imaging sensor such that the principal axis is offset from a sensor axis wherein the sensor axis is substantially perpendicular to the imaging sensor and passes through a center of the imaging sensor. The data unit, typically a computer with a display or a display alone, is configured to receive the captured image and form the profile, contour or other planar features using at least the captured image.
The foregoing and other aspects, features, details, utilities, and advantages of the present disclosure will be apparent from reading the following description and claims, and from reviewing the accompanying drawings.
Before proceeding to a detailed description of the embodiments, a general overview will first be set forth of a system and method for profile measurement. As described in the Background, it is known to use a measurement angle of less than 90 degrees in profile measurement systems. The measurement resolution, however, as set forth herein, can be maximized when the measurement angle is 90 degrees, and in addition, the mathematical model(s) used to determine the object profile and dimension(s) become simplified when the measurement angle is 90 degrees. Known systems for profile measurement, however, particularly those with a scanning function, do not employ a measurement angle of 90 degrees. The reasons are at least two-fold.
First, it is more desirable to have either the light projection or the imaging sensor normal to the object surface being measured. This will reduce the mathematics of the scanning. Second, a large measurement angle may cause the hardware to interfere with the object being scanned, unless a good portion of the field of view is wasted. As a result, a typical profile measurement device based on triangulation is designed with a measurement angle of about 30 to 60 degrees. Some may even have an angle outside this range. As a result, the optical resolution in a measurement plane (i.e., the light traveling place of the projected light pattern), as viewed by an imaging sensor, will be location dependent. Namely, if the object surface intercepts the projected light pattern at different locations of the measurement plane, the measurement result in the pixel space will be different. Furthermore, there requires a depth of focus as the measurement plane deviates from the imaging plane. High optical resolution (typically with shallower depth of focus) may limit the measurement range or result in additional measurement variations. For objects that are steady, this may not be very critical. However, for some applications, the object to be measured may move substantially in the measurement plane, the true three-dimensional (3D) calibration may be an issue.
Embodiments according to the teachings of the instant disclosure employ a measurement angle of substantially 90 degrees so that the pixel resolution in the measurement plane will be substantially uniform. In addition, a greater portion of the active imaging area on an imaging sensor will be usable for imaging in the measurement plane, thereby increasing resolution (i.e., pixels used for capture on the object, for example, all or at least most of the pixels). Embodiments according to the instant teachings are characterized by a system for profile measurement—using triangulation—that employs a measurement angle of substantially 90 degrees (or substantially so) as well as fully utilizing the imaging sensor's active imaging area.
Embodiments of a system for determining a profile of an object (e.g., may be a three-dimensional object) may be used for a number of useful purposes, for example, in a manufacturing process to confirm or validate that the manufacture of an object conforms to a predetermined shape and/or predetermined dimensional specifications. For example only, such a profiling system may be used to determine the “roundness” of a round object, or to determine the “shape” of a non-round object, like an H-beam or a rail. Embodiments of a profiling system according to the present teachings may be used for determining the actual shape of a steel object.
Illumination assembly 18 includes at least a line source 20, such as a laser line or other light line sources having the same effect, configured to project a light plane 22 onto surface 12 of object 10. The line source 20 can be of any wavelength or a combination of multiple wavelengths, in the areas of infrared, visible light or ultraviolet or known in the range from 0.01 micron to 1000 microns, suitable for the selected imaging sensor and lens. The instant invention, without losing generality, will adopt the term “laser” as the line source 20. Light plane 22 is also known as the measurement plane described above. In the illustrated embodiment, source 20 is arranged relative to object 10 such that light plane 22 is substantially perpendicular to outer surface 12 of object 10, and thus the longitudinal axis “A”. In this case, the light plane and the measurement plane (i.e., the plane in which the light plane impinges onto the object's surface) will be the same. Light plane 22 interacts with surface 12 of object 10, where it can be imaged by image acquisition assembly 28 as will be described below. It should be understood that the light plane 22 may be formed by more than one line source 20 if the cross-section geometry of object 10 requires lighting from multiple angles of illumination to extract the profile or a segment of the profile of object 10, and the light emitted by multiple line sources 20 lies substantially in the light plane 22.
Image acquisition assembly 28 comprises a lens 30 (e.g., a converging lens or a lens of similar function) and an imaging sensor 40, both of which may comprise conventional construction. For example only, imaging sensor 40 may comprise a charge-coupled device (CCD), a complementary metal-oxide semiconductor (CMOS) device, or a video camera tube, to name a few. Image acquisition assembly 28 is configured to capture an image 24 (best shown in
In addition, lens 30 is off-centered relative to imaging sensor 40, as if imaging sensor 40 were larger (illustrated as the dotted line 41 in
Data unit 48 is configured to receive and process image 24 from image acquisition assembly 28 and form profile 16. In an embodiment, the data unit 48 is a video signal organizing device, such as a video multiplexer, that can distribute the image from an image acquisition assembly 28 to the corresponding display (not shown in
However, those skilled in the art shall appreciate the advancement in the electronic processing field and understand that an alternative embodiment, as illustrated in
Referring now to
In further embodiments, a profiling system (hereinafter system 14a) incorporates a modified illumination assembly 18 configured to project a light plane around the circumference of the object as well as multiple image acquisition assemblies 28 configured to image around the circumference of the object. System 14a, thus configured, is able to profile completely around the entire circumference of the object.
System 14a further includes a plurality of image acquisition assemblies 28 like that shown in
It should be understood, however, that while three image acquisition assemblies 281, 282, and 283 are used in the illustrated embodiment for complete profiling of a round object, more or less image acquisition assemblies may be used in other embodiments, depending on at least (i) the shape of the object, and (ii) the desired output profile. For example, in some embodiments, six, seven, eight or more image acquisition assemblies may be used, for example, for certain complex shapes, such as an “H” shaped beam or rail. The adoption of multiple image acquisition assemblies shall be optimized based on the cross-section geometry of object 10. The positions of the assemblies 28 may or may not be evenly spaced around the circumference of object 10. The second predetermined distances 38 for the assemblies 28 may be individually selected for each of the assemblies 28 and do not require to be same.
Each of the image acquisition assemblies 281, 282, and 283 captures a respective image 241, 242, 243 (best shown in
One advantage to the offset image acquisition assemblies relates to the ease with which multiple pieces of data (image 241, 242, 243) can be processed and integrated to form a composite profile 16. In conventional arrangements, each image acquisition assembly will have its own 3D trigonometric calibration function. In such arrangements, having 3, 4 or even 8 imagers will greatly complicate the overall system calibration process very quickly.
In embodiments consistent with the teachings of the instant disclosure, the profile data—the profile segments—obtained from each image 241, 242, 243 can be more easily tied together or mapped to form a composite profile, due to the availability of 2-dimensional (2D) uniform mapping. In the embodiments, the integration of multiple (e.g., three) data sets from multiple image acquisition assemblies require only a two-dimensional planar calibration of the laser plane, not a three-dimensional non-linear calibration as in the conventional art. The overlapping of at least certain portions of the focused imaging planes, designated 421, 422, 423, allows data unit 48 to perform the calibration in 2D, based on at least the overlapping portions of the profile segments from the image acquisition assemblies.
Step 62 involves projecting a light plane onto an outer surface 12 of an object 10. Step 62 may be performed substantially as described in the embodiments set forth above, for example, by operating at least a light line source to produce a light plane 22 (e.g., as in system 14), or expanding the approach to produce a light plane 22 that impinges onto the object around the object's entire circumference (e.g., as in system 14a). The process proceeds to step 64.
Step 64 involves capturing an image of an imaging plane using an offset image acquisition assembly. Step 64 may be performed substantially as described in the embodiments set forth above. The imaging plane 42 will be substantially parallel to the light plane/measurement plane, and the principal axis of the lens will be offset from the sensor axis, all to achieve the above-described beneficial effects. In some embodiments, a single image acquisition assembly may be used to capture an image, while in other embodiments, multiple image acquisition assemblies may be used to capture plural images. The process then proceeds to step 66.
Step 66 involves forming the profile of an object, which can be a three-dimensional object, using the captured image or images from step 64. Step 66 may be performed substantially as described in the embodiments set forth above. For example, in an embodiment of system 14a, step 66 may be performed by profile generator 54 by (i) determining the profile segments in the captured images, (ii) applying the transformation models obtained from the calibration process to the profile segments, and (iii) combining the profile segments.
The system 14a can be configured in a different embodiment, in that each image acquisition assembly 28 is individually coupled with one illumination assembly 18 and form a profile scanner. Within a profile scanner, the relationship between the illumination assembly 18 and the image acquisition assembly 28 is fixed. Multiple profile scanners can be used to form a system having the same function of system 14a. To avoid interference of light plane(s), those skilled in the art shall know that each profile scanner may equipment with a light plane of a unique wavelength and a corresponding optical filter may be used to select the light plane of interest to the specific profile scanner. Another approach is to offset different profile scanners along the axis “A”.
The profiling functionality of the instant teachings may be used by itself and/or in combination with additional optical imaging functionality, such as surface inspection, such as performed by a surface inspection apparatus, for example, like the inspection apparatus described in U.S. application Ser. No. 10/331,050, filed 27 Dec. 2002 (the '050 application), now U.S. Pat. No. 6,950,546, and U.S. application Ser. No. 12/236,886, filed 24 Sep. 2008 (the '886 application), now U.S. Pat. No. 7,627,163. The '050 application and the '886 application are both hereby incorporated by reference as though fully set forth herein.
It should be understood that system 14 (and system 14a), particularly a main electronic control unit (L e., data unit 48), as described herein, may include conventional processing apparatus known in the art, capable of executing pre-programmed instructions stored in an associated memory, all performing in accordance with the functionality described herein. Such an electronic control unit may further be of the type having both ROM, RAM, a combination of non-volatile and volatile (modifiable) memory so that any software may be stored and yet allow storage and processing of dynamically produced data and/or signals. It should be further understood that the terms “top”, “bottom”, “up”, “down”, and the like are for convenience of description only and are not intended to be limiting in nature.
While one or more particular embodiments have been shown and described, it will be understood by those of skill in the art that various changes and modifications can be made without departing from the spirit and scope of the present teachings.
The teaching of the instant invention is also advantageous in view of other applications such as in-line measurement and monitoring for micro-printing or 3D printing. Without losing the generality, the printing of micro antenna on a substrate using an aerosol needle is employed as an example.
In order to verify the printing quality in terms of whether the antenna pattern is in conformity to the design specification, an image acquisition assembly 328 is typically employed. While the illumination can be well designed in the application to project uniform illumination onto the area of interest on the substrate 312, using either a directional (e.g., dark field or coaxial lighting) or non-directional (e.g., cloudy day lighting) approach, the assembly 328 is disposed such that its principal axis is at an angle to the principal axis of the aerosol needle, to avoid the mechanical interference. Due to this angle, the distance of the substrate 312 to the image acquisition assembly 328 is location dependent and may vary significantly. As illustrated in
The instant invention may be employed to solve this issue.
Those skilled in the art shall appreciate that external illumination may not be necessary. The self-emitting radiation, infrared, visible light or ultraviolet, depending on the material and temperature of the object to be imaged, may be captured by an image acquisition assembly properly selected to receive such radiation. For example, a CCD sensor may be used for short-wavelength infrared and visible light, and a microbolometer sensor may be used for long-wavelength infrared.
This application claims the benefit of U.S. provisional application No. 61/732,292, filed 1 Dec. 2012 (the '292 application), and U.S. provisional application No. 61/793,366, filed 15 Mar. 2013 (the '366 application). The '292 application and the '366 application are both hereby incorporated by reference as though fully set forth herein.
Number | Date | Country | |
---|---|---|---|
61793366 | Mar 2013 | US | |
61732292 | Dec 2012 | US |