This invention relates to a method and apparatus for gaging the conformance of the surface of a contoured panel having a specular surface to a pre-defined specification.
Manufacturers of panels having specular surfaces, particularly panels formed into various curved shapes, are interested in measuring the shape of the formed panel to assess conformity of the actual formed shape to design specification.
It is therefore desirous to develop a system and method for comparing the formed surface of a particular panel to a pre-defined surface specification.
It is also desirous to develop a panel gaging system which may be integrated in, and operating in-line with, panel forming and processing systems.
It is thus also desirous for at least the above purposes to develop a system and method for quickly inspecting a curved panel to acquire data corresponding to the surface the panel, particularly as the panel is being transported on a conveyor between or after other processing operations.
The disclosed system and method for inspecting and gaging the shape of a contoured panel includes, as components, (1) a system and method for acquiring three-dimensional surface data corresponding to the panel, and (2) a system and method for receiving the acquired surface data, comparing the acquired surface to a pre-defined surface description, and developing indicia of the level of conformance of the contoured panel to the pre-defined specification.
The surface data acquisition system may include a conveyor for conveying the panel in a first direction generally parallel to the first dimension of the panel, at least one display projecting a preselected contrasting pattern, and at least one camera, each one of the cameras uniquely paired with one of the displays, wherein each display and camera pair are mounted in a spaced-apart relationship a known distance and angle from the surface of the panel such that the camera detects the reflected image of the pattern projected on the surface of the panel from its associated display.
The surface data acquisition system may, in one embodiment, include two or more cameras, each one of the cameras being uniquely paired with one of the displays as described above, wherein each of the display and camera pairs are spaced apart from each other at least in a second direction across the second dimension of the panel such that each camera detects the reflected image of the pattern projected on the surface of the panel from only its associated display, and wherein the patterns detected by the two or more cameras together cover the entire surface in the direction of the second dimension of the panel.
The surface data acquisition system may also include a programmable control including at least one processor programmed to execute logic for controlling each of the cameras to acquire at least one image of the reflected pattern of the associated display on the panel as the panel is conveyed across the path of the projected pattern in the first direction, and logic for analyzing and combining the data acquired by the two or more cameras to construct surface data representative of the surface of the panel.
The disclosed gaging system may include at least one processor including logic for receiving the surface data, comparing the data to a pre-defined exemplary surface, and displaying or otherwise reporting selected information associated with the comparison.
The gaging system may utilize a single computer which controls the conveyor and the operation of the cameras, and includes the previously described surface data acquisition logic, as well as the optical distortion processing logic. Alternatively, the conveyor control, camera controls, surface data acquisition and optical processing may be integrated but implemented on separate or multiple processors or computers.
As required, detailed embodiments of the present invention are disclosed herein. However, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
Referring to
The inspection system 10 depicted in
Referring still to
Referring again to
Where, as in the depicted embodiment, the cameras are mounted with their viewing aperture extending through an aperture in the display, the system control 42 may be programmed to acquire multiple images as the glass is conveyed in the first direction to insure that an image of the reflected pattern is obtained in a previous or subsequent image of the moving sheet for that portion of the surface of the panel that, in any one of the captured images, is in the area of reflection of the display aperture. Again, it will be appreciated that the number of images acquire by each camera should also be sufficient that the surface information developed from each image (as hereinafter described) can be combined to form a description of the entire surface across the height in the area in which a single image might include an image of the display aperture rather than the reflected pattern.
The surface descriptions for each of the cameras are similarly combined to form a description of the entire surface across the width (or across the area of interest in the direction of the width) of the panel.
Referring to
The sinusoidal patterns are chosen and combined to insure that the portion of the resultant pattern appearing on the display is non-repetitive, thereby ensuring that, for the image data collected, each pixel in the camera's field of view will correspond uniquely to a single point on the display. Each of the three frequencies may be relatively prime values, and are selected such that they are spaced apart within the envelope of frequencies bound by the minimum and maximum frequency limits of the camera's optics.
The image of this three-frequency pattern reflected from the surface of the panel may then be mathematically deconstructed into three single frequency images in each of the x and y direction. Phase information corresponding to each of the three frequencies can then be isolated and utilized as hereinafter described to develop an accurate description of the panel surface.
In another embodiment, illustrated in
Thus, in the illustrated embodiment, the orthogonal directions of the sinusoidal patterns are skewed from the x and y axes of the display. It will be appreciated, however, that any other convenient orientation may be chosen for the axes that are used by the system to separate the analysis into orthogonal components, so long as the sinusoidal patterns are rotated about the axes that are used to separate the analysis into orthogonal components to yield phase information in both the x and y directions.
Again, the sinusoidal patterns are chosen (relatively prime frequencies and spaced apart as described above) and combined to insure that the portion of the resultant pattern appearing on the display is non-repetitive, thereby ensuring that the image data collected that each pixel in the camera's field of view will correspond uniquely to a single point on the display.
The image of this two-frequency pattern reflected from the surface of the panel may then be similarly mathematically deconstructed. Again, phase information corresponding to each of the two frequencies can be isolated and utilized as hereinafter described to develop an accurate description of the panel surface.
It will be appreciated by those skilled in the art that, by employing a multi-frequency, non-repeating pattern and employing the deflectometry techniques hereinafter described, an accurate mathematical description of the panel surface may be obtained from a single image for each point on the surface of the panel from which the camera detects the reflected pattern. It is thus unnecessary to capture utilize multiple patterns, and/or multiple images, except as described herein where multiple images are acquired as the panel is moved on the conveyor to construct a surface for that portion of the panel that does not reflect the projected pattern in any single acquired image (e.g., (1) that portion of the panel directly below the aperture in the screen, or (2) for that portion of the panel that is not in the viewing area of the camera due to the fact that the height of the panel is greater that the projected pattern from the screen in the direction of conveyance).
Referring now to
The panel inspection system 10 includes a surface data acquisition system which employs the above-described camera and display pairs and acquired images, as well as logic for developing an accurate three-dimensional description of the surface from the reflected patterns from each image, and logic for combining the surface descriptions developed from the images as hereinafter described to obtain an accurate mathematical description of the entire surface of the panel.
The panel inspection system 10 may also, in addition to the surface data acquisition system, include one or more computers and/or programmable controls including logic for processing the acquired surface data to determine the conformance of the curved sheet with a pre-defined shape specification.
The inspection system 10 may, in turn, be incorporated into a system for fabricating panels including one or more processing stations and one or more conveyors for conveying the panels from station to station during processing, such as fabrication systems 200 and 300 schematically shown in
As indicated at 84, one or more additional images may be obtained from each camera, as required, as the panel moves on the conveyor. As previously described, the number of images acquired by each camera is determined by at least two considerations. First, in embodiments of the system wherein the cameras are mounted within an aperture of their associated displays, a sufficient number of images must be acquired to ensure that the system acquires a reflected image of the pattern for all of the points in the viewing area, including those points from which the display pattern is not reflected in a particular image due to the fact that it is located within the area that includes a reflection of the aperture. Second, multiple images may be required as the glass is conveyed across the viewing area of the camera in embodiments of the system where the field of view of the camera is not large enough to acquire a reflection of the display pattern from the surface of the panel across its entire first dimension (i.e., the entire height) in one image.
For each of the acquired images, the system, at 86, must determine the precise location in three-space of each point on the surface of the panel based upon the reflected pattern in the image. As previously described, the use of a pattern which is non-repeating in the camera's viewing area ensures that each point on the display screen that is reflected within the viewing area of the camera will be uniquely associated with a pixel that detects the reflected pattern. Conventional image processing techniques may be employed to determine the x and y locations (i.e., in the focal plane of the camera) for each point on the surface of the panel that is in the viewing area of the camera for that image. Other known processing techniques may be employed to determine the z location (a.k.a. the elevation) of each point. In the disclosed embodiment, a mapping vector technique is employed (as depicted in
In one embodiment, the x, y, and z values developed for each point in the viewing area of a particular camera are typically developed in a coordinate system associated with that camera. In one embodiment, for example, the origin of the coordinate system for each camera is set at that camera's origin 98 (as shown in
The system, at 88, then combines the developed surface data for each of the images acquired from all of the cameras to obtain the surface definition which identifies the location of each point in three-space for the entire surface of the panel. In one embodiment, the point clouds for each camera are converted to a common (“global”) coordinate system, and the point clouds are then combined to form the entire surface.
It will be appreciated that one or more other coordinate systems/origins may be selected and employed based upon a particular system's camera/display architecture and/or for computational convenience. Similarly, the combination of the surface developed from the individual acquired images may be performed using other conventional image data processing techniques.
The system then, at 90, performs one or more known processing techniques to compare the surface to a pre-defined exemplary surface contour. In one embodiment, a statistical correlation of the points in the surface data of the panel with the points in the predefined surface shape is conducted. The variance in locations of the correlated points may be examined, for the entire surface, and/or for selected areas of concern, such as, for example, around the perimeter of the panel. Selected indicia of conformance of the panel to the pre-defined surface, including, for example, color coding or numerical values, may then be developed and displayed or otherwise reported.
Referring still to
The geometric optical equation is:
Where n is the surface normal, v is the camera pixel vector, m is the mapping vector, and s is the distance from the camera to the surface (along the camera vector so that the surface point
The differential geometry describes the points on the surface of the panel:
Since n is the cross product of the two differentials, it is by definition orthogonal to both, yielding:
Solving these for the elevation, s:
It will be appreciated by those skilled in the art that other known methods may be utilized to develop unambiguous locations in three dimensions for each of the points on the surface of the panel based upon the unambiguous x and y locations of the reflected patterns at each pixel location of the image, and the geometrical relationship between the focal plane of the camera, the display screen, and the panel. However, it has been determined that the elevation of each point on the surface of the panel can be quickly determined using the principles described above and illustrated in
Referring again to
The system 10 may also be programmed by the user to graphically and numerically display various indicia of conformance to the desired shape, including those indicia most relevant to industry standards, or other indicia considered relevant in the industry to the analysis of the shape of the formed panels.
The digital cameras 28-40 are each connected via a conventional data line to one or more computers, such as computer 42, which may be suitably programmed to acquire the digital image data from the camera, process the image data to obtain the desired surface definition for the panel, and analyze the data to develop various indicia of shape conformance. The computer 42 may also be programmed to present the derived information in both graphical (e.g., color-coded images) and statistical forms. If desired, various other statistical data which may be of interest can be derived and reported for predefined areas of the panel.
As will be appreciated by those skilled in the art, the inspection system 10 may additionally or alternatively employ other known image processing techniques to collect and analyze the acquired image data, develop a definition of the surface, and provide various indicia of the shape conformance for each panel.
In one embodiment, the displays 14-26 are light boxes that utilize conventional lighting (such as fluorescent lights) behind a translucent panel upon which the contrasting pattern is printed, painted, or otherwise applied using conventional methods. The digital cameras 28-40 are connected to the computer 60 using known methods, preferably so that the acquisition of the image by the camera may be controlled by the computer 42.
Selected data output by the disclosed in-line inspection system 10 may also be provided as input to the control logic for the associated glass sheet heating, bending, and tempering system 200 (or automotive windshield fabrication system 300, or other panel fabrication/processing system) to allow the control(s) associated with one or more of the stations the system to modify its (their) operating parameters as a function of the gaging data developed from previously processed glass sheets (or panels).
It will be appreciated that the inspection system 10 of the present invention could alternatively be mounted in-line at various other points in the above-described and other panel fabrication systems as desired to maximize the production rate of the system, so long as the shape conformance measurements are taken after the panel has been formed to its final shape.
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4585343 | Schave et al. | Apr 1986 | A |
4629319 | Clarke et al. | Dec 1986 | A |
4989984 | Salinger | Feb 1991 | A |
5705805 | Han | Jan 1998 | A |
6031221 | Furnas | Feb 2000 | A |
6100990 | Ladewski | Aug 2000 | A |
6392754 | Pingel et al. | Mar 2002 | B1 |
6512239 | Weiss et al. | Jan 2003 | B1 |
7345698 | Abbott et al. | Mar 2008 | B2 |
7430049 | Bertin-Mourot et al. | Sep 2008 | B2 |
7471383 | Ehrick | Dec 2008 | B2 |
8049879 | Shetterly et al. | Nov 2011 | B2 |
8064069 | Wienand et al. | Nov 2011 | B2 |
8284396 | Rudert | Oct 2012 | B2 |
8670031 | Case et al. | Mar 2014 | B2 |
9470641 | Addington et al. | Oct 2016 | B1 |
20030030639 | Ritter | Feb 2003 | A1 |
20030053066 | Redner | Mar 2003 | A1 |
20040057046 | Abbott et al. | Mar 2004 | A1 |
20060028096 | Kushibiki et al. | Feb 2006 | A1 |
20060050284 | Bertin-Mourot et al. | Mar 2006 | A1 |
20060182308 | Gerlach et al. | Aug 2006 | A1 |
20080050008 | Oaki | Feb 2008 | A1 |
20080144044 | Ehrick | Jun 2008 | A1 |
20080247668 | Li et al. | Oct 2008 | A1 |
20080310701 | Caroli et al. | Dec 2008 | A1 |
20090238449 | Ihang et al. | Sep 2009 | A1 |
20090257052 | Surrel | Oct 2009 | A1 |
20090282871 | Shetterly et al. | Nov 2009 | A1 |
20100051834 | Lopatin | Mar 2010 | A1 |
20100060905 | Wienand | Mar 2010 | A1 |
20100149327 | Okamura | Jun 2010 | A1 |
20100165134 | Dowski, Jr. et al. | Jul 2010 | A1 |
20100260378 | Noy et al. | Oct 2010 | A1 |
20100309328 | Ehrick | Dec 2010 | A1 |
20100315635 | Janzen et al. | Dec 2010 | A1 |
20110187855 | Pichon et al. | Aug 2011 | A1 |
20110285987 | Surrel | Nov 2011 | A1 |
20110320023 | Sullivan et al. | Dec 2011 | A1 |
20120098959 | Addington | Apr 2012 | A1 |
20120229682 | Ng et al. | Sep 2012 | A1 |
20120269404 | Hassebrook et al. | Oct 2012 | A1 |
20130088630 | Kanade et al. | Apr 2013 | A1 |
20130216141 | Ushiba | Aug 2013 | A1 |
20140253929 | Huang et al. | Sep 2014 | A1 |
20140267666 | Holz | Sep 2014 | A1 |
20160145141 | Bennett | May 2016 | A1 |
20160377415 | Addington et al. | Dec 2016 | A1 |
20160377419 | Addington et al. | Dec 2016 | A1 |
20160377420 | Addington et al. | Dec 2016 | A1 |
20160379379 | Addington et al. | Dec 2016 | A1 |
20170003232 | Addington et al. | Jan 2017 | A1 |
Number | Date | Country |
---|---|---|
10127304 | Dec 2002 | DE |
2975776 | Nov 2012 | FR |
2009102490 | Aug 2009 | WO |
Entry |
---|
International Search Report and Written Opinion from related PCT Application No. PCT/US2016/032851, dated Aug. 25, 2016. |
International Search Report and Written Opinion from related PCT Application No. PCT/US2016/032840, dated Aug. 30, 2016. |
International Search Report and Written Opinion from related PCT Application No. PCT/US2016/032837, dated Aug. 18, 2016. |
International Search Report and Written Opinion from related PCT Application No. PCT/US2016/032846, dated Aug. 18, 2016. |
International Search Report and Written Opinion from related PCT Application No. PCT/US2016/032848, dated Aug. 17, 2016. |
International Search Report and Written Opinion from related PCT Application No. PCT/US2016/032855, dated Aug. 18, 2016. |
Number | Date | Country | |
---|---|---|---|
20160379380 A1 | Dec 2016 | US |