A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
This invention relates to the acquisition and subsequent construction of an extended dynamic range image using a high-speed imaging system.
High-speed optical image acquisition systems are used in a variety of environments to analyze the physical characteristics of one or more targets. Generally, such systems include an image acquisition system, such as a camera, that can acquire one or more images of the target. The images are then analyzed to assess the target. In many cases, the inspection is performed while there is relative motion between the target and the image acquisition system. As used herein, a “high-speed” optical image acquisition system is any system that acquires images of one or more targets while there is relative motion between the target and an image acquisition system. One particular example of such a system includes a phase profilometry inspection system such as that used in some modern solder paste inspection systems.
Phase profilometry inspection systems are currently used to inspect three-dimensional aspects of target surfaces. The concept of phase profilometry is relatively simple. A pattern or series of patterns of structured light are projected upon a target at an angle relative to the direction of an observer. This can be likened to sunlight passing through a Venetian blind and falling upon a three-dimensional object, where the object is viewed from an angle that differs from that of the sunlight. The pattern is distorted as a function of the object's shape. Knowledge of the system geometry and analysis of the distorted image or images can provide a map of the object in three dimensions.
Generally, phase profilometry systems employ a source of structured, patterned light, optics for directing the structured, patterned light onto a three-dimensional object and a sensor for sensing an image of that light as it is scattered, reflected or otherwise modified by its interaction with the three-dimensional object.
Phase profilometry inspection can be performed while there is relative motion between the target and the inspection system. This feature is highly beneficial for automated inspection machines where throughput is very important. One example of a phase profilometry inspection system is the Model SE 300 Solder Paste Inspection System available from CyberOptics Corporation of Golden Valley, Minn. This system uses phase profilometry to measure height profiles of solder paste deposited upon a circuit board prior to component placement. The SE 300 System is able to acquire the images that it uses for three-dimensional phase profilometry while allowing continuous relative motion between the sensor and the target. This is because a strobe illuminator is operated to provide short exposure times thus essentially freezing motion. Specifics of the illumination and image acquisition are set forth in greater detail in the parent application.
Solder paste, itself, presents a relatively favorable target in that it is comprised of a number of tiny solder spheres. Because there are so many small, spherical reflectors in each solder deposit, in the aggregate, each solder deposit provides a substantially diffuse optical surface that can be illuminated and imaged by a sensor configured to receive diffusely scattered light. The relatively uniform reflectivity of the solder paste deposits facilitates imaging.
High-speed inspection systems such as the surface phase profilometry inspection described above provide highly useful inspection functions without sacrificing system throughput. Theoretically, such inspection systems would be highly useful for any inspection operation that requires height information as well as two-dimensional information from a system without adversely affecting system throughput. However, there are real-world hurdles that hinder the ability to extend the advantages of high-speed inspection beyond targets having relatively diffuse reflectivity such as solder paste.
Surfaces such as the underside of a ball grid array (BGA) or chip scale packages (CSP's) are difficult to inspect using current optical phase profilometry systems. Specifically, the balls on a BGA are constructed from reheated solder, rendering them substantially hemispherical and shiny. Illuminating the hemispherical shiny balls with a structured or directional light source will generate a bright glint from a portion of the surface near the specular angle. Conversely, there will be nearly no light returned by diffuse scattering from the remainder of the hemispherical surface. Imaging the ball with a sensor that is not configured to deal with this wide dynamic range of returned light levels would result in erroneous data.
When light falls upon any surface, some light will be specularly reflected and some will be diffusely scattered. Some surfaces (like a mirror) are predominantly specular in their reflections, and some surfaces (like a piece of paper) predominantly scatter light. Imaging any such surface, even highly specular surfaces, may be possible by extending the dynamic range of a sensor configured to receive diffusely scattered light. One way in which dynamic range has been extended in the past is by taking or acquiring multiple images of a target with different illumination levels and processing the images to discard image data such as saturated pixels or dark pixels. However, all dynamic range extension techniques currently known are believed to be applied solely to systems in which the target and inspection system do not move relative to each other while the multiple images are acquired. Thus, as defined herein, such systems are not “high-speed.” In inspection systems where throughput cannot be sacrificed by pausing relative movement between the inspection system and the target, dynamic range extension itself is not easy to implement.
A high-speed image acquisition system includes a source of light; a sensor for acquiring a plurality of images of the target; a system for determining relative movement between the source and the target; and an image processor for processing the acquired images to generate inspection information relative to the target. The system has an extended dynamic range provided by controlling illumination such that a plurality of images is acquired at two or more different illumination levels. In some embodiments, the high-speed image acquisition system is used to perform three dimensional phase profilometry inspection.
Embodiments of the invention are applicable to any high-speed optical image acquisition system. While much of the disclosure is directed to inspection systems, particularly phase profilometry inspection systems, embodiments of the invention can be practiced in any image acquisition system where image acquisition occurs without pausing relative motion between a target and image acquisition system.
In the phase profilometry embodiments, the number of images in each image set is preferably equal to the number of relevant unknowns in the target. For example, if the target has three substantial unknowns, which might include height, reflectivity and fringe contrast, then three images of three differing projected patterns or phases of a single pattern or a suitable mix would be preferred. Thus, for embodiments with more than three unknowns, more images would be preferred, and vice versa.
Once the image sets are registered, a composite image with extended dynamic range can be created. Extended dynamic range is provided because if one image set has saturated or black portions, data from the other image can be used and normalized based on the measured illumination intensities or the measured ratio between the two intensities.
Target 104 may be a two-dimensional or three-dimensional object. Further, the illumination generated during either or both flashes can be of any suitable wavelength, including that of the visible spectrum.
CCD array 220 provides data representative of the acquired images to image processor 202 which computes a height map of target 216 based upon the acquired images. The height map is useful for a number of inspection criteria.
Typically, throughout inspection, there is continuous relative motion between sensor 201 and target 216. At block 306, a second set of at least one image is obtained while target 216 is illuminated at a second illumination level that differs from the first illumination level. Preferably, the second set consists of three images obtained with differing phases of a sinusoidal fringe pattern. The second set is also obtained within an interval of preferably one millisecond, but need not be within any specific period from the acquisition of the first set, as long as the difference is not so great that the two sets cannot be related to one another. Preferably, however, if the first set of images is acquired at p, p+120, and p+240 degrees, the second set is acquired at p+360n, p+360n+120 and p+360n+240 degrees, where n is an integer. In this manner, the images of the fringe pattern on target 216 will be identical except for differences due to the intentional illumination level change and the relative motion between sensor 201 and target 216 corresponding to 360n degrees.
At block 308, the first and second sets of at least one image are registered. This can be performed by adjusting one or both sets of images based upon measured, or otherwise known movement that occurred between acquisition of the first and second sets. Additionally, in some embodiments, it is possible to apply image processing techniques to the sets of images themselves to recognize patterns or references in the images and manipulate the images such that the references are aligned. Further still, measured motion between the first and second sets of acquisitions can be used to perform a first stage manipulation where one or both sets are manipulated for a coarse alignment. Then, the results of the coarse alignment are processed to identify patterns or references in the coarse-aligned images in order to finely adjust the images. Once the images are registered, the dynamic range can be extended as discussed above.
Generally, the brighter set of images (those obtained using the greater of the illumination levels) should be usable until saturation, presuming only that the detector and analog circuitry are linear, or at least stable (in which case non-linearity can be calibrated out). However, any suitable technique or algorithm can be used.
Some embodiments of the present invention require that the difference in illumination levels be precisely controlled. Although it may be technically possible to provide an illumination circuit with a precisely-controlled intensity, it is preferred that the illumination levels be only approximately controlled, but then measured to the required precision during use. As an example, the brighter illumination level may be one hundred times brighter than the dimmer level. In one embodiment, illumination levels are measured during use and used to correct for variations from one set to the next with sufficient precision. Preferably, the required precision is approximately equal to the smallest relative error detectable by the analog to digital converter. Thus, with a 10-bit converter, it would be sufficient to measure the relative light levels to a precision of one part in 210; with a ratio of light levels of 100 to 1, the resulting dynamic range would be 100 times 210 or about 100,000 to 1.
This approach is preferred due to the difficulty of controlling the strobe discharge with the required precision.
Alternatively, each set of images can be constructed into height maps before dynamic range extension and then three dimensional pixels (zetels) from the bright map can be replaced with zetels from the dim map depending on whether there was saturation in the images used to construct the bright map. The strobe intensities must still be well controlled or measured, since such intensity errors affect the height reconstruction, but the task is easier since only the nominally-equal strobe intensities within the set used to construct each height map need be measured or controlled; the magnitude of the large ratio between sets need not be known or controlled accurately.
Other normalization techniques are also possible. For instance, each image in the set can be normalized (that is, divided) by its corresponding measured strobe intensity. Then the normalized pixels can be freely chosen from one set or the other, based on their positions in their respective dynamic ranges, with the height reconstruction done later on the composite image set thus produced. This technique relies on the fact that phase reconstruction is not affected by a scaling that is applied uniformly to all the phase images. The suitability of a particular technique may depend on the details of the hardware or software used to make the selection.
At block 310, a height map is reconstructed using the composite set of image data from step 308. The manner in which the height map is generated can include any of the well-known phase profilometry height reconstruction techniques. Once the height map is reconstructed, step 312 is executed where profilometry inspection is performed on the reconstructed height map. This inspection can include extracting summary measures such as height, volume, area, registration, coplanarity, maximum height range, surface contours, surface roughness . . . etc.
Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.
This application is a Continuation-in-Part application of U.S. patent application Ser. No. 09/522,519, filed Mar. 10, 2000 now U.S. Pat No. 6,549,647, and entitled “Inspection System with Vibration Resistant Video Capture,” which claims priority benefits from U.S. Provisional patent application Ser. No. 60/175,049, filed Jan. 7, 2000 and entitled “Improved Inspection Machine.”
| Number | Name | Date | Kind |
|---|---|---|---|
| 2625856 | Muller | Jan 1953 | A |
| 3777061 | Takemura | Dec 1973 | A |
| 3995107 | Woywood | Nov 1976 | A |
| 4270863 | Trogdon | Jun 1981 | A |
| 4541010 | Alston | Sep 1985 | A |
| 4598321 | Elabd et al. | Jul 1986 | A |
| 4641972 | Halioua et al. | Feb 1987 | A |
| 4643565 | Goto | Feb 1987 | A |
| 4677473 | Okamoto et al. | Jun 1987 | A |
| 4782394 | Hieda et al. | Nov 1988 | A |
| 4835616 | Morcom | May 1989 | A |
| 4949172 | Hunt et al. | Aug 1990 | A |
| 4963024 | Ulich | Oct 1990 | A |
| 4984893 | Lange | Jan 1991 | A |
| 5039868 | Kobayashi et al. | Aug 1991 | A |
| 5069548 | Boehnlein | Dec 1991 | A |
| 5091963 | Litt et al. | Feb 1992 | A |
| 5103105 | Ikegaya et al. | Apr 1992 | A |
| 5135308 | Kuchel | Aug 1992 | A |
| 5278634 | Skunes et al. | Jan 1994 | A |
| 5298734 | Kokubo | Mar 1994 | A |
| 5307152 | Boehnlein et al. | Apr 1994 | A |
| 5406372 | Vodanovic et al. | Apr 1995 | A |
| 5424552 | Tsuji et al. | Jun 1995 | A |
| 5450204 | Shigeyama et al. | Sep 1995 | A |
| 5450228 | Boardman et al. | Sep 1995 | A |
| 5455870 | Sepai et al. | Oct 1995 | A |
| 5504596 | Goto et al. | Apr 1996 | A |
| 5546127 | Yamashita et al. | Aug 1996 | A |
| 5555090 | Schmutz | Sep 1996 | A |
| 5576829 | Shiraishi et al. | Nov 1996 | A |
| 5636025 | Bieman et al. | Jun 1997 | A |
| 5646733 | Bieman | Jul 1997 | A |
| 5668665 | Choate | Sep 1997 | A |
| 5684530 | White | Nov 1997 | A |
| 5686994 | Tokura | Nov 1997 | A |
| 5691784 | Häusler et al. | Nov 1997 | A |
| 5708532 | Wartmann | Jan 1998 | A |
| 5761337 | Nishimura et al. | Jun 1998 | A |
| 5774221 | Guerra | Jun 1998 | A |
| 5815275 | Svetkoff et al. | Sep 1998 | A |
| 5862973 | Wasserman | Jan 1999 | A |
| 5867604 | Ben-Levy et al. | Feb 1999 | A |
| 5878152 | Sussman | Mar 1999 | A |
| 5912984 | Michael et al. | Jun 1999 | A |
| 5917927 | Satake et al. | Jun 1999 | A |
| 5926557 | King et al. | Jul 1999 | A |
| 5953448 | Liang | Sep 1999 | A |
| 5969819 | Wang | Oct 1999 | A |
| 5982921 | Alumot et al. | Nov 1999 | A |
| 5982927 | Koljonen | Nov 1999 | A |
| 5991461 | Schmucker et al. | Nov 1999 | A |
| 5995232 | Van Der Ven | Nov 1999 | A |
| 5999266 | Takahashi et al. | Dec 1999 | A |
| 6028673 | Nagasaki et al. | Feb 2000 | A |
| 6061476 | Nichani | May 2000 | A |
| 6081613 | Ikurumi et al. | Jun 2000 | A |
| 6084712 | Harding | Jul 2000 | A |
| 6118524 | King et al. | Sep 2000 | A |
| 6180935 | Hoagland | Jan 2001 | B1 |
| 6201892 | Ludlow et al. | Mar 2001 | B1 |
| 6232724 | Onimoto et al. | May 2001 | B1 |
| 6268923 | Michniewicz et al. | Jul 2001 | B1 |
| 6269197 | Wallack | Jul 2001 | B1 |
| 6303916 | Gladnick | Oct 2001 | B1 |
| 6307210 | Suzuki et al. | Oct 2001 | B1 |
| 6445813 | Ikurumi et al. | Sep 2002 | B1 |
| 6496254 | Bostrom et al. | Dec 2002 | B1 |
| 6522777 | Paulsen et al. | Feb 2003 | B1 |
| 6603103 | Ulrich et al. | Aug 2003 | B1 |
| Number | Date | Country |
|---|---|---|
| 40 11 407 | Oct 1991 | DE |
| 19 511 160 | Mar 1995 | DE |
| 0 453 977 | Oct 1991 | EP |
| 0660 078 | Dec 1994 | EP |
| WO 9859490 | Jun 1998 | WO |
| WO 9912001 | Mar 1999 | WO |
| WO 9924786 | May 1999 | WO |
| WO 0106210 | Jan 2001 | WO |
| WO 0154068 | Jul 2001 | WO |
| WO 0201209 | Jan 2002 | WO |
| WO 0201210 | Jan 2002 | WO |
| Number | Date | Country | |
|---|---|---|---|
| 20020191834 A1 | Dec 2002 | US |
| Number | Date | Country | |
|---|---|---|---|
| 60175049 | Jan 2000 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 09522519 | Mar 2000 | US |
| Child | 10217910 | US |