The invention relates to an inspection system and to a method for analyzing defects in a product, in particular a printed circuit board product, a semiconductor wafer or the like, the inspection system comprising a projection device, an optical detection device and a processing device, the projection device having at least one spectrometer member configured to split white light into its spectral components and project a multichromatic light beam thus formed from monochromatic light beams onto a product at an angle of incidence β, the optical detection device having a detection unit comprising an area scan camera and an objective, the area scan camera being configured to detect the multichromatic light beam reflected on the product in a detection plane of the detection unit, the detection plane being perpendicular, preferably orthogonal, to a product surface of the product.
In the case of known inspection systems or methods for analyzing defects in a product, a height profile, for example of a printed circuit board or of components disposed on a printed circuit board, can be determined by means of a spectroscopic method. The product to be inspected may be what is known as a PCB (printed circuit board) or a photovoltaic cell. PCBs can be formed, for example, by printing a conductive paste onto a printed circuit board substrate to form conductive tracks or electronic components, such as resistors. Various materials, such as copper, gold, aluminum, titanium, etc., can be applied to the substrate. The printed circuit board product may further be fitted with wired components or what is referred to as SMD components, coated with solder resist and soldered as part of the manufacturing process in question, for example by wave soldering. In these manufacturing processes, a number of defects can occur, such as an incompletely printed conductor track, a defective solder joint or a missing component, which can lead to the printed circuit board product being unable to function. It is therefore regularly necessary for the printed circuit board product to be analyzed for possible defects during its manufacture.
Such a defect analysis is regularly carried out using an inspection system that has a camera for capturing images of the printed circuit board product. The camera may be what is referred to as a line scan camera, which has optical sensors disposed in a line or row or three rows or lines each having different sensors (RGB sensors), or an area scan camera having more than three rows or lines. A disadvantage of the known inspection systems is that the inspection system is able to analyze products with sufficient image quality only up to a certain size, as the line or row length of the optical sensors of the known cameras is limited due to the manufacturing process. For defect analysis, it may therefore be necessary to scan a product in several passes in order to capture the entire product surface or its relevant sections.
EP 2 307 852 B1 describes an inspection system and a method for analyzing defects in PCBs that can be used to optically measure a PCB surface. A polychromatic light beam emanating from a white light source is passed through a prism to generate a multichromatic light beam which is projected onto the printed circuit board product at an angle of incidence. Depending on a difference in height of a surface of the printed circuit board product, the surface appears, as a function of its height, in a spectral color of the multichromatic light beam directed at this height. A monochromatic light beam reflected from the surface in question in this manner is detected by means of an optical detection device, in particular a line scan camera, or a line of an area scan camera. By means of a processing device or a device for data processing, height information of the relevant surface of the printed circuit board product can then be calculated by setting the desired angle of incidence for directing the multichromatic light beam at the detection plane and setting the position of the printed circuit board product relative to the line scan camera.
A disadvantage of the inspection system or method described above is that a surface of the PCB cannot always be reliably detected. Depending on the reflectance or the color of the product surface, there may be deviations in a reflection of the relevant light beam on the product surface. For a reliable defect analysis, multiple images of the printed circuit board product or the PCB are therefore regularly required, or the product is illuminated with differently colored light, or the RGB channels that appear most suitable for a defect analysis are processed further using the processing device. To create a height profile and to obtain a good-quality analysis image of the product, however, different illumination devices, each with associated detection units, are regularly required, which can then scan the product surface sequentially or in separate passes. This makes the defect analysis itself and the constructive design of the inspection system complex. Another disadvantage is that the height information obtained for the surface in question may be inaccurate. Depending on the height of the surface, there may be blurring in an image plane of the optical detection device or on the line scan camera or the area scan camera. This makes a height measurement less accurate, and this blurring cannot be compensated for by using a camera chip with a higher resolution.
The object of the present invention is to propose an inspection system and a method for analyzing defects in a product that enable a reliable defect analysis by simple means.
This object is attained by an inspection system having the features of claim 1 and a method having the features of claim 13.
The inspection system according to the invention for analyzing defects in a product, in particular a printed circuit board product, a semiconductor wafer or the like, comprises a projection device, an optical detection device and a processing device, the projection device having at least one spectrometer member configured to split white light into its spectral components and project a multichromatic light beam thus formed from monochromatic light beams onto a product at an angle of incidence β, the optical detection device having a detection unit comprising an area scan camera and an objective, the area scan camera being configured to detect a multichromatic light beam reflected on the product in a detection plane of the detection unit, the detection plane being perpendicular, preferably orthogonal, to a product surface of the product, wherein the detection unit has a dispersive or diffractive element disposed in the detection plane in the objective or between the objective and the product, the reflected multichromatic light beam being projectable onto an image plane of the area scan camera, the processing device being configured to derive a height information of the product surface from a spatial distribution of saturation values of the reflected multichromatic light beam in the image plane, a position of the optical detection unit relative to the product and the angle of incidence β.
Depending on a difference in height of a product surface, a reflection image of the multichromatic light beam on the product surface shifts relative to an optical axis of the objective or to the detection plane of the detection unit or the area scan camera due to the multichromatic light beam directed at the detection plane at the angle of incidence β, the wavelength of the multichromatic light beam varying with a height in the detection plane. As a result, the reflection image in question is projected into the image plane of the area scan camera at an offset relative to the optical axis or the detection plane as a function of the wavelength or the height, as well. By determining a location of the image in the image plane, it is then possible for the processing device to calculate the height information of the product surface. Optionally or additionally, the dispersive or diffractive element of the detection unit may also be disposed between the objective and the area scan camera.
Since the multichromatic light beam, which is formed from monochromatic light beams, is projected onto the image plane of the detection device or the area scan camera via the dispersive or diffractive element, disposed in the detection plane in the objective or between the objective and the product, the light beam formed from monochromatic light beams and reflected by the object can be split again by the dispersive or diffractive element. This splitting on a detection path, i.e., in the objective or between the objective and the product, allows multiple lines of the area scan camera to be used for color recognition. In particular when a grayscale chip is used as the area scan camera, a higher resolution and a more precise color determination can be effected compared to an RGB chip. Also, a grayscale chip does not require color filters, meaning a maximum sensitivity of the grayscale chip can be used. Overall, this enables improved resolution of a height measurement and a more precise determination of the height information of the product surface.
Furthermore, line images may be captured with above-average saturation values by at least two sensor lines, preferably three sensor lines, particularly preferably four to a maximum of five sensor lines, of the area scan camera using the processing device. A shift of the reflection image in the image plane is then determined by evaluating a spatial distribution or an apparent width of saturation values using the processing device. The location of the shift is determined from the spatial distribution or the apparent width of the saturation values, as the sensor lines of the area scan camera with the above-average saturation values represent the same. It has been found that a signal of sensor lines greater than four sensor lines, preferably five to a maximum of six sensor lines, becomes imprecise, meaning height information can no longer be determined with sufficient accuracy. Furthermore, a signal obtained by a single sensor line leads to noise, in which case the height information can no longer be determined with sufficient accuracy, either. It is therefore sufficient to determine no more than two sensor lines with above-average saturation values using the processing device and to capture the line images imaged on these sensor lines in the image plane. This not only makes it possible to determine the height or a topography but also to capture line images of the product surface at the same time. By using the dispersive or diffractive element, which is positioned in the detection plane in the objective or between the objective and the product, line images in different color gradations are obtained.
Advantageously, a center of gravity or an expected value of a Gaussian distribution of the light in the image plane of the area scan camera may correspond to a full width at half maximum, preferably 50%, 60%, 70%, 80% or 90% of the Gaussian distribution; the center of gravity may cover at least one sensor line. “Cover” is understood to mean that the sensor line is fully covered in a perpendicular direction relative to the sensor line. Since different surfaces of a product form different luminances, different intensities of light in the image plane of the area scan camera may have less influence on a measurement result. If at least one sensor line is covered, at least one adjacent second sensor line is illuminated with the result that line images can already be captured with two sensor lines. The height information can thus be determined more clearly.
The center of gravity for a first height information may cover a first sensor line; a center of gravity for a second height information may cover a second sensor line adjacent to the first sensor line. In this manner, two pieces of height information can be obtained with just three adjacent sensor lines. Respective Gaussian distributions of different wavelengths or different color gradations of the light can then be superimposed in the area of at least one sensor line, preferably no more than one sensor line. For example, height information can have a width of 1.5 to 2 sensor lines. A shift in the height information or a change between two sensor lines can thus be detected particularly easily. Height information that differs from one another can thus be determined clearly and very accurately.
Height information of the product surface can be derived depending on the position of the spatial distribution of saturation values in the image plane of the area scan camera. The fact that the dispersive or diffractive element splits the light beam passing through the objective means that a comparatively larger area of the area scan camera can be used. This results in a wider spatial distribution of saturation values in the image plane of the area scan camera. The wider spatial distribution of the light beam in the image plane also makes it possible to obtain even more precise height information of the product surface. In this manner, a wavelength or wavelength range of the light beam can be analyzed even more precisely based on the distribution of the saturation values and more accurate color information and thus height information can be obtained.
The processing device can be used to derive an analysis image of the product from a plurality of line images. The processing device can then combine the captured line images in the course of a scanning process to form an analysis image of the product surface. In particular, height information and a high-quality analysis image can be obtained simultaneously in a single scan of the product surface.
By means of the objective, a line image can be projected from an object plane of the product surface onto an image plane of the area scan camera; the area scan camera may be disposed transversely, preferably orthogonally, to a direction of movement of a product. For example, the area scan camera may have a rectangular sensor surface, allowing a detection width in relation to a product surface to be optimally utilized. In this case, the objective serves to project a line image which corresponds to a multichromatic light beam reflected on the product and which can be captured by the objective into the image plane of the area scan camera. Due to the dispersive or diffractive element disposed between the objective and the product, a beam path is divergent and the projection into the image plane is spectrally distributed across the image plane according to the spectrum of the spectrometer member and the multichromatic light beam reflected on the product due to the refraction of the dispersive or diffractive element.
The area scan camera may be formed by an RGB chip or a grayscale chip which has 32 to 128 sensor lines, preferably 32 to 64 sensor lines, transverse, preferably orthogonal, to a direction of movement of a product. For example, the area scan camera may have 1024 pixels×36 sensor lines, 2048 pixels×72 sensor lines, 4096 pixels×128 sensor lines, 16,384 pixels×256 sensor lines or more. In principle, a higher resolution can be achieved with a grayscale chip, as all sensor lines of the grayscale chip can be used regardless of a wavelength of the light. Nevertheless, it is also possible to use an RGB chip.
The projection device may emit light in the red, green, blue (RGB), infrared (IR) and/or ultraviolet (UV) wavelength ranges, preferably in a wavelength range of 400 to 700 nm, and the area scan camera can detect this light.
The inspection system may have a further projection device, and the further projection device may emit light with a different wavelength range than the projection device or a matching wavelength range. Alternatively, the further projection device and the projection device may be disposed at different heights relative to the product surface in order to illuminate the product surface at a height with different wavelength ranges. The further projection device may be configured identically to the projection device. Nevertheless, it is possible for the further projection device to emit light of a different wavelength range or a matching wavelength range. Thus, it is also possible for the projection device in question to be calculated from the image captured by the area scan camera. This may also result in multiple spaced-apart areas with above-average saturation values in the image plane of the area scan camera. This also makes it possible to generate many different features of the product with one image capture. If the projection device and the further projection device use light in different wavelength ranges, the product can be illuminated with light of a shorter wavelength and with light of a longer wavelength at the same time. The dispersive element can then split this light, which is reflected by the product, again. In this manner, even more precise height information can be determined overall. A total of three or more matching or different projection devices may also be provided. In this manner, a plurality of projection devices can be used to generate multiple images in a single scan. This can significantly speed up the inspection of products. The product can also be inspected simultaneously for a variety of defect types, which may require different lighting for optimum detection. The projection device and the further projection device may be arranged symmetrically opposite each other relative to the detection plane. This makes it possible to take essentially shadow-free images of a product.
The dispersive or diffractive element may be a prism or a diffraction grating. In a particularly simple embodiment, the prism may be a dispersion prism in the form of an isosceles triangle, for example.
The objective may be a telecentric objective with a diaphragm on the image side. The beam path of the objective may then run at least partially parallel within the objective. The diaphragm or aperture diaphragm may be disposed in a focal plane of the objective on the objective side. Consequently, the objective may also be a partially telecentric objective.
The diaphragm may be a slit diaphragm. Since an area scan camera is used, it is sufficient if the diaphragm is a slit diaphragm that is disposed or aligned in the direction of the longitudinal axis of the area scan camera.
The objective may have a front lens that is formed by a converging lens and/or by an achromatic lens. The achromatic lens may be composed of at least two lenses, the lenses of the achromatic lens being glued together or merely placed one inside the other, with a very thin air gap being formed. The air gap between the lenses may be up to 100 μm, for example. This makes it possible to form an achromatic lens particularly cost-effectively and to dispense with gluing the lenses together. The use of an achromatic lens makes it possible to correct color errors or distortion errors.
The front lens may be a cylindrical or spherical lens. It may be sufficient to use a cylindrical lens alone as the front lens, as only a light slit in the image plane or the line image needs to be imaged. By using the cylindrical lens, it is possible to save costs in the manufacture of the inspection system.
The front lens may be in the shape of a circle segment with two coaxial and parallel boundary surfaces. If the front lens is a cylindrical lens, the edges of the front lens may be cut off so that a rod-shaped lens which includes a main axis of the cylindrical lens remains. If the front lens is a spherical lens, boundary surfaces at the outer ends of the front lens may also be parallel, in which case the front lens may only approximately correspond to a circle segment. Using a rod-shaped front lens configured in this manner, it is possible to save installation space next to the detection unit compared to a completely circular front lens and then use this installation space for placing the projection device or an illumination device. The inspection system becomes particularly compact in this case.
A circle segment of the front lens may have a depth of <20 mm, preferably 10 mm, and the front lens may have a width of >250 mm, preferably 180 mm. The width of the front lens may also be understood as a diameter of the front lens. The depth of the front lens may correspond to a relative distance between the parallel boundary surfaces.
The optical detection device may have at least one second detection unit with a second area scan camera, a second dispersive or diffractive element and a second objective, and the area scan cameras may be disposed in alignment relative to one another in the direction of their longitudinal axes. The area scan cameras may be disposed in such a manner at a distance relative to one another that the respective line images of the area scan cameras which may be captured from the object plane by the area scan cameras in a common detection plane may partially overlap, the processing device being configured to assemble or combine the overlapping line images into a combination line image, the processing device being configured to derive an analysis image of the product from a plurality of combination line images.
The second detection unit of the optical detection device may be disposed in such a manner with the second area scan camera and the second objective that the area scan cameras are positioned in a row with respect to their longitudinal axes and thus transversely to the direction of movement. The area scan cameras may be disposed at a distance relative to each other but adjacent to each other. The line images in the object plane are comparatively larger than the line images in the image plane of the area scan camera, which makes it possible to arrange the detection units or area scan cameras in such a manner that the respective line images of the area scan cameras partially overlap at their longitudinal ends, i.e., that the area scan cameras can each capture matching sections (b) of the product surface. The processing device may be configured in such a manner that it can assemble or combine these overlapping line images into a combination line image. This makes it possible in principle to form an inspection system with an almost arbitrarily large detection width (E) by forming a row of area scan cameras or detection units. A total width of a product may then be captured in one scan pass, and the combination line images may be combined to form the analysis image of the product using the processing device.
The combination line image may have a detection width (E) in the object plane that is greater than a detectable width (B) of the line image in the object plane. Consequently, the respective line images do not overlap completely but only in an overlapping section (b). For example, the overlapping section may be 1 mm to 3 mm in size.
A detectable width (B) of the line image in the object plane may be greater than a diameter (D) of the objective or a length (L) of the area scan camera. This makes it possible to arrange area scan cameras with aligned longitudinal axes next to each other and still obtain an undivided combination line image in the first place. The diameter of the objective is understood to be a maximum external dimension of the objective, and the length of the area scan camera is understood to be a maximum external length of the area scan camera.
An optical axis of the objective may be disposed orthogonally relative to the object plane and transversally to the direction of movement, and a convergent beam path of the objective may run at an angle α>800 and <90° relative to the object plane. The convergent beam path of the objective makes it possible to partially overlap the line images of the object plane and at the same time to arrange the detection units next to each other, possibly with a space in between. The optical axes of the respective objectives may then be disposed in parallel in the detection plane.
It is particularly advantageous if the line images overlap by 1 mm to 3 mm. In this case, the processing device may be configured to combine the line images to form the combination line image based on matching pixels in the overlapping areas. By adding the line images in the overlap area, it is possible to achieve a particularly good image quality in the overlap area. The objective may then also have imaging errors in the edge area of its detectable width, which may then be tolerated.
The projection device may have an illumination device, and the illumination device may be configured to project diffuse light onto the product, and the area scan camera may be configured to detect light of the diffuse light reflected on the product in the detection plane. The illumination device may have a diffuser, by means of which a homogeneous distribution of the light on the product surface may be achieved while avoiding strong contrasts. The illumination device may emit light in the red, green and blue (RGB), infrared (IR) and/or ultraviolet (UV) wavelength ranges. Also, color components of the light may be selectable as desired in order to mix certain wavelength ranges. The illumination device may be composed of a number of light-emitting diodes (LEDs) disposed in rows or a matrix. Also, the illumination device may have a polarization filter.
The projection device may have a second illumination device, and the first and second illumination devices may be disposed coaxially relative to the optical detection device. The illumination devices may also be disposed transversely or orthogonally to the direction of movement of the product. This avoids a possible formation of shadows on the product surface. The structure of the second illumination device may correspond to that of the first illumination device.
In the method according to the invention for analyzing defects in a product, in particular a printed circuit board product, a semiconductor wafer or the like, using an inspection system comprising a projection device, an optical detection device and a processing device, a spectrometer member of the projection device splits white light into its spectral components by means of and projects a multichromatic light beam thus formed from monochromatic light beams onto a product at an angle of incidence β, the optical detection device having a detection unit comprising an area scan camera and an objective, a multichromatic light beam being reflected on the product in a detection plane of the detection unit, the detection plane being perpendicular, preferably orthogonal, to a product surface of the product, the area scan camera detecting said multichromatic light beam, wherein a dispersive or diffractive element of the detection unit disposed in the detection plane in the objective or between the objective and the product projects the reflected multichromatic light beam onto an image plane of the area scan camera, the processing device deriving a height information of the product surface from a spatial distribution of saturation values of the reflected multi-chromatic light beam in the image plane, a position of the optical detection unit relative to the product and the angle of incidence β. With regard to the advantageous effects of the method according to the invention, reference is made to the description of advantages of the inspection system according to the invention.
By means of the processing device, at least three, preferably five, sensor lines of the area scan camera with the highest saturation values may simultaneously capture line images. For example, it is then also possible to superimpose at least two of the sensor lines or their line images in order to obtain a single quality-optimized line image. At the same time, an amount of data to be processed will also be small, which enables fast processing of the image data and therefore a fast scan of the product. By superimposing the line images, it will be possible to capture only one line image of the reflected multichromatic light beam with the area scan camera. The area scan camera is therefore used simultaneously for determining the topography and for selecting usable line images.
A further projection device des inspection system may emit light in a different wavelength range than the projection device or in a matching wavelength range, the area scan camera being configured to simultaneously capture line images in den wavelength ranges of the projection device and the further projection device The image plane or an image in the image plane of the area scan camera may therefore be analyzed in terms of hue, brightness and/or saturation using the processing device.
The image information may be used in particular to analyze the material type and the distribution, as different materials have different H, S and V values. A color space may be selected as a function of the materials to be analyzed; an RGB color space may serve as the basis.
The processing device may be used to determine a material, a material property and/or a geometric structure of the product from the analysis image and/or to compare the analysis image with a reference image. Further material structure information of a product may be obtained if line images of the analysis image are superimposed and evaluated using the processing device. For example, two line images of a matching product surface may be combined using the image processing of the processing device. Optionally, sequential illumination by means of the spectrometer member may also take place. In this case, a first analysis image is first recorded with illumination in a single wavelength range, and then a second analysis image is captured with illumination in a single different wavelength range of a matching product surface. Finally, the analysis images are combined using image processing. The processing device may further be used to compare analysis image information or analysis images with reference images as part of a defect analysis. The reference images of the product may include CAD data and material distribution data of the product. The comparison may be carried out by image processing, wherein difference images may be analyzed separately for different structures of the product. Furthermore, this material information may be combined with height information in order to clearly identify a conductor track, for example. The reference image information may therefore include all geometric data of the product, material information, component information and also height information. If the analysis image information deviates from the reference image information, a defect may be signaled.
In this way, at least two or more line images of a matching product surface can be combined using the image processing of the processing device. This may result in what is referred to as an HDR (high dynamic range) image. A graduated series of exposures may be achieved by adjusting the illumination using the spectrometer member and/or a correspondingly different detection of the sensor lines of the area scan camera using the processing device. For example, a 12-bit analysis image of the area scan camera in which several brightnesses are integrated in 8-bit format may be read out by means of the processing device. Overall, a color location of a color space of a pixel of the analysis image in question may thus be determined even more precisely.
Further advantageous embodiments of the method are apparent from the descriptions of features of the claims dependent on the device claim.
Hereinafter, a preferred embodiment of the invention is explained in more detail with reference to the attached Figure. The Figure shows a simplified principle representation of an embodiment of an inspection system 26 in a side view. The inspection system 26 has an area scan camera 27. Furthermore, a detection device 28 having a detection unit 29 comprising the area scan camera 27, an objective 30 and a dispersive element 31 is shown together with a projection device 32. The projection device 32 has a light source 33, which emits white light, a diaphragm 34 and a further dispersive element 35. The further dispersive element 35 is formed by a further prism 36, by means of which a multichromatic light beam 37 is projected onto a product surface 38 of a product (not shown) transversely to a direction of movement indicated by an arrow 39.
The objective 30 comprises a lens assembly 40, which is shown schematically here, a front lens 41 and a diaphragm 42 disposed on the image side. The diaphragm 42 is configured in particular as a slit diaphragm 43. The front lens 41 is in the shape of a circle segment and has two parallel boundary surfaces 45 coaxial with an optical axis 44. The optical axis 44 runs through a detection plane 46 of the detection unit 29, the detection plane 46 being orthogonal to the product surface 38, which corresponds to an object plane 47. The multichromatic light beam 37 strikes the product surface 38 or the object plane 47 at an angle R relative to the detection plane 46 and is reflected from there into the objective 30. The dispersive element 31, which is formed by a prism 48, is disposed between the objective 30 and the product surface 38. The prism 48 disperses the light entering the objective 30 and projects it onto the area scan camera 27 or the image plane 49 thereof via the objective 30. Alternatively or additionally, a further optical element (not shown) which corrects the longitudinal chromatic aberration may further be disposed on the optical axis 44 between the objective 30 and the image plane 49.
A processing device (not shown) derives height information of the product surface 38 relative to the area scan camera 27 based on a spatial distribution of the reflected multichromatic light beam 37 on the area scan camera 27. For this purpose, sensor lines (not shown) of the area scan camera 27 which run parallel to the detection plane 46 are evaluated, five sensor lines with the highest or maximum saturation values being detected. The height information may then be calculated from a position of the sensor lines relative to the detection plane 46 and the angle of incidence β. Furthermore, the processing device is used to superimpose the sensor lines or their line images. These line images are in turn combined to form an analysis image of the product.
Furthermore, a further projection device 50 of the inspection system 26, shown here by way of indication, may be provided. The further projection device 50 is identical in structure the projection device 32 and is disposed symmetrically to the projection device 32 relative to the detection plane 46. In particular, the further projection device 50 is also positioned at the angle of incidence β relative to the detection plane 46. The further projection device 50 emits light with a wavelength range that differs from that of the projection device 32 onto the product surface 38. The area scan camera 27 can then simultaneously capture line images in the respective wavelength ranges of the projection device 32 and the further projection device 50. In this manner, at least two three-dimensional images of the product surface 38 can be generated with a single image acquisition. Since both three-dimensional images are based on light with different wavelength ranges, further features of the product surface and even more precise height information can be obtained.
Number | Date | Country | Kind |
---|---|---|---|
10 2021 124 505.2 | Sep 2021 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2022/076258 | 9/21/2022 | WO |