The present invention generally concerns optical inspection installations particularly comprising three-dimensional (3D) image determination systems intended for the on-line analysis of objects, particularly of electronic circuits. The invention more particularly concerns optical inspection installations comprising digital cameras.
An optical inspection installation is generally used to verify the sound condition of an object, for example, an electronic circuit, before it is released to the market. The optical inspection installation may provide a 3D image of the object which is analyzed by a computer and/or by an operator to search for possible defects. A 3D image of an object corresponds to a cloud of points, for example, several million points, of at least a portion of the external surface of the object, where each point of the surface is located by its coordinates determined with respect to a three-dimensional space reference system.
The optical inspection installation generally comprises a processing unit capable of performing an automatic analysis of the images of the object to search for possible defects. This is for example done by comparing the image of the object with a reference image. In the case of an electronic circuit comprising, for example, a printed circuit having electronic components affixed thereto, the images of the electronic circuit may be used, in particular, to inspect the sound condition of the solders of the electronic components on the printed circuit.
A method of determining a 3D image comprises the projection of light patterns onto the object to be inspected, for example, fringes, the acquisition of images by cameras while the light patterns are projected onto the object to be inspected, and the determination of the 3D image based on the acquired images. In particular, in the case where the object is laid on a horizontal reference plane, each point of the 3D image may comprise a height coordinate relative to the reference plane.
The object to be inspected may comprise portions made of a translucent material. This may in particular be the case when the object comprises a printed circuit having its board having electronic components welded thereto made of a translucent material.
A disadvantage of a method of determining a 3D image by projection of light patterns onto such an object is that the projected patterns may partially penetrate into the translucent portions of the object. The 3D image of the translucent portions may then be incorrectly determined. In particular, in the case where, for each point of the object, a height coordinate relative to a reference plane is determined, the height coordinate of a point of a translucent portion may be smaller than the value that should have been determined.
An object of an embodiment is to at least partly overcome the disadvantages of the previously-described 3D image determination methods and 3D image determination systems.
Another object of an embodiment is to detect the presence of the translucent portions of an object.
Another object of an embodiment is for the 3D image of an object comprising translucent portions to be correctly determined.
Another object of an embodiment is to cause few modifications with respect to a known 3D image determination method.
Thus, an embodiment provides a method of determining a three-dimensional image of an object, comprising:
the projection by at least one projector of a plurality of first images onto the object, each first projected image comprising first light patterns spaced apart by a first period;
the acquisition, for each first projected image, of at least one first two-dimensional image of the object by at least one image sensor;
the projection by said at least one projector of a plurality of second images onto the object, each second projected image comprising second light patterns spaced apart by a second period different from the first period;
the acquisition, for each second projected image, of at least one second two-dimensional image of the object by said at least one image sensor; and
the detection of at least one translucent area of the object by comparison of first signals obtained from the first images and of second signals obtained from the second images and, for the translucent area, the determination of the height of each point of the translucent area based on the first and second signals.
According to an embodiment, the method further comprises:
the projection by said at least one projector of a plurality of third images onto the object, each third projected image comprising third light patterns spaced apart by a third period different from the first period and different from the second period;
the acquisition, for each third projected image, of at least one third two-dimensional image of the object by said at least one image sensor; and
the determination, for the translucent area, of the height of each point of the translucent area based on the first and second signals and on third signals obtained from the third images.
According to an embodiment, the first patterns are periodic along a given direction, with a period equal to the first period in the range from 1 mm to 15 mm.
According to an embodiment, the first light patterns comprise first light fringes.
According to an embodiment, the second patterns are periodic along the given direction, with a period equal to the second period in the range from 1 mm to 15 mm.
According to an embodiment, the second light patterns comprise second light fringes.
According to an embodiment, the first fringes are straight and parallel and the second fringes are straight and parallel.
According to an embodiment, the first patterns are not periodic, the first period corresponding to the average interval between the first patterns.
According to an embodiment, the method comprises determining a first height for each point of the object based on the first images, determining a second height for each point of the object based on the second images, detecting at least one translucent area of the object by comparison of the first and second heights and determining, for each point of the translucent area, a third height for said point based on the first and second heights for said point and on the first and second periods.
According to an embodiment, the first light patterns are phase-shifted from a first projected image to the next one and the second light patterns are phase-shifter from a second projected image to the next one.
An embodiment also provides a system for determining three-dimensional images of an object, comprising:
at least one projector configured to project a plurality of first images onto the object, each first projected image comprising first light patterns spaced apart by a first period, and a plurality of second images onto the object, each second projected image comprising second light patterns spaced apart by a second period different from the first period;
at least one image sensor configured to acquire, for each first projected image, at least one first two-dimensional image of the object and, for each second projected image, at least one second two-dimensional image of the object; and
a unit configured to detect at least one translucent area of the object by comparison of first signals obtained from the first images and of second signals obtained from the second images and, for the translucent area, to determine the height of each point of the translucent area based on the first and second signals.
According to an embodiment, the system comprises a unit for supplying digital images and the projector is capable of projecting said plurality of images onto the object, each of said images being formed by the projector from one of said digital images.
The foregoing and other features and advantages will be discussed in detail in the following non-limiting description of specific embodiments in connection with the accompanying drawings, in which:
The foregoing features and advantages, as well as others, will be described in detail in the following description of specific embodiments given by way of illustration and not limitation with reference to the accompanying drawings, in which:
For clarity, the same elements have been designated with the same reference numerals in the various drawings and, further, the various drawings are not to scale. Unless otherwise specified, expressions “about”, “approximately”, and “substantially” mean to within 10%, preferably to within 5%. Further, only those elements which are useful to the understanding of the present description have been shown and will be described.
In the following description, embodiments will be described in the case of the optical inspection of electronic circuits. However, these embodiments may apply to the determination of three-dimensional images of all types of objects, particularly for the optical inspection of mechanical parts. Call (OX) and (OY) two perpendicular directions. As an example, direction (OX) is horizontal.
Electronic circuit Board is placed on a conveyor 12, for example, a planar conveyor. Conveyor 12 is capable of displacing circuit Board parallel to direction (OY). As an example, conveyor 12 may comprise an assembly of straps and of rollers driven by a rotating electric motor 14. As a variation, conveyor 12 may comprise a linear motor displacing a carriage supporting electronic circuit Board. Circuit Board for example corresponds to a rectangular card having a length and a width varying from 50 mm to 550 mm.
Optical inspection installation 10 comprises a system 15 for determining a 3D image of electronic circuit Board. According to an embodiment, system 15 is capable of determining a 3D image of circuit Board by projection of images, for example, fringes, onto the circuit to be inspected. System 15 may comprise an image projection device P comprising at least one projector, a single projector P being shown in
System 15 further comprises an image acquisition device C comprising at least one camera, for example, a digital camera. As an example, two cameras C are shown in
The means for controlling conveyor 12, camera C, and projector P of previously-described optical acquisition system 10 are within the abilities of those skilled in the art and are not described in further detail. As a variant, the displacement direction of circuit Board may be a horizontal direction perpendicular to the direction (OY) shown in
System 15 is capable of determining a 3D image of circuit Board. A 3D image of circuit Board corresponds to a cloud of points, for example, of several million points, of at least a portion of the external surface of circuit Board, where each point of the surface is located by its coordinates (x, y, z) determined with respect to a three-dimensional space reference system RREF (OX, OY, OZ). In the following description, plane (OX, OY) is called reference plane PlREF. The z coordinate of a point of the surface of the object then corresponds to the height of the point measured with respect to reference plane PlREF. As an example, reference plane PlREF corresponds to the plane containing the upper surface or the lower surface of the printed circuit. Plane PlREF may be horizontal. Preferably, direction (OZ) is perpendicular to plane (OX, OY), that is, perpendicular to the upper or lower surface of the printed circuit.
The inventors have shown the existence of a dependency relationship between the error E which occurs during the determination of the points of the 3D image belonging to a translucent portion of the circuit and the period T of the images projected onto the circuit for the determination of the 3D image.
The inventors have carried out many tests and have shown that for the translucent materials used in electronics and microelectronics, there is a relation close to proportionality between bias E and period T of the light patterns projected onto the object to be inspected.
Further, the inventors has shown by many tests that a relation close to proportionality between bias E and period T of the light patterns projected onto the object to be inspected is obtained whatever the type of periodic patterns used.
According to an embodiment, each image projected for the determination of a 3D image comprises periodic patterns along a preferred direction. In particular, when patterns correspond to periodic fringes, the period of the patterns corresponds to the distance between two successive fringes. In the examples shown in
Further, the inventors have shown by many tests that a relation close to proportionality between bias E and period T of the light patterns projected onto the object to be inspected is also obtained, even when the projected light patterns do not have a periodic character but comprise spaced apart light patterns, the average space between adjacent light patterns, possibly along a preferred direction, then corresponding to the previously-described period T.
At step 30, first images are projected onto the object to be inspected, each first image comprising light patterns having a first period T1. Period T1 may be in the range from 1 mm to 15 mm. In the present embodiment of a method of determining a 3D image, at step 30, a plurality of first images are successively projected onto circuit Board. The first images differ from one another by an offset of the patterns along a preferred direction. As an example, for the image 24 shown in
According to an embodiment, processing unit 16 comprises a unit for determining a digital image and projector P is capable of projecting an image obtained from the digital image. According to an embodiment, projector P is of the type comprising a lamp emitting a beam which is directed towards an optical motor. The optical motor modulates the beam, according to the digital image, to form an image which is projected onto circuit Board. The optical motor may comprise an active area. As an example, the optical motor may comprise an array of liquid crystal shutters or LCD shutter which operates by transmission, the light beam crossing the LCD shutter. As a variant, the optical motor may implement the DLP (digital light processing) technology, which relies on the use of a device comprising an array of adjustable micro-mirrors, the light beam reflecting on the mirrors. As a variant, the optical motor may implement the LCoS (liquid crystal on silicon) technology, which relies on the use of a liquid crystal device, the light beam reflecting on the device. According to another variant, the optical motor may implement the GLV (grating light valve) technology, which relies on the use of a dynamically adjustable diffraction grating based on reflecting bands. According to another embodiment, projector P may implement at least one laser beam which is modulated according to the digital image, the image being obtained by an array scanning of the modulated laser beam.
Advantageously, when projector P is capable of projecting an image obtained from a digital image, the projected images may be simply obtained by modifying the digital image which controls projector P.
At step 32, second images are projected onto the object to be inspected, each second image comprising the same type of light patterns as the first images but with a second period T2 different from first period T1. Period T2 may be in the range from 1 mm to 15 mm. In the present embodiment of a method of determining a 3D image, at step 32, a plurality of second images with the patterns having the second period are successively projected onto circuit Board. The second images differ from one another by an offset of the patterns having the second period along a preferred direction. A 2D image is acquired during the projection of each new second image with light patterns onto circuit Board.
Generally, the larger period T, the greater the reconstruction depth, that is, the size of the height interval over which the 3D image may be determined by the method. Thereby, at least one of periods T1 or T2 is selected to have the desired reconstruction depth.
Step 32 may be repeated once or more than once with different periods.
At step 34, processing unit 16 determines a corrected 3D image of circuit Board.
According to an embodiment, processing unit 16 determines a first 3D image from the images acquired at step 30 and a second 3D image from the images acquired at step 32. Processing unit 16 then compares the first and second 3D images, for example, by determining, for each point of the 3D image, the difference between the height Z1 of the first 3D image and the height Z2 of the second 3D image. For the opaque portions of circuit Board, the difference between heights Z1 and Z2 is substantially null, for example, smaller the a given threshold. For the translucent portions of circuit Board, the difference between heights Z1 and Z2 is not null, for example, greater than a given threshold. Processing unit 16 thus determines the translucent portions of circuit Board. For each point of the translucent portions, processing unit 16 may determine the real height Z, for example, by extrapolation, from heights Z1 and Z2 and periods T1 and T2, considering that the relation between the height and the period is substantially linear.
Another embodiment of determination of a corrected 3D image will now be described. In this embodiment, the determination of the presence of translucent portions is carried out before the end of the method of determination of the first and second 3D images, which would normally be obtained with the first images and the second images, based on first intermediate data used for the determination of the first 3D image and on second intermediate data used for the determination of the second 3D image. As an example, the difference between the first intermediate data and the second intermediate data is determined. For the opaque portions of circuit Board, the difference between the first and second intermediate data is substantially null, for example, smaller than a given threshold. For the translucent portions of circuit Board, the difference between the first and second intermediate data is not null, for example, greater than a given threshold. Processing unit 16 thus determines the translucent portions of circuit Board. A determination of intermediate data corrected for the translucent portions is then performed and a 3D image corrected for the translucent portions is directly determined from the corrected intermediate data.
A more detailed embodiment will now be described for a specific example of a method of determining a 3D image.
Each point Qi of the scene has a corresponding point Cqi in the image plane of camera C and a corresponding point Pqi in the image plane of projector P. A reference frame RC(OC, X′, Y′, Z′) associated with camera C is considered, where OC is the optical center of camera C, direction Z′ is parallel to the optical axis of camera C, and directions X′ and Y′ are perpendicular to each other and perpendicular to direction Z′. In reference frame RC, to simplify the following description, it can approximately be considered that point Cqi has coordinates (Cui, Cvi, fC), where fC is the focal distance of camera C. A reference frame RP(OP, X″, Y″, Z″) associated with projector P is considered, where OP is the optical center of projector P, direction Z″ is parallel to the optical axis of projector P, and directions X″ and Y″ are perpendicular to each other and perpendicular to direction Z″. In reference frame RP, to simplify the following description, it can be approximately considered that point Pqi has coordinates (Pui, Pvi, fP), where fP is the focal distance of projector P.
Generally, calling PP the projection matrix of projector P and PC the projection matrix of camera C, one has the following equation system (1) for each point Qi, noted in homogeneous coordinates:
Each point Qi corresponds to the intersection of a line DC associated with camera C and of a line DP associated with projector P.
Each point Pqi of the image projected by projector P is associated a phase φi(zi). Light intensity IC(Cqi(zi)), measured by the pixel at point Cqi of the image acquired by the camera and corresponding to point Qi, follows relation (2) hereafter:
I
C(cqi(zi))=A(zi)+B(zi)cos φi(zi) (2)
where A(hi) is the light intensity of the background at point Qi of the image, B(zi) shows the amplitude between the minimum and maximum intensities at point Qi of the projected image.
According to an example, projector P successively projects N different images onto the circuit, where N is a natural integer greater than 1, preferably greater than or equal to 4, for example, equal to 8.
A 2π/N phase-shift is applied for each new first or second image projected with respect to the previous first or second projected image. Light intensity IdC(Cqi (zi)), measured by the pixel at point Cqi for the d-th image acquired by the camera corresponding to point Qi, follows relation (3) hereafter:
where d is an integer which varies from 0 to N−1.
Vector iC(zi) is defined according to relation (4) hereafter:
It is a linear equation system. It can be demonstrated that phase φi(zi) is given by relation (5) hereafter:
According to the previously-described embodiment where intermediate data are used for the determination of the translucent portions, phase φi(zi) may correspond to the intermediate data used.
A literal expression of height zi can generally be obtained.
An example of expression of height zi will be described in a specific configuration where projector P and camera C are of telecentric type and where the following conditions are fulfilled:
the optical axes of projector P and of camera C are coplanar;
the projected images are of the type shown in
lines DP are perpendicular to plane PlREF and lines DC form an angle θ with plane PlREF.
In this configuration, equation system (1) may then be simplified according to the following equation system (6):
considering that point QiREF of coordinates (xiREF, yiREF, 0) is the point of reference plane PlREF associated with point of camera C.
In the image plane of projector P, abscissa Pui of point Pqi follows, for example, relation (7) hereafter:
P
u
i
=aφ
i(zi)+b (7)
where a and b are real numbers, a being equal to p1/2π with p1 corresponding to the pitch of sinusoidal fringes 25.
Based on relations (6) and (7), the following relation (8) is obtained:
where φ1(QiREF) is equal to the phase at point QiREF of reference plane PlREF, that is, to the phase in the absence of circuit Board.
According to the previously-described embodiment where the 3D images are used for the determination of the translucent portions, height zi may be used.
Specific embodiments have been described. Various alterations and modifications will occur to those skilled in the art. In particular, although an embodiment has been described where the determination of the 3D image is performed from an algorithm using the camera and the projector, it should be clear that the 3D image determination method may be implemented by a triangulation method using at least two cameras.
Number | Date | Country | Kind |
---|---|---|---|
1800513 | May 2018 | FR | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/FR2019/050837 | 4/9/2019 | WO | 00 |