1. Technical Field
The present invention relates to bonding and more particularly to a bonding method, bonding apparatus and bonding program in which the positions of a plurality of positioning patterns disposed on a chip that is the object of bonding are respectively detected, the positions of bonding pads that are in a specified positional-relationship with the plurality of positioning patterns are calculated, and bonding is performed in the calculated bonding positions.
2. Description of the Related Art
In wire bonding between a plurality of bonding pads disposed on a chip and a plurality of bonding leads disposed on a circuit board, etc., on which the chip is mounted, the bonding of wires is accomplished by moving a bonding tool to the positions of the respective bonding pads and the positions of the respective bonding leads. As chips have become smaller and more highly integrated, the dimensions of bonding pads have become smaller and the spacing of such bonding pads has become narrower; accordingly, the accurate specification of the positions of the respective bonding pads has become necessary. Accordingly, position detection of bonding pads and the positioning patterns used in bonding is practiced.
However, when chips are disposed so that these chips are shifted in the direction of rotation, an inclination is generated in the positioning patterns, etc., so that accurate position detection cannot be accomplished, and wire bonding cannot be performed correctly. Japanese Patent Application Laid-Open (Kokai) No. S63-56764 discloses a method in which pattern matching between a reference-image prepared beforehand and the object-image is performed by successively rotating the reference-image from 0 to 360° for each pattern matching in cases where there is rotation, repeating pattern matching for each angle, and judging the locations and angles that show the best match as a result.
Furthermore, in the method of Japanese Patent No. 2864735, square regions that are to be compared are extracted from object-image signals obtained by imaging, the image signals contained in the extracted square regions are converted into image signals with polar coordinates by way of using the corners of the square regions as the origin, radial-direction patterns for respective specified angles and radial-direction patterns in the reference angles of reference-images prepared beforehand and subjected to a polar coordinate conversion are successively compared, and the comparative angle of the object-image is calculated.
Furthermore, in Japanese Patent Application Laid-Open (Kokai) No. 2002-208010, an approach using a rotation-resistant reference point is disclosed as a means for performing high-precision position detection without performing pattern matching in the rotational direction (which tends to involve an increase in the quantity of calculations) even in cases where the object of comparison is disposed in an attitude that includes positional deviation in the rotational direction. Here, according to Japanese Patent Application Laid-Open (Kokai) No. 2002-208010, the term “rotation-resistant reference point” refers to a point which is such that the error in the position of the object of comparison that is detected in pattern matching of the reference-image and an image of the object of comparison that is obtained by imaging the object of comparison disposed in an attitude that includes positional deviation in the direction of rotation shows a minimum value.
Furthermore, in Japanese Patent Application Laid-Open (Kokai) No. 2002-208010, it is indicated that normalized correlation calculations can be used as one method of pattern matching. Moreover, the following embodiment is indicated as a method for calculating the rotation-resistant reference point.
In the first embodiment, the rotation-resistant reference point is calculated as follows. Specifically, with one corner of the reference-image taken as the center, a rotated image that is rotated +Q° is produced, and the coordinates (X1, Y1) of the point showing the best match as a result of pattern matching between this rotated image and the reference-image are determined. Similarly, a rotated image that is rotated −Q° is produced, and the coordinates (X2, Y2) of the point showing the best match as a result of pattern matching between this rotated image and the reference-image are determined. The coordinates (AX1, AY1) of the rotation-resistant reference point are expressed by the following Equations (1) through (4) using the coordinates (X1, Y1), (X2, Y2) of these two points, the angle Q° and the coordinates (XC1, YC1) of the corner point taken as the center of rotation.
AX1=XC1+r·cosα (1)
AY1=YC1+r·sinα (2)
Here, α=tan−1{(X2−X1)/(Y1−Y2)} (3)
r=√{(X2−X1)2+(Y1−Y2)2}/2sinQ (4)
The rotation-resistant reference point determined by this method is the center of the object in cases where the pattern used is the shape of the object. For example, in the case of a circle, the center point of the circle is the rotation-resistant reference point, and in the case of a square, the center point of the square is the rotation-resistant reference point.
The second embodiment is a simpler method for calculating the rotation-resistant reference point. Specifically, a plurality of rotational center points are set within the reference-image. Then, the reference-image is rotated +Q° about each rotational center point. The amounts of matching between the respective rotated images thus obtained and the reference-image are respectively calculated. Then, a rotational center point with a relatively large amount of matching (among the plurality of rotational center points) is taken as the rotation-resistant reference point. In this case, a rotational center point that is set in the vicinity of the center of the pattern used is taken as the rotation-resistant reference point.
It is indicated that the coordinates of points used in bonding can be determined with high precision, without any need to perform pattern matching in the rotational direction, by thus calculating the coordinates of the rotation-resistant reference point, and taking this point as a bonding alignment point, i.e., a bonding positioning point.
In regard to position detection of positioning patterns, etc., new techniques have been developed as shown below as a demand for increased wire bonding speed has appeared; along with these new techniques, new problems have arisen.
For example, as the scale of LSI has increased, the number of boning pads has increased, and detection of the individual positions of all of these pads takes time. Accordingly, a method is practiced in which only the positions of positioning patterns, at least two of which are spaced as widely as possible on the surface of the chip, are detected, the positions of the other bonding pads are obtained by calculation based upon these positions, and the calculated positions are taken as the bonding target positions.
Furthermore, in cases where wire bonding is performed for numerous types of chips, the storage in memory and read-out of the disposition of positioning patterns and bonding pads on the chip according to the type involved is also bothersome. Accordingly, a method in which a reference chip that acts as a reference for the type of chip involved is prepared when there is a change in the chip type, the disposition relationship of the positioning patterns and bonding pads is stored in memory as training for this reference chip, next, positioning patterns are imaged for the chip that is the object of bonding as the running state, and position detection is accomplished by pattern matching with the acquired image of the reference chip, is also practiced.
If an even greater increase in speed is required, a considerable processing time is required for the pattern recognition of a plurality of positioning patterns in the same visual field; accordingly, a method is performed in which only the areas in the vicinity of the respective positioning patterns are imaged, and positional detection is accomplished by performing pattern matching based upon these images.
Thus, as the speed progressively increases, the question of how to discriminate a plurality of positioning patterns and detect the positions of these patterns in a short time becomes an important performance factor in wire bonding apparatuses.
In this case, it has been demonstrated that if there is a deviation of the chip in the rotational direction, i.e., an inclination, this hinders an increase in the speed of position detection. The conditions of this problem will be described with reference to
The training in regard to the reference chip 200 may be described as follows. A first reference-image 204 that includes the first positioning pattern 202 disposed on the upper left corner is acquired, and a second reference-image 214 that includes the second positioning pattern 212 disposed on the lower right corner is similarly acquired. These imaging-positions are detected by the wire bonding apparatus, and are respectively stored in memory as the center position 206 of the first reference-image and the center position 216 of the second reference-image. These center positions are treated as the position of the first positioning pattern 202 and the position of the second positioning pattern 212, and the respective positions of numerous bonding pads (not shown in the drawings) are calculated based upon these positions.
When training is completed, detection of the positioning patterns on the bonding object chip 230 is performed as a running step. First, the bonding object chip 230 is imaged in the position where the first positioning pattern 202 of the reference chip 200 was imaged. As shown in
Next, the camera must be moved in order to image the second positioning pattern of the bonding object chip 230. Besides the information that was acquired in training, i.e., the position 206 of the first positioning pattern 202 and position 216 of the second positioning pattern 212 of the reference chip 200, and the positions of the respective bonding pads calculated based upon these positions, the information that is obtained in this case is only the position 236 of the first positioning pattern of the bonding object chip 230. Accordingly, as shown in
In the case of
Thus, in cases where there is a deviation of the chip in the direction of rotation, i.e., an inclination, this hinders the increase in the speed of movement to the imaging-position of the next positioning pattern, so that the productivity of the wire bonding apparatus cannot be improved. On the other hand, in the conventional techniques described in the above-described Japanese Patent Application Laid-Open (Kokai) No. S63-56764, Japanese Patent No. 2864735, and Japanese Patent Application Laid-Open (Kokai) No. 2002-208010, inclination-angle detection and position detection are performed with the acquired image as an object. Accordingly, there is no description of the capturing of the positioning pattern in the next imaging-position in any of these techniques.
Furthermore, in Japanese Patent Application Laid-Open (Kokai) No. S63-56764 and Japanese Patent No. 2864735, a detection of the inclination-angle is described; however, there is no description of the relationship between this inclination-angle and the capturing of the positioning pattern in the next imaging-position. In the method in which angle detection is performed using a polar coordinate conversion in Japanese Patent No. 2864735, the precision of inclination-angle detection is greatly influenced by the manner in which the origin is set. For example, in cases where the positioning patterns are circular, polar coordinate development can be performed with good reproducibility if the center of each circle is taken as the origin for polar coordinate development. However, there is no angular dependence of the development pattern, so that angle detection is in fact impossible. In cases where the positioning patterns have an asymmetrical shape, the conditions of the development pattern differ according to the location of the origin of polar coordinate development; as result, the precision of angle detection is affected. Japanese Patent No. 2864735 discloses a method in which respective polar coordinate conversions are performed for the four corners of square regions, and the inclination-angle is determined based upon these conversions. In this case, however, the processing time is long.
Furthermore, in the method of Japanese Patent Application Laid-Open (Kokai) No. 2002-208010, a rotation-resistant reference point is calculated, and this point is used as a bonding positioning point, makes it possible to determine the positions with a high degree of precision. However, the inclination-angles of the positioning patterns cannot be determined.
Thus, in the prior art, in cases where the positioning patterns have an inclination-angle, problems remain in terms of how to move quickly to the next imaging-position.
The object of the present invention is to solve such problems encountered in the prior art, and to provide a bonding method, bonding apparatus and bonding program that make it possible to move to the next imaging-position at a higher speed even in cases where the positioning patterns have an inclination-angle.
The present invention is based on the results of an investigation of the question of where to locate the origin so as to improve the precision in the detection of the inclination-angle by way of using a polar coordinate conversion, focusing on the fact that a means for clarifying the relationship between the inclination-angle of one positioning pattern and the imaging-position of the next positioning pattern have been devised, and on the fact that the positioning processing time can be shortened as the precision of this inclination-angle is improved. First, the means for clarifying the relationship between the inclination-angle and the next imaging-position will be described; then, the principle involved in the improvement of the precision of the inclination-angle will be described.
The above object is accomplished by unique steps of the present invention for a bonding method that detects each one of positions of a plurality of positioning patterns disposed on a chip that is the object of bonding, calculates positions of bonding pads that are in a specified positional-relationship with the plurality of positioning patterns, and performs bonding in calculated positions of the bonding pads; and in the present invention, the bonding method includes:
In the above bonding method of the present invention, it is preferable that
Furthermore, in the above-described bonding method of the present invention, it is preferable that:
In addition, it is preferable that the above-described second object-image acquisition step image the second positioning pattern in an imaging range that is narrower than the imaging range used in the first object-image acquisition step.
The above-described object is also accomplished by a unique structure of the present invention for a bonding apparatus that detects each one of positions of a plurality of positioning patterns disposed on a chip that is the object of bonding, calculates positions of bonding pads that are in a specified positional-relationship with the plurality of positioning patterns, and performs bonding in calculated positions of the bonding pads; and in the present invention, the bonding apparatus includes:
In the above structure of the bonding apparatus of the present invention, it is preferable that
Furthermore, the above-described object is accomplished by unique processes of the present invention for a bonding program that detects each one of positions of a plurality of positioning patterns disposed on a chip that is the object of bonding, calculates positions of bonding pads that are in a specified positional-relationship with the plurality of positioning patterns, and performs bonding in calculated positions of the bonding pads; and in the present invention, the bonding program includes:
In the above bonding program of the present invention, it is preferable that
As seen from the above, in the present invention, the inclination-angle of the first positioning pattern of the bonding object chip is calculated, and the imaging range of the next second positioning pattern is calculated based upon this inclination-angle. Accordingly, even in cases where the positioning patterns have an inclination-angle, more rapid movement of the camera to the next imaging-position is possible, and the speed of bonding increases even further.
Furthermore, in the detection of the inclination-angle, a point which is such that the error in the position of the object of comparison that is detected by pattern matching between the reference-image and a comparison object-image obtained by imaging an object of comparison that is disposed in an attitude that includes positional deviation in the direction of rotation shows a minimum value is used as the origin of the polar coordinate conversion; as a result, the inclination of the bonding positioning pattern can be detected more quickly and with greater precision. Accordingly, even in cases where the positioning patterns have an inclination-angle, the camera can be moved even more quickly to the next imaging-position, and the speed of bonding can be increased even further.
Furthermore, since the inclination of the bonding positioning pattern can be detected with greater precision, the next imaging range can be made narrower. Accordingly, the processing time can be reduced even further.
1. Relationship Between Inclination-Angle and Imaging-Position
Here, it is assumed that the length L of a line segment connecting the position coordinates (X1, Y1) of the first positioning pattern 202 and the position coordinates (X2, Y2) of the position 216 of the second positioning pattern 212 in the reference chip 200, and the angle θ2 of this line segment with respect to the X axis, are already known; furthermore, it is assumed that the inclination-angle Δθ of the first positioning pattern 232 of the bonding object chip 230 is determined using a conventional technique.
Accordingly, if the angle with respect to the X axis of a line segment connecting the coordinates (X3, Y3) of the position 236 of the first positioning pattern 232 and the coordinates (X4, Y4) of the position 256 of the second positioning pattern 252 in the bonding object chip 230 is designated as θ1, the following relationship equations hold true:
θ1=θ2+Δθ (5)
θ2=arcTan{(Y2−Y1)/(X2−X1) (6)
L=√{(X2−X1)2+(Y2−Y1)2} (7)
X4=X3+Lcosθ1 (8)
Y4=Y3+Lsinθ1 (9)
Thus, by detecting the inclination-angle Δθ of the first positioning pattern 232 of the bonding object chip 230, and performing the calculations of Equations (5) through (9) based upon this detected angle, it is possible to obtain (X4, Y4), which are the center position coordinates of the next imaging range 250. By moving the camera to this position at a high speed, it is possible to capture the next positioning pattern 252 in the center of the imaging range of the camera even if the first positioning pattern 232 of the bonding object chip 230 is inclined.
2. Principle of Improvement of Precision of Inclination-Angle
The next positioning pattern 252 can be captured in the center of the imaging range of the camera by way of using the above-described relationship between the inclination-angle and the imaging-position; in this case, this positioning pattern can be captured further toward the center of the imaging range as the detection precision of the inclination-angle is improved. Specifically, even if the imaging range is narrowed, the next positioning pattern 252 can be captured in this range, and the processing time required for positioning can be shortened even further. Accordingly, the improvement of the precision of the inclination-angle will be described next.
The present invention is based on the results of an investigation of the question of where to locate the origin in the detection of angles by way of using a polar coordinate conversion in order to achieve greater precision. In the prior art, as described above, the precision is influenced by the manner in which the origin is established in performing a polar coordinate conversion. Accordingly, when a point showing relative immunity to the effects of rotation was investigated, the rotation-resistant reference point of Japanese Patent Application Laid-Open (Kokai) No. 2002-208010 attracted attention. The rotation-resistant reference point in the same reference is not used to determine angles. However, as described above, this point is “a point which is such that the error in the position of the object of comparison that is detected in pattern matching of the reference-image and an image of the object of comparison that is obtained by imaging the object of comparison disposed in an attitude that includes positional deviation in the direction of rotation shows a minimum value”.
Accordingly, it was thought that this point could be viewed as a point that is relatively unaffected even if there is rotation, and that it might be possible to perform a stable polar coordinate conversion even in the case of asymmetrical patterns by way of using this point as the origin of the polar coordinate conversion. As a result of this investigation, it was found that inclination-angles can be detected with good precision, while reducing the quantity of calculations required, by taking this rotation-resistant reference point as origin of the polar coordinate conversion. This will be described in detail below with reference to the accompanying drawings.
First, the fact that it is difficult to determine inclination-angles with sufficient precision even if a positional reference point determined by ordinary pattern matching is used as the origin of a polar coordinate conversion will be illustrated using
As shown in
The conditions of the polar coordinate conversion of the positioning patterns P0 and P2 are shown in
The relative inclination-angle between the positioning patterns P0 and P2 is determined from a comparison of the positioning patterns P6 and P8 following the polar coordinate conversion. Specifically, both positioning patterns P6 and P8 are caused to move in relative terms along the angular axis, and this movement is stopped in the position where the positioning patterns show the greatest degree of overlapping; the inclination-angle can be determined based upon the movement angle. However, the positioning pattern P6 shown in
Thus, the pattern obtained by a polar coordinate conversion varies greatly according to the placement of the origin of the polar coordinate conversion, and the precision of angle detection is influenced by this. Furthermore, as described above, in cases where the positioning pattern that is the object of comparison is disposed at an inclination, the inclination-angle cannot be determined with satisfactory precision even if the position determined by ordinary pattern matching is used “as is” as the origin of the polar coordinate conversion.
Next, a case in which a rotation-resistant reference point is taken as the origin of the polar coordinate conversion will be described. In order to facilitate comparison, the positioning patterns used are the same as those shown in
Furthermore, polar coordinate conversion for the object of comparison is performed over 360°. However, as seen from
As described above, it was found that the inclination-angle can be detected with good precision while reducing the amount of calculation required by taking a rotation-resistant reference point as the origin of the polar coordinate conversion. The bonding pattern discrimination of the present invention was devised based upon this result.
The present invention will be described in detail below with reference to the accompanying drawings. Below, the present invention will be described using a wire bonding apparatus that bonds wires to common semiconductor chips. However, the wire bonding apparatus used may also be a wire bonding apparatus that is used for stacked ICs in which chips have other chips stacked on top. In the following description, furthermore, dedicated positioning patterns disposed on the surfaces of the chips are used as the positioning patterns utilized in bonding. However, bonding pads disposed on the diagonal lines of the chips may also be used as positioning patterns.
The wire bonding apparatus 100 comprises an apparatus main body 102 and a control section 120. The apparatus main body includes a bonding head 104, a table 106 which moves the bonding head 104 within the XY plane shown in
The control section 120 has the function of controlling the overall operation of the elements that constitute the apparatus main body 102. In particular, this control section 120 has the functions of calculating the positions of the positioning patterns, and performing wire bonding based upon the results of this calculation. Such a control section 120 can be constructed from a general computer, or a computer that is especially meant for use in a bonding apparatus.
The control section 120 includes a CPU 122, an input means 124 such as a keyboard or input switch, etc., an output means 126 such as a display, etc., a memory 128 which stores image data, etc., and the above-described bonding head I/F 130, camera I/F 132 and table I/F 134; and these elements are connected to each other by an internal bus.
The CPU 122 includes a positioning pattern position calculating section 136 which has the function of performing processing that calculates the positions of the positioning patterns, and a bonding processing section 138 which has the function of setting the wire bonding conditions and performing wire bonding processing based upon the calculated positions of the positioning patterns. Software can be used to perform such processing; specified processing can be performed by executing corresponding bonding pattern discrimination programs and bonding programs. Furthermore, some of the processing may also be performed by hardware.
Details of the functions from the first reference-image acquisition module 140 to the second positional-relationship calculating module 154 of the positioning pattern position calculating section 136, and the function of the bonding processing section 138, will be described with reference to the flow chart shown in
First, the reference chip 200 is set (S100). More specifically, a chip used as a reference for the discrimination of the inclination of positioning pattern is set as the reference chip 200, and this chip is held on the stage 108. Next, the camera 112 is moved, and is brought to a position in which the imaging field can capture the first positioning pattern 202 of the reference chip 200 (S102).
Then, an image including the first positioning pattern 202 is acquired, and this image is stored in memory as the first reference-image (S104). More specifically, the first reference-image acquisition module 140 sends instructions to the camera 112 via the camera I/F 132, the first positioning pattern 202 of the reference chip 200 is imaged, and this data is stored in the memory 128. The acquired image corresponds to the first reference-image 204 shown in
Next, the camera 112 is moved, and is brought to a position where the imaging visual field can capture the second positioning pattern 212 of the reference chip 200 (S106).
Then, an image including the second positioning pattern 212 is acquired, and this image is stored in memory as the second reference-image (S108). More specifically, the second reference-image acquisition module 148 sends instructions to the camera 112 via the camera I/F 132, the second positioning pattern 212 of the reference chip 200 is imaged, and this data is stored in the memory 128. The acquired image corresponds to the second reference-image 214 in
The steps up to this point are a training process that uses the reference chip 200; and running steps that use the bonding object chip 230 are performed next.
In the running steps, the bonding object chip 230 is first set (S110). Specifically, the reference chip 200 is removed from the stage 108, and the chip 230 that is the object of bonding work is set on the stage 108. Then, the camera 112 is moved (S112), imaging is performed in the same visual field position as that in which the first reference-image 204 was imaged, and this image is stored in the memory 128 as the first object-image (S114). More specifically, the first object-image acquisition module 142 sends instructions to the camera 112 via the camera I/F 132, the first positioning pattern 232 of the bonding object chip 230 is imaged, and this data is stored in the memory 128. The acquired first object-image corresponds to an image of the imaging field 220 including the bonding object chip 230 in
Next, pattern matching is performed between the first reference-image 204 and first object-image, and a first positional-relationship which is the relative positional-relationship between the first positioning pattern 202 of the reference chip 200 and the first positioning pattern 232 of the bonding object chip 230 is calculated (S116). More specifically, the first positional-relationship calculating module 144 reads out the first reference-image 204 and first object-image from the memory 128, disposes the first object-image and first reference-image with the origins of the imaging visual fields aligned, and moves both images parallel to each other so that the overlapping of the first positioning pattern of the first reference-image and the first positioning pattern of the first object-image shows a maximum value. For example, normalized correlation calculations can be used as pattern matching method. As a result of this pattern matching, the center position of the first reference-image moves from the original center position 206 to the center position 236; this moved position is determined. This moved position indicates the relative position of the first positioning pattern 232 of the first object-image with reference to the center position 206 of the first reference-image 204. The position of the first positioning pattern 232 of the first object-image obtained as a result of pattern matching corresponds to the coordinates (X3, Y3) shown in
Next, using the first positioning pattern 202 of the first reference-image 204 as a reference, the inclination-angle of the first positioning pattern 232 of the first object-image is calculated (S118). The calculation of this inclination-angle can be accomplished using a conventional technique. For example,
The next imaging-position is calculated based upon the calculated inclination-angle Δθ and the coordinates (X1, Y1), (X2, Y2) and (X3, Y3) (S120). More specifically, the imaging-position calculating module 150 performs the calculations of the above-described Equations (5) through (9) using the above-described data, thus calculating the coordinates (X4, Y4), and these coordinates are taken as the center position of the next imaging-position. The calculation results correspond to the imaging range 250 shown in
If the respective coordinates and inclination-angles Δθ are calculated without error, the center position of the imaging range 250 in
Next, pattern matching is performed between the second reference-image 214 and second object-image, so that a second positional-relationship which is the relative positional-relationship between the second positioning pattern 212 of the reference chip 200 and the second positioning pattern 252 of the bonding object chip 230 is calculated (S126). More specifically, the second positional-relationship calculating module 154 reads out the second reference-image 214 and second object-image from the memory 128, and performs pattern matching by the same method as that described for the first positional-relationship calculation step (S116). The position of the second positioning pattern 252 of the second object-image that is obtained as a result of pattern matching corresponds to the point 256 in
Thus, by detecting the inclination-angle Δθ and calculating Equations (5) through (9), it is possible to move the camera at a high speed to the next imaging-position, and to obtain the position of the second positioning pattern for the bonding object chip 230. Once the positions of the first positioning pattern and second positioning pattern have thus been determined in the running steps, the processing required for wire bonding is performed by the function of the bonding processing section 138. For example, the positions of the respective bonding pads of the bonding object chip are calculated based upon the positions of the first positioning pattern and second positioning pattern (S128). Then, an instruction to move the tool 110 to the corrected bonding pad positions is sent to the table 106 via the table I/F 134, and when the tool 110 is moved to these positions, instructions are sent to the bonding head 104 via the bonding head I/F 130 so that the operations of the tool that are required for wire bonding are performed, thus resulting in the performance of wire bonding (S130).
An embodiment in which a polar coordinate conversion is used as the method of detecting the inclination-angle of the positioning patterns will be described with reference to the flow chart shown in
Furthermore, since it is convenient to use the description in
First, the reference chip 200 is set (S10). More specifically, a chip that acts as a reference for the discrimination of the inclination of the positioning patterns is used as a reference chip 200, and this chip is held on the state 108. Next, the camera 112 is moved, and is brought to a position which is such that the imaging visual field can capture the positioning pattern P0 of the reference chip 200 (S12).
Then, an image including the positioning pattern P0 is acquired, and this image is stored in memory as a reference-image (S14). More specifically, the CPU 122 sends instructions to the camera 112 via the camera I/F 132 so that the positioning pattern P0 of the reference chip 200 is imaged, and this data is stored in the memory 128. The acquired image corresponds to the reference-image 10 in
Next, the reference conversion origin which is the origin used for the polar coordinate conversion is specified for this reference-image 10 (S16). More specifically, the CPU 122 calculates the rotation-resistant reference point described in the “Principle of Improvement of Precision of Inclination-angle” of the present invention based upon the data of the reference-image 10 stored in the memory 128, and specifies these coordinates as the reference conversion origin. Furthermore, as described above, the coordinates of the reference conversion origin are specified using the center position 20 as a reference. The more detailed content of this process will be described later in Embodiment 2 and Embodiment 3. The specified reference conversion origin corresponds to the origin 26 in
The reference-image 10 is subjected to a polar coordinate conversion using the specified reference conversion origin (S18), and the resulting image is stored in the memory 128 as a post-conversion reference-image. More specifically, the CPU 122 reads out the reference-image 10 from the memory 128, determines the coordinates of the specified reference conversion origin using the center position 20 as a reference, and takes this origin as the origin of the polar coordinate conversion. Then, for example, calculations are performed in which the image is varied by an angle of θ in the clockwise direction, and the brightness data of the reference-image are converted as a function of the radius r for each angle θ. Accordingly, in the reference-image following this conversion, the brightness data are disposed with the horizontal axis taken as the angle θ, and the vertical axis taken as the radius r. Such a post-conversion reference-image corresponds to the image shown in
The process up to this point is a training process that uses the reference chip 200; next, a running process that uses the bonding object chip 230 is performed.
In the running process, the bonding object chip 230 is first set (S20). Specifically, the reference chip 200 is removed from the stage 108, and the chip 230 that is the object of bonding work is set on the stage 108. Then, imaging is performed in the same visual field position as that in which the reference-image 10 was acquired, and the resulting image is stored in the memory 128 as an object-image (S22). More specifically, the CPU 122 sends instructions to the camera 112 via the camera I/F 132 so that the positioning pattern P2 of the bonding object chip 230 is imaged, and this data is stored in the memory 128. The acquired object-image corresponds to
Next, pattern matching is performed between the reference-image 10 and the object-image, and the relative positional-relationship between the positioning pattern P0 of the reference chip 200 and the positioning pattern P2 of the bonding object chip is calculated (S24). More specifically, the CPU 122 reads out the reference-image 10 and object-image from the memory 128, disposes the object-image and reference-image with the origins of the imaging visual fields aligned, and moves both images parallel to each other so that the overlapping of the positioning pattern of the reference-image and the positioning pattern of the object-image shows a maximum value. For example, normalized correlation calculations can be used as pattern matching method. As a result of this pattern matching, the center position of the reference-image moves from the original center position 20 to the center position 24; the amount of this movement (ΔX, ΔY) is determined. This movement amount (ΔX, ΔY) indicates the relative position of the positioning pattern of the object-image with reference to the center position 20 of the reference-image 10. The conditions of this pattern matching correspond to
Next, the object conversion origin which is the origin that is used in order to subject the object-image to a polar coordinate conversion is specified (S26). More specifically, the CPU 122 performs the following calculations. Specifically, if the coordinates of the reference conversion origin 26 with reference to the center position 20 are designated as (X26, Y26), and the coordinates of the object conversion origin are designated as (X28, Y28), then the coordinates of this origin are (X28=X26+ΔX, Y28=Y26+ΔY). The object conversion origin corresponds to the origin 28 in
The object-image is subjected to a polar coordinate conversion using the object conversion origin thus specified (S28), and the resulting image is stored in the memory as a post-conversion object-image. More specifically, the CPU 122 reads out the object-image from the memory 128, determines the coordinates of the specified reference conversion origin, takes these coordinates as the origin of the polar coordinate conversion, and performs calculations in which, for example, the image is varied by an angle of θ in the clockwise direction, and the brightness data of the reference-image is converted as a function of the radius r for each angle θ. Such a post-conversion reference-image corresponds to the image shown in
Pattern matching is performed for the post-conversion reference-image and post-conversion object-image thus determined, and the inclination-angle is calculated (S30). More specifically, the CPU 122 reads out the post-conversion reference-image and post-conversion object-image from the memory 128, disposes both images with the origins of the angular axes aligned, moves both images parallel to each other along the angular axis, and determines the movement amount Δθ which is such that the overlapping of the positioning pattern of the post-conversion reference-image and the positioning pattern of the post-conversion object-image shows a maximum value. This movement amount Δθ indicates the relative inclination-angle of the positioning pattern P2 of the bonding object chip 230 with reference to the positioning pattern P0 of the reference chip 200. The conditions of the determination of the inclination-angle Δθ are shown in
Thus, the inclination-angles of the positioning patterns can be determined with good precision by calculating the rotation-resistant reference point described in the “Principle of Improvement of Precision of Inclination-angle” of the present invention, specifying the coordinates of this point as the reference conversion origin, and performing a polar coordinate conversion using this origin.
Accordingly, the imaging-position in which the next positioning pattern is imaged can be determined with a higher degree of precision, so that the next positioning pattern can be captured more securely even if the imaging range is narrowed. An example of this is indicated by the narrower imaging range 270 shown in
The more detailed content of the reference conversion origin specifying step (S16) will be described. Embodiment 3 corresponds to an embodiment in which the first embodiment in the above-described Japanese Patent Application Laid-Open (Kokai) No. 2002-208010 is applied to the bonding positioning patterns. The detailed content of this reference conversion origin specifying step will be described with reference to the internal flow chart shown in
The reference-image 10 acquired by the steps S10 through S14 illustrated in
Using this reference-image 10, a rotated image obtained by rotating the image +Q° about one corner of the reference-image 10 is produced (S40).
Next, the point of best matching is determined by pattern matching between the reference-image 10 and the rotated image 40 (S42). More specifically, the reference-image 10 is moved parallel to the rotated image 40 so that maximum overlapping is obtained between the positioning pattern of the reference-image 10 and the positioning pattern of the rotated image 40. The center position in the reference-image 50 in the case of maximum overlapping is the point of best matching 42 of both images. The conditions of this matching are shown in
Similarly, as shown in
The coordinates of the reference conversion origin are calculated using the coordinates (X1, Y1) of the point of best matching 42 thus determined, the coordinates (X2, Y2) of the point of best matching 72, the rotational angle Q°, and the coordinates (XC1, YC1) of the corner point 30 taken as the center of rotation (S52). The coordinates (AX1, AY1) of the reference conversion origin are expressed by the Equations (1) through (4) as described above.
AX1=XC1+r·cosα (1)
AY1=YC1+r·sinα (2)
Here, α=tan−1{(X2−X1)/(Y1−Y2)} (3)
r=√{(X2−X1)2+(Y1−Y2)2}/2sinQ (4)
Equation (3) can be explained utilizing the fact that the angle in
Equation (4) can be explained utilizing the fact that the distance between the tip ends of mutually equal line segments of a length r on both sides of the angle Q can be approximated by r×sinQ in cases where the rotational angle Q is very small. Specifically, the length of the line segment (point A1−point Am1)=r×SinQ=the length of (point 20−point 42)=√{(X1)2+(Y1)2}; accordingly, r=√{(X1)2+(Y1)2}/sinQ is obtained from this. Equation (4) is an equation in which this is rewritten as an equation using (X1, Y1) and (X2, Y2).
When r and α are thus determined from the coordinates (X1, Y1) and (X2, Y2) and the rotational angle Q, the coordinates (AX1, AY1) of the reference conversion origin in the reference-image 10 can be calculated specified by Equations (1) and (2) using the coordinates (XC1, YC1) of the point 30.
The method used to specify the reference conversion origin in Embodiment 2 can be used even if the positioning pattern is asymmetrical. Accordingly, the inclination-angle can be calculated with better precision, without being influenced by the shape of the positioning pattern.
Embodiment 4 corresponds to an embodiment in which the second embodiment in the above-described Japanese Patent Application Laid-Open (Kokai) No. 2002-208010 is applied to the bonding positioning pattern with regard to the reference conversion origin specification step. The detailed content of this reference conversion origin specification step will be described with reference to the internal flow chart shown in
The reference-image 10 acquired in the steps S10 through S14 illustrated in
A plurality of rotational center points are set within this reference-image 10 (S60). The conditions of this setting are shown in
Next, for one rotational center point, a rotated image obtained by rotating the reference-image 10 +Q° about this point is produced (S62). Then, the amount of matching between the positioning pattern of the rotated image thus produced and the positioning pattern of the reference-image 10 is determined (S64). Assuming that each positioning pattern is a bonding pad, and that the brightness data is the same for all pixels, then it may be predicted that the amount of matching will be proportional to the overlapping area of the bonding pads. The steps S62 through S64 are performed for each rotational center point (S66 through S68).
Then, when the amounts of matching have been respectively determined for all of the rotational center points, the rotational center point showing the maximum amount of matching among these rotational center points is determined (S70). Generally, the rotational center point that is closest to the center of the positioning pattern P0 shows the maximum amount of matching. The coordinates of this rotational center point showing the maximum amount of matching are specified as the coordinates of the reference conversion origin (S72). In the example shown in
Depending on the manner in which the rotational center points are set, there may be instances in which no exceptional amount of matching is obtained, and the amounts of matching are close to even. In such cases, the rotational center point showing the maximum amount of matching may be tentatively extracted and the amounts of matching of the surrounding rotational center points may be compared, and the coordinates of a position that is recognized as being close to the center of the positioning pattern may be specified as the reference conversion origin. In this case, a specified range from the maximum value of the amount of matching is set, a rotational center point that is within this range or a rotational center point that is close to this range is specified, and this point may be specified as the reference conversion origin.
Compared to the method of Embodiment 3, the method of Embodiment 4 is easier in terms of the determination of the reference conversion origin, and can greatly reduce the calculation time.
Number | Date | Country | Kind |
---|---|---|---|
2003-348888 | Oct 2003 | JP | national |
This application is a divisional of Ser. No. 10/959,242 now patent U.S. Pat. No. 7,209,583, filed Oct. 6, 2004.
Number | Name | Date | Kind |
---|---|---|---|
6016358 | Balamurugan | Jan 2000 | A |
6381359 | Hayata | Apr 2002 | B1 |
6917699 | Sugawara | Jul 2005 | B2 |
7224829 | Enokido | May 2007 | B2 |
20070036423 | Enokido | Feb 2007 | A1 |
20070036424 | Enokido | Feb 2007 | A1 |
20070036425 | Enokido et al. | Feb 2007 | A1 |
Number | Date | Country |
---|---|---|
63-56764 | Mar 1988 | JP |
2864735 | Mar 1999 | JP |
2002-208010 | Jul 2002 | JP |
Number | Date | Country | |
---|---|---|---|
20070041632 A1 | Feb 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10959242 | Oct 2004 | US |
Child | 11583105 | US |