The present invention relates to a manufacturing method for an inspection device which inspects a surface of a test object represented by a semiconductor wafer.
An inspection device to inspect the surface of such a test object as a wafer has a line sensor that picks up an image of the test object and outputting image signals, and an image processing unit that creates two-dimensional image data of the test object based on image signals which are acquired by relatively moving the line sensor with respect to the test object in a sub-scanning direction (direction perpendicular to the longitudinal direction of the line sensor), so that the surface of the test object is inspected based on the two-dimensional image data of the test object (e.g. see Patent Document 1). When a surface of a large test object is inspected using such an inspection device, a method for picking up the image of the test object by scanning with the line sensor a plurality of times, or a method for picking up an image of the test object, which is reduced by a reduction optical system, by scanning with the line sensor once, is used. In the former case, however, it takes time to pickup the image of the test object, and in the latter case, a complicated optical system is required for the reduction optical system, and a pixel count of the image is limited by the pixel count of one line sensor. Therefore increasing the length of the line sensor is under consideration, so that the image of the entire test object can be picked up by one scan.
When a two-dimensional image acquired by the line sensor is used for inspection and measurement, distortion of the image causes an error of the inspection and measurement, so the distortion must be measured and corrected when the device is manufactured or adjusted (e.g. see Patent Document 2). In the case of measurement of the distortion using a two-dimensional camera image, an image of a lattice pattern, of which pattern position is known, is picked up, and positions of the cross-sections of the lattice pattern are measured, and the parameters for correction are calculated. In other words, the distortion amount is determined at the cross-sections of the lattice pattern, that is, the discrete positions, and based on the distortion amounts, the distortion correction parameters of the entire screen are decided. In the distortion of the two-dimensional camera image, the above mentioned method can be used since it can be assumed that the distortion amount in the screen is smooth and continuous based on the factors which generate the distortion.
Patent Document 1: National Publication of Translated Version No. 2001-524205
Patent Document 2: Japanese Patent Application Laid-Open No. H10-311705
In the case of a long line sensor, however, which is comprised of relatively short line sensor elements, such as one-dimensional photo-electric transfer elements, which are linearly connected, the distortion amount in the two-dimensional image is in some cases not smooth or continuous among adjacent pixels. In order to correct such a distortion of two-dimensional images, the distortion amount must be measured for all the pixels in the line sensor, which is not supported by the conventional method using a lattice pattern.
With the foregoing in view, it is an object of the present invention to provide manufacturing steps for an inspection device using a line sensor, which allows measurement of the distortion amount of all the pixels of the line sensor.
To achieve this object, a manufacturing method for an inspection device according to a first invention is a manufacturing method for an inspection device which has: a line sensor that picks up an image of a test object and outputs image signals, and is constructed by a plurality of line sensor elements connected in a line; and an image processing unit that creates two-dimensional image data of the test object based on the image signals acquired by relatively moving the line sensor in a sub-scanning axis direction that is perpendicular to a main scanning axis direction extending in a longitudinal direction of the line sensor, and which inspects a surface of the test object based on the two-dimensional image data of the test object, the method comprising a distortion measurement step of measuring a distortion amount of the two-dimensional image.
This distortion measurement step has: a first step of the line sensor picking up an image of a linear pattern extending substantially in parallel with the main scanning axis direction to create two-dimensional image data of the linear pattern, and calculating a distortion amount of the sub-scanning axis direction in the two-dimensional image from the created two-dimensional image data of the linear pattern; and a second step of the line sensor picking up an image of an oblique pattern extending in an inclined direction with respect to the main scanning axis direction to create two-dimensional image data of the oblique pattern, and calculating a distortion amount in the main scanning axis direction in the two-dimensional image from the two-dimensional image data of the oblique pattern and the distortion amount in the sub-scanning axis direction calculated in the first step.
In the above mentioned manufacturing method, it is preferable that in the second step, the distortion of the two-dimensional image of the oblique pattern is corrected using the distortion amount in the sub-scanning axis direction calculated in the first step, to create corrected image data, and the distortion amount in the main scanning axis direction is calculated from the created corrected image data.
It is preferable that the above mentioned manufacturing method further has: a third step of calculating and storing the distortion amounts in both the main scanning axis direction and the sub-scanning axis direction for each pixel of the line sensor.
A manufacturing method for an inspection device according to a second aspect of the invention is a manufacturing method similar to the manufacturing method for an inspection device according to the first aspect of the invention, and the distortion measurement step has: a first step of the line sensor picking up an image of a linear pattern extending substantially in parallel with the main scanning axis direction to create two-dimensional image data of the linear pattern; a second step of the line sensor picking up an image of an oblique linear pattern extending in an inclined direction with respect to the main scanning axis direction to create two-dimensional image data of the oblique pattern; a third step of setting a main scanning axis distortion amount, which is a distortion amount in the main scanning axis direction in the two-dimensional image, to zero, and setting the sub-scanning axis distortion amount, which is a distortion amount in the sub-scanning axis direction in the two-dimensional image, to zero; and a fourth step of copying the two-dimensional image data of the linear pattern created in the first step and setting the copied data in virtual two-dimensional image data.
The distortion measurement step further has: a fifth step of calculating a distortion amount in the sub-scanning axis direction in the virtual two-dimensional image from the virtual two-dimensional image data, and adding the calculated distortion amount in the sub-scanning axis direction to the sub-scanning axis distortion amount; a sixth step of correcting distortion of the two-dimensional image of the oblique pattern created in the second step, using the main scanning axis distortion amount and the sub-scanning axis distortion amount, to create corrected image data; a seventh step of calculating a distortion amount in the main scanning axis direction in the corrected image from the corrected image data and adding the calculated distortion amount in the main scanning axis direction to the main scanning axis distortion amount; an eighth step of correcting the distortion of the two-dimensional image of the linear pattern created in the first step, using the main scanning axis distortion amount and the sub-scanning axis distortion amount, and setting the corrected two-dimensional image data in the virtual two-dimensional image data; and a ninth step of repeating the fifth to eighth steps for a predetermined number of times or until the distortion amount in the sub-scanning axis direction in the virtual two-dimensional image calculated in the fifth step and the distortion amount in the main scanning axis direction in the corrected image calculated in the seventh step become smaller than a predetermined value respectively.
It is preferable that the above mentioned manufacturing method further has: a tenth step of calculating and storing the distortion amounts in both the main scanning axis direction and the sub-scanning axis direction for each pixel of the line sensor.
In each of the above mentioned manufacturing methods, it is preferable that in the distortion measurement step, the line sensor is relatively moved in the sub-scanning axis direction so as to pickup the images of the linear pattern and the oblique pattern all at once.
According to the present invention, in manufacturing steps of an inspection device using a line sensor, the distortion amount can be measured for all the pixels of the line sensor, including joints between line sensor elements.
Embodiments of the present invention will now be described with reference to the drawings.
The stage 15 supports the wafer 10 in substantially a horizontal state, and also supports the wafer 10 such that the wafer 10 can move in a sub-scanning axis direction which is perpendicular to a main scanning axis direction extending in a direction along the longitudinal direction of the line sensor 20. Hereafter in the present embodiment, it is assumed that the main scanning axis direction is the X direction, and the sub-scanning axis direction is the Y direction, as shown in
As
The image processing unit 25 creates two-dimensional image data of the wafer 10 based on (one-dimensional) image signals which are respectively acquired by relatively moving the line sensor 20 in the Y direction. When the two-dimensional image data of the wafer 10 is created, the image processing unit 25 performs a predetermined image processing on the created two-dimensional image data, and inspects the surface of the wafer 10 (detects defects on the surface of the wafer 10), and the inspection result by the image processing unit 25 and the two-dimensional image of the wafer 10 thereof are output to and displayed on the image display device 26.
Operation of the inspection device 1 constructed like this will be described with reference to the flow chart in
Then in step S102, the image processing unit 25 corrects the shading of the two-dimensional image of the wafer 10 created in the step S101. The shading is corrected based on the shading data which was input in the storage unit (not illustrated) of the inspection device 1 in advance. The shading data is determined by the line sensor 20 picking up an image of a shading correction target 16 of which reflectance is uniform, and measuring the non-uniformity of brightness based on the two-dimensional image data of the image of the shading correction target 16.
In step S103, the image processing unit 25 corrects the distortion of the two-dimensional image of the wafer 10. The distortion is corrected based on the distortion data which was input in the storage unit (not illustrated) of the inspection device 1 in advance. The distortion data is determined by the line sensor 20 picking up the image of the later mentioned distortion correction target 17, and measuring the distortion amount in the two-dimensional image from the two-dimensional image data of the picked up image of the distortion correction target 17.
In step S104, the image processing unit 25 removes noise from the two-dimensional image of the wafer 10.
In step S105, the image processing unit 25 performs a predetermined image processing on the two-dimensional image data corrected in steps S102 to S104, and inspects the surface of the wafer 10 (detects defects on the surface of the wafer 10).
In step S106, the inspection result by the image processing unit 25 and the two-dimensional image of the wafer 10 at the this time are output to and displayed on the image display device 26, and processing ends.
The distortion data is determined by measuring the distortion amount using the distortion correction target 17 when the inspection device 1 is manufactured or when the line sensor 20 is replaced, and is input to the storage unit (not illustrated) of the inspection device 1.
When the inspection device 1 according to the present embodiment is manufactured, the distortion amount in the two-dimensional image is measured in the distortion measurement step after the assembly step, and the distortion data determined based on this distortion amount is input to the storage unit of the inspection device 1. After the inspection step, the inspection device 1 is completed. Measuring the distortion amount is not limited to after the assembly step, but the distortion amount may be measured when assembly of the line sensor 20 is completed, using a single line sensor 20 and dedicated test machine, and this distortion data is input separately into the storage unit of the inspection device 1 in which the line sensor 20 is installed.
As
The distortion can be measured only if the oblique pattern 19 is inclined from the linear pattern 18, but in terms of the measurement accuracy, it is preferable that the inclination angle of the oblique pattern 19, with respect to the X direction, is between 45° to 90°. In the distortion correction target 17, the accuracy of the linearity and angle of the pattern is assumed to be guaranteed by the measurement means, which is not illustrated.
As mentioned above, the line sensor 20 is comprised of a plurality of one-dimensional photo-electric transfer element 21, which are connected in a line, but each one-dimensional photo-electric transfer element 21 is not always perfectly connected linearly, but is in a somewhat distorted state, as shown in
The distortion measurement step using this distortion correction target 17 will now be described with reference to the flow chart in
In this way, the images of the linear pattern 18 and the oblique pattern 19 can be picked up all at once by picking up the image of the distortion correction target 17 while relatively moving the line sensor 20 in the Y direction (sub-scanning axis direction). Thereby the time for measuring the distortion amount can be decreased. The input image at this time, that is, the two-dimensional image of the distortion correction target 17 in
Then in step S202, all zeros are set in the distortion table, which is the distortion amount measurement result, and the measurement result is initialized. In the distortion table, the distortion amount in the X direction (hereafter called the “main scanning axis distortion amount”) and the distortion amount in the Y direction (hereafter called the “sub-scanning axis distortion amount) are regarded as a set, and there are the same number of sets as the number of pixels of the line sensor 20. The distortion amount is measured by adding the distortion amount, calculated in subsequent steps, to the distortion table, and as a result, the accuracy of measurement is improved.
In step S203, the image data of the input image A (linear pattern 18 and oblique pattern 19) created in step S201 is copied, and the image A is set in the image B, which is a virtual two-dimensional image.
Then in step S204, the distortion amount in the Y direction in the image B is calculated from the image data of the image B, using the linear pattern 18 of the image B. Specifically, the edge positions in the Y direction, with respect to the X coordinates, are measured at the pattern edges of the linear pattern 18 in the image B, that is, portions where the brightness level of the image changes from black to white in the top to bottom directions in
In this case, there is no pixel which has exactly scale 110 in most areas, so an edge position is measured not in integral pixel units, but with decimal accuracy, by interpolating pixels before and after scale 110. For example, if the brightness of a pixel at Y=64 (coordinate at this time is regarded as b) is scale 90 (scale at this time is regarded as c), and if the brightness of the pixel at Y=65 (coordinate at this time is regarded as b+1) is scale 170 (scale at this time is regarded as d), then an edge position is expressed as follows by linear interpolation between these two points.
Edge position=b+(a−c)/(d−c)=64+(110−90)/(170−90)=64.25
Then the distortion amount in the Y direction is calculated based on the difference (shift amount or deviation) of the measured edge position in the Y direction and the ideal linear edge position. In other words, the distortion is bumps and inclinations of an image of the pattern 18 which should be a straight line. For the ideal linear edge position, the designed pattern position is used. If the designed position information is not available, a straight line, calculated from the measured edge position in the Y direction by the least square method, may be regarded as the ideal straight line. In this way, the distortion amount in the Y direction with respect to each pixel of the line sensor 20 is calculated.
In the next step S205, the distortion amount in the Y direction calculated in step S204 is added to the sub-scanning axis distortion amount in the distortion table. At this point, the main scanning axis distortion amount in the distortion table remains zero.
In step S206, distortion of the image A is corrected using the distortion table, and the image C, which is a corrected image, is created. In this correction, coordinate transformation is performed based on the main scanning axis distortion amount and sub-scanning axis distortion amount saved in the distortion table (in concrete terms, the brightness of the pixel shifted for the distortion amount, with respect to a coordinate, is regarded as the brightness in this coordinate), and the image C is created from the image A. The image in this stage is as shown in
In step S207, the distortion amount in the X direction in the image C is calculated using the oblique pattern 19 of the image C in the image data of the image C. In the image C, the distortion amount in the Y direction has been corrected, so it is a safe assumption that the distortion of the oblique pattern 19 is caused by the distortion in the X direction. Specifically, in the oblique pattern 19 of the image C, the edge position in the X direction, with respect to the Y coordinate, is measured in the same way as step S204 for a portion where the brightness level changes from black to white in a left to right direction (X direction) in
If the inclination angle of the oblique pattern 19, with respect to the X direction, is 45°, then one edge exists approximately at one location in the X coordinate corresponding to each pixel of the line sensor 20, and the measurement value of the edge position is acquired. If the inclination angle of the oblique pattern 19 with respect to the X direction is smaller than 45°, an X coordinate, where an edge position measurement value cannot be acquired, exists, which is not desirable.
Then the distortion amount in the X direction is calculated based on the difference (shift amount or deviation) of the measured edge position in the X direction and the edge position of an ideal straight line. The designed pattern position is used for the edge position of the ideal straight line. Even if the designed position information is not available, a straight line may be calculated from the measured edge position in the X direction by the least square method, and used as the ideal straight line. In this way, the distortion amount in the X direction, with respect to each pixel of the line sensor 20, is calculated.
The edge position is measured for a portion where the edge exists, for each Y coordinate (pixel) in the image C, from left to right (X direction) in
In the next step S208, the distortion amount in the X direction, calculated in step S207, is added to the main scanning axis distortion amount in the distortion table. By this, the distortion amounts in both the X and Y directions are acquired. The distortion amounts in this stage are still significant as the measurement values of the distortion amounts, but a slight error may be generated in the measurement value of the distortion amount in the Y direction because of the distortion in the X direction, which results in a measurement error, even if very minor. Therefore in order to improve the measurement accuracy of the distortion amount and confirm measurement accuracy, the following steps are continuously executed.
In step S209, the distortion in the image A is corrected using the distortion table, in the same way as step S206, and the corrected two-dimensional image (image data) is set in the image B (image data).
In step S210, step S204 to step S209 are repeated until a predetermined condition is satisfied. The distortion amounts (in the Y direction and the X direction) acquired in step S204 and step S207 are small, since there are just residuals. By adding these values to the distortion table, the measurement accuracy of the distortion amounts can be improved. An example of a predetermined condition is that the distortion amounts (in the Y direction and the X direction) newly acquired in step S204 and step S207 become smaller than a predetermined value (threshold). At this point, it is judged that sufficient measurement accuracy is acquired, and the measurement ends. The end of measurement is not limited to this, but may be when steps S204 to step S209 are repeated for a predetermined number of times (which is set such that sufficient measurement accuracy can be implemented).
According to the procedure in
The distortion table acquired like this shows the deviation amount of each pixel of the line sensor 20 in the X direction and the Y direction from a position where the pixel should be. If the distortion table is determined like this, for the inspection device in which the line sensor 20 is installed when the inspection device 1 is manufactured, and is input in the storage unit (not illustrated) of the inspection device 1 as the distortion data in advance, the two-dimensional image acquired by the line sensor 20 can be easily corrected (distortion correction).
As a result, according to the manufacturing method for the inspection device 1 of the present embodiment, the distortion amount in the Y direction (sub-scanning axis direction) is calculated from the two-dimensional image data of the linear pattern 18, and the distortion amount in the X direction (main scanning axis direction) is calculated from the two-dimensional image data of the oblique pattern 19 and from the distortion amount in the Y direction calculated in the previous step, so the distortion amounts in the X direction and the Y direction can be independently measured, and in the manufacturing steps of the inspection device 1 using the line sensor 20, the distortion amount can be measured for all the pixels of the line sensor 20, including the joints between one-dimensional photo-electric transfer elements 21. Since the linear pattern 18 extends in the X direction, only the distortion in the Y direction can be calculated first, then using the distortion amount in the Y direction calculated like this and the two-dimensional image data of the oblique pattern 19, only the distortion amount in the X direction can be calculated.
When the distortion amount in the X direction (main scanning axis direction) is calculated, the distortion is corrected for the two-dimensional image (image A) of the oblique pattern 19, and corrected image data (image C) is created using the distortion amount in the Y direction (sub-scanning axis direction) calculated in the previous step, and the distortion amount in the X direction is calculated from the created corrected image data, thereby only the distortion amount in the X direction can be easily measured.
In the above embodiment, the distortion correction target 57 may have a plurality of linear patterns 58 extending in parallel, and a plurality of oblique patterns 59 extending in parallel, as shown in
By increasing the number of linear patterns 58 and a number of oblique patterns 59 like this, the deviation amounts of the edge positions in the X direction and the Y direction measured with respective patterns are averaged, for example, and thereby the measurement accuracy of the distortion amount can be further improved. And the influence of foreign substances, which happen to be on the distortion correction target 57 by accident, on measurement accuracy can be decreased by comparing the deviation amounts of the edge positions in the X and Y directions, measured with respective patterns, removing abnormal values or removing values that are greater or smaller than a predetermined threshold.
In
If the inclination angle of the oblique pattern with respect to the X direction is θ, the length L when the portion of the oblique pattern 19 corresponding to one pixel of the line sensor 20 is projected in the Y direction is L=tan θ as shown in
Generally the greater the inclination angle of the oblique pattern with respect to the X direction, the better the measurement accuracy of the distortion amount, since the number of measurement values increases, but the size of the distortion correction target in the Y direction increases. Therefore an appropriate angle should be selected according to the balance of the measurement accuracy and available space. Making L 1 or a higher integer is preferable, since the number of edge position information, which is acquired per pixel of the line sensor 20, becomes the same for all the line sensor pixels. In the example in
In the distortion measurement step according to the present embodiment, it is preferable that the sub-scanning, that is, scanning by the line sensor 20, is accurately performed. Specifically, it is preferable that errors of linearity of scanning by the line sensor 20 and the position accuracy of the line sensor 20 with respect to the distortion correction target 17, are controlled to be 1/10 or less of the distortion amount to be measured. It is also preferred that the actual relative position of the line sensor 20, with respect to the distortion correction target 17, is accurately corresponded with the relative position in the picked up image, by counting the linear encoder pulse at each pixel to generated an (image) pickup pulse using a linear encoder (not illustrated).
In the above embodiment, the edge positions are determined from the brightness data of two adjacent pixels by linear interpolation in step S204 and step S207, but the present invention is not limited to this, but, for example, the cubic function (with respect to coordinates) may be approximated using the brightness data of four adjacent pixels so that the edge positions are determined using the approximate cubic function, thereby measurement accuracy can be improved more than linear interpolation.
Also in the above embodiment, the distortion target 17 has a linear pattern 18 and an oblique pattern 19, but the present invention is not limited to this configuration, but a linear and an oblique pattern may be created separately, and two-dimensional images of the linear pattern and oblique pattern may be picked up independently. In this case, the two-dimensional image of the linear pattern is used when the distortion amount in the Y direction is measured, and the two-dimensional image of the oblique pattern is used when the distortion amount in the Y direction is measured.
Number | Date | Country | Kind |
---|---|---|---|
2007-048479 | Feb 2007 | JP | national |
This is a continuation of PCT International Application No. PCT/JP2008/053419, filed on Feb. 27, 2008, which is hereby incorporated by reference. This application also claims the benefit of Japanese Patent Application No. 2007-048479, filed in Japan on Feb. 28, 2007, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2008/053419 | Feb 2008 | US |
Child | 12549010 | US |