This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2022-122623, filed Aug. 1, 2022, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an inspection apparatus, an inspection method, and a storage medium.
To inspect an anomaly of a specimen, an apparatus configured to compare a plurality of images is used. Examples of the specimen to be inspected include a mask for manufacturing a semiconductor device. At the time of inspection, a pattern is formed on a wafer by exposing it through a mask in which a circuit pattern is formed on a material such as glass. Thus, anomalies such as pattern defects present in the mask and foreign matter adhering to the mask may lead to a decrease in the yield.
As a method of inspecting mask pattern defects, a die-and-die comparison inspection and a die-and-database comparison inspection, for example, are known. In a die-and-die comparison inspection, images of two dies on a reticle, namely, a reference image of a die to be a reference and an inspection image of a die to be an inspection target, are compared. In a die-and-database comparison inspection, a reference image generated from CAD data showing a design pattern and an inspection image of a die to be an inspection target are compared. As a method of inspecting defects and foreign matter, an inspection method of comparing images of a die to be an inspection target captured in different methods, for example, is known. Representative examples of such an inspection method include a method of comparing a transmission image generated with light that has transmitted through a specimen and a reflection image generated with light reflected from the specimen.
In recent years, with the reduction in size of semiconductor devices, the optical magnification at the time of imaging of an inspection target has been increasing. In accordance therewith, the effect of, for example, a position gap between comparison targets increases, causing a difference in inspection images even in a normal region not including a defect or foreign matter that should be detected, causing the problem of error detection.
For example, an inspection method is known in which a parameter for correcting a position gap between a reference image and an inspection image is calculated by simultaneous equations, a correction image is generated by correcting the position of the reference image using the parameter, and making a comparison between the correction image and the inspection image. However, in such a method in which a position gap is correctly calculated from actual images and the gap is corrected, processing such as solving simultaneous equations, which is complicated, and retaining a large number of items of correction information for each pattern shape is required. Moreover, with the improvement in optical magnification, in order to correct pattern fluctuations not resulting from defects and/or uneven displacements between images resulting from optical characteristics, as well as a simple position gap, further complicated processing will be required.
In general, according to one embodiment, an inspection apparatus includes processing circuitry. The processing circuitry is configured to: acquire a first image and a second image for inspecting an inspection target; generate a plurality of deformed images by applying a plurality of deformation processes to at least one of the first image or the second image; calculate, for each pixel, a difference value between a pixel value of the first image and a pixel value of the second image, using the deformed images; calculate a pixel-by-pixel integrated difference value by integrating a plurality of difference values calculated for the respective deformed images; and detect an anomaly of the inspection target based on the pixel-by-pixel integrated difference value.
Hereinafter, an inspection apparatus, an inspection method, and a program according to an embodiment will be described in detail with reference to the drawings. In the following description, constituent elements having substantially the same function and configuration will be assigned a common reference symbol, and redundant descriptions will be given only where necessary.
<Description of Apparatus>
The inspection apparatus 10, as shown in
It is to be noted that the image acquisition unit 101, the image deformation unit 102, the difference generation unit 103, the difference integration unit 104, and the anomaly detection unit 105 may be either realized by software executed by a processor such as a CPU or realized by dedicated hardware. That is, the functions of the image acquisition unit 101, the image deformation unit 102, the difference generation unit 103, the difference integration unit 104, and the anomaly detection unit 105 may be either realized by a single processing circuit, or realized by a combination of independent processors configuring processing circuitry and executing the respective programs. Also, the functions of the image acquisition unit 101, the image deformation unit 102, the difference generation unit 103, the difference integration unit 104, and the anomaly detection unit 105 may be implemented as individual hardware circuits.
The storage medium stores processing programs used in the processor, as well as parameters, tables, etc. used in computation at the processor. The storage medium is a storage device such as a hard disk drive (HDD), a solid-state drive (SSD), and an integrated circuit configured to store various types of information. The storage device may be a portable storage medium such as a compact disc (CD), a Digital Versatile Disc (DVD), and a flash memory, as well as an HDD, an SSD, etc., and may be a drive device configured to read and write various types of information from and to a semiconductor memory device, etc. such as a flash memory and a random-access memory (RAM).
The image acquisition unit 101 acquires an image 1 and an image 2 for inspecting an inspection target. The image 1 corresponds to a first image. The image 2 corresponds to a second image. The image acquisition unit 101 outputs the image 1 to the image deformation unit 102, and outputs the image 2 to the difference generation unit 103. The images 1 and 2 may be stored in advance in the memory of the inspection apparatus 10, or may be acquired from an external device or a database connected to the inspection apparatus 10 in a wired or wireless manner.
The images 1 and 2 are two types of images used to inspect a mask for anomalies. In the case of, for example, detecting foreign matter adhering to a mask, a transmission image generated based on light that has transmitted through the mask to be inspected and a reflection image generated based on light reflected from the mask to be inspected are used as the images 1 and 2. For example, if the image 1 is a transmission image, the image 2 is a reflection image, and if the image 1 is a reflection image, the image 2 is a transmission image.
In the case of the inspection apparatus 10 being an inspection apparatus configured to detect a defect of a mask, a mask image to be a reference (hereinafter referred to as a “reference image”) and a mask image to be inspected (hereinafter referred to as an “inspection image”) are respectively used as the images 1 and 2. For example, if the image 1 is a reference image, the image 2 is an inspection image, and if the image 1 is an inspection image, the image 2 is a reference image. The reference image is, for example, an image generated from CAD data showing a design pattern of a mask, or an image generated by photographing a normal mask.
The image deformation unit 102 generates a plurality of deformed images by applying a plurality of deformation processes on at least one of the images 1 and 2. In the present embodiment, the image deformation unit 102 generates a plurality of deformed images by performing a plurality of different deformation processes on the image 1 output from the image acquisition unit 101. Thereafter, the image deformation unit 102 outputs the generated deformed images to the difference generation unit 103.
In the case of using a transmission image and a reflection image as the images 1 and 2, dilation processing or erosion processing, for example, is used as the deformation processing. The dilation processing refers to processing of replacing a pixel value of each pixel with a largest pixel value of its neighboring pixels. The erosion processing refers to processing of replacing a pixel value of each pixel with a smallest pixel value of its neighboring pixels. In the case of using, for example, dilation processing as the deformation processing, a plurality of dilation processes in which different ranges of neighboring pixels are referred to are used as the plurality of different deformation processes. In the case of using erosion processing as the deformation processing, a plurality of erosion processes in which different ranges of neighboring pixels are referred to are used as the plurality of different deformation processes.
The difference generation unit 103 calculates, for each pixel, a difference value (hereinafter also referred to as a “pixel value difference”) between a pixel value of the image 1 and a pixel value of the image 2, using the plurality of deformed images. In the present embodiment, the difference generation unit 103 acquires the plurality of deformed images output from the image deformation unit 102 and the image 2 output from the image acquisition unit 101, and calculates, for each pixel, a difference value between a pixel value of each of the deformed images and a pixel value of the image 2. Thereafter, the difference generation unit 103 outputs a plurality of difference values obtained for each deformed image to the difference integration unit 104.
A transmission image and a reflection image are light-dark inverted. Accordingly, in the case of using a transmission image and a reflection image as the images 1 and 2, to calculate a difference value, the difference generation unit 103 generates an inverted image by applying inversion processing of performing light-dark inversion on one of the transmission image and the reflection image, and calculates the difference value using the inverted image to which the inversion processing has been applied and a normal image, which is one of the transmission image and the reflection image to which inversion processing is not applied.
Also, to calculate a difference value, the difference generation unit 103 subjects the inverted image or the normal image to a pixel value correction so that a range of pixel values of the inverted image and a range of pixel values of the normal image match each other, and calculates the difference value using the image subjected to the pixel value correction. By thus calculating a difference value after adjusting pixel values of two images between which the difference value is calculated, a difference value in a normal region in which an anomaly is not present can be kept small.
The difference integration unit 104 integrates difference values output from the difference generation unit 103, and obtains a final difference value. In other words, the difference integration unit 104 generates a single difference value by merging a plurality of difference values. Specifically, the difference integration unit 104 calculates, for each pixel, a difference value (hereinafter referred to as an “integrated difference value”) obtained by integrating a plurality of difference values calculated for each of the deformed images. For example, for each pixel, a difference value whose absolute value is smallest is selected from among a plurality of difference values, and an integrated difference value is calculated as the absolute value of the selected difference value. By thus calculating the integrated difference value, a difference between the pixel value of the image 1 and the pixel value of the image 2 resulting from a small displacement in pattern can be kept small. It is to be noted that the difference values may be integrated by any other method. The difference integration unit 104 outputs the calculated integrated difference value to the anomaly detection unit 105.
The anomaly detection unit 105 detects an anomaly of an inspection target based on the integrated difference value. Specifically, the anomaly detection unit 105 performs, for each pixel, normal/anomalous determination based on the integrated difference value output from the difference integration unit 104. If there are any anomalies such as adhesion of foreign matter or existence of a defect, the integrated difference value of a pixel in a region to which the foreign matter adheres or a region in which the defect exists becomes large. Accordingly, by performing, for example, threshold processing on the integrated difference value, it is possible to perform determination as to whether each pixel is normal or anomalous. The threshold value is set in advance and is stored in, for example, the memory of the inspection apparatus 10. It is to be noted that the anomaly detection may be performed by any other method using the integrated difference value. The anomaly detection unit 105 outputs anomaly information containing an anomaly detection result to an external device, etc.
As described above, in processing of the difference integration unit 104, by selecting a difference value whose absolute value is smallest from among a plurality of difference values, a difference between the pixel values resulting from a small displacement in pattern can be kept small. It is thereby possible to detect only differences between the pixel values resulting from anomalies of the mask, resulting in improvement in anomaly detection properties. That is, since the effect of a small pattern displacement between the images 1 and 2 is suppressed by the processing of the difference integration unit 104, the integrated difference value becomes small in a normal region in which an anomaly is not present. Accordingly, in the processing of the anomaly detection unit 105, it is possible to correctly detect an anomaly by determining only a pixel with a difference value equal to or higher than a threshold value to be anomalous.
<Description of Operation>
Next, an operation of processing executed by the inspection apparatus 10 will be described. Herein, processing of detecting foreign matter adhering to a mask using a transmission image and a reflection image will be described as an example of anomaly detection processing. Also, a case will be described where the image 1 is a transmission image and the image 2 is a reflection image.
General characteristics of a transmission image and a reflection image will be described with reference to
In general, a displacement resulting from optical characteristics occurs at the pattern edge between a transmission image and a reflection image. Accordingly, a pattern of a light portion in the transmission image is photographed in a small size compared to a pattern of a dark portion in the reflection image, as shown in
(Step S1)
At step S1, the image acquisition unit 101 acquires a transmission image and a reflection image. The transmission image and the reflection image are acquired by, for example, actually capturing the images. In this case, an inspection target specimen is actually irradiated with light such as visible light, ultraviolet light, etc., and transmitted light, which has transmitted through a specimen, and reflected light, which has reflected from the specimen, are received by a sensor, thereby generating the transmission image and the reflection image based on a result of the reception. Alternatively, the transmission image and the reflection image may be acquired by reading, from a storage medium such as a memory, a transmission image and a reflection image captured in advance. After acquiring the transmission image and the reflection image, the image acquisition unit 101 outputs the transmission image to the image deformation unit 102, and outputs the reflection image to the difference generation unit 103.
(Step S2)
At step S2, the image deformation unit 102 applies deformation processing to the transmission image output from the image acquisition unit 101. Since the light-portion pattern in the transmission image is photographed in a smaller size compared to the dark-portion pattern of the reflection image at an identical region, as described above, the size of the light-portion pattern of the transmission image can be made closer to that of the dark-portion pattern of the reflection image by applying dilation processing to the transmission image.
In the present embodiment, an example has been described in which three types of dilation images are generated as a plurality of deformed images; however, the number of deformed images that are generated may be freely set according to the properties of the image of the inspection target. For example, two types of deformed images may be generated by dilation processes in which different numbers of neighboring pixels are referred to; furthermore, four or more types of deformed images may be generated. Also, the number of pixels by which the dilation is performed may be set according to the number of pixels by which the pattern edge is displaced between the transmission image and the reflection image. For example, the number of pixels by which the pattern edge is displaced may be studied in advance, and dilation processing of dilating the light-portion pattern by that number may be applied.
(Step S3)
At step S3, the difference generation unit 103 calculates a difference value between a pixel value of a reflection image output from the image acquisition unit 101 and a pixel value of each of the deformed images output from the image deformation unit 102. The difference value is a difference between the pixel values (brightness values). In the present embodiment, since three types of dilation images are output by the image deformation unit 102, the difference generation unit 103 calculates, for each pixel, three difference values, namely, a difference value between the pixel value of the 0-pixel dilated image and the pixel value of the reflection image, a difference value between the pixel value of the 1-pixel dilated image and the pixel value of the reflection image, and a difference value between the pixel value of the 2-pixel dilated image and the reflection image.
Herein, a method of calculating the difference value will be described in detail with reference to
The pixel values of the light and dark portions may differ between the deformed image and the inverted reflection image. Accordingly, the difference generation unit 103 performs correction processing of correcting pixel values of one or both of the deformed image and the inverted reflection image to make the ranges of the pixel values (pixel value ranges) of the deformed image and the inverted reflection image match. By calculating difference values using the corrected images, a difference value in the normal region can be kept small.
Herein, a case will be described, as an example, where a pixel value R (x, y) at a pixel position (x, y) of the inverted reflection image is corrected by Formula (1).
The RL, RD, TL, and TD in Formula (1) respectively denote a representative pixel value of a light portion of the inverted reflection image, a representative pixel value of a dark portion of the inverted reflection image, a representative pixel value of a light portion of the deformed image, and a representative pixel value of a dark portion of the deformed image. Representative values may be determined in advance using results of calibration at the time of imaging, or may be directly calculated from images. In the case of directly calculating representative values from images, an average or central value of pixel values in a partial region of a light or dark portion, or a peak pixel value in a histogram of pixel values, for example, may be used as a representative value. By applying the linear formation of Formula (1) to the inverted reflection image, the pixel values of the light/dark portions of the inverted reflection image and the deformed image are made to match. Accordingly, by performing correction processing, it is possible to suppress differences other than a difference resulting from a pattern displacement between the transmission image and the reflection image. In the example of
A case has been described, as an example, where a reflection image is light-dark inverted through inversion processing, and the inverted reflection image is pixel-value corrected through correction processing, thereby correcting pixel values of the inverted reflection image in accordance with those of the deformed image; however, such processing may be performed on a deformed image of a transmission image. For example, a deformed image may be light-dark inverted by applying inversion processing to the deformed image. Moreover, pixel values of an inverted deformed image may be corrected in accordance with those of a reflection image. Furthermore, one of inversion processing and correction processing may be applied to a deformed image, and the other processing may be applied to a reflection image.
Moreover, inversion processing may be executed on a transmission image prior to generation of a deformed image of a transmission image. In this case, erosion processing is applied, instead of dilation processing, as deformation processing to an inverted transmission image obtained by inverting a transmission image. Thereby, an image similar to the image obtained by performing inversion processing on an dilation image of a transmission image can be generated.
(Step S4)
At step S4, the difference integration unit 104 calculates an integrated difference value by integrating a plurality of difference values output from the difference generation unit 103. The integrated difference value is a final difference value. The difference integration unit 104 outputs the calculated integrated difference value to the anomaly detection unit 105. Example methods of integrating the difference values include a method of comparing absolute values of the difference values calculated for each deformed image, and selecting the smallest value as the final integrated difference value. With such a method, it is possible to suppress a difference value resulting from a pattern displacement.
Processing of calculating difference values and processing of integrating the difference values will be described in detail with reference to
It can be seen that, since the pattern width of the light portion of the transmission image is smaller than that of the inverted reflection image, the difference value between the deformed image 1 (0-pixel dilated image), which is the same as the transmission image, and the inverted reflection image is large in the peripheral region A2 of the pattern edge, as shown in
Also, in the present embodiment, processing (hereinafter referred to as “edge specification processing”) of specifying a position of a pattern edge and making an integrated difference value at the specified position zero may be executed to further suppress the difference values at the pattern edge. The edge specification processing may not be executed.
However, in the case where edge specification processing is applied, a small anomaly, if any, present in the periphery of the pattern edge may not be detected. Accordingly, it is preferable that a change be made as to whether edge specification processing is applied according to the characteristics of an anomaly to be detected.
(Step S5)
At step S5, the anomaly detection unit 105 performs threshold processing on the integrated difference value output from the difference integration unit 104, and performs determination as to whether the inspection target is normal or anomalous. As described above, with the processing at steps S1-S4, a difference value that has occurred by a cause other than an anomaly such as foreign matter or a defect is suppressed. Accordingly, the integrated difference value becomes large only at a pixel at which an anomaly is present. Thus, by specifying the pixel at which the integrated difference value is large by the threshold processing, the presence of the anomaly can be detected. The threshold value may be a value set in advance, or may be relatively (dynamically) set using a histogram, etc. of the integrated difference values of the entire image.
In
If an anomaly is detected at a plurality of continuous pixels, further determination may be performed, depending on the size and shape of the area including the plurality of continuous pixels, as to the type of the anomaly, whether or not the detection of the anomaly is erroneous, etc.
(Effects)
The effects of the inspection apparatus 10 according to the present embodiment will be described.
As described above, as the position gap at the pattern edge between the transmission image and the reflection image increases with the increase in optical magnification at the time of imaging of an inspection target, the difference value (pixel value difference) between the pixel values increases even in a normal region in which an anomaly such as foreign matter or a defect is not present, possibly causing erroneous anomaly detection.
The inspection apparatus 10 according to the present embodiment includes an image acquisition unit 101, an image deformation unit 102, a difference generation unit 103, a difference integration unit 104, and an anomaly detection unit 105. The image acquisition unit 101 acquires a transmission image and a reflection image. The image deformation unit 102 generates a plurality of deformed images (a 0-pixel dilated image, a 1-pixel dilated image, and a 2-pixel dilated image) by applying a plurality of dilation processes with different dilation numbers to the transmission image. The difference generation unit 103 calculates, for each pixel, a difference value between the pixel value of the transmission image and the pixel value of the reflection image using the plurality of deformed images. At this time, the difference generation unit 103 generates an inverted image (inverted reflection image) by applying inversion processing of light-dark inverting the reflection image, and calculating a difference value using a normal image (a deformed image of a transmission image) to which inversion processing is not applied and an inverted image (inverted reflection image). The difference integration unit 104 calculates, for each pixel, an integrated difference value obtained by integrating a plurality of difference values calculated for each of the deformed images. For example, for each pixel, the difference integration unit 104 selects, as the integrated difference value, a difference value whose absolute value is smallest from a plurality of difference values. The anomaly detection unit 105 detects foreign matter adhering to a mask based on the integrated difference value.
With the above-described configuration of the inspection apparatus 10 according to the present embodiment, since the position of the pattern edge is close to the pattern edge of the inverted reflection image in the dilation image of the transmission image, a difference value resulting from a position gap at the pattern edge can be made small by calculating the difference value using the dilation image and the inverted reflection image. Also, by integrating a difference value using a plurality of deformed images, it is possible to select, for each pixel, an optimum difference value without the need to strictly correct a position gap. Thereby, the effect of occurrence of a difference resulting from the position gap at the pattern edge is suppressed, and only foreign matter adhering to the mask has a large effect on the integrated difference value. Accordingly, by performing anomaly detection using the integrated difference value, it is possible to suppress erroneous detection at the time of detection of foreign matter, and to improve the detection precision of foreign matter.
It suffices that the deformation processing is applied to at least one of a transmission image and a reflection image. For the deformation processing, either dilation processing or erosion processing is applied, according to the image to be applied and the order of other processes, etc. In the present embodiment, a case has been described where dilation processing is applied only to a transmission image and deformation processing is not applied to a reflection image; however, deformation processing may be applied only to the reflection image, or applied to both the transmission image and the reflection image.
For example, if only deformed images of a transmission image are generated, as in the present embodiment, the difference generation unit 103 calculates, as a difference value, a difference between the pixel value of each of the deformed images and the pixel value of the inverted image. If only deformed images of an inverted image are generated, the difference generation unit 103 calculates, as a difference value, a difference between the pixel value of each of the deformed images and the pixel value of the transmission image. If both deformed images of a transmission image and deformed images of an inverted image are generated, the difference generation unit 103 calculates, as a difference value, a difference between the pixel value of each of the deformed images of the transmission image and the pixel value of each of the deformed images of the inverted image.
It suffices that the inversion processing is applied to one of a transmission image and a reflection image. In the present embodiment, a case has been described where inversion processing is applied to a reflection image; however, inversion processing may be applied to a transmission image. Also, both inversion processing and deformation processing may be applied to either a transmission image or a reflection image; in this case, the deformation processing may be applied after application of the inversion processing, or the inversion processing may be applied after application of the deformation processing.
As described in the present embodiment, in the case where deformation processing is to be applied to a transmission image and inversion processing is to be applied to a reflection image, dilation processing is used as the deformation processing. In the case where deformation processing is to be applied to a reflection image and inversion processing is to be applied to a transmission image, dilation processing is used as the deformation processing. In the case of applying both deformation processing and inversion processing to either a transmission image or a reflection image, the inversion processing is performed after performing dilation processing, or erosion processing is performed after performing inversion processing.
In the present embodiment, to calculate a difference value, the difference generation unit 103 subjects an inverted image or a normal image to pixel value correction so that a range of pixel values of the inverted image and a range of pixel values of the normal image match each other, and calculates the difference value using the image subjected to the pixel value correction. By thus adjusting pixel values of two images based on which difference values are to be calculated and then calculating the difference values, a difference value in a normal region in which an anomaly is not present can be kept small.
In the present embodiment, edge specification processing of specifying a position of a pattern edge and making an integrated difference value at the specified position zero may be executed. At this time, the difference integration unit 104 sets an integrated difference value of a pixel, for which the calculated difference values have different signs, to zero. By specifying a pixel at which the magnitude relationship between the pixel value of each deformed image and the pixel value of the normal image is reversed, utilizing the characteristic whereby the positional relationship at the pattern edge between the deformed image and the normal image to which deformation processing is not applied is reversed in accordance with an increase in the degree of deformation in the deformation processing, it is possible to specify the position of the pattern edge. By replacing an integrated difference value at the pixel estimated to be located at the pattern edge with zero, it is possible to suppress an increase of the difference value due to the effect of a pattern displacement in the periphery of the pattern edge.
(First Modification)
A first modification will be described. In the present modification, the configurations of the embodiment are modified in the manner described below. Descriptions of configurations, operations, and effects similar to those of the embodiment will be omitted. An inspection apparatus 10 according to the present modification differs from that of the present embodiment in that determination as to whether or not dilation processing is to be applied is performed for each pixel, in view of the concern that anomaly detection cannot be performed by dilation processing.
In the present modification, in the case where dilation processing is to be applied as the deformation processing, the image deformation unit 102 determines, for each pixel, whether or not to apply the dilation processing according to an amount of change between a pixel value of an image to which erosion processing has been applied after the dilation processing and a pixel value of an image not subjected to deformation processing, and in the case where erosion processing is to be applied as the deformation processing, the image deformation unit 102 determines, for each pixel, whether or not to apply the erosion processing according to an amount of change in the pixel value between the image to which the dilation processing is applied after the erosion processing and an image not subjected to deformation processing. At this time, the image deformation unit 102 does not apply deformation processing to a pixel at which the change amount is large, and applies deformation processing to a pixel at which the change amount is small.
In the case where, for example, dilation processing is to be applied to a transmission image, the image deformation unit 102 determines, for each pixel, whether or not to apply the dilation processing according to an amount of change between a pixel value between an image to which erosion processing has been applied after dilation processing and a pixel value of a transmission image not subjected to dilation processing. At this time, the image deformation unit 102 does not apply deformation processing to a pixel at which the change amount is large, and applies deformation processing to a pixel at which the change amount is small.
Also, in the case where erosion processing is to be applied to an inverted transmission image obtained by performing inversion processing on a transmission image, the image deformation unit 102 determines, for each pixel, whether or not to apply the erosion processing according to an amount of change between a pixel value of an image to which dilation processing has been applied after erosion processing and a pixel value of an inverted transmission image not subjected to dilation processing. At this time, the image deformation unit 102 does not apply erosion processing to a pixel at which the change amount is large, and applies erosion processing to a pixel at which the change amount is small.
A pixel value of a deformed image 1 (0-pixel dilated image) in the case where small-size foreign matter is contained in a transmission image is shown in
A pixel value of a comparison image obtained by applying a 1-pixel erosion process to the deformed image 2 is shown in
A pixel value of a deformed image 2 obtained by applying dilation processing to a pixel with a change amount equal to or smaller than a predetermined value and not applying dilation processing to a pixel with a change amount greater than the predetermined value, and a pixel value of the inverted reflection image are shown in
(Second Modification)
A second modification will be described. In the present modification, the configurations of the embodiment are modified in the manner described below. Descriptions of configurations, operations, and effects similar to those of the embodiment will be omitted. An inspection apparatus 20 according to the present modification differs from the inspection apparatus 10 in that the image input to the image deformation unit is an image 2 and that the processing of the difference generation unit is varied.
<Description of Apparatus>
The image deformation unit 202 generates a plurality of deformed images by performing deformation processing on the image 2 output from the image acquisition unit 101. Thereafter, the image deformation unit 102 outputs the generated deformed images to the difference generation unit 103. That is, the inspection apparatus 20 differs from the inspection apparatus 10 in that deformation processing is applied not to the image 1 but to the image 2.
The difference generation unit 203 calculates, for each pixel, a difference value between the image 1 output from the image acquisition unit 101 and each of the deformed images output from the image deformation unit 102. The difference generation unit 203 outputs the obtained difference values to the difference integration unit 104.
<Description of Operation>
Next, an operation of processing executed by the inspection apparatus 20 will be described.
(Step S1)
At step S1, the image acquisition unit 101 acquires a transmission image and a reflection image, similarly to the embodiment. After acquiring the transmission image and the reflection image, the image acquisition unit 101 outputs the reflection image to the image deformation unit 102, and outputs the transmission image to the difference generation unit 103.
(Step S2)
At step S2, the image deformation unit 202 applies deformation processing to the reflection image output from the image acquisition unit 101. In the present embodiment, too, dilation processing is used as the deformation processing. The image deformation unit 202 applies three types of different dilation processes, namely, a 0-pixel dilation process, a 1-pixel dilation process, and a 2-pixel dilation process, to the reflection image, and generates three types of dilation images, namely, a 0-pixel dilated image, a 1-pixel dilated image, and a 2-pixel dilated image as deformed images. The 0-pixel dilated image is the same as the original reflection image. The 1-pixel dilated image is an image in which the light portion of the original reflection image is dilated by one pixel by the one-pixel dilation processing. The 2-pixel dilated image is an image in which the light portion of the original reflection image is dilated by two pixels by the two-pixel dilation processing. The image deformation unit 202 outputs the generated three types of deformed images to the difference generation unit 203.
Differences between the processing by the image deformation unit 102 of the inspection apparatus 10 according to the embodiment and the processing by the image deformation unit 202 of the inspection apparatus 20 according to the present modification will be described with reference to
As shown in
(Step S3)
At step S3, the difference generation unit 203 calculates a difference value between a pixel value of the transmission image output from the image acquisition unit 101 and a pixel value of each of the deformed images output from the image deformation unit 202. Since the three types of dilation images are output by the image deformation unit 202, the difference generation unit 203 calculates, for each pixel, three difference values, namely, a difference value between the 0-pixel dilated image and the transmission image, a difference value between the 1-pixel dilated image and the transmission image, and a difference value between the 2-pixel dilated image and the transmission image. The difference generation unit 203 outputs the three calculated difference values to the difference integration unit 104.
At this time, the difference generation unit 203 generates three inverted deformed images by executing the above-described inversion processing of performing light-dark inversion on each of the deformed images, executes correction processing on each of the inverted deformed images to make the ranges of pixel values (pixel value ranges) of each of the inverted deformed images and the transmission image match, and calculates a difference value using a pixel value of each of the corrected inverted deformed images and a pixel value of the transmission image. The correction processing of the present modification can be executed by using, for example, the formula obtained by replacing RL, RD, TL, and TD in the formula (1) with a representative pixel value of a light portion of the inverted deformed image, a representative pixel value of a dark portion of the inverted deformed image, a representative pixel value of a light portion of the transmission image, and a representative pixel value of a dark portion of the transmission image, respectively.
A case has been described, as an example, where deformed images of a reflection image are light-dark inverted through inversion processing, thereby correcting pixel values of each of the deformed images of the reflection image in accordance with those of the transmission image; however, such processing may be performed on a transmission image. For example, a transmission image may be light-dark inverted by applying inversion processing on a transmission image. Moreover, a pixel value of a transmission image may be corrected in accordance with that of a deformed image.
Furthermore, a reflection image may be light-dark inverted in advance, and then erosion processing may be performed on the generated image. Thereby, an image similar to the image obtained by light-dark inverting the deformed images after dilation processing can be generated.
(Step S4)
At step S4, the difference integration unit 104 calculates an integrated difference value by integrating a plurality of difference values output from the difference generation unit 203. Since a method of integrating difference values by the difference integration unit 104 is similar to that of the inspection apparatus 10 of the embodiment, a description thereof will be omitted. The difference integration unit 104 outputs the calculated integrated difference value to the anomaly detection unit 105.
(Step S5)
At step S5, the anomaly detection unit 105 performs threshold processing on the integrated difference value output from the difference integration unit 104, and performs determination as to whether an inspection target is normal or anomalous. Since a method of determination by the anomaly detection unit 105 is similar to that of the inspection apparatus 10 of the embodiment, a description thereof will be omitted.
According to the inspection apparatus 20 of the present modification, an integrated difference value in the periphery of the pattern edge can be kept small, as shown in
(Third Modification)
A third modification will be described. In the present modification, the configurations of the embodiment are modified in the manner described below. Descriptions of configurations, operations, and effects similar to those of the embodiment will be omitted. In an inspection apparatus of the present modification, by combining integrated difference values obtained from both the inspection apparatus 10 of the embodiment and the inspection apparatus 20 of the second modification, a difference value in the periphery of the pattern edge can be made even smaller.
<Description of Apparatus>
The difference generation unit 103 calculates a plurality of difference values 1 between a pixel value of each deformed image of an image 1 and a pixel value of an image 2. The difference value 1 corresponds to a first difference value. Since processing of calculating the difference value 1 is similar to that of the inspection apparatus 10, a description thereof will be omitted. The difference generation unit 103 outputs the calculated difference value 1 to the difference integration unit 304.
The difference generation unit 203 calculates a plurality of difference values 2 between a pixel value of each deformed image of the image 2 and a pixel value of the image 1. The difference value 2 corresponds to a second difference value. Since processing of calculating the difference values is similar to that of the inspection apparatus 20, a description thereof will be omitted. The difference generation unit 203 outputs the calculated difference value 2 to the difference integration unit 304.
The difference integration unit 304 calculates an integrated difference value 1 by integrating a plurality of difference values 1 output from the difference generation unit 103, and calculates an integrated difference value 2 by integrating a plurality of difference values 2 output from the difference generation unit 203. The integrated difference value 1 corresponds to a first integrated difference value 1, and the integrated difference value 2 corresponds to a second integrated difference value. Since processing of integrating the difference values is similar to those of the inspection apparatuses 10 and 20, a description thereof will be omitted. The difference integration unit 304 calculates an integrated difference value 3 by integrating the integrated difference value 1 and the integrated difference value 2. The integrated difference value 3 corresponds to a third integrated difference value. For the calculation of the integrated difference value 3, a method similar to the method of calculating the integrated difference value 1 and the integrated difference value 2 can be used. For example, the integrated difference value 3 is calculated by selecting either the integrated difference value 1 or the integrated difference value 2, whichever is smaller, as the integrated difference value 3. The difference integration unit 304 outputs the calculated integrated difference value 3 to the anomaly detection unit 105.
The anomaly detection unit 105 performs threshold processing on the integrated difference value 3 output from the difference integration unit 304, and performs determination as to whether an inspection target is normal or anomalous. Since a method of the determination by the anomaly detection unit 105 is similar to that of the inspection apparatus 10 or 20 of the embodiment, a description thereof will be omitted.
<Description of Operation>
Next, an operation of processing executed by the inspection apparatus 30 will be described.
(Step S1)
At step S1, the image acquisition unit 101 acquires a transmission image and a reflection image. After acquiring the transmission image and the reflection image, the image acquisition unit 101 outputs the transmission image to both of the image deformation unit 102 and the difference generation unit 203, and outputs the reflection image to both of the image deformation unit 202 and the difference generation unit 103.
(Step S2-1)
At step S2-1, the image deformation unit 102 applies three types of different dilation processes, namely, a 0-pixel dilation process, a 1-pixel dilation process, and a 2-pixel dilation process, to the transmission image output from the image acquisition unit 101, and generates three types of dilation images, namely, a 0-pixel dilated image, a 1-pixel dilated image, and a 2-pixel dilated image as deformed images, similarly to the inspection apparatus 10 of the embodiment. The image deformation unit 102 outputs the generated three types of deformed images to the difference generation unit 103.
(Step S3-1)
At step S3-1, the difference generation unit 103 calculates, for each pixel, three difference values, namely, a difference value between the 0-pixel dilated image and the reflection image, a difference value between the 1-pixel dilated image and the reflection image, and a difference value between the 2-pixel dilated image and the reflection image, as a difference value between the pixel value of the reflection image output from the image acquisition unit 101 and the pixel value of each deformed image output from the image deformation unit 102, similarly to the inspection apparatus 10 according to the embodiment. At this time, the difference generation unit 103 generates an inverted reflection image by executing inversion processing on the reflection image, executes the above-described correction processing on the inverted reflection image to make the ranges of pixel values (pixel value ranges) of each of the deformed images and the inverted reflection image match, and then calculates difference values using a pixel value of the corrected inverted reflection image and a pixel value of each of the deformed images. The difference generation unit 203 outputs the difference value 1 including the three calculated difference values to the difference integration unit 304.
(Step S2-2)
At step S2-2, the image deformation unit 202 applies three types of different dilation processes, namely, a 0-pixel dilation process, a 1-pixel dilation process, and a 2-pixel dilation process, to the reflection image output from the image acquisition unit 101, and generates three types of dilation images, namely, a 0-pixel dilated image, a 1-pixel dilated image, and a 2-pixel dilated image as deformed images, similarly to the inspection apparatus 20 of the second modification. The image deformation unit 202 outputs the generated three types of deformed images to the difference generation unit 203.
(Step S3-2)
At step S3-2, the difference generation unit 203 calculates, for each pixel, three difference values, namely, a difference value between the 0-pixel dilated image and the transmission image, a difference value between the 1-pixel dilated image and the transmission image, and a difference value between the 2-pixel dilated image and the transmission image, as a difference value between the pixel value of the transmission image output from the image acquisition unit 101 and the pixel value of each deformed image output from the image deformation unit 202, similarly to the inspection apparatus 20 according to the second modification. At this time, the difference generation unit 203 generates three inverted deformed images by executing inversion processing on each of the deformed images, executes the above-described correction processing on each of the inverted deformed images to make the ranges of pixel values (pixel value ranges) of each of the inverted deformed images and the transmission image match, and calculates difference values using pixel values of each of the corrected inverted deformed images and the transmission image. The difference generation unit 203 outputs the difference value 2 including the three calculated difference values to the difference integration unit 304.
Through the above-described processing, the difference value 1 and the difference value 2 each including three difference values are output to the difference integration unit 304, and therefore six different difference values are output to the difference integration unit 304.
(Step S4-1)
At step S4-1, the difference integration unit 304 calculates an integrated difference value 1 by integrating three difference values included in the difference value 1 output from the difference generation unit 103, similarly to the inspection apparatus 10 of the embodiment.
(Step S4-2)
At step S4-2, the difference integration unit 304 calculates an integrated difference value 2 by integrating three difference values included in the difference value 2 output from the difference generation unit 203, similarly to the inspection apparatus 20 of the second modification.
The integration processing at steps S4-1 and S4-2 is executed by, for example, selecting a smallest value from among the absolute values of the difference values. At this time, the above-described edge specification processing of specifying a position of a pattern edge and making an integrated difference value at the specified position zero may be executed. In this case, since the position of a pixel determined to be located at the pattern edge differs between the case where the deformation processing is applied to a transmission image and the case where the deformation processing is applied to a reflection image, it is preferable that edge specification processing be applied to both the integrated difference value 1 and the integrated difference value 2.
(Step S5)
At step S5, the difference integration unit 304 calculates an integrated difference value 3 by further integrating the integrated difference value 1 and the integrated difference value 2 obtained by the processing at steps S4-1 and S4-2. In this integration processing, either the integrated difference value 1 or the integrated difference value 2, whichever is smaller, is selected.
Effects of integration by the difference integration unit 304 will be described with reference to
As shown in
If the above-described edge specification processing is not performed, six difference values included in the difference value 1 and the integrated difference value 2 may be integrated at once, instead of steps S4-1, S4-2, and S5. Even in this case, results similar to the case where the integrated difference value 1 and the integrated difference value 2 are integrated can be obtained.
(Step S6)
At step S6, the anomaly detection unit 105 performs threshold processing on the integrated difference value 3 output from the difference integration unit 304, and performs determination as to whether an inspection target is normal or anomalous. Since a method of determination by the anomaly detection unit 105 is similar to that of the inspection apparatus 10 of the embodiment, a description thereof will be omitted.
In the present embodiment, an inspection method in which a difference value generated in the periphery of the pattern edge resulting from a pattern displacement is suppressed, thus allowing a large difference to occur only in a region in which an anomaly is present. As described in the first modification, a change of an image to which deformation processing is to be applied causes a change in the position at which a relatively large difference value occurs. In the inspection apparatus 30 according to the present modification, by further integrating the integrated difference value 1 obtained from the inspection apparatus of the embodiment and the integrated difference value 2 obtained from the inspection apparatus 20, an integrated difference value in the periphery of the pattern edge can be made even smaller, as shown in
It is to be noted that the configuration of the inspection apparatus 30 may be suitably varied to achieve similar effects. For example, difference values between an image obtained by applying dilation processing to a transmission image generated by the image deformation unit 102 and an image obtained by applying dilation processing and inversion processing to the reflection image generated by the image deformation unit 202 in order may be calculated, and the calculated difference values may be integrated in a similar method.
(Fourth Modification)
The inspection apparatus 40 includes, as hardware, a central processing unit (CPU) 401, a random-access memory (RAM) 402, a program memory 403, an auxiliary storage device 404, and an input/output interface 405. The CPU 401 communicates with a RAM 402, a program memory 403, an auxiliary storage device 404, and an input/output interface 405 via a bus. That is, the inspection apparatus of the present embodiment is realized by a computer with such a hardware configuration.
The CPU 401 is an example of a general-purpose processor. The RAM 402 is used by the CPU 401 as a working memory. The RAM 402 includes a volatile memory such as a synchronous dynamic random access memory (SDRAM). The program memory 403 stores a data analysis program for realizing components corresponding to each embodiment. The data analysis program may be a program for causing a computer to realize the functions of, for example, the image acquisition unit 101, the image deformation units 102 and 202, the difference generation units 103 and 203, the difference integration units 104 and 304, and the anomaly detection unit 105. A part of the auxiliary storage device 404 or a read-only memory (ROM), or a combination thereof is used as the program memory 403. The auxiliary storage device 404 stores data in a non-transitory manner. The auxiliary storage device 404 includes a nonvolatile memory such as a hard disc drive (HDD) or a solid-state drive (SSD).
The input/output interface 405 is an interface for connecting to another device. The input/output interface 405 is used for, for example, connection with a keyboard, a mouse, a database, and a display.
The data analysis program stored in the program memory 403 includes a computer executable instruction. When executed by the CPU 401, which is processing circuitry, the data analysis program (computer executable instruction) causes the CPU 401 to execute predetermined processing. When executed by the CPU 401, the data analysis program causes the CPU 401 to execute a series of processing described with reference to
The data analysis program may be provided to the inspection apparatus 40, which is a computer, in a state of being stored in a computer-readable storage medium. In this case, the inspection apparatus 40 may further include, for example, a drive (not illustrated) configured to read data from a storage medium, and to acquire a data analysis program from the storage medium. Examples of the storage medium that may be suitably used include a magnetic disc, an optical disc (CD-ROM, CD-R, DVD-ROM, DVD-R, etc.), a magnetooptical disc (e.g., MO), a semiconductor memory, etc. The storage medium may be referred to as a non-transitory computer readable storage medium. The data analysis program may be stored in a server on a communication network, in such a manner that the inspection apparatus 40 downloads the data analysis program from the server using the input/output interface 405.
The processing circuitry configured to execute the data analysis program is not limited to a general-purpose hardware processor such as the CPU 401, and a dedicated hardware processor such as an application-specific integrated circuit (ASIC) may be used. The term “processing circuitry” includes at least one general-purpose hardware processor, at least one dedicated hardware processor, or a combination of at least one general-purpose hardware processor and at least one dedicated hardware processor. In the example shown in
(Other Modifications)
A case has been described where a transmission image and a reflection image are compared to detect foreign matter adhering to a mask, which is an inspection target; however, the configuration of the present application can be similarly applied to an apparatus configured to compare a reference image and an inspection image to detect a defect of a mask. In this case, either the reference image or the inspection image is used as the image 1, and the other one is used as the image 2. Deformed images obtained by applying deformation processing such as enlargement processing, reduction processing, parallel movement processing, and rotation processing to at least one of the reference image and the inspection image are generated.
For the deformation processing, any type of deformation processing may be used, according to the characteristics of the images 1 and 2. For example, if a transmission image and a reflection image are used as the images 1 and 2, as in the above-described embodiment, dilation processing or erosion processing is used as the deformation processing, since, in general, a transmission image and a reflection image are equal in optical magnification and are already aligned in position. If the optical magnifications of the images 1 and 2 are different, enlargement processing or reduction processing is used as the deformation processing. If the positions of the images 1 and 2 are displaced from one another, parallel movement processing is used as the deformation processing. If one of the images 1 and 2 is rotated, rotation processing is used as the deformation processing. The deformation processing may be processing in which the above-described multiple types of processing are combined.
In this manner, even if a position gap occurs between two images used for a comparison inspection resulting from the characteristics of the images or the method of their acquisition, it is possible to suppress occurrence of a difference due to the effect of the position gap, and to improve the precision of anomaly determination using the difference by generating a plurality of deformed images by applying a given type of deformation processing according to the characteristics of the comparison images, calculating a plurality of difference values (pixel value differences) using the deformed images, and integrating the calculated difference values.
Thus, according to one of the embodiments described above, it is possible to provide an inspection apparatus, an inspection method, and a program capable of detecting only anomalies such as defects and foreign matter, while permitting a pattern displacement between the images to be compared, using only simple image processing.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2022-122623 | Aug 2022 | JP | national |