The present invention relates to inspection to compare an inspection object image (detected image) obtained by using light or laser or an electron beam with a reference image and detect a fine pattern defect, extraneous material and the like based on the result of comparison, and more particularly, to a defect inspecting apparatus and a method in the apparatus appropriate to perform visual examination on a semiconductor wafer, a TFT, a photo mask and the like.
As a conventional technique of defect detection by comparing a detected image with a reference image, a method disclosed in Japanese Published Unexamined Patent Application No. Hei 05-264467 (Patent Reference 1) is known. In this method, image pickup is sequentially performed on a an object to be inspected where a repetitive pattern is regularly arrayed with a line sensor, then comparison is made with an image time-delayed for the pitch of the repetitive pattern, and a mismatch part is detected as a defect.
In a semiconductor wafer as an object to be inspected, even between adjacent chips, a slight film thickness difference occurs in patterns due to flattening by CMP or the like, and local brightness difference (luminance difference) occurs in images between the chips. When a part where the luminance difference is equal to or higher than a predetermined threshold value th is determined as a defect as in the case of the conventional method, such area where a brightness difference occurs due to the film thickness difference is also detected as a defect. This should not be detected as a defect, i.e., such detection is a false alarm. Conventionally, as one method to avoid the occurrence of false alarm, the threshold value for defect detection is set to a high value. However, this degrades the sensitivity, and a defect having a difference value equal to or lower than the threshold value cannot be detected.
Further, the brightness difference due to film thickness difference may occur only between particular chips in arrayed chips in a wafer or may occur only in a particular pattern in a chip. When the threshold value is set in correspondence with these local areas, the entire inspection sensitivity is seriously lowered. Further, it is undesirable for a user to set the threshold value in correspondence with brightness difference by local area since the operation becomes complicated.
Further, the factor of the impairment of sensitivity is brightness difference between chips due to variation of pattern thickness. In the conventional comparative inspection by brightness, this brightness variation becomes noise during inspection.
On the other hand, there are various types of defects, and the defects are briefly classified into defects which should not be necessarily detected (regarded as normal pattern noise) and defects which should be detected. In the present application, a defect which is not a defect but has been erroneously detected as a defect (false report), normal pattern noise and the like will be referred to as a non-defect. In visual examination, it is necessary to extract only a defect desired by a user from a large number of defects. However, it is difficult to realize such extraction by the above-described comparison between luminance difference and the threshold value. Further, by combining factors depending on an object to be inspected such as material quality, surface roughness, size and depth, and factors depending on a detection system such as a illumination condition, the view of a defect often differs by type, and it is difficult to perform condition setting to extract only a desired defect.
The purpose of the present invention is to provide a defect inspecting apparatus and a defect inspecting method, by which a defect which a user desires to detect but is hidden in noise or in a defect unnecessarily detected can be detected with high sensitivity and high speed without requiring complicated threshold setting.
To attain the above-described purpose, the present invention provides a defect inspecting apparatus including: an illumination optical system to illuminate an object to be inspected on a predetermined optical condition; and a detection optical system to detect scattered light from the object to be inspected, illuminated on a predetermined optical condition by the illumination optical system, on a predetermined detection condition, to obtain image data; and further, an image processing section having a feature calculating section to calculate the feature from inputted design data of an object to be inspected, a defect candidate detecting section to detect a defect candidate using image data in a corresponding position on the object to be inspected obtained by the detection optical system and the feature calculated by the feature calculating section, and a defect extracting section to extract a highly critical defect based on the feature of the design data calculated by the feature calculating section from the defect candidates detected by the defect candidate detecting section.
Further, in the present invention, the image data used in the defect candidate detecting section is a plurality of image data pieces on different optical conditions obtained by the detection optical system or different image data acquisition conditions. Further, in the present invention, in the defect candidate detecting section, a plurality of different defect candidate detection processes are performed in parallel in correspondence with a shape of a pattern formed on the object to be inspected. Further, in the present invention, in the defect candidate detecting section, any one of the plurality of detect candidate detection processes is applied with respect to each area of image data obtained by the detection optical system in correspondence with the shape of the pattern formed on the object to be inspected extracted from the design data of the object to be inspected.
Further, the present invention provides a defect inspecting apparatus including: an illumination optical system to illuminate an object to be inspected on a predetermined optical condition; and a detection optical system to detect scattered light from the object to be inspected, illuminated on a predetermined optical condition by the illumination optical system, on a predetermined detection condition, to obtain image data; and further, an image processing section having: a feature calculating section to calculate a feature from inputted design data of an object to be inspected and calculate a feature quantity from a plurality of image data pieces obtained on different optical conditions obtained by the detection optical system or different image data acquisition conditions, a defect candidate detecting section to perform integration processing between the feature from the design data calculated by the feature calculating section and feature quantities from the plurality of image data pieces to detect a defect candidate, and a defect extracting section to extract a highly critical defect based on the feature of the design data calculated by the feature calculating section from the defect candidates detected by the defect candidate detecting section.
Further, in the present invention, in the defect candidate detecting section the integration processing between the feature from the design data and the feature quantities from the plurality of image data is performed by determining a corresponding point from the design data.
Further, the present invention provides a defect inspecting apparatus including: an illumination optical system to illuminate an object to be inspected on a predetermined optical condition; and a detection optical system to detect scattered light from the object to be inspected, illuminated on a predetermined optical condition by the illumination optical system, on a predetermined detection condition, to obtain image data; and further, an image processing section having: a feature calculating section to calculate a feature from inputted design data of an object to be inspected and calculate a feature quantity from a plurality of image data pieces obtained on different optical conditions obtained by the detection optical system or different image data acquisition conditions, a defect candidate detecting section to perform integration processing between the feature from the design data in a corresponding position on the object to be inspected calculated by the feature calculating section and feature quantities from the plurality of image data pieces to detect a defect candidate, and a defect extracting section to extract a highly critical defect based on the feature of the design data calculated by the feature calculating section from the defect candidates detected by the defect candidate detecting section.
Further, in the present invention, the defect inspecting apparatus further including: a simulator to calculate image data obtained by irradiating the object to be inspected on a predetermined optical condition and detecting scattered light from the object to be inspected on a predetermined detection condition by simulation. The defect candidate detecting section establishes correspondence in the integration processing between the feature from the design data and the feature quantity from the plurality of image data based on the result of simulation by the simulator. Further, in the present invention, the simulator uses the design data in the simulation of the image data obtained from the object to be inspected.
According to the present invention, it is possible to detect a critical defect with a high sensitivity without complicated setting by utilizing design data.
Embodiments of a defect inspecting apparatus and a method for the apparatus according to the present invention will be described using
The scattered light 3a and the scattered light 3b show scattered light distribution caused in correspondence with the respective illuminating sections 15a and 15b. When the optical condition of the illumination light by the illuminating section 15a and the optical condition of the illuminating section 15b are different, the scattered light 3a and the scattered light 3b caused by the respective illuminating sections are mutually different. In the present specification, the optical characteristic of scattered light caused by some illumination light and its feature will be referred to as scattered light distribution of the scattered light. More particularly, the scattered light distribution means distribution of optical parameter values such as intensity, amplitude, phase, polarization, wavelength, coherency and the like with respect to the emitted position, emitted direction and emitted angle of the scattered light.
Next,
As the respective illumination light sources of the illuminating sections 15a and 15b, laser or lamps may be used. Further, wavelengths of lights emitted from the respective illumination light sources may be a short wavelength or a broad-band wavelength light (white light). When using a light source which emits a short wavelength light, ultra violet light in an ultraviolet area (UV light) may be used to increase the resolution of a detected image (to detect a fine defect). When laser is used as a light source and it is single wavelength laser, it is possible to provide the illuminating sections 15a and 15b with a section to reduce coherency (not shown).
The optical path of the scattered light caused from the semiconductor wafer 11 is branched, and the one light is converted by the detecting section 17 via the detection optical system 16 into an image signal. Further, the other light is converted by the detecting section 131 via the detection optical system 130 into an image signal.
In the detecting sections 17 and 131, a time delay integration (TDI) image sensor in which plural one-dimensional image sensors are two-dimensionally arrayed is employed as an image sensor. In synchronization with movement of the X-Y-Z-θ stage 12, it is possible by the TDI image sensor to obtain a two-dimensional image at a comparatively high speed and with high sensitivity by transferring signals detected by the respective one-dimensional image sensors of the TDI image sensor to the one-dimensional image sensors of the second stage of the TDI image sensor and adding there. By using a parallel-output type TDI image sensor having plural output taps, the outputs from the detecting sections 17 and 131 can be processed in parallel, and it is possible to perform detection at a higher speed.
The image processing section 18 extracts a defect on the semiconductor wafer 11 by processing signals output from the detecting sections 17 and 131. The image processing section 18 includes a preprocessing section 18-1 to perform image correction such as shading correction and dark level correction on image signals inputted from the detecting sections 17 and 131 and divide the corrected images into images in a predetermined unit size, the defect candidate detecting section 18-2 to detect defect candidates from the corrected and divided image, the defect extracting section 18-3 to extract a critical defect other than user-designated unnecessary defects and noise from the detected defect candidates, a defect classification section 18-4 to classify the extracted critical defects in accordance with defect type, and a parameter setting section (teaching data setting section) 18-5 to receive an extraneously input parameter or the like and set it in the defect candidate detecting section 18-2 and the defect extracting section 18-3. In the image processing section 18, e.g. the parameter setting section 18-5 is connected to a data base 1102.
The overall control section 19, having a CPU (included in the overall control section 19) to perform various control, is connected to a user interface section (GUI section) 19-1 having a display section and an input section to receive a parameter from the user and the like and display a detected defect candidate image, an image of a finally-extracted defect and the like, and a storage device 19-2 to hold a feature quantity of the defect candidate detected by the image processing section 18, images and the like. The mechanical controller 13 drives the X-Y-Z-θ stage 12 based on a control command from the overall control section 19. Note that the image processing section 18, the detection optical systems 16 and 130 and the like are also driven based on the command from the overall control section 19.
Note that in the present invention, in addition to the image signals as scattered light images from the semiconductor wafer 11, the design data 30 of the semiconductor wafer 11 is also inputted into the image processing section 18. Then, in the image processing section 18, in addition to the two image signals, the design data is integrated, to perform defect extraction processing. In the semiconductor wafer 11 as an object to be inspected, a large number of chips with the same pattern having a memory mat part and a peripheral circuit part are regularly arrayed. The overall control section 19 continuously moves the semiconductor wafer 11 with the X-Y-Z-θ stage 12, and in synchronization with this movement, sequentially inputs chip images from the detecting sections 171 and 131. Then, with respect to the obtained two types of scattered light images, the overall control section 19 compares images in the same position in the regularly arrayed chips with an image feature from the design data 30 in the corresponding position to extract defects.
Assuming that a chip n is an inspection object chip, numerals 41a, 42a, . . . , 46a denote divided images obtained by dividing an image of the chip n obtained from the detecting section 17 by 6. Further, numerals 31a, 32a, . . . , 36a denote divided images obtained by dividing an image of an adjacent chip m obtained from the detecting section 17 by 6 as in the case of the chip n. These divided images obtained from the same detecting section 17 are illustrated as vertical-striped images.
On the other hand, numerals 41b, 42b, . . . , 46b denote divided images similarly obtained by dividing a chip n image obtained from the detecting section 131 by 6. Further, numerals 41b, 42b, . . . , 46b denote divided images similarly obtained by dividing an image of an adjacent chip m obtained from the detecting section 131 by 6. These divided images obtained from the same detecting section 131 are illustrated as vertical-striped images. Further, numerals 1d, 2d, . . . , 6d denote data in positions corresponding to the 6 divided images with respect to the design data 30.
In the present invention, with respect to the images from the two detecting systems and design data inputted into the image processing section 18, division is performed such that all the data correspond on the chips. The defect inspecting apparatus according to the present invention converts the design data 30 to image features to be described later. The image processing section 18 has plural processors which operate in parallel. The respective corresponding images (e.g., the corresponding divided images 41a; 41b of the chip n obtained by the detecting sections 17 and 131, and the corresponding divided images 31a; 31b of the chip m) and the corresponding design data (1d) are inputted into the same processor 1, and the defect extraction processing is performed. On the other hand, in other corresponding positions, the divided images (42a; 42b) of the chip n obtained from the different detecting sections 17; 131 and the corresponding divided images (32a; 32b) of the adjacent chip m and the corresponding design data (2d) are inputted into the processor 2, and the defect extraction processing is performed in parallel to the processor 1.
Next, the flow of processing in e.g. the defect candidate detecting section 18-2 of the image processing section 18 will be described in a case where the head divided images 41a; 41b of the chip n obtained by the two different detecting sections 17; 131, as shown in
As described above, the defect candidate detecting processing and the defect extraction processing (critical defect extraction processing) are respectively performed by plural processors in parallel. The detected images (41a; 41b) in the same position obtained by the different detecting sections 17; 131, and the corresponding reference images (31a; 31b) and the design data (1d) as a set, are inputted into each processor, and the defect candidate detecting processing and the defect extraction processing (critical defect extraction processing) are performed.
In the semiconductor wafer 11, the same pattern is regularly formed as described above. Although the detected image 41a and the reference image 31a should be the same, there is a great difference of brightness between the images due to the difference of film thickness between the chips in the wafer 11 having a multi-layer film. Further, since an image acquisition position is shifted between the chips due to vibration in stage scanning or the like, in the image processing section 18, e.g. the preprocessing section 18-1 initially performs correction on the shift. First, the brightness shift between the detected image 41a and the reference image 31a obtained by the detecting section 17 is detected and corrected (step 501a). Next, the positional shift between the images is detected and corrected (step 502a). Similarly, the brightness shift between the detected image 41b and the reference image 31b obtained by the detecting section 130 is detected and corrected (step 501b). Next, the positional shift between the images is detected and corrected (step 502b).
Generally, the positional shift amount detection and correction process (step 502a and step 502b) shown in
Next, with respect to the object pixel of the detected image 41a subjected to the brightness correction and positional correction, a feature quantity is calculated between the object pixels of the reference image 31a (step 503a). Similarly, a feature quantity is calculated between the detected image 41b and the reference image 31b (step 503b). Further, when the images obtained by the detecting sections 17 and 131 have been sequentially obtained, the positional shift amount between the detected image 41a and the detected image 41b is similarly calculated (step 504). Then, in view of the positional relation between the images obtained by the detecting sections 17 and 131, all or some of the feature quantities of the object pixel are selected, and feature space is formed (step 505). Any amount may be used as the feature quantity as long as it indicates the feature of the pixel. As an example, (1) contrast, (2) shade difference, (3) brightness dispersion value of neighbor pixel, (4) correlation coefficient, (5) brightness increase/decrease with respect to the neighbor pixel, and (6) second-derivative value and the like can be given. As an example of these feature quantities, assuming that the brightness of each point of a detected image is f(x, y) and the brightness of a corresponding reference image is g(x, y), the feature quantity is calculated from a set of images (41a and 31a, and 41b and 31b) with the following expression.
contrast: max{f(x,y), f(x+1,y), f(x,y+1), f(x+1,y+1)}−min{f(x,y), f(x+1,y), f(x,y+1), f(x+1,y+1)} (6)
shade difference: f(x,y)−g(x,y) (7)
fraction: [Σ{f(x+i,y+j)2}−{Σf(x+i,y+j)}2/M]/(M−1) i,j=−1,0,1 M=9 (8)
In addition, the brightness itself of each image (detected image 41a, reference image 31a, detected image 41b and reference image 31b) is used as a feature quantity. Further, it may be arranged such that the integration processing is performed on the images in the respective detecting systems and feature quantities (1) to (6) are obtained from an average value of e.g. the detected image 41a and the detected image 41b, the reference image 31a and the reference image 31b. Hereinbelow, an embodiment will be described in which brightness average Ba calculated with respect to the detected image 41a and the reference image 31a and brightness average Bb calculated with respect to the detected image 31b and the reference image 31b are selected as a feature quantity. When the positional shift of the detected image 41b with respect to the detected image 41a is (x1, y1), the feature quantity calculated from the output from the detecting section 131 with respect to the feature quantity Ba(x, y) of each pixel (x, y), calculated from the output from the detecting section 17, is Bb(x+x1, y+y1). Accordingly, the feature space is generated by plotting all the pixel values in two-dimensional space with the X value as Ba(x, y) and the Y value as Bb(x+x1, y+y1). Then, in the two-dimensional space, a threshold value plane is calculated (step 506), and a pixel outside the threshold value plane, i.e., a deviated pixel as a feature is detected as a defect candidate (step 507). Note that the feature space at step 505 is described as two-dimensional space. However, it may be multi-dimensional feature space with some or all the features as axes.
Further, in the present invention, the design data 1d in an area corresponding to a detected image is also inputted into the same processor. The input design data 1d is first converted to an image feature (image feature quantity) so as to be handled equally to a feature quantity calculated from the above-described image (step 508 in
Next, an embodiment where the design data 1d is converted to an image feature (image feature quantity) at step 508 in
In an example of the feature conversion 83 (conversion to multi-valued data), the binary design data 30 (1d) of the density or the line width of the wiring pattern which is variable in accordance with the subject process and obtained as the inspection information 81 is converted to a luminance value. Regarding an area where the wiring pattern is loose, the data is converted a low luminance (black) value, and an area where the wiring pattern is dense, the data is converted to a high luminance value (white). Since, the density or line width of the wiring pattern differs in accordance with subject process for the inspection object wafer, the feature conversion (step 508) reflecting the inspection conditions corresponding to the inspection information 81 is performed. That is, regarding an area where the wiring pattern is loose, since short circuit even with a comparatively large foreign material or particle is unlikely, a defect candidate is detected with a lowered sensitivity.
In another example of the feature conversion 84 (conversion to multi-valued data), in the binary design data 30 (1d), the probability of occurrence of noise (luminescent spot) which occurs as scattered light from a pattern corner, the edge of a thick wiring pattern or the like is converted to a luminance value in correspondence with the optical conditions (illumination conditions) included in the inspection information 81. In a part where the noise occurrence probability is high, luminance value is converted to high (white). Note that a pattern corner or edge of a thick wiring pattern is sometimes a point where the probability of occurrence of noise indicating a luminescent spot (high luminance) is high in accordance with optical condition (illumination condition) even if it is not a defect.
In this manner, the defect candidate detecting section 18-2 performs the integration processing between the image features 83 and 84 obtained by converting the design data 30 (1d) into multivalued data in correspondence with the inspection information 81, and image features 85 obtained by the detecting sections 17 and 131, to perform the defect candidate detection processing (step 505). Numeral 85 denotes an embodiment of a feature quantity calculated through the feature quantity calculation processing (step 503a, step 503b and step 504) from the input images 41a, 31a, 41b and 31b shown in
Next, processing in the defect extracting section 18-3 to extract only a defect necessary for the user from the defect candidate 94 detected by the defect candidate detecting section 18-2 will be described using
As described above, in the present invention, the design data is converted to an image feature having multi-level value such as binary or higher-level value, then the image feature and the feature calculated from the image are integrated at the respective stages of defect determination processing (the defect candidate detecting section, the defect extracting section and the like). By this processing, it is possible to discriminate noise from defects and to detect a highly critical defect immersed in noise and unnecessary defects by performing defect critical level estimation.
Further, in the present invention, in integration of images obtained on different optical conditions (step 505) shown in
Accordingly, in the present invention, e.g. the image processing section 18 determines corresponding points in images with different views using the design data 30.
Accordingly, in the present invention, when the design data 30 in the corresponding area and the inspection information 81 on the semiconductor wafer as an object to be inspected such as subject process and inspection conditions are inputted, the image processing section 18, e.g., estimates images on the respective inspection conditions (here two inspection conditions) from the design data, and calculates corresponding points i.e. spots where the scattered light is obtained in common between the conditions (1101 in
In this embodiment, the corresponding points between the images are registered in a database 1102. When the design data 30 and the inspection condition 81 are inputted, corresponding points corresponding to the data and condition are retrieved from the database 1102. As shown in
In this embodiment, the estimated image 1201 is previously registered in the database 1102. When the design data 30 and the inspection condition 81 are inputted, the estimated image corresponding to the data is retrieved from the database 1102. As shown in
As described above, according to the defect inspecting apparatus according to the present invention, the plural images 41a; 41b and 31a; 31b with different views in accordance with plural detecting systems, plural optical conditions and the like and the corresponding design data 1b are inputted into the image processing section 18. The image processing section 18 extracts plural features corresponding to the inspection information 81 from the design data 1b, and obtains multivalued image features. Then the image processing section 18 enables high-sensitive detection of defect candidates 94 using the feature quantity 85 calculated from the images 41a; 41b and 31a; 31b and the multivalued image features 83 and 84 extracted from the design data 1b. Further, the image processing section 18 performs critical level determination to the above-described detected defect candidates 94 by using the design data 1b (83), and mark out the highly critical defects from the large number of non-critical defects. Further, the image processing section 18 performs positioning of corresponding points in positional shift detection among the plural images with different views obtained from the design data, and performs integration processing on the feature quantities to detect the defect candidates 94. The detection of the defect candidates 94 is performed based on the optimum defect determination mode which differs by the area. Note that the high-sensitivity inspection is realized without complicated operations and setting by the user by obtaining pattern layout information in the chip from the design data and automatically setting an optimum mode in correspondence with the feature.
Even when there is a slight difference of pattern film thickness after flattening process such as CMP or great brightness difference between compared chips due to shortened wavelength of illumination light, detection of defect in size of 20 nm to 90 nm is realized by the present invention.
Further, at inspection of a low k film including inorganic insulating films such as an SiO2 film, an SiOF film, a BSG film, a SiOB film and a porous silica film, and organic insulating films such as a methyl SiO2 film, an MSQ film, a polyimide film, a parylene film, a Teflon (registered trademark) film, and an amorphous carbon film, even when there is a local brightness difference due to variation of refractive index distribution within the film, according to the embodiments of the present invention, a defect in size of 20 nm to 90 nm can be detected.
As described above, the examples of comparative inspecting images in a dark field inspecting apparatus in which a semiconductor wafer is handled as an object in the embodiments of the present invention have been explained. However, the present invention is also applicable to comparative images in an electron beam pattern inspection. Further, the present invention is also applicable to a bright field illumination pattern inspecting apparatus.
The object to be inspected is not limited to a semiconductor wafer, but the present invention is applicable to e.g. a TFT substrate, a photo mask, a print board or the like as long as it is subjected to the defect inspections by the image comparison.
2 . . . memory, 3a, 3b . . . scattered light, 11 . . . semiconductor wafer, 12 . . . X-Y-Z-θ stage, 13 . . . mechanical controller, 15a, 15b . . . illumination section, 16 . . . detection optical system, 17, 131 . . . detecting section, 18 . . . image processing section, 18-1 . . . preprocessing section, 18-2 . . . defect candidate detecting section, 18-3 . . . defect extracting section, 18-4 . . . defect classification section, 18-5 . . . parameter setting section (teaching data setting section), 19-1 . . . user interface section, 19-2 . . . storage device, 19 . . . overall control section, 30 . . . design data, 1d-6d . . . design data, 41a-46a and 41b-46b . . . detected image, 31a-36a and 31b-36b . . . reference image, 81 . . . inspection information, 83 and 84 . . . design data image feature, 85 . . . defect candidate indicated with a difference between detected image and reference image, 86, 87 and 88 . . . detected image including a defect candidate obtained by cutting the periphery of defect candidate 85, 94 . . . defect candidate, 101 . . . size information of each defect candidate, 102 . . . critical level distribution, and 1102 . . . database.
Number | Date | Country | Kind |
---|---|---|---|
2009-015282 | Jan 2009 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2009/006767 | 12/10/2009 | WO | 00 | 9/7/2011 |