The present invention relates to a pattern matching device and a computer program, and more particularly to a pattern matching device and a computer program which conduct pattern matching on an image including a plurality of feature regions within the image with the use of a template formed on the basis of design data of a semiconductor device and a picked-up image.
In a device that measures and inspects a pattern formed on a semiconductor wafer, a view field of an inspection device is adjusted to a desired measurement position through a template matching technique (Nonpatent Literature 1). Patent Literature 1 discloses an example of the above template matching method. The template matching represents processing of finding a region that most matches a template image registered in advance from an image to be searched. As an example of the inspection device using the template matching, there is measurement of the pattern on the semiconductor wafer with the use of a scanning electron microscope.
In this device, the view field of the device travels to a rough position of the measurement position by the movement of a stage. However, a large deviation is frequently produced on the image picked up by a high-power electron microscope only with a positioning precision of the stage.
Also, the wafer is not always placed on the stage in the same direction every time, and a coordinate system (for example, a direction along which chips of the wafer are aligned) of the wafer placed on the stage is not completely aligned with a driving direction of the stage, which also causes a deviation on the image picked up by the high-power electron microscope. Further, in order to obtain the image of the high-power electron microscope at a desired observation position, a target position on an observation specimen (also called “beam shift”) may be irradiated with an electron beam deflected by a fine amount (for example, several tens μm or lower). However, even in the beam shift, the irradiated position may be deviated from the desired observation position only with a precision in a deflection control of the beam. In order to correct the above respective deviations to conduct the measurement and inspection at an accurate position, template matching is conducted.
Specifically, alignment is conducted by an optical camera lower in power than the electron microscope image, and alignment is conducted on the electron microscope. Thus, alignment is conducted at a multistage. For example, when the alignment in the coordinate system of the wafer placed on the stage is conducted by the optical camera, the alignment is conducted with the use of images of a plurality of chips (for example, chips on both of right and left ends of the wafer) located distant from each other on the wafer. First, a unique identical pattern within the respective chips, or adjacent thereto (a pattern located relatively at the same position within the respective chips) is registered as a template (the pattern used in registration is frequently created as an optical alignment pattern on the wafer). Then, the stage travels so that the respective chips image the template-registered pattern to acquire the image by the respective chips. The template matching is conducted on the acquired image. The amount of deviation of the stage movement is calculated on the basis of the respective matching positions resultantly obtained, and the coordinate system of the stage movement and the coordinate system of the wafer match each other with the amount of deviation as a correction value of the stage movement. Also, in the alignment by the electronic microscope to be subsequently conducted, a unique pattern close to the measurement position is registered in the template in advance, and relative coordinates of the measurement position viewed from the template is stored in advance.
Then, when the measurement position is obtained from the image picked up by the electronic microscope, template matching is conducted in the picked-up image, the matching position is determined, and the relative coordinates stored in advance are moved from the determined matching position to obtain the measurement position. The view field of the device is moved to the desired measurement position with the use of the above template matching.
Also, Patent Literature 2 discloses a method of creating a template for template matching on the basis of the design data of the semiconductor device. If the template can be created on the basis of the design data, there is advantageous in that time and effort for purposely acquiring the image by the inspection device for template creation are eliminated.
Patent Literature 3 has proposed a method in which an influence of a lower layer is removed with separation into an upper layer and the lower layer to improve a matching performance.
Patent Literature 4 discloses a technique in which, in matching processing between a template formed on the basis of the design data and the image, the design data is subjected to exposure simulation so as to complement a configuration difference between the template and the image.
[Patent Literature 1] Japanese Patent Publication No. 2001-243906 (corresponding U.S. Pat. No. 6,627,888)
[Patent Literature 2] Japanese Patent Publication No. 2002-328015 (corresponding U.S. Patent No. US2003/0173516)
[Patent Literature 3] WO2010/038859
[Patent Literature 4] Japanese Patent Publication No. 2006-126532 (corresponding U.S. Patent No. US2006/0108524)
[Nonpatent Literature 1] New Edition, Image Analysis Handbook, supervision of TAKAGI, Mikio, University of Tokyo Press (2004)
As compared with the creation of the template based on the picked-up image, and the creation of a pseudo-template disclosed in Patent Literature 1, according to the technique of creating the template with the use of the design data as disclosed in Patent Literatures 2, 3, and 4, there is advantageous in that operation for acquiring the image by the electronic microscope for the template creation, and the condition setting for the pseudo-template creation do not need to be conducted.
However, the design data represents an idle pattern configuration and arrangement state of the semiconductor device, which is different in hue from the image to be subjected to the template matching. In particular, with higher integration of the semiconductor device in recent years, the pattern is being multi-layered. However, the pattern of one layer may be different in hue from the pattern of another layer in view of a situation of the detection efficiency of secondary electrons emitted from the specimen. As described above, the design data represents the ideal configuration and arrangement of the pattern, and it may be difficult to conduct the appropriate matching between the design data and a target image different in hue of the pattern between the respective layers. Also, even if the template is created on the basis of the picked-up image, the hue may be different between the respective layers according to optical conditions of the imaging device (for example, scanning electronic microscope).
Patent Literature 3 discloses a technique in which templates of an upper portion and a lower portion of a hole pattern are created, separately, and matching is conducted by the respective templates. This publication discloses a matching method effective to a pattern such as the hole pattern in which edges are present in both of a lateral direction (X-direction) and a longitudinal direction (Y-direction). However, for example, if the upper layer pattern represents a line pattern extending, for example, inane direction, or a pattern in which lines extending in the same direction are arrayed at the same pitch, an accurate position may not be specified by the template of only the upper layer pattern.
Hereinafter, a description will be given of a pattern matching device intended to conduct pattern matching on an image including a plurality of regions having different features with high precision as with the pattern image including a plurality of layers, a computer program causing a computer to execute the processing in question, and a readable storage medium that stores the program in question.
As one configuration for achieving the above object, thereinafter, there is proposed a pattern matching device, a computer program, or a readable storage medium storing the program in question, which executes pattern matching on an image with the use of a template formed on the basis of design data or a picked-up image, which executes the pattern matching on a first target image with the use of a first template including a plurality of different patterns, creates a second target image with the exclusion of information on a region including a specific pattern among a plurality of target patterns from the first target image, or with the reduction of the information on the specific pattern, and determines the degree of similarity between the second target image, and a second template including pattern information other than the specific pattern, or reducing the information on the specific information, or the first template.
Also, there is proposed a pattern matching device, a computer program, or a readable storage medium storing the program in question, which extracts position candidates of the pattern matching by pattern matching the first target image, and extracts a specific position from the candidates on the basis of the similarity determination.
According to the above configuration, even if the patterns having different features are mixed together within a search screen by pattern matching, a success rate of the pattern matching can be maintained in a high state.
Hereinafter, a description will be mainly given of pattern matching using a template formed on the basis of design data.
In the design data 210 illustrated in
In the matching processing using the template formed on the basis of the design data, when a visual separation in the image at the matching correct position between the SEM image and the design data is large, the matching may fail. For example, when an observation specimen has a multilayered pattern, a pattern in a specific layer may become vague in the SEM image, and the matching may fail. As an example of the multilayered pattern, it is assumed that the design data 210 in
Hence, for example, as illustrated in
As illustrated in
The gradation value is varied even within only a region in which the edge strength of the upper layer pattern is higher (hereinafter called “high strength region”). When the variation is the same as or more than the edge strength of a region in which the edge strength of the lower layer pattern is lower, a difference in the degree of similarity between a matching incorrect position caused by the deviation of the lower layer pattern and the matching correct position is buried in a variation of the gradation value in the high strength region, and hardly appears in a similarity evaluation value.
In this case, if attention is paid to only an upper layer pattern 241 as indicated in a matching result 240 illustrated in
In the embodiment described below, a description will be given of a pattern matching device, a computer program that causes a computer to execute the pattern matching, and a computer readable storage medium storing the program, which achieve a high template success rate even when a high strength and a low strength of the gradation value or the edge strength are mixed together within a pattern to be searched mainly in the template matching.
One configuration for improving the success rate of the pattern matching includes a preprocessing unit that preprocesses an image to be searched; a preprocessing unit that preprocesses a template; a template matching processing unit that selects a plurality of matching candidate positions with the use of the image to be searched which has been preprocessed, and the template which has been preprocessed; a designation processing unit of a high strength similarity region which designates the high strength similarity region to be removed from the image to be searched from the design data of an ROI region; a removal processing unit of the high strength similarity region which removes a similarity region of the high strength from the image to be searched; a similarity determination processing unit that calculates the degree of similarity of the image from which the high strength similarity region has been removed, and the template; and a matching position selection processing unit that selects a matching position high in the degree of similarity.
A description will be given of the pattern matching device, the computer program that causes the computer to execute the pattern matching, and the computer readable storage medium storing the program, in which the similarity region of the high strength described above includes an overall region in which the pattern is present in the upper layer of the design data.
The above means evaluates the degree of similarity between each of plural matching candidate positions including the matching correct position and the matching incorrect position obtained by the template matching processing unit in a remaining region where the region of the high strength has been removed, and the template. Therefore, the above means evaluates the degree of similarity in only the region of the low strength without being influenced by the region of the high strength with the result that the matching correct position can be selected even in the above problematic case. In the matching processing unit, since the image including both of the high strength region and the low strength region is used, matching including the information on the low strength region is conducted with the result that the matching correct positions in which the positional deviation does not occur in the low strength region are included in the matching candidates.
According to the above-mentioned configuration, even when the high strength and the low strength of the gradation value or the edge strength are mixed together within the pattern to be searched, the accurate matching position can be determined by the temperate matching. Also, in the image including the multilayered pattern, it is conceivable that the pattern corresponding to the upper layer forms the high strength region, and the pattern corresponding to the lower layer forms the low strength region. When the upper layer and the lower layer are subjected to the matching processing, separately, the information on the lower layer pattern is excluded particularly in matching the upper layer. This may make it difficult to realize accurate matching. As exemplified in
In an example described below, a description will be given of a pattern matching method in which matching can be conducted at a high success rate not depending on a difference in the edge strength between the upper layer and the lower layer while conducting matching with the use of the information on the multilayered pattern.
Hereinafter, a description will be given of a pattern matching processing with reference to the drawings. It is assumed that the same reference numerals denote identical members in the drawings unless otherwise specified.
In the SEM, an electron beam is generated from an electron gun 801. A beam deflector 804 and an objective lens 805 are controlled so that the electron beam is emitted and focused at an arbitrary position on a semiconductor wafer 803 which is a specimen placed on a stage 802. Secondary electrons are emitted from the semiconductor wafer 803 irradiated with the electron beam, and detected by a secondary electron detector 806. The detected secondary electrons are converted into a digital signal by an A/D converter 807, stored in an image memory 815 within a processing/control unit 814, and subjected to image processing according to purposes by a CPU 816. The template matching according to this embodiment is processed by the processing/control unit. The setting of the processing described with reference to
In this example, the scanning electron microscope exemplifies the inspection device. However, the present invention is not limited to this configuration, but can be applied to the inspection device that acquires the image, and conducts the template matching processing.
The electrons emitted from the specimen is acquired by the detector 1403, and converted into a digital signal by an A/D converter incorporated into a control device 1404. The image processing is conducted according to the purpose by an image processing hardware such as a CPU, an ASIC, or an FPGA incorporated into the image processing unit 1407. Also, the image processing unit 1407 has a function of creating a line profile on the basis of a detection signal, and measuring a dimension between peaks of the profile.
Further, the arithmetic processing device 1405 is connected to an input device 1418 having input means, and has a function of a GUI (graphical user interface) that allows an image or an inspection result to be displayed on a display device provided in the input device 1418 for an operator.
A part or all of control and processing in the image processing unit 1407 can be allocated to an electronic computer having a CPU and a memory that can store an image therein, and processed and controlled. Also, the input device 1418 also functions as an imaging recipe creation device that creates an imaging recipe including coordinates of an electronic device required for inspection, a pattern matching template used for positioning, and photographing conditions, manually, or with the help of the design data stored in a design data storage medium 1417 of the electronic device.
The input device 1418 has a template creating unit that clips a part of a line image formed on the basis of the design data to create a template. The created template is registered in the memory 1408 as a template of the template matching in a matching processing unit 1409 incorporated into an image processing unit 507. The template matching represents a technique of specifying a portion where the picked-up image to be positioned matches the template on the basis of the degree of matching using a normalized correlation method, and the matching processing unit 1409 specifies a desired position of the picked-up image on the basis of the matching degree determination. In this embodiment, the degree of matching between the template and the image is expressed by words such as the degree of matching or the degree of similarity, which have the same meaning from the viewpoint of an index indicative of the extent of matching therebetween. Also, the degree of non-matching and the degree of dissimilarity also represent modes of the degree of matching and the degree of similarity.
The embodiment described below relates to the pattern matching between edge information obtained mainly on the basis of the design data, and the image picked up by the SEM or the like, and the edge information obtained on the basis of the design data includes line image information indicative of an ideal shape of the pattern formed on the basis of the design data, or line image information subjected to deformation processing so as to come close to a real pattern by a simulator 1419. Also, the design data is expressed by, for example, a GDS format or an OASIS format, and stored in a given format. Any kind of design data is applicable if software that displays the design data can display the format thereof , and deal with the design data as graphic data.
In the embodiment described below, a description will be given of an example in which the matching processing is executed by the control device mounted on the SEM, or the arithmetic processing device 1405 connected to the SEM through a communication line. However, the present invention is not limited to this configuration, but processing to be described later may be conducted by a computer program with the use of a general-purpose arithmetic device that executes the image processing by a computer program. Further, a technique to be described later is applicable to other charged particle radiation devices such as a focused ion beam (FIB) device.
This embodiment pertains to a device that conducts the pattern matching, a program causing a computer to execute the pattern matching, and a storage medium storing the program therein.
This embodiment is intended to detect the matching correct position even when the high strength (or a high value) and the low strength (or a low value) of the edge strength (or gradation value) are mixed together in the image to be searched as described above. To achieve this, the details will be described in a later half of the description in
As a result, even when the high strength (or the high value) and the low strength (or the low value) of the edge strength (or gradation value or the degree of similarity between the image to be searched and the template) of the pattern to be searched are mixed together, the matching result also taking the pattern of the region having the low strength (or the low value) into consideration can be obtained to obtain the matching correct position. In the present specification, the edge strength of the pattern will be mainly described below. However, the same matching can be implemented on a pixel value, or the degree of similarity between the image to be searched and the template by merely replacing the edge strength therewith.
Hereinafter, the respective processing of matching in
In a preprocessing unit B103, in order to emphasize the shape of the pattern of the design data, the edge emphasis processing is conducted. For example, the Sobel filter processing (Nonpatent Literature 1, pp. 1215) or the like is conducted. The edge emphasis processing is not limited to this configuration, but any processing that can conduct the edge emphasis is applicable. Also, this processing in a preprocessing unit B is not always implemented, but the processing may not be implemented. A template (first template) including information on a plurality of layers is produced on the basis of the above processing. The above image processing can be conducted by a design data image processing unit 1414 disposed in a template production unit 1410. Also, a plural-layer template production unit 1412 produces the template on the basis of plural layers of pattern data included in the selected design data region.
With the use of the first template produced as described above, a matching processing unit 104 or 1409 conducts the template matching on a target image (first target image) (Nonpatent Literature 1, pp. 1670). For example, the matching processing is conducted with a normalized correlation method (Nonpatent Literature 1, pp. 1672). Positions of regions in which the pattern is similar between the template and the image to be searched can be detected through the matching processing. The matching processing unit 104 selects a plurality of positions having the higher degree milarity (for example, correlation value). The number of selections may be set to a given value in advance, or the regions whose index of the incidence degree determination called “matching score” is a given value or more may be selected. Also, the number of regions indicative of the degree of incidence having a given value or more may be set to a given value (or a given number, or more) in advance.
The selected matching positions represent matching position candidates 105, and as described above, the matching position candidates 105 frequently include the matching correct positions and the matching incorrect positions
A designation processing unit 106 of the high strength similarity region designates regions in which the edge strength is high as described above. The high strength similarity region represents a region in which the degree of similarity between the template and the image to be searched is high, and the strength is high, a region in which the degree of similarity is high, and the strength is expected to be high, or a region including those regions (region including those regions in this case represents, for example, a region of a layer in which there is the design data including the region high in the similarity and high in the strength). Thus processing is conducted by a region selection unit of a removal processing selection unit 1411.
For example, as will be described with reference to
Also, the above-mentioned high strength region or the low strength region may conduct automatic determination on the basis of layer information registered in GDS data. For example, the input device 1418 may set an image acquisition region on the design data, automatically discriminate which layer the pattern included in the acquisition region belongs , on the basis of the selection, and automatically discriminate patterns belonging to the upper layer side, and patterns belonging to the lower layer side. When the above processing is automatically conducted, a sequence for classification so that the pattern having the upper layer information is classified into the upper layer pattern, and the pattern having the lower layer information is classified into the lower layer pattern is prepared, and the patterns are automatically classified on the basis of the setting of the image acquisition region. The above processing may be executed by a layer determination unit 1415 on the basis of the selection in the removal processing selection unit 1411.
A similarity determination processing unit 108 for the image from which the high strength similarity region has been removed evaluates the degree of similarity for the image data (image 331 in the example of
A matching position selection processing unit 109 compares the degree of similarities at the respective matching candidate positions at the respective matching candidate positions obtained by a similarity determination processing unit 108 for the image from which the above high strength similarity region has been removed with each other, and outputs a candidate highest in the degree of similarity as the matching position 110. With the above configuration, even when the high strength and the low strength of the edge strength of the pattern to be searched are mixed together in the pattern to be searched on the image to be searched, it is possible to determine an accurate matching position by the template matching. Because the above similarity determination may be selectively conducted on the extracted matching candidate positions, the precise matching position can be specified with a high efficiency. The above similarity determination can be applied with the above-mentioned matching algorithm, and can be conducted by the matching processing unit 1409. Also, the matching candidate position information is stored in the memory 1408 in advance, and the template of the lower layer pattern maybe superimposed on the image on the basis of the position information.
As described above, the matching candidate positions are narrowed by a first matching, and the selective degree of similarity of the lower layer pattern (low brightness region) is determined, thereby making it possible to conduct the high precision matching using the low brightness region relatively small in the amount of information on the high brightness region.
The above similarity determination is conducted with the use of the second template in which the lower layer pattern is selectively displayed. Alternatively, the similarity determination may be conducted with the use of the first template. In this case, because of a comparison between the second target image (image from which the upper layer image has been removed) and the first template (image including the upper layer information and the lower layer information), even if the accurate matching position is provided, the degree of similarity becomes relatively low as compared with the determination using the second template. On the other hand, because the second target image is an image from which the upper layer information has been removed, even if the information of the upper layer remains in the template, this may hardly influence relative merits of the degree of similarity among a plurality of matching position candidates. Hence, when the degree of similarity among the plurality of matching position candidates balances each other, and a precision in the matching is intended to be prioritized, it is conceivable that the similarity determination using the second template is conducted. When a processing efficiency is intended to be enhanced with the elimination of the processing for creating the second template, it is conceivable that the similarity determination using the first template is conducted.
When the image to be searched is the region 300 in
This removal processing is conducted by the removal processing unit 107 of the high strength similarity region. A method of designating the region to be removed will be described with reference to
Then, the similarity evaluation is conducted on the image 331 from which the region of the high strength has been removed with the use of a pattern (in this example, lower layer design data 321) which is the design data other than the removed region (for example, using the normalized correlation value method). The similarity evaluation is conducted by the similarity determination processing unit 108. In this example, the similarity evaluation method is not limited to the normalized correlation method, but applicable to any method that can evaluate the degree of similarity. Also, when it is found that a part of the pattern is concealed from the pattern to be removed on the image in the pattern (pattern for conducting the similarity evaluation) which is the design data other than the above removed pattern, it is possible to use the pattern (removal of a portion that overlaps with a dashed region interior 322 in
Also, in this example, the upper layer pattern 301 is high in the strength, and the lower layer pattern 302 is low in the strength. However, the number of layers is not particularly limited to the two layers, and the region of the high strength is not also limited to the upper layer. In the design data of the multilayer structure, the layer of the region having the high strength is removed when the region of the high strength is provided, and the similarity determination processing is conducted by the remaining region.
The former will be described with reference to
In this example, the provision of the image is not always necessary, and only the layer of the high strength similarity region is accepted as the input of the user (in this case, for example, the user makes a determination on the basis of the past experience or the results of simulation, and specifies the high strength similarity region). Also, when the high similarity region is fixedly set, since a larger number of discharge electrons is frequently detected in the upper layer pattern in, for example, an electronic microscope image of a semiconductor pattern, it is conceivable that the upper layer pattern is set as the layer that is the high strength. The upper layer pattern does not always become the high strength depending on the type of a specimen (the type of material or structure), or observation conditions of the device (if the electronic microscope is provided, an accelerating voltage, a probe current, the type of an electron detector (location position or detection conditions), a state of the other device magnetic field, etc.). The region that becomes the high strength may be different depending on the type of specimen or the conditions of the device. In this case, the region that becomes the high strength is set under those conditions.
In the region that becomes the high strength, for example, the inspection device calculates the acquired image through simulation based on the type of specimen and the observation conditions of the device, and may select the region which becomes the high strength from the calculated image. This processing is conducted by the designation processing unit 106 of the high strength region in
For example, as illustrated in
In an SEM image 500 illustrated in
Also, in an SEM image 520 illustrated in
Also, in an SEM image 540 illustrated in
Under the circumstances, as illustrated in
In addition, the formed semiconductor pattern may be separated from the shape of the design data due to a variety of factors (Patent Literature 4). Under the circumstances, a pattern brought closer to a shape of the semiconductor pattern by treating the design data may be used instead of the design data described above. As an example, there is a method in which the design data is subjected to Gaussian filter processing, and the processing results are subjected to contour extraction to obtain the shape brought closer to the actual pattern shape. Also, there is a method in which the design data is subjected to exposure simulation, and the simulation results are subjected to contour extraction to obtain a shape brought closer to the actual pattern shape (Patent Literature 4).
The methods of creating the region to be removed as the high strength similarity region have been described above. However, the respective methods may be used, independently, or the respective methods may be used in combination. In this way, the removal region is set according to the status of the similarity region of the high strength in the image of the inspection device, thereby being capable of improving the performance of selection of the correct position in the matching method described with reference to
In this example, a method will be described in which the region of the high strength is designated (or extracted) on the basis of both of the design data of the designated layer, and the image acquired by the inspection device, and the designated (or extracted) region is removed from the image data 100 acquired by the inspection device. As a result, the high strength similarity region removed from the image data 100 can be brought closer to the similarity region of the high strength in the actual image. Also, the removal of the high strength similarity region makes it possible to prevent the region of the low strength from be removed more than necessary.
An example of a specific implementation method will be described.
In this method, there is no need to acquire and set the information before imaging the specimen as in the method described with reference to
A specific implementation method will be described.
Examples of the strength evaluation region based on the design data are illustrated in
The evaluation index is not limited to the above indexes, but any index value that enables a comparison of the strengths can be applied. Also, this example shows the pattern of the double-layered structure. Similarly, in a pattern of three or more layers, the evaluation region is set in each of the layers to calculate the evaluation index value in each of the layers, and the high strength similarity region can be selected from the evaluation index values.
With the above configuration, the high strength similarity region can be extracted from the images picked up by the inspection device through the image processing.
Hereinafter, this method will be described with reference to FIG. (this method is identical with the method of
Both of the noise reduction processing and the edge emphasis processing in the preprocessing of this preprocessing unit A are not always implemented, but any one or both of those processing may not be implemented. In the preprocessing unit B103, the edge emphasis processing is conducted to emphasize the shape of the pattern of the design data. For example, the Sobel filter processing (Nonpatent Literature 1, pp. 1215) is conducted. The edge emphasis processing is not also limited to this configuration, but any processing that can conduct the edge emphasis can be applied.
Also, the above processing of the preprocessing unit B is not always implemented, but the processing may not be implemented. In the matching processing unit 104, the template matching is conducted (Nonpatent Literature 1, pp. 1670). For example, the matching processing using the normalized correlation method (Nonpatent Literature 1, pp. 1672) is conducted. The position of the region of the pattern similar between the template and the image to be searched can be detected through the matching processing. The designation processing unit 106 selects a plurality of matching positions higher in the degree of similarity (for example, correlation value). The selected matching positions are the matching position candidates 105, and as described above, the matching position candidates include the matching correct positions and the matching incorrect positions in most situations.
The designation processing unit 106 of the high strength similarity region designates the region in which the above-mentioned edge strength is high. The high strength similarity region represents a region in which the degree of similarity between the template and the image to be searched is high, and the strength is high, a region in which the degree of similarity is high, and the strength is expected to be high, or a region including those regions (region including those regions in this case represents, for example, a region of a layer in which there is the design data including the region high in the similarity and high in the strength).
A treatment processing unit 900 of the high strength similarity region treats the regions of the image data (image to be searched) corresponding to the respective matching position candidates, in the region designated by the designation processing unit 106 of the high strength similarity region described above. A specific example of the treatment method will be described with reference to
In the similarity determination processing unit 108 for the image in which the high strength similarity region is treated, the degree of similarity is evaluated for the image data obtained by the removal processing unit 107 of the high strength similarity region described above with the use of the pattern of the template other than the removed region. This makes it possible to evaluate the degree of similarity in which the similarity region of the high strength is treated in the respective matching position candidates, and mainly makes it possible to evaluate the degree of similarity in the pattern of the low strength.
In the matching position selection processing unit 109, the degrees of similarity at the respective matching candidate positions obtained by the above similarity determination processing unit 108 for the image in which the high strength similarity region has been deleted are compared with each other, and the candidates highest in the degree of similarity are output as the matching position 110. With the above processing, even if the high strength and the low strength of the edge strength are mixed together in the pattern to be searched on the image to be searched, it is possible to determine an accurate matching position by the template matching.
Also,
Also, in the template provided for determination of the degree of similarity, the amount of signals in a region corresponding to the above treatment region is reduced, thereby making it possible to enhance the degree of similarity at the matching correct position on the image that has been subjected to the above image processing.
In this example, the method of treating the high strength similarity region for the pattern of the double-layered structure has been described. However, this is not limited to the double layer, but the same processing can be conducted on the three or more patterns. The high strength similarity region can be treated by the above-mentioned method.
Hereinafter, this method will be described with reference to
For example, the Sobel filter processing (Nonpatent Literature 1, pp. 1215) is conducted. The edge emphasis processing is not also limited to this configuration, but any processing that can conduct the edge emphasis can be applied. Both of the noise reduction processing and the edge emphasis processing in the preprocessing of this preprocessing units A and B are not always implemented, but any one or both of those processing may not be implemented. In the matching processing unit 104, the template matching is conducted (Nonpatent Literature 1, pp. 1670).
For example, the matching processing using the normalized correlation method (Nonpatent Literature 1, pp. 1672) is conducted. The position of the region of the pattern similar between the template and the image to be searched can be detected through the matching processing. The designation processing unit 106 selects a plurality of matching positions higher in the degree of similarity (for example, correlation value). The selected matching positions are the matching position candidates 105, and as described above, the matching position candidates include the matching correct positions and the matching incorrect positions in most situations. The designation processing unit 1103 of the high strength similarity region designates the regions in which the edge strength is high as described above.
A removal/treatment processing unit 1102 of the high strength similarity region removes/treats the regions of the image data (image to be searched, and the template image) corresponding to the respective matching position candidates, in the region designated by the designation processing unit 106 of the high strength similarity region described above. A specific example of the removal/treatment method will be described later with reference to
In the similarity determination processing unit 108 for the image in which the high strength similarity region is removed/treated, the degree of similarity is evaluated for the image data to be searched obtained by the removal processing unit 107 of the high strength similarity region described above with the use of the template obtained by the removal processing unit 107 of the high strength similarity region. This makes it possible to evaluate the degree of similarity in which the similarity region of the high strength is removed/treated in the respective matching position candidates, and mainly makes it possible to evaluate the degree of similarity in the pattern of the low strength.
In the matching position selection processing unit 109, the degrees of similarity at the respective matching candidate positions obtained by the above similarity determination processing unit 108 for the image in which the high strength similarity region has been deleted are compared with each other, and the candidates highest in the degree of similarity are output as the matching position 110. With the above processing, even if the high strength and the low strength of the edge strength are mixed together in the pattern to be searched on the image to be searched and in the pattern to be searched on the template, it is possible to determine an accurate matching position by the template matching.
In this example, the region of the high strength is, for example, the regions in which the edge strength is high, or the pixel value is high. For example, as the former regions in which the edge strength is high, proper binarization processing (Nonpatent Literature 1) is conducted on the edge image of the image 1200, and the regions corresponding to a side in which the value is higher may be extracted. Also, as the latter regions in which the pixel value is high, the binarization processing is conducted on the image 1200, and the regions corresponding to a side on which the value is higher may be extracted. The method of extracting the regions in which the edge is high in strength, or the pixel value is high is not limited to the binarization processing, but any methods that can extract the appropriate regions can be applied.
In this method, an example of the extracted high strength region is a region 1211 indicated in an image 1210 of
When the matching between the measurement data and the device image is selected, the setting of the setting method of the removal/treatment region, and the removal method can be accepted from the user. In the setting of the removal/treatment region, if a select box 1303 is selected, an input box 1319 can accept the input of the layer in the design data to be subjected to removal/treatment (the method described with reference to
The designation and the edition can be conducted while confirming the region in a display region 1323 of the high strength similarity region. Also, when a check box 1325 is selected, the extracted region can be expanded or reduced by a value (for example, set in a pix unit) input to the input box 1321. Also, when the @1307 is selected, the region setting by the contour extraction processing can be conducted (the method described in
In the latter case, an input of a width (for example, set in a pix unit) of the region can be accepted by an input box 1322. In select boxes of the removal method, the removal or treatment method can be selected. If a select box 1309 is selected, the removal of the high strength region can be selected (the method described with reference to
Even when the matching between the measurement data and the device image is selected, the setting of the setting method of the removal/treatment region, and the removal method can be accepted from the user. In the setting method of the removal/treatment region, if a select box 1313 is selected, the layer of the high strength can be automatically selected (the method described with reference to
With the above processing, the setting of the setting method of the removal/treatment region when the removal/treatment processing of the high strength similarity region is conducted by the GUI, and the removal method can be accepted from the user. This GUI does not need to provide all of the members described above, but provides all or a part of the members.
In the above description, mainly, the high strength region of the signal is removed, or the brightness thereof is weakened to specify a desired matching position from the matching position candidates. Alternatively, the high strength region is not selectively removed, but the low strength region may be selected, resulting in the removal of the high strength region.
Also,
First, information necessary for template creation is read from a storage medium (the design data storage medium 1417, or the memory 1408) on the basis of the setting of an arbitrary region on the design data (Step 1601). The creation of multilayered templates provided for the first pattern matching (Step 1604), and the creation of the lower layer template provided for the second template matching (Step 1602) are conducted. Further, the removal regions of the image when conducting the second pattern matching are selected (Step 1603).
Subsequently, the image to be subjected to the pattern matching is acquired (Step 1605), and the pattern matching using multilayered template created in Step 1604 is executed (Step 1606). In this situation, if the number m of matching positions which exceed a threshold value (given value) of a preset matching score is zero, an error message is generated together with the processing of skipping the measurement based on the matching in question assuming that a target could not been found out. Also, if the number m of matching positions is 1, the matching processing is terminated assuming that the number of correct positions is 1, that is, a final matching position cannot be specified. It is conceivable that the number of matching positions is only one because the specimen is charged, and the resolution of the image is low. In this case, it is preferable that the error message is generated. The dealing may be changed according to status of the specimen and the measurement environments.
In this example, if the number of matching positions is larger than 1, that is, if a plurality of matching positions can be specified, the flow proceeds to the next step. A threshold value is also set for the number of matching positions, and m higher matching positions higher in the score may be specified.
Subsequently, the removal regions selected in Step 1603 are removed from the SEM image acquired in Step 1605 to create the removal image (Step 1607). The removal region is, for example, a region set to cover the contour of the upper layer pattern, and a region slightly larger than the contour of the upper layer pattern may be set as the removal region. The pattern matching using the lower layer template created in Step 1602 is executed on the removal image thus formed (Step 1608). That is, in a flowchart exemplified in
If the number n of matching positions in Step 1608 is 0, because an appropriate lower layer pattern could not been detected, the error message is generated. Also, if n is 1, the matching processing is terminated assuming that the matching is properly conducted under the condition where the matching position in question is specified even in the pattern matching in Step 1606. Also, if the matching position in Step 1608 does not match the matching position in Step 1608, the error message is generated under the determination that the matching has not been properly conducted.
If the number n of matching positions in Step 1608 is plural (n>1), the number o of matching positions specified by both of Step 1606 and Step 1608 is determined. If o is 1, the matching processing is terminated assuming that the number of proper matching positions is 1. If o is plural (o>1), because a plurality of matching position candidates is present, a position at which the matching score in Step 1606 or Step 1608 is maximum, or a position at which a multiplication value of the degree of matching of both the matching, or an addition value of the matching scores becomes maximum is determined as the matching position (Step 1609).
If a plurality of matching processing is conducted with the use of the different templates as described above, a possibility that positioning is conducted at an incorrect position can be reduced.
Like
As has been described above, taking the deviation between the two matching positions into consideration, the deviation of some degree is determined as the generation of the overlay error. If the deviation is larger, the possibility of the matching at the incorrect position is suppressed by generating an error, and a success rate of the matching can be enhanced without depending on the overlay error. Also, the overlay error can be measured on the basis of the distance between the positions specified by the two matching. In this embodiment, in particular, the position specified by the first pattern matching is a position specified as a result of being more affected by the upper layer pattern, and the position specified by the second pattern matching is a position corresponding to the position of the lower layer pattern. Hence, the deviation (the amount of shift) between those positions can be defined as an overlay error.
Further, if the number p of matching positions is larger than 1 (p>1), the shortest distance between the two matching positions is selected, or a distance between the two matching positions which fulfills a given conduction (for example, a the threshold value or lower) is selected (Step 1708). Then, the same processing as that in Steps 1705 and 1706 is executed.
Number | Date | Country | Kind |
---|---|---|---|
2011-039187 | Feb 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2011/006906 | 12/12/2011 | WO | 00 | 10/4/2013 |