The present invention relates to a method, an image processing apparatus, and a program for creating a template used for detecting a specific position, and particularly, to a method and the like for creating a template based on design data of a semiconductor device and the like.
In a semiconductor measuring apparatus, conventional image recognition compares the same type of images, such as an image of an SEM (Scanning Electron Microscope) and an SEM image as well as an image of an optical microscope (OM) and an OM image. In recent years, an image recognition technique using design data has emerged, in which the design data and an SEM image are compared, and the design data and an OM image are compared.
In the comparison between the same type of images, contrast information can be used as effective information in the image recognition. However, the contrast information cannot be used as effective information in the comparison between images of different types, as in the comparison of the design data and the OM image. This is because the design data does not include contrast information as expressed in the OM image, and information of presence/absence of a pattern is just binary information. Therefore, even if the design data is compared with the OM image that is multi-valued information, a part with different contrast is generated, and the image recognition may not be successful.
A pattern that does not exist in the design data may be included in the OM image, and that situation cannot be handled, either. Consequently, a system for using only edges of design data to execute a matching process (see Patent Literature 1) is proposed. In this method, matching is performed only with the edges obtained from the design data, and correlation computation is ignored in other regions. This can handle inversion of the contrast and inclusion of a pattern that does not exist in the design data. To use information of contrast of an image, a method for using material information of patterns from design data to reflect reflectance and the like on each pattern to create multi-valued image information (see Patent Literature 2) is proposed as a method for using information other than the edges. According to the method, the information of the contrast can be used.
According to the method for selectively extracting an edge part to perform matching as described in Patent Literature 1, lower-layer pattern information that may become noise in the matching process can be selectively excluded to perform pattern matching. However, the edges obtained from a multi-valued OM image with contrast information have variations in the brightness, and the edges of a binary template image created from the design data do not have variations in the brightness. Therefore, the degree of coincidence between the two may be reduced.
Even if the image information is created from the design data by multiple values close to the OM image as described in Patent Literature 2, there is little effective information for comparison if the contrast of the image is low. Therefore, the image recognition may not succeed
Furthermore, there are various processes for manufacturing a semiconductor, such as exposure, development, etching, photoresist removal, and planarization, and the appearance of the OM image varies in each process. Therefore, even if the information of the contrast is used to create the image information from the design data by multiple values as described in Patent Literature 2, the appearance is different from the OM image depending on the process, and the image recognition may not succeed.
Hereinafter, a method and an image processing apparatus for creating a template for pattern matching for the purpose of performing pattern matching based on a template image having high contrast will be described. A method for creating a template for pattern matching and an image processing apparatus using the same for the purpose of performing pattern matching based on a step state of a pattern in a process will also be described.
Proposed below as an aspect for attaining the purposes are: a method for creating a template for template matching, the template created by partially extracting part of design data, the template created based on the extracted partial region; and an apparatus that realizes the method for creating a template, wherein a density of edges that belong to a predetermined region (for example, a search region or a region specified by the template) in the design data equivalent to a region to be searched for in the template matching is obtained.
Proposed as a more specific aspect are a method and an apparatus that determine an edge density for the predetermined region and that select the predetermined region as a template region or a template candidate if the edge density satisfies a predetermined condition. For example, a region including a region with a high edge density and a region with a low edge density at a predetermined ratio is selected as a template region or a template candidate region.
Proposed below as another aspect for attaining the purposes are a method for creating a template for template matching from design data and an apparatus that realizes the method for creating a template, wherein process information related to a process of semiconductor manufacturing is used to obtain grayscale information of a multi-layer pattern of a region specified by the template.
According to the aspect, a template or a template candidate having high contrast can be extracted from design data. According to the other aspect, a template suitable for each process can be created.
An image processing apparatus illustrated in embodiments described below relates to a method and an apparatus that detect a specific region based on information obtained from design data to create a multi-valued template. In a specific example of the image processing apparatus, a specific region is detected based on information on the basis of a density of pattern edges obtained from design data to create one or both of binary and multi-valued templates from the design data. An example of creating one or both of binary and multi-valued templates from design data based on information on the basis of a density of pattern edges obtained from the design data will also be described.
The present embodiments also describe setting of one or both of a coordinate position and a region size of a region used for a template of design data based on information of pattern edges obtained from the design data. An example of setting one or both of a coordinate position and a region size of a region used for a template of design data based on information on the basis of a density of pattern edges obtained from the design data will also be described. An example of changing one or both of already set coordinate position and region size of a region used for a template of design data based on information on the basis of a density of pattern edges obtained from the design data will also be described.
Also described is an example of an image processing apparatus including: storage means for storing design data; pattern edge density calculating means for obtaining information based on a density of pattern segments from the design data; template position adjusting means for obtaining a template region based on the information of the density of pattern edges obtained by the pattern edge density calculating means; and template creating means for creating a template based on the information obtained by the template position adjusting means.
An example of detecting a region including both a region with a sparse density of pattern segments and a region with a dense density of pattern segments based on information on the basis of a density of pattern segments obtained from design data and setting one or both of a coordinate position and a region size of the region used for a template will also be described. An example of displaying information based on a density of pattern edges obtained from design data will also be described. Also described is an example in which information based on a density of pattern edges obtained from design data is displayed, and a user sets a template region.
An example of forming one or both of binary and multi-valued templates will also be described. It is a feature that information based on a density of pattern edges obtained from design data is used to create one or both of binary and multi-valued templates. An example of obtaining pattern edges of each layer from design data and using information based on a density of pattern edges obtained by placing the pattern edges of the layers on top of each other to create one or both of binary and multi-valued templates will also be described.
An example of detecting a region including both a region with a sparse density of pattern segments and a region with a dense density of pattern segments based on information on the basis of a density of pattern segments obtained from design data to create one or both of binary and multi-valued templates will also be described. An example will also be described, in which information based on a density of pattern edges from design data is used, and smoothing means performs smoothing of information of a pattern edge image in creation of a multi-valued template.
Further described is an image creating method for using information based on a density of pattern edges obtained from design data to create one or both of binary and multi-valued templates and an image processing program for using information based on a density of pattern edges obtained from design data to create one or both of binary and multi-valued templates.
According to the embodiments, image processing can be applied to a pattern at a robust, high matching success rate.
In the following embodiments, a method and an apparatus that create a template based on information obtained from process information and design data will be described. In a specific example, design data and process information related to a manufacturing process are used to estimate a step state of a pattern of a region specified by the template to obtain grayscale information of each position in the template. An example in which the user sets process information related to a manufacturing process in creating a template for template matching will also be illustrated. Also described is an example, in which multi-layer patterns corresponding to a manufacturing process are used to divide a region specified by the template into a plurality of regions based on relative positions of patterns between layers of the multi-layer patterns, and process information related to the manufacturing process is used for each region to estimate a step state of the pattern to generate grayscale information of each position of each region.
Also described is an example, in which a plurality of image processing methods for obtaining grayscale information of each position in the template from design data are included, and process information related to a manufacturing process is used to switch output of the plurality of image processing methods. Also described is an example, in which the user can use a plurality of layers of a region specified by the template to change parameters for adjusting grayscale information of patterns of a plurality of regions classified based on relative positions of patterns between the layers.
An example of an image processing program for using information based on design data and process information related to a manufacturing process to create a template will also be described.
According to the embodiments, image processing can be applied to each process at a robust, high matching success rate.
Hereinafter, an example of acquiring density information of a pattern from design data to verify a template or create a template based on the density information will be described with reference to the drawings.
Hereinafter, an apparatus and a measurement inspection system including a pattern matching function for specifying a measuring or inspecting position based on template matching will be described with reference to the drawings. More specifically, an apparatus and a system including a critical dimension-scanning electron microscope (CD-SEM) that is a type of a measuring apparatus, as well as a computer program realized by the apparatus and the system will be described.
In the following description, a charged particle radiation apparatus will be illustrated as an apparatus that forms an image, and an example using an SEM will be described as a mode of the charged particle radiation apparatus. However, the arrangement is not limited to this, and for example, a focused ion beam (FIB) apparatus that applies an ion beam to a sample to scan the sample to form an image may be adopted as a charged particle radiation apparatus. However, an extremely high magnification is required for highly accurate measurement of an increasingly miniaturized pattern. Therefore, it is generally desirable to use the SEM that is superior to the FIB apparatus in terms of resolving power.
The design data is expressed in, for example, a GDS format or an OASIS format and is stored in a predetermined style. The type of the design data can be any type as long as software for displaying the design data can display the format style and as long as the design data can be handled as figure data. The storage medium 2405 may include a control apparatus of a measuring apparatus or an inspecting apparatus, the condition setting apparatus 2403, or the simulator 2404.
Each of the CD-SEM 2401 and the defect inspecting apparatus 2402 includes a control apparatus, and control necessary for the apparatuses is performed. The control apparatuses may include a function of the simulator or a setting function of a measurement condition and the like.
In the SEM, a plurality of stages of lenses focus an electron beam released by an electron source, and a scanning deflector applies focused electron beam to a sample to one-dimensionally or two-dimensionally scan the sample.
A detector detects a secondary electron (SE) or a backscattered electron (BSE) released by the sample as a result of the scan by the electron beam, and the electron is stored in a storage medium, such as a frame memory, in synchronization with the scan by the scanning deflector. A computing apparatus included in the control apparatus integrates image signals stored in the frame memory. The scan by the scanning deflector is possible in arbitrary size, position, and direction.
The control apparatus of each SEM performs the control and the like, and images and signals obtained as a result of the scan by the electron beam are transmitted to the condition setting apparatus 2403 through a communication line network. Although the control apparatus that controls the SEM and the condition setting apparatus 2403 are separate in the description of the present example, the arrangement is not limited to this. The condition setting apparatus 2403 may collectively perform the control of the apparatus and the measuring process, or each control apparatus may perform the control of the SEM and the measuring process.
A program for executing the measuring process is stored in the condition setting apparatus 2403 or the control apparatus, and the measurement or computation is performed according to the program.
The condition setting apparatus 2403 has a function of creating, based on the design data of the semiconductor, a program (recipe) for controlling operation of the SEM, and the condition setting apparatus 2403 functions as a recipe setting section. Specifically, the condition setting apparatus 2403 creates a program for setting positions and the like for executing a process necessary for the SEM, such as desired measurement points on design data, contour line data of pattern, and simulated design data, as well as auto focus, auto stigma, and addressing points, to automatically control a sample stage, a deflector, and the like of the SEM based on the setting. A program for causing a processor, which extracts information of a region that serves as a template from the design data to create a template described later based on the extracted information, or a general-purpose processor to create a template is embedded or stored.
When the electron beam 2503 is directed to the sample 2509, an electron 2510, such as a secondary electron and a backscattered electron, is released from the directed part. The released electron 2510 is accelerated in an electron source direction by acceleration effect based on the negative voltage applied to the sample. The electron 2510 collides against a conversion electrode 2512, and a secondary electron 2511 is generated. A detector 2513 captures the secondary electron 2511 released from the conversion electrode 2512, and output I of the detector 2513 changes according to the amount of the captured secondary electron. The brightness of a display apparatus not shown changes according to the output I. For example, to form a two-dimensional image, a deflection signal to the scanning deflectors 2505 and the output I of the detector 2513 are synchronized to form an image of the scanning region. The scanning electron microscope illustrated in
Although the conversion electrode converts the electron released from the sample once and detects it in the example described in
An optical microscope is further mounted on the scanning electron microscope illustrated in
A mode of an image processing apparatus for performing image recognition will be described. The image processing apparatus can be included in the control apparatus 2514, or an embedded computing apparatus can execute image processing. An external computing apparatus (for example, condition setting apparatus 2403) can also execute image processing through a network.
Design data (layout data) corresponding to a pattern of an OM image as a target of image recognition (matching) is stored in a design data storage section 1. A template creating section 2 of an image processing apparatus 5 creates a template image based on the design data corresponding to the pattern of the OM image of the design data storage section 1. In doing so, region information selected by a region selecting section 4 is also used. The design data may be read from the external storage medium 2405.
As illustrated in
A sample is placed on a movable stage, and an optical microscope takes an OM image used in semiconductor inspection or the like. The OM image can be taken at a position corresponding to the design data. However, there is an error in the position movement of the stage, and the corresponding position is displaced to some extent. Therefore, an accurate corresponding position needs to be obtained in a matching process.
An edge density calculating section 3 obtains a density of pattern edges of a region corresponding to the pattern of the OM image based on the information obtained by drawing the design data by the template creating section 2. The region selecting section 4 selects a region that allows acquiring high-contrast sharp edges based on the information of the density of the pattern edges. The template creating section 2 creates a template image from a region suitable for the template selected by the region selecting section 4.
In the OM image, there is a known phenomenon in which a signal value of brightness (brightness value) decreases at a region of steps of a pattern as shown in
An embodiment of the template creating section 2 will be described with reference to
There is a case of a pattern with a plurality of layers (multi-layer) in the drawing. In this case, each layer is similarly drawn based on the design data, and the data is stored in the storage section 212. The drawing section 21 can perform the drawing outside of the template creating section 2 or outside of the image processing apparatus 5. In this case, image data including the drawn design data can be stored in the design data storage section 1. The image creating section 22 of
An embodiment of the edge density calculating section 3 will be described with reference to
A specific example of the operation of the density detecting section 32 will be illustrated. For example, in a region of the edge detection result as shown in
In the case of the multiple layers, a plurality of image data drawn by the drawing section 21 of the template creating section 2 are used for each layer. The edge density calculating section 3 of the present invention will be described with reference to
If there is the storage section 342, for example, 0 can be stored for everything in the storage section 342. It is possible to use only the edge detecting section 31 to sequentially perform the edge detection of the 2A layer and the 2B layer. The maximum value selecting section 34 can read the already stored, currently maximum value of the edges in the comparison. In the detection of the pattern edges, edges of a part with the lower pattern hidden by the pattern of the upper layer can be removed. In this case, whether the pattern is hidden can be determined from the design data. In this case, the drawing section 21 can remove the pattern section (white), which is covered by the upper layer and cannot be seen, in advance and can set the pattern section as the outside of the pattern (black).
The density detecting section 35 of
Although there are two layers for the density detecting section 32, the same can be applied even if there are more than two layers.
An embodiment of a region selecting section will be described with reference to
A sparse region detecting section 42 detects a region inside of the pattern, in which the density of pattern edges is low. For example, a region without any pattern edges throughout a specific range can be detected. A dense region detecting section 43 detects a region inside of the pattern, in which the density of pattern edges is high. For example, all regions other than the region without any pattern edges throughout a specific range can be regions in which the density of pattern edges is high. The determining section 44 determines whether the data is suitable for the template based on the amount of regions detected by the sparse region detecting section 42 in which the density of pattern edges is low and the amount of regions detected by the dense region detecting section 43 in which the density of pattern edges is high. For example, if the regions in which the density of pattern edges is low and regions in which the density of pattern edges is high are included at a specific ratio relative to the image regions, the determining section 44 can determine that the data is suitable for the template. Otherwise, the determining section 44 can determine that the data is not suitable for the template. The signal information 4a indicating whether the data is suitable for the template is transmitted to the template creating section 2 along with the coordinate position of the template region at that time and the information 4a of the region size.
The storage section 41 of
For example, there are white (1) and black (0) in a pixel region of 5 pixels by 5 pixels as shown in
Comparing sections 445 and 446 compare the counted values with thresholds 441 and 442, respectively. If these two comparison results are greater than the thresholds (output of comparing section is “1”), i.e. if the numbers of both the sparse regions and the dense regions are greater than the numbers of certain specific regions, the result of AND 447 is “1”, and it is determined that the data is suitable for the template. If one of the counted values is smaller than the threshold (output of comparing section is “0”) in the comparison with the thresholds 441 and 442, the result of the AND 447 is “0”, and it is determined that the information is not suitable for the template.
As for the region information section 45 in
For the initial value, the template size is set to the smallest value here. In this case, whether the template size is the maximum is determined. If the template size is not the maximum, the size is increased in S108, and S101 to S103 are carried out. If the template size is the maximum in S107, for example, information indicative of no effective region shown in S109 can be transmitted to the template creating section 2 or the user in a visible format.
The image creating section 22 of the template creating section 2 will be described with reference to
As for the 2A layer and the 2B layer, a maximum value selecting section 236 selects the maximum value in each pixel. Since the inside of the pattern is painted out in white, the pixel remains white if one of the layers is inside of the pattern. The brightness of the region with the steps of the pattern is reduced. The region becomes blacker with an increase in the steps. Therefore, a density calculating section 237 can calculate the density of the pattern, and the brightness value can be estimated based on the density information.
It is considered here that the region becomes blacker in proportion to the height of the pattern density. Instead of the simple proportion, a formula calculated from empirically obtained information may be used, or values based on the empirically obtained information may be used to form a table. The density calculating section 237 can be realized in the same way as the density detecting section 32 described in
A smoothing section 230 can apply a smoothing process to the image with the pattern edges remained black by the minimum value selecting section 235 as shown in
Other than the method of using the density information of pattern edges to automatically set the data, the user can set the data while viewing the density information of pattern edges. In this case, a display section 6 can be arranged as shown in
When the user sets a template, if it is recognized in advance that a region including both the sparse region (white) and the non-sparse region (black) of edges is suitable for the template, the user can determine that a region A is not suitable for the template when the user selects the region A, because there is no sparse region (white) of edges. When the user selects B, the user can determine that B is suitable for the template, because B is a region including both the sparse region (white) and the non-sparse region (black) of edges. The coordinate position of the template and the template size set by the user can be set to the region information section 45 of the region selecting section 4. The information can be transmitted to the template region to create a template with the region selected by the user.
The display section 6 can display an instruction for switching a mode of automatically setting the template, a mode of manually setting the template, and the like as described above and can display information indicating whether the current mode is automatic or manual.
Although the image creating section 22 of the template creating section 2 creates a multi-valued template, the created multi-valued template can be compared with something specific to binarize the template to create a binary template.
The image creating section 22 of the template creating section 2 can use a binary signal as output of the sparse region detecting section 42 shown in
All of the multi-valued template, the template obtained by binarizing the multi-valued template, the binary template as output of the sparse region detecting section 42, and the binary template as output of the dense region detecting section 43 can also be created.
In a pattern synthetic image creating process S400, the drawing images (pattern images) of the layers are synthesized to create a pattern synthetic image. In a multi-valued template creating process S500, the pattern edge multi-valued image and the pattern synthetic image are synthesized.
To obtain the synthetic image of pattern edges in the case of the multiple layers, edges of a part where the lower pattern is hidden by the pattern on the upper layer can be removed. In this case, whether the pattern is hidden can be obtained from the design data.
The process of the image processing apparatus of the present invention may be executed in a software process. In doing so, a personal computer may execute the software process, or the process may be incorporated into an LSI to execute a hardware process.
The pattern density determination method (edge density determination method) can be applied to a recipe verifying method for verifying a created recipe, an assist function for assisting the operator in creating the template, and an automatic template creating method. Hereinafter, a specific application method of the pattern density determination method will be described.
According to the configuration, the suitability of the recipe can be verified without the actual operation of the apparatus.
An assist function for assisting the operator in the creation of the template and an automatic template creating method will be described with reference to a flow chart of
In step 2704, for the region selected as the region with predetermined contrast, the degree of coincidence with another region is determined for each region (hereinafter, “first region”) in the same size as the template (step 2705). Here, the degree of coincidence with another region (for example, region in the same size as the first region at a position displaced by one or more pixels from the first region) is obtained for the first region in the selected region based on an autocorrelation method or the like. The determination of the degree of coincidence in the entire selected region can determine whether the first region is a region to be selected as a template.
If the degree of coincidence between the first region and other regions (a plurality of other regions displaced by one or more pixels from the first region) is low, the first region is a region including a pattern or the like having a unique shape relative to the other regions, and it can be stated that the first region is a region to be selected as a template. On the other hand, if the degree of coincidence is high, another region that should not be identified in the matching may be identified as a matching position in the actual template matching, and it can be stated that the first region is a region prone to a matching error. Therefore, for example, a first region in which the degree of coincidence with the other regions is equal to or smaller than a predetermined value or a first region in which the degree of coincidence with the other regions is the lowest is extracted, and the extracted region is output as a template candidate (steps 2706 and 2707). In step 2706, the comparison is performed based on a result in which the degree of coincidence with the other regions is the highest among the results of the determination of the degree of coincidence between the first region and the plurality of other regions. A predetermined number of regions with the lowest degrees of coincidence may be extracted and output as template candidates.
According to the configuration, the region to be selected as a template can be narrowed down based on the contrast information not expressed in the design data and based on the AND condition of the information obtained from the design data. In the example, since the region subject to autocorrelation is narrowed down based on the density information of pattern edges, the configuration is also effective in reducing the processes.
The present example focuses primarily on the extraction of the region to be selected as a template based on the AND condition of the contrast information (pattern density information) not directly expressed in the design data and the information that can be directly extracted from the design data. Therefore, for example, a part having a predetermined pattern shape may be selectively extracted to obtain AND with the contrast information to narrow down the template candidates. The shape information (layout data) of the pattern is information that can be directly extracted from the design data. Therefore, if the template pattern suitable for the matching is empirically recognized, the template candidates may be narrowed down based on input of the information.
In the OM image, there is a phenomenon in which the signal value (brightness value) of brightness decreases as illustrated in
Since the semiconductor device nowadays has a multi-layer structure, there may be several tens of manufacturing processes. Along with the miniaturization, the manufacturing processes include a process of planarizing the wafer, such as reflow and CMP (polishing).
As illustrated in
When the appearance of the OM image varies depending on the manufacturing process, a template creating section 2802 that forms part of the image processing apparatus illustrated in
An embodiment of the region dividing section 2901 will be described with reference to
Based on the edge detection result of the lower-layer edge detecting section 3003 and the pattern drawing image of the upper layer (Nth layer) stored in the upper-layer drawing image storage section 3004, an overlap edge detecting section 3005 detects pattern edges of the lower layer that overlaps the upper-layer pattern and stores the pattern edges in the region dividing result storage section 3006.
Examples of the pattern drawing, the edge detection result, and the overlap edge detection result will be illustrated with reference to
The region division result storage section 3006 stores the edge region (c) of the upper layer, the edge region (f) of the lower layer overlapping the upper layer, and a region (g) obtained by deleting the edge region (f) overlapping the upper layer from the edge region (d) of the lower layer. The pattern drawing section 3001 included in the region dividing section 2901 illustrated in
The upper-layer edge detecting section 3002 and the lower-layer edge detecting section 3003 can be realized by a differential filter, such as a Sobel filter and a Laplacian filter. For example, binarization (Otsu method or the like) may be performed after performing the differential filter. The edge part may be set to white “255”, and the non-edge part may be set to “0”. The upper-layer drawing image storage section 3004 can be realized by a memory. The overlap edge detecting section 3005 is realized by an AND circuit, and an AND result of the output of the lower-layer edge detecting section 3003 and the output of the upper-layer drawing image storage section 3004 is output. For example, AND of (d) and (a) of
The output values are stored in the grayscale correction storage section 3304. The grayscale correction storage section 3304 can be realized by a memory. Correction values of the upper-layer step estimating section 3301, the lower-layer step estimating section 3302, and the overlap part step estimating section 3303 are stored as correction values corresponding to the pixel positions of the upper-layer edge part, the lower-layer edge part, and the overlap part obtained in the region dividing section 2901. Although the planarizing process is applied to the pattern of the upper layer in the case here, for example, an interlayer insulating film can be placed after the formation of the lower-layer pattern, and the upper-layer pattern can be placed after planarization of the interlayer insulating film in the planarizing process. In this case, the inside of the upper-layer pattern is also flat, and the pattern of the lower layer overlapping the upper-layer pattern cannot be seen. In doing so, the process information to be used needs to include not only information of the current process or the previous process in the semiconductor manufacturing as a target of matching, but also information of prior processes. For example, process information of prior planarizing processes can be tracked back, and a series of process information can be used.
After the planarizing process, the pattern that overlaps the lower layer when the pattern is placed can be ignored, and it can be considered that the pattern on the lower layer of an overlapping layer has an influence when a pattern is overlapped subsequently. When the pattern edges overlap at a position with the same pattern in the design data, steps of the patterns of two layers are piled up. The steps become larger, and the brightness can further decrease. In this case, the correction value can be enlarged twofold, etc. The information of the film thickness of the layers can be used to accurately obtain the size of the steps of the pattern. Other than the standpoint of the steps of the pattern, if the film thickness of the interlayer insulating film placed over the upper layer is thick, the contrast of the upper-layer and lower-layer patterns decreases. Therefore, the information of the film thickness of the layers can be used to correct the contrast. In this case, not only the step state, but also the appearance of the pattern can be estimated.
The interlayer insulating film is smooth depending on the material, and the flatness increases. Therefore, material information can be used to obtain the steps of the pattern. In this case, information of the film thickness of the layer can be included in the process information. A series of process names and details of the corresponding processes can be included in the process information. The step state of the pattern based on various processes, such as creation and etching of a resist pattern and photoresist removal, can be obtained, and the grayscale of each pixel can be obtained based on the step state to create the template. The user may be able to use a GUI or the like to simply set the process information related to the manufacturing process.
For the correction values of the upper-layer step estimating section 3301, the lower-layer step estimating section 3302, and the overlap part step estimating section 3303 corresponding to the process information, changes in the steps of the patterns of the regions (upper layer, lower layer, and overlap part) based on process names and process details can be checked in advance, and the correction values corresponding to the changes can be included in a table. The ratio of the synthesis of the grayscales of the pattern edges are determined based on the correction values. The correction values may be determined from the appearances of the OM images of the regions (upper layer, lower layer, and overlap part) based on the process names and the process details. A commercially available simulator can be used in advance to calculate and obtain changes in the steps of the patterns of the regions (upper layer, lower layer, and overlap part) based on the process names and the process details, and the changes can be included in the conversion table of the upper-layer step estimating section 3301, the lower-layer step estimating section 3302, and the overlap part step estimating section 3303. Formulas used in the simulator and the like can also be included to obtain the correction values by computation.
The grayscale information creating section 2804 will be described with reference to
The minimum value selecting section 3407 can be realized by the comparing section 341 and the storage section 342 as illustrated in
If the storage section 342 is included, for example, 0 can be stored for everything in the storage section 342. Edge detection can be sequentially performed not only for the upper layer and the lower layer, but also for the layers following below. The already stored current maximum value of the edges can be read in the comparison by the minimum value selecting section 3407. In the detection of the pattern edges, the edges of the part where the lower pattern is hidden by the pattern on the upper layer can be removed. In this case, whether the pattern is hidden can be obtained from the design data. In this case, the drawing sections 3401 and 3042 can exclude the pattern section (white), which overlaps the upper layer and cannot be seen, to set the section to the outside of the pattern (black). Meanwhile, a maximum value selecting section 3408 selects the maximum value of the patterns of the upper layer and the lower layer in each pixel. Since the inside of the pattern is painted out in “255” white, the pixel remains “255” white if one of the pixels is inside of the pattern.
The maximum value selecting section 3408 can be realized in the same way as the minimum value selecting section 3407. In this case, the difference is that the values after the edge detection of the layers are compared to select a larger one.
The brightness of the region with steps of the pattern decreases, and the brightness becomes closer to black with an increase in the steps. Therefore, a density calculating section 3409 can calculate the pattern density to estimate the brightness value based on the density information. It is considered here that the brightness becomes blacker in proportion to the height of the pattern density. Instead of the simple proportion, a formula calculated from empirically obtained information may be used, or values based on the empirically obtained information may be included in a table.
In the region of the result of the minimum value selecting section as shown in
A signal inverting section 3410 inverts the information of the density obtained by the density calculating section 3409 to make the pixels black in proportion to the height of the pattern density. A synthesizing section 3411 synthesizes the values. In the synthesis, the values can be synthesized at a specific ratio. The ratio of the synthesis is adjusted by the grayscale correction value from the step estimating section 2803. For example, when the grayscale correction value estimated to have steps is “100” (100%), if the ratio for synthesizing the gray values obtained from the pattern density is 60% of the original, the gray values are synthesized by the same 60% obtained by multiplying 60% by 1.0. When the grayscale correction value estimated not to have steps is “0” (0%), if the ratio of the synthesis of the gray values obtained from the pattern density is 60% of the original, the gray values are synthesized by the same 0% obtained by multiplying 60% by 0.0.
More specifically, the image selected and created by the maximum value selecting section 3408 is output without change. The information of the film thickness of a layer, such as an interlayer insulating film, can be used to change the ratio. For example, if the overall contrast decreases when the film thickness of the layer, such as the interlayer insulating film, is thick, the ratio in the synthesis can be reduced for, without limitation, the upper-layer edges, the lower-layer edges, and the overlap part edges. There can also be a method for creating a template by using the result of the step estimating section 2803 in switching the output of the image processing system for synthesizing the gray values obtained from the pattern density and the output of another image processing system without the use of the gray values obtained from the pattern density (i.e. the ratio of synthesis is 0%). The step state of the upper layer, the step state of the lower layer, and the step state of the overlap part can be externally obtained to set the obtained result.
For example, when the user sets the state, the user uses a display section 3601 to set the step state of the upper layer, the step state of the lower layer, and the step state of the overlap part in the process as shown in
Although there are two layers here, the same can be applied even if there are more than two layers.
In a region dividing process S200, each pattern is divided into a plurality of regions based on the design data. The patterns of an upper layer and a lower layer are used to obtain upper-layer pattern edges, lower-layer pattern edges, and lower-layer pattern edges overlapping the upper-layer pattern. Here, the content described in
In a grayscale information creating process S400, the grayscale correction value of each region obtained in the grayscale correction calculating process S300 is used to create grayscale information of each pixel of the template. Specifically, the content described in
Although the template creating system has been described, the system can be used to create an image processing apparatus. A semiconductor inspecting apparatus including the image processing apparatus may also be formed.
In the execution of the software process, a personal computer may be used to execute the software process, or the software process may be incorporated into an LSI to execute a hardware process.
An example of forming in advance a table of a relationship between pattern conditions (classification of patterns), manufacturing processing information, and image processing conditions to read an image processing condition according to the region or the manufacturing process used in the OM image matching to use the image processing condition to create a template will be described.
If the OM template includes a plurality of regions with different pattern conditions, different image processing may be applied to each of the different pattern condition regions as illustrated in
In the present embodiment, the pattern density is also an index of the brightness or contrast reduction. Therefore, parameters that significantly affect the change in the brightness or contrast (for example, statistics of interval between patterns, distance between segments included in the selected region, distance between adjacent closed figures, and distance between a plurality of closed figures) may be defined as the pattern density.
Information related to the manufacturing process for performing OM matching is input (step 4303), and the image processing conditions are searched based on the input information and the pattern classification information (step 4304). In the table illustrated in
Although the example of storing two types of image processing conditions is described in the present embodiment, the arrangement is not limited to this. Any type of image processing method for approximating the design data to the OM image can be used, and one or more types of processing conditions can be stored. The example of using two tables including the table for associating and storing the pattern classification and the pattern conditions (
In this way, the searched image processing conditions are registered in the storage section 4103 as a recipe for measurement and inspection (step 4305).
According to the configuration, the template region can be set on the pattern shape information obtained from the design data, simulation, or the like to appropriately form the template for OM matching.
Although the example of acquiring the formation conditions of the template based on the design data or the simulation image has been mainly described, if a relationship between the OM image obtained after a manufacturing process and the OM image obtained after another manufacturing process is clear, the OM image obtained after one manufacturing process allows creating the template for OM matching used for the measurement and inspection after another manufacturing process.
According to the configuration, images acquired after different manufacturing processes can be used to create a template, and an effort for creating the template can be reduced.
Number | Date | Country | Kind |
---|---|---|---|
2010-147051 | Jun 2010 | JP | national |
2011-008374 | Jan 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2011/002660 | 5/13/2011 | WO | 00 | 3/25/2013 |