The present invention relates to a method and an apparatus for combining a plurality of pictures to form a composite picture, and in particularly to a composite picture forming method and a picture forming apparatus for realizing a coupling between the plurality of pictures in high accuracy.
Recently, fine semiconductor devices have advanced for improving the performance of semiconductor device and reducing its manufacturing cost. For a purpose of evaluating a fine pattern finish for the semiconductor device, a semiconductor measurement and inspection device mounting with a critical-dimension scanning electron microscope (CD-SEM) etc, has been used. These devices image a picture of pattern in a high magnification ratio as high as several tens of thousands to hundreds of thousands times to process the acquired picture so that they evaluate a fine pattern shape in high accuracy.
However, according to the restriction of the SEM configuration, when a pattern domain to be evaluated is extended to a wide range since an imaging magnification ratio of the SEM is set in high to make a visual field size small, a method is effective that the pattern domain is divided into plural numbers thereof to then image a picture and thereafter combine the pictures (to make it panoramic), as disclosed in JP-A-61-22549 and JP-A-07-130319. However, since the pattern of electronic devices is mostly monotone straight line, a matching point appears plural numbers when determining a composite position between pictures, therefore, the composite position cannot sometimes be specified.
JP-A-2009-157543 has disclosed a technique such that pattern designed data is analyzed at a previous step of imaging a pattern and the pattern is imaged so as to enter the pattern, which easily specifies the composite position of the picture, into an overlapping domain between the pictures to be targeted for a composition to thereby enhance a matching easiness and perform a panoramic composition of semiconductor circuit patterns. There has also been a method such that a picture position is optimized to perform the panoramic composition while fitting together between proximity pictures as disclosed in JP-A-2001-512252.
However, in even the case disclosed in JP-A-2009-157543, the composition cannot sometimes be realized depending on the pattern shape since the pattern shape of designed data is different from an actually manufactured pattern shape. Further, as disclosed in JP-A-2001-512252, even when the picture position is optimized while fitting together between proximity pictures, the picture position cannot be specified since the monotone straight line pattern can be fitted in any straight line directions. Furthermore, even though the picture position is specified while fitting together between the entire pictures, it is necessary to repeat the composition in a number of times. This is impractical since it takes a lot of time for processing a number of compositions.
An object of the invention is to provide a composite picture forming method and a picture forming apparatus capable of appropriately forming a composite picture in response to a pattern condition of the overlapping domain.
In order to realize the above-mentioned object, according to an aspect of the invention, a composite picture forming method and a picture forming apparatus for combining a plurality of pictures to form a composite picture, is proposed that pattern attribute information contained in an overlapping domain between a plurality of pictures is formed, the picture to be targeted for a composition is selected on the basis of the pattern attribute information, and a picture composition is performed on the basis of the selected picture.
According to another aspect of the invention, the composite picture forming method and the picture forming apparatus is proposed that the picture composition is performed in plural stages by using the pattern attribute information in the overlapping domain.
According to still another aspect of the invention, the composite picture forming method and the picture forming apparatus is proposed that the composition is selectively performed for between the pictures when the pattern attribute information contained in the overlapping domain between the pictures satisfies a predetermined condition, either a plurality of composite pictures or one composite picture is formed, and the composition is performed for between either the plurality of composite pictures or between the composite picture and the picture which is not combined.
According to the above-mentioned configuration, the pattern composition can be performed appropriately since the overlapping domain is evaluated for propriety of the composition to then perform the picture composition.
The other objects, features and advantages of the invention will become apparent from the following description of the embodiments of the invention taken in conjunction with the accompanying drawings.
Hereinafter, in a composite picture forming method for a picture having an electronic device pattern acquired from an imaging device of such as SEM, the method will be described with use of design data and picture data for the overlapping domain of the picture data to be targeted and use of a step of forming attribute information of a pattern contained in the respective overlapping domains and the attribute information of the formed pattern, to then select a picture to be combined from the targeted picture data to perform a picture composition.
In the composite picture forming method for a picture having the electronic device pattern acquired from the imaging device, a method will be described with use of the design data and the picture data for all of the overlapping domains of the picture data to be targeted and use of the step of forming the attribute information of the pattern contained in the respective overlapping domains and the attribute information of the formed pattern, to perform a step of selecting the picture to be combined from the picture data to be targeted to thereby perform a picture composition and a step of detecting and selecting a combination of the specified pattern attribute information to perform the picture composition for a plurality of overlapping domains of the targeted composite picture.
In a composite picture forming apparatus for a picture having the electronic device pattern acquired from the imaging device, the composite picture forming apparatus will be described with use of a picture storing unit that stores plural pieces of divided picture data acquired from imaging the electronic device pattern of which an imaged position is varied, a pattern attribute forming unit that forms pattern information contained between the pictures by using the design data of the electronic device pattern and the picture data stored in the image storing unit, a composite selection unit that selects the pictures to be combined in accordance with the information from the pattern attribute forming unit, and a composite unit that selectively performs a composite process for the selected picture by the composite selecting unit.
In the composite picture forming apparatus for the picture having the electronic device pattern acquired from the imaging device, the composite picture forming apparatus will be described with use of a picture storing unit that stores plural pieces of divided picture data acquired from imaging the electronic device pattern of which an imaged position is varied, a pattern attribute forming unit that forms the pattern information contained between the pictures by using the design data of the electronic device pattern and the picture data stored in the image storing unit, a composite selection unit that selects the picture to be combined in accordance with the information from the pattern attribute forming unit, a composite unit that selectively performs the composite process for the selected picture by the composite selection unit, and a position specifying composite unit that selectively specifies and combines picture positions for the pictures selected from the composite selection unit by using the pictures combined in the composite unit.
An example will be described in the case where the picture to be processed in composition by the composite unit is different from that processed in that by the position specifying composite unit.
An example clearly understood for a user will be described in the case where a display device is provided for displaying the picture selected by the composite selection unit, the composite picture on composite steps, and the picture combined or not combined as a result of the composition.
In the above-mentioned composite picture forming method, the pattern attribute contained in the overlapping domains among all of the pictures to be targeted for the composition is detected by using the design data and the picture data stored in a picture memory, before the composition, to form pattern attribute data of the respective overlapping domains. The pattern attribute is divided into a pattern for which a composite position is determined in both an x-direction and a y-direction, a pattern for which the composite position is determined in the x-direction alone, a pattern for which the composite position is determined in the y-direction alone, and a pattern for which the composite position is not determined in both the x- and y-direction.
First, the picture in the overlapping domain for which the composite position is determined in both the x- and y-direction is only selected to then combine it by using the pattern attribute of the respective overlapping domains. The composition of picture in the overlapping domain, for which the composite position is determined in all of the x-direction and y-direction, is performed to then finish an initial composition. After finishing the initial composition, there is a case where all of the pictures are combined to one in response to the pattern attribute, but there is also a case of dividing into a plurality of composite pictures combined with one or more pictures, and there is further a case where a single picture, which cannot be combined with any adjacent pictures, is present.
When all of the pictures cannot be combined to one at the initial composition, the presence of picture to be able to combine is further detected by using the pattern attribute for the composite picture containing a single picture after the initial composition.
In the initial composition, the pattern attribute to be detected is focused attention on each of the overlapping domains. In the detection after the initial composition, the pattern attribute is detected by using two overlapping domains. One of the pattern attributes to be detected is set to a pattern attribute for which a position is determined in the x-direction, and the other thereof is set to a pattern attribute for which a position is determined in the y-direction. In the view from between the composite pictures containing an adjacent single picture, the composite picture has a plurality of overlapping domains. If both the pattern attributes for determining respectively the positions in the x-direction and y-direction are contained in the plurality of overlapping domains, it judges that the composition between the composite pictures containing the single picture can be performed. The composition is then performed for between the composite pictures alone containing the single picture to be able to combine. Since there could be a plurality of overlapping domains by using the after composite picture, it is also possible to contain the pattern attributes for determining respectively the positions in both the x- and y-direction in the plurality of overlapping domains. To this end, the detection of pattern attribute using the plurality of overlapping domains is performed repeatedly, and the composition is also performed repeatedly for between the composite pictures alone containing the detected single picture, subsequently to the initial composition. The composite process is finished when there is no overlapping domain containing the pattern attributes for determining respectively the positions in both the x- and y-direction between all of the composite pictures containing the single picture. The above-mentioned process can be performed in high speed since the composition is performed for the detected pattern attribute alone, rather than a technique performed for while considering an entire consistency. Further, the picture of which the composite position can be specified clearly is discriminated by using the pattern attribute. The picture is then arranged in the order of easily specifying the composite position, so that the composition for more pictures can be performed by using information for the plurality of overlapping domains of the composite pictures which are already combined.
Hereinafter, the following description will be concerned with an apparatus for forming the composite picture, a system, and a computer program (or a storage medium for storing the computer program) executed in the apparatus or system, with reference to the drawings. More specifically, the description will be concerned with an apparatus including a critical dimension-scanning electron microscope (CD-SEM) as one of measuring devices, the system, and the computer program executed thereby.
The following-mentioned composite picture forming method is applicable not only to a device for measuring a pattern size, but also to a device for inspecting a pattern drawback. In addition, in the following description, a charged particle radiation device is shown as a device for forming pictures, and the SEM will be described as an example for that device. This is not limited to these devices thereto. A focused ion beam (FIB) device, which forms a picture on a sample by scanning ion beam, may be employed as the charged particle radiation device. In this regard, it is desirable to use the SEM rather than FIB device in consideration of a resolution in general, since an extremely high magnification ratio is demanded for measuring the fine pattern in high accuracy.
Control units 2004, 2005 are connected respectively with the SEMs 2002, 2003 to control the SEMs as required. Electron beam emitted from an electron source is focused by a multiple lens in the SEMs, and the focused electron beam is scanned one- or two-dimensionally on the sample by a scan deflection device.
A secondary electron (SE) and a backscattered electron (BSE) emitted from the sample by scanning the electron beam are detected by a detector to then be synchronized with the scanning in the scan deflection device and stored in a storage medium such as a frame memory etc. A picture signal stored in the frame memory is accumulated by a computation device incorporated in the controllers 2004, 2005. The scanning performed by the scan deflection device can be used for any sizes, positions and directions.
The above-mentioned controls etc. are performed in the controllers 2004, 2005 of the SEMs. As a result of scanning the electron beam, the acquired picture and signal are sent to the data management unit 2001 via communication lines 2006, 2007. In addition, as described in this embodiment, the controllers 2004, 2005 for controlling the SEMs are provided separately from the data management unit 2001 for performing the measurement on the basis of the signal acquired from the SEM. This is not limited to the above-mentioned configuration. The control and the measuring process for the SEM may be performed collectively in the data management unit 2001, and the controllers 2004, 2005 may also perform both the control and measuring process for SEMs.
The above-mentioned data management unit 2001 or controllers 2004, 2005 store a computer program for executing the measuring process to perform the measurement and computation in accordance with the computer program. Further, the data management unit 2001 stores design data for the photomask (sometimes referred to as a mask) and wafer for use in semiconductor manufacturing steps. The design data is represented by a GDS format, OASIS format or the like to be stored as a predetermined format, for example. In addition, software indicating the design data represents a format thereof, and any types of design data can be acceptable if it is handled as graphic data. Further, the design data may be stored in a storage medium separately provided from the data management unit 2001 in advance.
The data management unit 2001 provides a function to form the computer program (recipe) for controlling the operation of SEM on the basis of the design data of semiconductor device, and this function acts as a recipe setting unit. Specifically, a position etc. is set for performing a necessary process for the SEM, such as the design data, contour data of the pattern, a desired measurement point on the simulation applied design data, an autofocus, an autostigma, an addressing point, etc., and the computer program is written on the basis of the above-mentioned setting, for automatically controlling a sample stage of the SEM, the scan deflection device, etc. In addition, a template matching method using a reference picture referred to as a template is a technique, in which the template is moved in a search area for searching a desired place, to specify a place having a highest coincidence degree with the template or having a coincidence degree equal to or greater than a predetermined value in the search area. The controllers 2004, 2005 perform a pattern matching on the basis of the template which is one of recipe registered information. The controllers 2004, 2005 store a computer program for executing the pattern matching using a normalized correlation method etc.
A computation device 2009 is provided in the data management unit 2001 and the computation device 2009 can form composite pictures.
In addition, a focused ion beam unit may be connected to the data management unit 2001, for irradiating helium ion, liquid metal ion etc. on the sample. A simulator 2008 may be connected to the data management unit 2001, for simulating the pattern finish on the basis of the design data to turn a simulation picture acquired from the simulator 2008 into the GDS format and use it instead of the design data.
The electron beam 2103 is irradiated on the sample 2109 to thereby emit an electron 2110, such as a secondary electron and a backscattered electron from an irradiated place. The emitted electron 2110 is accelerated in an electron source direction by an accelerating action on the basis of the negative voltage applied to the sample to then be collided to a transform electrode 2112 and generate a secondary electron 2111. The secondary electron 2111 emitted from the transform electrode 2112 is picked up by a detector 2113, and an output I of the detector 2113 is varied in response to an amount of the picked up secondary electron. A luminance of a display unit (not shown) is varied in response to the output “I”. For example, when forming a two-dimensional image, a deflection signal supplied to the scan deflection device 2105 is synchronized with the output “I” from the detector 2113 to form a picture of a scanning domain. The electron scanning microscope shown in
In addition,
The controller 2004 provides a function for controlling the parts of electron scanning microscope and forming the picture on the basis of a detected electron and also provides a function for measuring a pattern width of the pattern formed on the sample on the basis of a detected electron intensity distribution referred to as a line profile. In addition, a composite picture forming as after-mentioned may be performed in the data management unit 2001 and also performed in the controllers 2004, 2005. Further, a part of the process necessary for the composite picture forming may be performed in one computation device and the other of the process may be performed in the other computation device. Furthermore, a partial picture processed to the composite picture forming may be registered on a frame memory incorporated in the controllers 2004, 2005 in advance, and also stored in a memory incorporated in the data management unit 2001 and an externally installed storage medium.
Hereinafter, the following description will be concerned with a composite picture forming method and a picture forming apparatus of the same on the basis of the picture imaged by an imaging device such as the charged particle radiation device etc.
First, both the design data of the electronic device pattern stored in the design data storing unit 1 is entered into a pattern attribute data forming unit 30 in the picture forming unit 3, and the picture data of the electronic device pattern acquired from the SEMs 2002 and 2003 and stored in the picture memory unit 2 is also entered into the same.
The pattern attribute data forming unit 30 uses the design data and the pieces of picture data of the electronic device pattern to form pattern attribute data of the pattern present in the overlapping domain of the picture data. The pictures to be combined in the composite selection unit 31 are selected on the basis of the pattern attribute data formed by the pattern attribute data forming unit 30, when combining the pictures. The picture data, stored in the picture memory unit 2, only corresponding to the selected pictures in the composite selection unit 31 is combined by a composite unit 32.
The following description will be concerned with the picture composition of the electronic device pattern with reference to
Here,
Further, a matching unit 302 makes a picture memory 301 store the picture data adjacent to the pictures, and performs matching with the adjacent pictures by using the overlapping domains.
Here,
However, when the two line segments are toward a different direction as shown in
Here, the matching unit 302 determines a shape of a correlated distribution around the correlation peak, on the basis of the determined correlation information, to discriminate the pattern shape contained in the overlapping domain of the imaged picture. A pattern attribute determining unit 303 determines the pattern attribute of a targeted overlapping domain by using results of the matching units 300, 302. When the vertical and horizontal line segment information contained in the overlapping domain based on the design data in the result of the matching unit 300 is matched with the pattern shape determined from the shape around the peak of the correlation picture in the matching unit 302, the pattern attribute is determined on the basis of the line segment information of the result in the matching unit 300. When that is not matched with the pattern shape in the above-mentioned case, it can be thought that the attribute is handled as having no pattern in both vertical and horizontal line segments.
It can also be thought that the types of pattern attribute are divided into: the vertical and horizontal pattern by which a composite position is determined in both the x- and y-direction; the vertical pattern by which the composite position is determined in the x-direction alone; the horizontal pattern by which the composite position is determined in the y-direction alone; the oblique pattern by which the composite position is determined in both the x- and y-direction if the position is determined in either the x- or y-direction; and there is no pattern itself. In the case of reviewing the line segments described in
The pattern attribute determined in the pattern attribute determining unit 303 is stored in a pattern attribute data table 304 for every overlapping domain. The pattern attribute for all of the overlapping domains of the pictures to be targeted for the process is determined to then store in the pattern attribute data table 304. For example, as shown in
In this embodiment, the attribute of the vertical and horizontal pattern to be determined in the x- and y-direction is set in the selected pattern attribute storing unit 310 in advance. The signal for permitting the composition is output only when the attribute of the vertical and horizontal pattern to be determined in both the x- and y-direction is contained in the overlapping domain to be targeted for the process. In this embodiment, the description is targeted to the picture of which a longitudinal direction of two line segments is the x- and y-direction, as an example of the picture to be able to determine the coordinate position. However, since the coordinate position can be specified if at least one of two line segments is directed toward a different direction, the picture containing two line segments of +45 degree and −45 degree may be acceptable when the x-direction is set to 0 degree, for example.
As mentioned above, the necessity of composition is judged in response to the pattern attribute, therefore, it is possible to selectively combine the easily combined pictures in the picture process, so that the composition can be realized in high accuracy. Further, as after-mentioned, the composition is performed selectively between the pictures satisfying a predetermined condition, and the composition between the composite pictures acquired from the above-mentioned composition is also performed separately, so that the picture composition can be performed while a coupling error is inhibited, in the combined picture as a whole.
Next,
When grouping the 25 pieces of pictures, first, the-between pictures of overlapping domain to be permitted for the composition in the horizontal direction (first direction) are grouped, that is, the pictures enclosed by the thick-lined rectangle in
As mentioned above, in plural directions, the picture composition is performed on the basis of the determination of whether the overlapping domain between the pictures to be targeted for the composition is satisfied for a predetermined condition and the picture composition is further performed, so that it is possible to arrange the pictures to be targeted for the composition on the appropriate position, in the composition pictures as a whole.
In addition, the groups are grouped into one in this embodiment, but can be grouped into a plurality of groups. Subsequently, the composite position of the pictures in the group is determined for every group grouped by the grouping unit 3210, by a composite position calculation unit 3211. The composite position of the picture based on either picture in the group and the adjacent picture becomes the corresponding position determined by the matching. When the corresponding position is of the adjacent picture of the picture of which the composite position is previously determined, the composite position becomes a position added the corresponding position determined by the matching to the previously determined composite position. Likewise to continuously determine the composite position, and all of the picture positions in the group are determined. When the group is present in plural numbers, the composite position of the pictures in the group is determined likewise for all of the groups.
Group information, belonging to the pictures, determined by the composite position calculation unit 3211 and the composite position of the respective pictures for every group, are stored in a composite position storing unit 3212. The composite position storing unit 3212 is realized by a memory. For example, when 25 pieces of picture represented from “A” to “Y” are arranged in
In this case, the picture “A” corresponds to an address “1” in a group number ID2 and the picture names adjacent to the pictures in eight directions become 0, 0, 0, 0, B, 0, F, G. The picture “M” corresponds to an address “2” in a group number ID2 and the picture names adjacent to the pictures in eight directions become H, I, L, N, Q, R, S in series. When the group is present in plural numbers, the picture to be reference is present in every group. For that reason, it is necessary to determine a relative position of the picture to be reference in the respective groups. Therefore, the design data formed in the pattern attribute data forming unit 30 is stored, and the corresponding position determined by the matching of the respective pieces of picture data is also stored in advance. It can be thought that the corresponding position on the design data of the picture to be reference in the groups is read out to determine the relative position. However, the corresponding position becomes rough since the design data is not matched completely with the respective pieces of picture data.
A picture composite unit 322 combines the pictures at the composite position of the respective pictures output from the in-group composite position calculation unit 321. It can be thought that the pictures are overwritten on the composite position in the composite unit 32, and it can also be thought that a blend process of the pixel values is performed around the composite position of the pictures to make a boundary obscure.
However, the pictures are grouped into a plurality of groups in the in-group composite position calculation unit 321 of the composite unit 32, in consequence, it is possible to occur a displacement since the composite pictures are also divided into plural number. For this reason, it is necessary to preferably reduce the number of groups. Consequently, the pictures to be able to combine are grouped by using the pattern attribute data to apply the technique to the determination of the composite position, after that, the corresponding position between the groups is again determined by using the pattern attribute data. This technique will be described below.
The pattern attribute data forming unit 34 forms the pattern attribute data present in the overlapping domain of the picture data by using the design data and the pieces of picture data of the electronic device pattern. At a time of the composition, the picture to be combined in the composite selection unit 31 is selected on the basis of the pattern attribute data formed by the pattern attribute data forming unit 34. The group information indicating to be able to combine the respective pictures in a composite unit 36 and the composite position of the respective pictures in the group are determined only for the picture data in the picture memory unit 2 corresponding to the picture selected by the composite selection unit 31.
The composite picture forming apparatus in
For example, as shown in
When the overlapping domain between the groups is present in plural numbers, the pattern attribute data corresponding to the plurality of overlapping domains is determined. The pattern attribute data between the pictures adjacent to between the determined groups is entered into the composite selection unit 35.
The comparators 352, 353 are set so as to output “1” when matching the pieces of pattern attribute data to be compared. The output of an AND circuit 354 becomes “1” in response to the matching of the pattern attribute data to be output as “1” to the specified position composite unit 37.
When the output of the comparison unit 352 and 353 becomes “1” and the pattern attribute data is matched with the stored attribute information, information regarding the attribute pattern at this time is stored in a memory and maintained for a time period during which the overlapping domain between the pictures in the same group is detected. For this reason, the order may be neglected, and a detection is performed for whether the pattern attribute set in both the selected pattern attribute storing units 350 and 351 is present in the pictures between the groups. The memory may be made clear when detecting the pattern attribute in a different group.
As shown in
For a purpose of specifying the relative position between the above-mentioned picture groups or between the picture group and picture, it is desirable that the pattern attribute being set in the selected pattern attribute storing units 350, 351 in the composite selection unit 35 is made into the horizontal line pattern capable of specifying the relative position in the y-direction and the vertical line pattern capable of specifying the relative position in the x-direction. However, as assumption in this case, it is necessary to determine the composite position in the group containing the pictures A and K in
After determining the position information by the composite unit 36, information regarding the group number of the respective pictures stored in the composite position storing unit 3212, the adjacent picture name in the eight directions and the composite position in the group, are stored in the position correcting unit 372 in advance.
The groups to be targeted are made into one on the basis of specifying the relative position between the picture groups. When the groups are made into one, the corresponding position between the determined groups is added for the picture, to which the group is changed, to be able to correspond to the composite position of the changed group.
When the output of composite selection unit 35 does not become “1” even though repeating all of between groups, a picture composite unit 373 combines the composite positions of the respective pictures at that point. The picture composite unit 373 is the same as picture composite unit 322 shown in
Further, a display unit 4 may be provided to display the picture combined at the composite positions of the respective pictures at a time of fining the composition in the composite unit 36 and also display the picture to being grouped continuously in the position specifying composite unit 37. Further, a composite result may be displayed for a user who can see the picture on the composition, the picture to which the composition was applied or not applied.
The above-mentioned picture forming unit 3 may be incorporated into the controllers 2004, 2005 or data management unit 2001. Further, the process performed in the picture forming unit 3 may be executed by software. In this case, the software process may also be executed by a personal computer, and the software may be incorporated in an LSI to perform a hardware process.
The picture of which the composite position is specified clearly is discriminated by using the pattern attribute of the overlapping domain between the pictures. In addition, the relative position is determined in sequence for easily specifying the composite position to perform the picture composition, and the picture composition is further performed by using the plurality of overlapping domains of the already combined composite pictures, so that more number of pictures can be combined.
In addition, the above-mentioned embodiment has described that the attribute data is appended to the overlapping domain between the pictures by using the design data, but this is not limited thereto. A template individually indicating a shape of patterns formed separately from the design data is provided in plural. The pattern matching in the overlapping domain is then performed by using the template. The pattern information of the template having the highest coincidence degree may be stored as the attribute data of the overlapping domain.
It should be further understood by those skilled in the art that although the foregoing description has been made on embodiments of the invention, the invention is not limited thereto and various changes and modifications may be made without departing from the spirit of the invention and the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2009-225933 | Sep 2009 | JP | national |