The present invention relates to a technique of inspecting a pattern by means of, for example, an electronic microscope, and, more particularly, relates to a technique of panoramic synthesis for generating one image by synthesizing a plurality of images.
Conventionally, a critical-dimension scanning electron microscope (CD-SEM) has been widely used to inspect results of precise wiring patterns formed on semiconductor wafers. Recently, with miniaturization of process for semiconductor devices, products of process nodes of 45 nm have been mass-produced. With miniaturization of wiring patterns, defects which need to be detected become small. Hence, the image capturing magnification of CD-SEM should be higher.
Recently, with miniaturization of wiring patterns, there is a problem that a pattern deforms due to an optical proximity effect. Therefore, optical proximity correction (OPC) is performed. The OPC simulation is performed to optimize OPC. According to the OPC simulation, an image of a wiring pattern of a mask or wafer formed by performing OPC is captured, and its image data is fed back to the OPC simulation. Thus, higher precision of the OPC simulation and improvement precision are realized.
The captured image to be fed back to the OPC simulation requires an area of about 2 micrometers*2 micrometers to 8 micrometers*8 micrometers at a high magnification. The captured image has several thousand pixels when one pixel has a resolution of 1 nm.
To acquire an image at a high magnification in a wide range, it is only necessary to increase the number of pixels of an imaging system to capture images in the wide range or capture images a plurality of times and then panoramically synthesize the images. Japanese Patent Application Laid-Open No. 61-22549 discloses a panoramic synthesizing method.
In an electronic microscope, an image capturing target is irradiated with an electrode beam to detect secondary electrons from the image capturing target and to generate an electronic image. Therefore, when the image capturing target is a wafer, it is known that irradiation of the electron beam shrinks a resist and deforms a pattern. To reduce deformation of a pattern such as shrinkage, it is necessary to adjust, for example, the amount of the electron beam. Japanese Patent Application Laid-Open No. 2008-66312 discloses a method of adjusting, for example, the amount of the electron beam.
According to panoramic synthesis, when a plurality of images is joined, images are joined such that the rim of an image overlaps the rim of an adjacent image. Hence, an area on an image capturing target corresponding to an image joining area is irradiated with the electron beam a plurality of times. When an image capturing target is a wafer, multiple irradiation of the electron beam shrinks a resist and deforms a pattern. Even if this image data is fed back to the OPC simulation, it is not possible to improve precision of the OPC simulation.
Such deformation of a pattern varies depending on, for example, the electron beam amount, a material of a resist and a pattern shape, and is hard to be predicted.
It is therefore an object of the present invention to provide a composite image creating method and device which, when images separately captured a plurality of times are panoramically synthesized, can prevent deformation of a pattern due to multiple irradiation of the electron beam.
The present invention generates one image by overlapping joining areas of rims of two adjacent images when a plurality of images are connected to generate one image. Of two adjacent images, the joining area of an image of an earlier image capturing order is left, and the joining area of an image of a later image capturing order is removed. The joining area of the image of an earlier image capturing order is obtained with irradiation of electron beam a less number of times than the joining area of the image of a later image capturing order, and therefore deformation of a pattern due to irradiation of the electron beam is little.
The present invention corrects deformation of a pattern due to irradiation of the electron beam in the joining area of the image of an earlier image capturing order. The relationship between the number of times of irradiation of the electron beam and a deformation amount of the pattern is calculated in advance. The pattern in the joining area is corrected on the basis of this pattern deformation information.
The present invention provides an imaging device and imaging method which, when images separately captured a plurality of times are panoramically synthesized, can prevent deformation of a pattern due to multiple irradiation of the electron beam. It is possible to prevent deformation of a pattern due to image capturing and acquire precisely connected images.
A configuration of a first example of a composite image creating device according to the present invention will be described with reference to
The imaging device 11 may be a scanning electron microscope (SEM) or critical-dimension scanning electron microscope (CD-SEM). In the scanning electron microscope, a mask or wafer of an image capturing target is irradiated with an electron beam, to detect a secondary electron discharged therefrom and to acquire image data. The composite image creating device according to the present invention separately captures images of a pattern of an image capturing target a plurality of times, and synthesizes the images to generate one image. Consequently, a plurality of divided images is acquired as image data. When one pattern is divided into nine, nine items of divided image data are acquired. The image capturing control device 12 sets image capturing positions and an image capturing order of a plurality of divided images.
The image memory 1 stores image data acquired by the imaging device 11. When, for example, one pattern is divided into nine and captured, nine divided images are stored in the image memory 1.
The image capturing position information storing unit 5 stores the image capturing positions of a plurality of divided images provided from the image capturing control device 12. The image capturing order information storing unit 3 stores the image capturing order of a plurality of divided images provided from the image capturing control device 12. An example of information stored in the image capturing position information storing unit 5 and image capturing order information storing unit 3 will be described with reference to
On the basis of information stored in the image capturing position information storing unit 5, image capturing order information storing unit 3, deformation information storing unit 4 and design data storing unit 6, the image synthesizing unit 2 synthesizes a plurality of items of image data stored in the image memory 1 and generates one image. The image synthesizing unit 2 further corrects the wiring patterns in joining areas of the divided images when the images are synthesized. Information of the wiring patterns corrected in this way is stored in the deformation information storing unit 4. The details of the image synthesizing unit 2 will be described with reference to
The image processing unit 13 according to the present invention may be configured with a computer or computing device. Further, processing in the image processing unit 13 may be executed using software. That is, software may be executed by a computer or may be installed in a LSI and processed by hardware.
A configuration example of a third example of a composite image creating device according to the present invention will be described with reference to
The deformation information generating unit 7 receives image capturing order information as input from the image capturing order information storing unit 3, receives image data as input from the image memory 1 and corrects wiring patterns in the joining areas of the divided images. Information of the wiring patterns corrected in this way is stored in the deformation information storing unit 4. Processing in the deformation information generating unit 7 is the same as processing of correcting the wiring patterns in the joining areas of the divided images in the image synthesizing unit 2 illustrated in
An example of data stored in the image capturing order information storing unit 3 and image capturing position information storing unit 5 illustrated in
The image capturing order information and image capturing position information table 303 includes the image capturing order, image file name and image capturing position. The image capturing order and image file name are associated one to one, so that it is possible to store image capturing order information and image capturing position information in one table 303. The image capturing order information storing unit 3 of the second example of the composite image creating device according to the present invention in
The deformation information table illustrated in
To create a deformation information table, an image of a test pattern is captured a plurality of times, and the length of each site of the test pattern images is measured per image capturing. To measure the length of each site, a point which serves as the reference is utilized as the center line of each pattern. On the basis of previous and current image capturing results, a difference value of measurement length values is acquired. This is the deformation amount. Deformation of the pattern is basically shrinkage.
An example of pattern deformation in a joining area will be described with reference to
A method of calculating a deformation amount will be described with reference to
A case will be described with reference to
A reference numeral 703 indicates captured images 711 and 712 of the areas 701 and 702. The image 711 includes a non joining area 711a and a joining area 711b, and the image 712 includes a non-joining area 712a and a joining area 712b. The joining area 712b of the image 712 is an image portion corresponding to the area b, and therefore is an image obtained upon second irradiation of the electron beam. Hence, there is a concern that, from the joining area 712b of the image 712, an image of the deformed pattern is obtained.
The two images 711 and 712 are joined to generate panoramic images 704 and 705. The panoramic image 704 is synthesized by joining the subsequently captured image 712 overlapping on the previously captured image 711. In this case, in a pasting area of the two images, the joining area 711b is removed and the joining area 712b is left. Hence, the panoramic image 704 includes the joining area 712b of the subsequently captured image 712. Therefore, it is necessary to correct the pattern in the joining area 712b of the image 712 upon joining.
The panoramic image 705 is synthesized by joining the previously captured image 711 overlapping on the subsequently captured image 712. In this case, in an overlapping area of the two images, the joining area 711b is left and the joining area 712b is removed. Hence, the panoramic image 705 includes the joining area 711b of the previously captured image 711. The panoramic image 705 includes images all of which are obtained by irradiation of the electron beam once. Therefore, it is not necessary to correct the pattern in the joining area of the two images upon joining.
With the example illustrated in
The two images 731 and 732 are joined to generate panoramic images 724 and 725. The panoramic image 724 is synthesized by joining the previously captured image 731 overlapping on the subsequently captured image 732. In this case, in an overlapping area of the two images, the joining area 731b is left and the joining area 732b is removed. Hence, the panoramic image 724 includes the joining area 731b of the previously captured image 731. The panoramic image 724 includes images all of which are obtained by irradiation of electron beam once. Therefore, it is not necessary to correct the pattern in the joining area of the two images upon joining. The panoramic image 725 is synthesized by joining the subsequently captured image 732 overlapping on the previously captured image 731. In this case, in an overlapping area of the two images, the joining area 731b is removed and the joining area 732b is left. Hence, the panoramic image 725 includes the joining area 732b of the subsequently captured image 732. Therefore, it is necessary to correct the pattern in the joining area 732b of the image 732 upon joining.
As described above, when two images are joined to generate a panoramic image, correction of a pattern in a joining area can be avoided by overlapping an image of an earlier image capturing order on an image of a later image capturing order and leaving the joining area of an image of an earlier image capturing order.
The relationship of the image capturing order and the number of times of irradiation of an electron beam in the joining area will be described with reference to
One image is generated by radiating an electron beam once. The images of nine areas “a” to “i” are captured in an alphabetical order to obtain nine images A to I. When the nine images A to I are generated in this way, irradiation of electron beam is performed once in non-overlapping areas 11, 12, 13, 21, 22, 23, 31, 32 and 33 in each of the areas a to i. However, irradiation of electron beam is performed twice on the overlapping areas 23, 25, 43, 45, 63, 65, 32, 34, 36, 52, 54 and 56. The irradiation of electron beam is performed four times on the overlapping areas 66, 70, 106 and 110.
In addition, the horizontal dimension of the non-overlapping area 11 of the upper left area a is Mx=Lx−Δx, and the vertical dimension is My=Ly−Δy. The horizontal dimension of the non-overlapping area 12 of the upper center area B is Nx=Lx−2Δx, and the vertical dimension is My=Ly−Δy. The horizontal dimension of the non-overlapping area 22 of the center area E is Nx=Lx−2Δx, and the vertical dimension is Ny=Ly−2Δy.
Of the upper left area a of the overlapping areas, the length of the overlapping area 32 extending in the horizontal direction is Mx, and the length of the overlapping area 23 extending in the vertical direction is My. Of the center overlapping area E, the lengths of the overlapping areas 34 and 54 extending in the horizontal direction is Nx, and the lengths of the overlapping areas 43 and 45 extending in the vertical direction is Ny. The horizontal dimensions of the four overlapping areas 66, 70, 106 and 110 are Δx, and the vertical dimensions are Δy.
A table 801 in
Processing of calculating the number of times of irradiation of the electron beam in each overlapping area in the composite image creating device according to the present invention will be described with reference to
In step S11, the index representing the number of times of irradiation in all joining areas is k=0, and the index representing the image capturing order is n=1. With the example in
In step S12, the image capturing position corresponding to the image capturing order n of the image capturing order information table is read. At the current point of time, n=1 and therefore the image capturing position of an image of the first image capturing order is read. By referring to the image capturing order information table 303 illustrated in
In step S13, 1 is added as the number of times of irradiation of the electron beam, to memory areas corresponding to all overlapping areas included in areas corresponding to the nth image capturing order in the image capturing order information table. With this example, 1 is added as the number of times of irradiation of the electron beam, to memory areas corresponding to the overlapping areas 23, 32 and 66 included in the area a.
In step S14, 1 is stored in the column of the number of times of irradiation of the electron beam corresponding to the overlapping areas 23, 32 and 66 included in the area a of the table in
In step S15, the index representing the image capturing order is increased by 1. That is, n=n+1. In step S16, whether or not the image capturing order n is greater than an image capturing order final value is decided. With this example, nine images are generated, and therefore the image capturing order final value is 9. When the image capturing order n is greater than the image capturing order final value, processing is finished, and, when the image capturing order n is equal to or less than the image capturing order final value, the step returns to step S12 and processings of step S12 to step S15 are repeated. Thus, when processing of step S15 is finished, the table illustrated in
An example of the image synthesizing unit 2 according to the present invention will be described with reference to
The corrected image selecting unit 21 may have selectors which are switched on the basis of the image capturing order. That is, two selectors are provided, and one of the selectors selects the image of the later image capturing order to output to the deformation correcting unit 22, and the other selector selects the image of the earlier image capturing order to output to the image pasting unit 23.
The deformation correcting unit 22 receives the image capturing order as input from the image capturing order information storing unit 3, receives information about deformation of a resist due to irradiation of the electron beam as input from the deformation information storing unit 4 and receives design data as input from the design data storing unit 6. Using these pieces of information, the deformation correcting unit 22 corrects the wiring pattern in the joining area of the image of the later image capturing order from the corrected image selecting unit 21. As described referring to
The image pasting unit 23 receives the image of the later image capturing order as input from the corrected image selecting unit 21, receives the corrected image of the image of the later image capturing order from the deformation correcting unit 22 and further receives the image capturing order from the image capturing order information storing unit 3. The image pasting unit 23 performs matching processing of images of joining areas for the two images, and detects joining positions to synthesize the images. The details of processing in the image pasting unit 23 will be described below with reference to
The deformation correcting unit 22 according to this example may correct the image of the later image capturing order such that the number of times of irradiation of the electron beam is the same in the joining areas of the two images. This correction is performed by calculating the difference in the number of times of irradiation of an electron beam in the joining areas between the image of the earlier image capturing order and the image of the later image capturing order. On the basis of the deformation amount corresponding to this difference, the image of the joining area may be corrected.
Although the deformation correcting unit 22 according to this example corrects the image of the later image capturing order, the deformation correcting unit 22 may correct an image 21a of the earlier image capturing order, too. That is, the pattern is corrected such that the number of times of irradiation of the electron beam is the same in the joining areas for both of the image of the earlier image capturing order and the image of the later image capturing order. For example, the pattern may be deformed such that the number of times of irradiation of the electron beam is one in the joining areas of the two images.
An example of the deformation correcting unit 22 of the image synthesizing unit 2 according to the present invention will be described with reference to
The joining area detecting unit 24 receives an image as input from the image memory 1, receives an image capturing order s0 from the image capturing order information storing unit 3 and detects joining areas in an image. The joining areas are image portions of areas which are likely to be irradiated with the electron beam a plurality of times. The joining area detecting unit 24 outputs image data s1 of the joining area to the pattern site detecting unit 25, correcting unit 26 and electron beam irradiation count calculating unit 27.
The pattern site detecting unit 25 receives image data s1 of the joining area as input from the joining area detecting unit 24, and receives design data from the design data storing unit 6. The pattern site detecting unit 25 detects a pattern site s2 and deformation amount measurement position s3 from image data s1 of the joining area, and outputs these to the correcting unit 26. The pattern site s2 and deformation amount measurement position s3 have been described with reference to
The correcting unit 26 receives as input the image data s1 of the joining area from the joining area detecting unit 24, the pattern site s2 and deformation amount measurement position s3 from the pattern site detecting unit 25, the deformation amount from the deformation information storing unit 4 and design data from the design data storing unit 6, and corrects the image data of the joining area to store in the image storing unit 28. The details of correction processing in the correcting unit 26 will be described with reference to
When there are a plurality of patterns in a joining area, this correction processing only needs to be repeated per pattern. With the second or subsequent correction processing, image data after correction processing may be read from the image storing unit 28 to overwrite only the corrected pattern site or paste the corrected pattern site on existing image data.
The electron beam irradiation count calculating unit 27 receives the image data s1 of a joining area as input from the joining area detecting unit 24, receives the image capturing order as input from the image capturing order information storing unit 3 and calculates the number of times of irradiation of the electron beam on each joining area. The electron beam irradiation count calculating unit 27 stores the number of times of irradiation of the electron beam in each joining area, in the table of the deformation information storing unit 4.
An example of the pattern site detecting unit 25 of the deformation correcting unit 22 of the image synthesizing unit 2 according to the present invention will be described with reference to
By contrast with this, the template pattern generating unit 255 reads design data corresponding to the joining area from the design data storing unit 6, and creates a template image from the pattern in the joining area. The template pattern generating unit 255 outputs the template image to the deformation information storing unit 4 and expansion processing unit 256. The expansion processing unit 256 expands the template image by expansion processing.
The matching processing unit 254 matches the binarized and expanded data obtained from the image data s1 of the joining area, and expanded data of the template image to detect the pattern site. The matching processing unit 254 outputs the pattern site s2 and deformation amount measurement position s3 to the correcting unit 26.
The matching processing unit 254 may use a matching which uses normalization correlation processing. However, the matching processing unit 254 according to this example matches binarized images. Hence, by simply finding the matching number of black pixels and white pixels and comparing the number with a predetermined threshold, whether or not a pattern is the same as the pattern of the template image may be decided.
The smoothing processing unit 251 according to this example may smooth input data using a Gaussian filter. The binarization processing unit 252 may binarize input data by common binarization processing. That is, a pixel value greater than the threshold is 1, and a pixel value smaller than the threshold is 0. The expansion processing units 253 and 256 may binarize input data by common expansion processing. For example, when the number of black pixels is one, all eight pixels adjacent around this black pixel are made black. By repeating this processing, the pattern is expanded.
An example of a method of generating a template pattern in the template pattern generating unit 255 will be described with reference to
An example of the correcting unit 26 of the deformation correcting unit 22 of the image synthesizing unit 2 according to the present invention will be described with reference to
The area copy deforming unit 264 receives as input image data of the pattern area from the area extracting unit 263, information about deformation of the resist due to irradiation of the electron from the deformation information storing unit 4 and image data s1 of the joining area from the joining area detecting unit 24, and copies and corrects a pattern image. The details of the area copy deforming unit 264 will be described with reference to
An example of the area extracting unit 263 of the correcting unit 26 of the deformation correcting unit 22 of the image synthesizing unit 2 according to the present invention will be described with reference to
The closed figure filling unit 267 receives as input the connection component of the black pixel from the connection component extracting unit 266, design data from the design data storing unit 6 and the pattern site s2 and deformation amount measurement position s3 from the pattern site detecting unit 25, creates a closed figure and fill inside the closed figure. The expansion processing unit 268 expands the filled closed figure. Expansion processing in the expansion processing unit 268 may be the same as the expansion processing in the expansion processing units 253 and 256 of the pattern site detecting unit 25 which has been described with reference to
An example of the closed figure filling unit 267 of the area extracting unit 263 of the correcting unit 26 of the deformation correcting unit 22 of the image synthesizing unit 2 according to the present invention will be described with reference to
The connection component selecting unit 2671 receives as input the connection component of the black pixel of binarized data of image data s1 of the joining area from the connection component extracting unit 266, and the pattern site s2 and deformation amount measurement position s3 from the pattern site detecting unit 25. The connection component selecting unit 2671 selects a connection component including a correction target pattern 1601 among connection components received as input from the connection component extracting unit 266, for example, as follows. The connection component selecting unit 2671 first finds the distance between each pixel of a connection component 1603 and a pixel position at which the correction target pattern 1601 exists. On the basis of this distance, a connection component 1604 including a pixel closest to the pixel position at which the corrected target pattern 1601 exists is selected in the connection component 1603.
The closed figure generating unit 2672 generates a closed figure formed with a connection component including the correction target pattern in the connection component selected by the connection component selecting unit 2671.
The filling unit 2673 fills the closed figure with black. Thus, as illustrated in
An example of the area copy deforming unit 264 of the correcting unit 26 of the deformation correcting unit 22 of the image synthesizing unit 2 according to the present invention will be described with reference to
Image data of the pattern area received as input from the area extracting unit 263 is a closed figure filled with black as illustrated in
The bilinear interpolating unit 2642 receives image data s1 of the joining area corresponding to the pattern area as input from the storing unit 2643, and receives resist deformation information as input from the deformation information storing unit 4. The bilinear interpolating unit 2642 corrects, that is, expands image data s1 of the joining area by bilinear interpolation using deformation information.
Bilinear interpolation processing in the bilinear interpolating unit 2642 will be described with reference to
The synthesizing unit 232 receives as input position information from the matching processing unit 231, the image of the later image capturing order from the corrected image selecting unit 21, the image of the later image capturing order from the deformation correcting unit 22 and the image capturing order from the image capturing order information storing unit 3. The synthesizing unit 232 joins and synthesizes two images on the basis of position information detected in the matching processing unit 231.
A method of joining processing in the image pasting unit 23 of the image synthesizing unit 2 according to the present invention will be described with reference to
Here, images of the area a, area d, area b and area c are captured in this order. The numbers added to alphabets represent the image capturing order. After images of all areas are captured, the number of times of irradiation of the electron beam is two in a long and thin overlapping area, and the number of times of irradiation of the electron beam is four in the center square overlapping area.
Four images 2202 are obtained by sequentially capturing images of the area a, area d, area b and area c. The numbers added to alphabets represent the image capturing order. As illustrated in
Next, the joining order will be described. As described above, generally for an earlier image capturing order, an image with less irradiation of electron beam can be obtained. As described above, when two images are joined, the lower joining area is removed and upper joining area is left in the joining areas to be overlapped. A panoramic image can include the images of areas which are obtained with less irradiation of electron beam by arranging the image of the later image capturing order on the lower side, arranging the image of the earlier image capturing order on the upper side and leaving the joining area of the image of the earlier image capturing order in the joining area.
The panoramic image 2203 is obtained by overlapping and synthesizing the joining areas of the image A, image B, image D and image C in this order. The numbers added to alphabets represent the overlapping order. The panoramic image 2204 represents the number of times of irradiation of the electron beam in each joining area of the panoramic image 2203. Images with irradiation of electron beam twice are obtained in long and thin joining areas, an image with irradiation of electron beam four times is obtained in a square joining area, and an image with irradiation of electron beam once is obtained in the other areas.
The panoramic image 2205 is obtained by overlapping and synthesizing the joining areas of the image C, image D, image B and image A in this order. The numbers added to alphabets represent the overlapping order. The panoramic image 2206 represents the number of times of irradiation of the electron beam in each joining area of the panoramic image 2205. In the long and thin joining area between the image B and image D, images with irradiation of electron beam twice are obtained, and, in the other area, an image with irradiation of electron beam once is obtained.
The panoramic image 2207 is obtained by overlapping and synthesizing the joining areas of the image C, image B, image D and image A in this order. The numbers added to alphabets represent the overlapping order. The panoramic image 2208 represents the number of times of irradiation of the electron beam in each joining area of the panoramic image 2207. Images with irradiation of electron beam once are obtained in all areas. The overlapping order of four images in the panoramic image 2207 is just opposite to the image capturing order in the four areas a to d in the image capturing target 2201 upon comparison. That is, images only need to be joined according to the order opposite to the image capturing order.
Joining processing according to the present invention will be described with reference to
According to the joining process of the present embodiment, coordinates of all joining areas will be first calculated on the basis of the position coordinates of an area. Next, all images are pasted on the basis of the image capturing order.
In step S21, the joining position coordinate of each image is initialized, in step S22, the first image of the images 1 to 16 corresponding to the areas 1 to 16 is read, in step S23, the second image 2 is read and, in step S24, positions of the joining areas of the first image 1 and second image 2 are calculated and the coordinates are stored.
In step S25, whether or not the current image is the final image, and, if the image is not final, an image of one subsequent order is read. Step S25 and step S26 are repeated in this way to find the position of the joining area of the final image.
In step S27, joining coordinates of the images 1 to 16 are mapped on a composite image area. Although the position of a joining area between two images is calculated, by repeating this calculation, it is possible to obtain a coordinate value of each joining area when all images are arranged in the composite image area.
For example, description will be made using only the x direction. The dimension of each pixel in the x direction is 100 pixels. The upper left of the image 1 is aligned to the original point (x=0) of the composite image area. The joining area between the image 1 and image 2 is between the 80th pixel and 100th pixel of the image 1. The image 2 is between 80th pixel and 180th pixel. The joining area between the image 2 and image 3 is between 70th pixel and 100th pixel of the image 2. The image 3 is between the 150th pixel (80 pixels+70 pixels) and 250th pixel. The joining positions of the images 1 to 16 in the composite image area are represented by the coordinates at the left end of each image from the original point in the x direction. For example, the joining position of the image 1 is 0, the joining position of the image 2 is 80 and the joining position of the image 3 is 150. The joining positions of the images 1 to 16 are calculated as mapped coordinates. Next, images are joined using the image capturing order.
In step S31, determination flags of all images of the composite image area are cleared to 0.1 is set as the image capturing order. In step S32, an image corresponding to a value set in the image capturing order is read. The image of the image capturing order 1 is read. In step S33, a joining position corresponding to the read image in the composite image area is read. In step S34, images are written only in pixels in which the determination flags are 0 from the joining position. When an image corresponding to the image capturing order 1 is written, all determination flags are 0. Then, all pixels of the image area corresponding to the image capturing order 1 are written. In step S35, 1 is written in determination flags corresponding to all pixel positions written in step S34. This is processing which prevents overwriting. Images are written in all image areas corresponding to the image capturing order 1, so that 1 is written as the determination flag in all image areas corresponding to the image capturing order 1.
With the present embodiment, when images are written in pixels, the determination flags are 1 and these pixels are not overwritten thereafter. Thus, images are written according to the image capturing order, written images are not overwritten and the first written image, that is, an image of the earliest image capturing order is left.
In step S36, the image capturing order is added by 1, and, in step S37, whether or not the image capturing order is greater than the final value is decided. When the image capturing order is greater than the final value, processing is finished, and when the image capturing order is equal to or less than the final value, processings of step S32 to step S37 are repeated. When the image capturing order is a final value, the image after synthesis is finished, and any pixel becomes data regarding irradiation of electron beam once.
With the above embodiment, a pattern for forming a panoramic image is extracted from image capturing areas in which the beam irradiation amount is the least to form an image of the earlier order, that is, panoramic image. Hereinafter, a method of extracting a panoramic image forming pattern will be described on the basis of the other criterion.
Another example of panoramic image synthesis will be described with reference to
With the present embodiment, from the view point of pattern deformation, a pattern edge included in the first image capturing area 2505 is preferably left for, for example, a pattern 2505 or pattern 2506. However, this is not necessarily the case for, for example, a pattern 2507. For example, the most part of the pattern 2507 is included in the second image capturing area 2502, and only small part of the pattern 2507 is included in the overlapping area 2511 in which the first image capturing area 2501 and second image capturing area 2502 overlap. In this case, part of the pattern 2507 included in the overlapping area 2511 is extracted from the second image capturing area 2502. Consequently, it is possible to acquire a pattern image of less connection parts for the entire pattern 2507.
If the influence such as pattern deformation based on repetition of beam scan is little, there are cases where it is desirable to extract pattern in one field of view (image capturing area). For example, a case will be assumed where the dimension of the pattern 2507 from the left to the right end is measured. Preferably, there is no pattern connection part between one end and the other end which serve as the measurement criterion. Hence, a flag is set in the pattern 2507, and an algorithm of determining image capturing areas is preferably set such that the number of times of connection of this pattern 2507 is as small as possible. By contrast with this, when a gap dimension between the pattern 2506 and pattern 2510 is measured, the gap portion is preferably in one image capturing area. In this case, two patterns are preferably extracted from the first image capturing area 2501. Further, when these image capturing areas from which patterns need to be extracted are determined, exposure simulation may be performed for design data of the pattern. Exposure simulation changes the pattern. Then, an image capturing area is selected such that, for example, a portion in which a dimension value of a pattern is greater than a predetermined value or a portion in which an inter-pattern distance is smaller than a predetermined value is settled in one field of view (image capturing area). In this case, an algorithm is required which determines a field of view (image capturing area) for which a pattern needs to be extracted, on the basis of a decision criterion different from the image capturing order.
Further, in one field of view (image capturing area), when an occupied area to which a certain pattern belongs or a ratio of the pattern area is a predetermined area or more, a field of view (image capturing area) for which a pattern needs to be extracted may be determined on the basis of a decision criterion different from the image capturing order. For example, when occupied areas in the image capturing area 2501 and image capturing area 2502 are compared for the pattern 2507, most of the pattern 2507 is included in the image capturing area 2502. In this case, a pattern only needs to be extracted from the image capturing area 2502. Consequently, it is possible to form for the pattern 2507 an image of a very small connection part.
In design data of a semiconductor device, information related to the size and shape of a pattern is recorded. Consequently, it is possible to set an image capturing area on layout data of design data. Consequently, it is possible to calculate, for example, the area of the pattern included in the image capturing area or overlapping area. This calculation result can be used as a decision criterion different from the above image capturing order.
Further, the position which needs to be measured may be configured to be set in advance on the basis of design data. By this means, it is possible to automatically make the above decision. For the patterns 2508 and 2509, as long as there is no other condition, a field of view (image capturing area) for extracting a pattern according to an image capturing order is preferably selected.
By contrast with this, in case of the pattern 2510, part of the pattern 2510 is in the overlapping area 2515 across four image capturing areas. In this case, if deformation of a pattern needs to be avoided as much as possible, a pattern is extracted from the image capturing area 2501 for the portion to which the overlapping area 2511 belongs, and a pattern is extracted from the image capturing area 2502 for the other portion to synthesize the portions. Further, if a pattern needs to be extracted only from one image capturing area, a pattern only needs to be extracted from the image capturing area 2502. The condition to be set changes depending on the type of a pattern or the measurement purpose of the user of the electron scanning microscope, and therefore is preferably set randomly.
The pattern is extracted from the image capturing area selected in this way to form a joining pattern (S2607). Next, whether or not there is a pattern for which a joining pattern is not formed is decided (S2608). When there is a pattern for which a joining pattern is not formed, processings in S2604 to S2607 are performed again. When there is no longer a pattern for which a joining pattern is not formed, a panoramic image is finally finished (S2609). According to the above configuration, it is possible to automatically determine an image capturing area from which a pattern needs to be extracted, on the basis of various conditions.
Although the embodiment of the present invention has been described, one of ordinary skill in the art would easily understand that the present invention is by no means limited to the above example, and can be variously changed within the range disclosed in the claims.
Number | Date | Country | Kind |
---|---|---|---|
2009-091390 | Apr 2009 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2010/056062 | 4/2/2010 | WO | 00 | 12/23/2011 |