The present disclosure generally relates to the field of charged particle beam systems and, more particularly, to a method and an apparatus for monitoring beam profile and beam power of a laser beam used in a charged particle beam system.
In manufacturing processes of integrated circuits (ICs), unfinished or finished circuit components are inspected to ensure that they are manufactured according to design and are free of defects. An inspection system utilizing an optical microscope typically has resolution down to a few hundred nanometers; and the resolution is limited by the wavelength of light. As the physical sizes of IC components continue to reduce down to sub-100 or even sub-10 nanometers, inspection systems capable of higher resolution than those utilizing optical microscopes are needed.
A charged particle (e.g., electron) beam microscope, such as a scanning electron microscope (SEM) or a transmission electron microscope (TEM), capable of resolution down to less than a nanometer, serves as a practicable tool for inspecting IC components having a feature size that is sub-100 nanometers. With a SEM, electrons of a single primary electron beam, or electrons of a plurality of primary electron beams, can be focused at locations of interest of a wafer under inspection. The primary electrons interact with the wafer and may be backscattered or may cause the wafer to emit secondary electrons. The intensity of the electron beams comprising the backscattered electrons and the secondary electrons may vary based on the properties of the internal and external structures of the wafer, and thereby may indicate whether the wafer has defects.
Embodiments consistent with the present disclosure include systems, methods, and non-transitory computer-readable mediums for monitoring a beam in an inspection system. The system includes an image sensor collect a sequence of images of a beam spot of a beam formed on a surface. Each image of the sequence of images has been collected at a different exposure time. of the image sensor. The system also includes a controller configured to combine the sequence of images to obtain a beam profile of the beam.
Embodiments consistent with the present disclosure include systems, methods, and non-transitory computer-readable mediums for monitoring a beam in an inspection system. The system includes an image sensor configured to collect an image of a beam spot of a beam formed on a surface, and a controller configured to transform coordinates of the image based on positions of the beam and the image sensor with respect to the beam spot and a magnification factor of an optical system arranged between the image sensor and the surface.
Embodiments consistent with the present disclosure include systems, methods, and non-transitory computer-readable mediums for monitoring a beam in an inspection system. The system includes an image sensor configured to collect a plurality of images of a beam spot of a beam formed at different locations on a surface, and a controller configured to generate an averaged image based on the plurality of images.
Embodiments consistent with the present disclosure include systems, methods, and non-transitory computer-readable mediums for monitoring a beam in an inspection system. The system includes an image sensor configured to collect an image of a beam spot of a beam formed on a surface, and a controller configured to obtain a beam profile of the beam based on the image, obtain a total grey level of the beam spot based on the beam profile, and determine a power of the beam based on a predetermined relationship between the total grey level and the power.
Additional objects and advantages of the disclosed embodiments will be set forth in part in the following description, and in part will be apparent from the description, or may be learned by practice of the embodiments. The objects and advantages of the disclosed embodiments may be realized and attained by the elements and combinations set forth in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosed embodiments, as claimed.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise represented. The implementations set forth in the following description of exemplary embodiments do not represent all implementations consistent with the invention. Instead, they are merely examples of apparatuses and methods consistent with aspects related to the invention as recited in the appended claims.
The enhanced computing power of electronic devices, while reducing the physical size of the devices, can be accomplished by significantly increasing the packing density of circuit components such as, transistors, capacitors, diodes, etc. on an IC chip. For example, in a smart phone, an IC chip (which is the size of a thumbnail) may include over 2 billion transistors, the size of each transistor being less than 1/1000th of a human hair. Not surprisingly, semiconductor IC manufacturing is a complex process, with hundreds of individual steps. Errors in even one step have the potential to dramatically affect the functioning of the final product. Even one “killer defect” can cause device failure. The goal of the manufacturing process is to improve the overall yield of the process. For example, for a 50-step process to get 75% yield, each individual step must have a yield greater than 99.4%, and if the individual step yield is 95%, the overall process yield drops to 7%.
Defects may be generated during various stages of semiconductor processing. For the reason stated above, it is important to find defects accurately and efficiently as early as possible. A charged particle (e.g., electron) beam microscope, such as a scanning electron microscope (SEM), is a useful tool for inspecting semiconductor wafer surfaces to detect defects. During operation, the charged particle beam microscope scans a primary charged-particle beam, such as an electron beam (e-beam), over a semiconductor wafer held on a stage, and generates an image of the wafer surface by detecting a secondary charged-particle beam reflected from the wafer surface. When the charged-particle beam scans the wafer, charges may be accumulated on the wafer due to large beam current, which may negatively affect the quality of the image. To regulate the accumulated charges on the wafer, an Advanced Charge Controller (ACC) module is employed to illuminate a light beam, such as a laser beam, on the wafer, so as to control the accumulated charges due to photoconductivity and/or photoelectric effect. It is thus important to monitor the power and quality of the light beam, so as to effectively control the accumulated charges.
Conventionally, the power and quality of a light beam are monitored by a power meter and a beam profiler. However, during operation of a charged particle beam microscope, the microscope, the ACC module that illuminates the light beam, and the stage that holds the semiconductor wafer are disposed in a vacuum chamber. Due to the limited space inside the vacuum chamber, the power meter and the beam profiler cannot be disposed in the vacuum chamber. As a result, the power meter and the beam profiler cannot be used to monitor power and quality of the light beam during the operation of a charged particle beam microscope.
To monitor the light beam in a limited space in a vacuum chamber, the disclosed system uses an image sensor, which is already included in the charged particle beam microscope for other purposes (e.g., observing wafer surface, etc.), to monitor the power and profile of the light beam emitted from the ACC module.
Due to the high intensity of the light beam (e.g., laser beam), an image sensor may not have a sufficiently large dynamic range to capture the complete beam profile of the light beam. In order to solve this problem, the disclosed systems can configure the image sensor to collect during different exposure times a sequence of images of a beam spot formed on the semiconductor wafer, resulting in a set of images capturing the complete beam profile. As a result, even if the image sensor has a dynamic range that is not sufficiently large to capture the complete beam profile of the light beam, the controller can still obtain the complete beam profile based on the partial beam files obtained from the sequence of images.
Due to limited space inside the vacuum chamber of the microscope, the image sensor may not be able to collect images from a top side of a semiconductor wafer. Thus, the shape of the beam spot in the image taken by the image sensor may not be a real beam projection on the wafer. To solve this problem, the disclosed systems can transform coordinates of an image based on positions of the light and the image sensor with respect to the beam spot and a magnification factor of an optical system arranged between the image sensor and the wafer surface. As a result, a projection of the light beam on the surface may be obtained, and an interaction between the light beam and the surface can be accurately evaluated.
In some instances, the ACC module, the beam spot, and the image sensor may not be in the same plane. Therefore, the light beam cannot be directly reflected into the image sensor. In order to solve this problem, the disclosed system can emit the light beam onto a relatively rough surface, such as a corner of a wafer stage, and the image sensor may be configured to collect light scattered or diffracted from this rough surface. In this case, due to the rough surface, a beam profile obtained from the captured image is also rough. In order to solve this problem, the disclosed systems can configure the image sensor to capture a plurality of images of the beam spot formed at different locations on the surface. Using these plurality of images, the disclosed systems can generate an averaged image based on the plurality of images, and use the averaged image as the collected image for the at least one exposure time. As a result, the beam profile obtained based on the averaged image may be smooth, even if the surface on which the beam spot is formed is relatively rough.
In some instances, the disclosed system can emit the light beam onto a relatively smooth and mirror-like surface, and the light beam can be directly reflected into the image sensor. In this case, a single beam spot image captured by the image sensor can be used to obtain a beam profile, and it is not necessary to generate an averaged image based on a plurality of images of the beam spot formed at different locations on the surface.
Moreover, the disclosed systems can obtain a total grey level of the beam spot, and determine a power of the light beam based on a predetermined relationship between the total grey level and the power. The disclosed systems can control a power of the ACC module based on the determined power of the light beam, in order to obtain a desired power of the light beam.
One or more robotic arms (not shown) in EFEM 106 may transport the wafers to load/lock chamber 102. Load/lock chamber 102 is connected to a load/lock vacuum pump system (not shown) which removes gas molecules in load/lock chamber 102 to reach a first pressure below the atmospheric pressure. After reaching the first pressure, one or more robotic arms (not shown) may transport the wafer from load/lock chamber 102 to main chamber 101. Main chamber 101 is connected to a main chamber vacuum pump system (not shown) which removes gas molecules in main chamber 101 to reach a second pressure below the first pressure. After reaching the second pressure, the wafer is subject to inspection by electron beam tool 104. Electron beam tool 104 may be a single-beam system or a multi-beam system. A controller 109 is electronically connected to electron beam tool 104. Controller 109 may be a computer configured to execute various controls of EBI system 100. While controller 109 is shown in
As shown in
There may also be provided an image processing system 199 that includes an image acquirer 200, a storage 130, and controller 109. Image acquirer 200 may comprise one or more processors. For example, image acquirer 200 may comprise a computer, server, mainframe host, terminals, personal computer, any kind of mobile computing devices, and the like, or a combination thereof. Image acquirer 200 may connect with detector 144 of electron beam tool 104 through a medium such as an electrical conductor, optical fiber cable, portable storage media, IR, Bluetooth, internet, wireless network, wireless radio, or a combination thereof. Image acquirer 200 may receive a signal from detector 144 and may construct an image. Image acquirer 200 may thus acquire images of wafer 150. Image acquirer 200 may also perform various post-processing functions, such as generating contours, superimposing indicators on an acquired image, and the like. Image acquirer 200 may be configured to perform adjustments of brightness and contrast, etc. of acquired images. Storage 130 may be a storage medium such as a hard disk, random access memory (RAM), cloud storage, other types of computer readable memory, and the like. Storage 130 may be coupled with image acquirer 200 and may be used for saving scanned raw image data as original images, and post-processed images. Image acquirer 200 and storage 130 may be connected to controller 109. In some embodiments, image acquirer 200, storage 130, and controller 109 may be integrated together as one control unit.
In some embodiments, image acquirer 200 may acquire one or more images of a sample based on an imaging signal received from detector 144. An imaging signal may correspond to a scanning operation for conducting charged particle imaging. An acquired image may be a single image comprising a plurality of imaging areas that may contain various features of wafer 150. The single image may be stored in storage 130. Imaging may be performed on the basis of imaging frames.
The condenser and illumination optics of the electron beam tool may comprise or be supplemented by electromagnetic quadrupole electron lenses. For example, as shown in
Although
For example, reference is now made to
Electron source 202, gun aperture 204, condenser lens 206, source conversion unit 212, beam separator 222, deflection scanning unit 226, and objective lens 228 may be aligned with a primary optical axis 260 of apparatus 104. Secondary optical system 242 and electron detection device 244 may be aligned with a secondary optical axis 252 of apparatus 104.
Electron source 202 may comprise a cathode, an extractor or an anode, wherein primary electrons can be emitted from the cathode and extracted or accelerated to form a primary electron beam 210 with a crossover (virtual or real) 208. Primary electron beam 210 can be visualized as being emitted from crossover 208. Gun aperture 204 may block off peripheral electrons of primary electron beam 210 to reduce size of probe spots 270, 272, and 274.
Source conversion unit 212 may comprise an array of image-forming elements (not shown in
Condenser lens 206 may focus primary electron beam 210. The electric currents of beamlets 214, 216, and 218 downstream of source conversion unit 212 may be varied by adjusting the focusing power of condenser lens 206 or by changing the radial sizes of the corresponding beam-limit apertures within the array of beam-limit apertures. Condenser lens 206 may be a moveable condenser lens that may be configured so that the position of its first principle plane is movable. The movable condenser lens may be configured to be magnetic, which may result in off-axis beamlets 216 and 218 landing on the beamlet-limit apertures with rotation angles. The rotation angles change with the focusing power and the position of the first principal plane of the movable condenser lens. In some embodiments, the moveable condenser lens may be a moveable anti-rotation condenser lens, which involves an anti-rotation lens with a movable first principal plane. Moveable condenser lens is further described in U.S. Publication No. 2017/0025241, which is incorporated by reference in its entirety.
Objective lens 228 may focus beamlets 214, 216, and 218 onto a wafer 230 for inspection and may form a plurality of probe spots 270, 272, and 274 on the surface of wafer 230.
Beam separator 222 may be a beam separator of Wien filter type generating an electrostatic dipole field and a magnetic dipole field. In some embodiments, if they are applied, the force exerted by electrostatic dipole field on an electron of beamlets 214, 216, and 218 may be equal in magnitude and opposite in direction to the force exerted on the electron by magnetic dipole field when the electrons of the beam are traveling at a particular velocity. Beamlets 214, 216, and 218 can therefore pass straight through beam separator 222 with zero deflection angle. However, the total dispersion of beamlets 214, 216, and 218 generated by beam separator 222 may also be non-zero such as when the electrons of the beam are traveling at a velocity other than the particular velocity. Beam separator 222 may separate secondary electron beams 236, 238, and 240 from beamlets 214, 216, and 218 and direct secondary electron beams 236, 238, and 240 towards secondary optical system 242.
Deflection scanning unit 226 may deflect beamlets 214, 216, and 218 to scan probe spots 270, 272, and 274 over a surface area of wafer 230. In response to incidence of beamlets 214, 216, and 218 at probe spots 270, 272, and 274, secondary electron beams 236, 238, and 240 may be emitted from wafer 230. Secondary electron beams 236, 238, and 240 may comprise electrons with a distribution of energies including secondary electrons and backscattered electrons. Secondary optical system 242 may focus secondary electron beams 236, 238, and 240 onto detection sub-regions 246, 248, and 250 of electron detection device 244. Detection sub-regions 246, 248, and 250 may be configured to detect corresponding secondary electron beams 236, 238, and 240 and generate corresponding signals used to reconstruct an image of surface area of wafer 230.
Conventionally, the power and quality of a light beam are monitored by a power meter and a beam profiler. However, an inspection system 300 may not be provided with a power meter or a beam profiler. As a result, power and quality of the light beam may not be monitored in situ, i.e., during the operation of inspection system 300.
According to embodiments of the present disclosure, beam monitoring system 350 may be included in inspection system 300 to monitor the power and profile of light beam 322 emitted from ACC module 320. Beam monitoring system 350 may include an image sensor 360, an optical system 370 disposed between wafer 340 and image sensor 360, and a controller 380. Optical system 370 may include one or more optical lenses that are configured to focus a secondary beam 324 onto image sensor 360. Secondary beam 324 may include light beam scattered from the wafer surface, or light beam diffracted from the wafer surface, or a combination of light beam scattered from the wafer surface and light beam diffracted from the wafer surface. Image sensor 360 may be a Charge-Coupled Device (CCD) camera or a Complementary Metal-Oxide-Semiconductor (CMOS) sensor that detects secondary beam 324 to form an image of secondary beam 324. Controller 380 may be a computer configured to receive the image of secondary beam 324 from image sensor 360 and, based on the image of secondary beam 324, obtain a beam profile and beam power of light beam 322 emitted from ACC module 320. In addition, controller 380 may be configured to control ACC module 320 based on the beam profile and beam power of light beam 322. For example, based on the beam power of light beam 322, controller 380 may automatically adjust a working current of the ACC emitter included in ACC module 320 to keep an output power of the ACC emitter at a target power or to remain stable. Meanwhile, based on the beam profile of light beam 322, controller 380 may adjust a beam shaper included in ACC module 320 to obtain a desired shape of the beam profile or a desired power distribution. Moreover, based on the beam power of light beam 322, controller 380 may be configured to monitor a location of the beam spot formed by light beam 322 on the wafer surface. In case of misalignment, controller 380 may be configured to re-align the beam spot to a field of view of electron beam tool 310. That is, controller 380 may control the ACC emitter to emit light beam 322 to a position on wafer 340 that is irradiated by primary electron beam 312.
In some embodiments, such as the embodiment illustrated in
In some embodiments, such as the embodiments illustrated in
As shown in
At step 420, the controller may transform coordinates of the beam spot image to obtain a Top View of the beam spot formed on the sample surface. The coordinate transformation may be performed based on the positions and layouts of optical system 370, ACC module 320, and image sensor 360. For example, controller 380 may transform the coordinates of at least one pixel in the beam spot image based on the following equation:
where x′ and y′ are the cartesian coordinates on the wafer, x and y are the coordinates in the beam spot image captured by image sensor 360, m is the magnification of the optical system 370, and θ is the angle of image sensor 360 with respect to the sample surface, as illustrated in
As described previously, in some embodiments, the images are taken on a relatively rough surface, such as the surface of stage 330 illustrated in
To minimize or eliminate the influences of the rough surface on the beam profile, according to embodiments of the present disclosure, image sensor 360 may be configured to capture a plurality of beam spot images at different locations on a sample surface, and controller 380 may obtain an averaged image based on the plurality of images. This method is referred to as a locations average method.
As shown in
Referring back to
Typically, an image sensor has a dynamic range of 8 bits, 10 bits, or 16 bits. Because a light beam (e.g., laser beam) may have a high intensity, or the beam intensity may vary dramatically from the center to the edge of a light beam, it may be hard to obtain the entire beam profile (i.e., entire intensity distribution) of the light beam by an image sensor due to the limited dynamic range of the image sensor. According to some embodiments of the present disclosure, to overcome the limited dynamic range of image sensors and to avoid using over-exposure pixel information, the complete beam profile may be restored by using images of a beam spot taken at different exposure times. As used herein, the exposure time represents the length of time when a film or a digital sensor in an image sensor is exposed to light. This method may be referred to as a dynamic range extension method.
As shown in
At step 920, the controller may adjust at least one beam spot image by using a grey level magnification factor for the at least one beam spot image. Here, in order to overcome limited dynamic range of the image sensor, the controller may first sort out useful pixel information by selecting pixels (also referred to as “pixels containing useful information”) within or without a dynamic range from all of the pixels in the image. The pixels without useful information (also referred to as “un-useful pixels”) may be discarded or ignored by the controller. In other words, in some embodiments, only pixels with useful information are used in processing. The controller may adjust the beam spot image by, for example, multiplying a grey level of each pixel containing useful information in the beam spot image by a grey level magnification factor for the image.
In some embodiments, the controller may determine a grey level magnification factor for a beam spot image based on an exposure time at which the beam spot image is captured. The grey level magnification factor may be inversely related to the exposure time. The longer the exposure time, the lower the grey level magnification factor is; and vice versa. For example, a grey level magnification factor for a first selected beam spot image may be calculated by dividing a first exposure time at which the first selected beam spot image is captured by a second exposure time at which a second selected beam spot image is captured.
In some other embodiments, the controller may determine a grey level magnification factor based on the grey levels of the same pixel at different exposure times. For example, the controller may determine a grey level magnification factor associated with a beam spot image by comparing grey levels of the pixels in the beam spot image with grey levels of the pixels in other beam spot images.
Specifically,
In the first exemplary method, the controller may determine an individual grey level magnification factor for each overlapping pixel pair based on a ratio of the grey levels of the pair of overlapping pixels. For example, for an overlapping pixel pair consisting of a first pixel from the first beam spot image and a second pixel from the second beam spot image, the controller may determine the individual grey level magnification factor by, for example, dividing the grey level of the second pixel by the grey level of the first pixel. The controller may then determine the grey level magnification factor associated with the first beam spot image by, for example, averaging the individual grey level magnification factors of all of the plurality of overlapping pixel pairs.
In the second exemplary method, assuming that the plurality of overlapping pixel pairs consist of a first set of pixels from the first beam spot image and a second set of pixels from the second beam spot image, the controller may obtain a first total grey level by summing the grey levels of the first set of pixels, and obtain a second total grey level by summing the grey levels of the second set of pixels. The controller may then determine the grey level magnification factor associated with the first beam spot image by dividing the first total grey level by the second total grey level.
After determining the grey level magnification factor associated with the first beam spot image, the controller may adjust the first beam spot image by, for example, multiplying a grey level of each pixel containing useful information in the first beam spot image by the grey level magnification factor. The controller may then combine the grey levels in the adjusted first beam spot image and the grey levels in the second beam spot image to obtain a beam profile.
Referring back to
In the example illustrated in
According to some embodiments of the present disclosure, the power of a light beam (“beam power”) can also be determined based on a beam spot image captured by an image sensor. This method is referred to as a power calibration method.
As shown in
The total grey level GL can be written as:
GL=Power*β*η
where Power is the power of the beam spot from on a diffraction surface, which could be measured by a power meter, β is an energy diffraction efficiency onto the image sensor, η is a power to grey level conversion ratio. The β*η represents a relationship between the Power and the total grey level GL. The β*η may be calibrated before operation of an inspection system (e.g., inspection system 100) and may be stored in a memory.
For example, to calibrate the β*η, the controller may first obtain an image of a sample beam spot (hereinafter referred to as a “sample beam spot image”) of a sample light beam captured by an image sensor, such as image sensor 360 of
According to the above disclosed embodiments, a beam monitoring system for monitoring a light beam emitted from an ACC module in an inspection system may include an image sensor configured to collect a sequence of images of a beam spot formed by the light beam at different exposure times of the image sensor, and a controller configured to combine the sequence of images to obtain a beam profile of the light beam. As a result, even if the image sensor has a dynamic range that is not sufficiently large to capture the complete beam profile of the light beam, the controller can still obtain the complete beam profile based on the partial beam files obtained from the sequence of images.
In addition, the controller may be configured to, for at least one of the sequence of images, transform coordinates of the image based on positions of the light and the image sensor with respect to the beam spot and a magnification factor of an optical system arranged between the image sensor and the surface. As a result, a projection of the light beam on the surface may be obtained, and an interaction between the light beam and the surface can be accurately evaluated.
Moreover, the image sensor may be configured to, at least one exposure time, collect a plurality of images of the beam spot formed at different locations on the surface, and the controller may be configured to generate an averaged image based on the plurality of images and use the averaged image as the collected image for the at least one exposure time. As a result, the beam profile obtained based on the averaged image may be smooth, even if the surface on which the beam spot is formed is relatively rough.
Furthermore, the controller may be configured to obtain a total grey level of the beam spot by summing the grey levels of all pixels in the beam profile and determine a power of the light beam based on a predetermined relationship between the total grey level and the power. As a result, the beam power may be obtained in situ while the inspection system is operating, without using a power meter.
The embodiments may further be described using the following clauses:
1. A system for monitoring a beam in an inspection system, comprising:
2. The system of clause 1, wherein the controller is further configured to adjust an image of the sequence of images by using a grey level magnification factor associated with the image.
3. The system of clause 2, wherein the controller is further configured to adjust the image by multiplying a grey level of each pixel in the image that contains useful information by the grey level magnification factor associated with the image.
4. The system of either one of clauses 2 and 3, wherein the controller is configured to determine the grey level magnification factor associated with the image based on an exposure time at which the image is collected.
5. The system of either one of clauses 2 and 3, wherein the controller is configured to determine the grey level magnification factor associated with the image based on a comparison of grey levels of pixels in the sequence of images.
6. The system of clause 5, wherein the image is a first selected image of the sequence of images, and the controller is configured to determine the grey level magnification factor associated with the first selected image based on:
7. The system of any one of clauses 1 to 6, wherein the controller is further configured to, for a particular image of the sequence of images, transform coordinates of the particular image based on positions of the beam and the image sensor with respect to the beam spot and a magnification factor of an optical system arranged between the image sensor and the surface.
8. The system of any one of clauses 1 to 7, wherein
9. The system of clause 8, wherein the controller is configured to generate the averaged image by:
10. The system of any one of clauses 1 to 9, wherein the controller is further configured to:
11. The system of clause 10, wherein the controller is configured to determine the relationship between the total grey level and the power by:
12. The system of either one of clauses 10 and 11, wherein
13. The system of any one of clauses 1 to 12, wherein
14. The system of any one of clauses 1 to 13, wherein the inspection system further includes a charged particle beam tool that includes circuitry to emit a charged particle beam onto the surface.
15. The system of clause 14, wherein the controller is further configured to align the beam spot of the beam to a field of view of the charged particle beam tool.
16. The system of clause 14, wherein the inspection system further includes an Advanced Charge Control module that includes circuitry to emit the beam.
17. The system of clause 16, wherein the charged particle beam tool, the Advanced Charge Control module, and the image detector are disposed outside of a vacuum chamber.
18. A method of monitoring a beam in an inspection system, comprising:
19. The method of clause 18, further comprising adjusting an image of the sequence of images by using a grey level magnification factor associated with the image.
20. The method of clause 19, wherein adjusting the image of the sequence of images comprises:
21. The method of either one of clauses 19 and 20, further comprising determining the grey level magnification factor associated with the image based on an exposure time at which the image is collected.
22. The method of either one of clauses 19 and 20, further comprising determining the grey level magnification factor associated with the image based on a comparison of grey levels of pixels in the sequence of images.
23. The method of clause 22, wherein the image is a first selected image of the sequence of images, and the method further comprises:
24. The method of any one of clauses 18 to 23, further comprising, for a particular image of the sequence of images, transforming coordinates of the particular image based on positions of the beam and the image sensor with respect to the beam spot and a magnification factor of an optical system arranged between the image sensor and the surface.
25. The method of any one of clauses 18 to 24, further comprising:
26. The method of clause 25, wherein the generating the averaged image comprising:
27. The method of any one of clauses 18 to 26, further comprising:
28. The method of clause 27, further comprising determining the relationship between the total grey level and the power by:
29. The method of either one of clauses 27 and 28, wherein
30. The method of any one of clauses 18 to 29, wherein
31. The method of any one of clauses 18 to 30, wherein
32. A non-transitory computer-readable medium that stores a set of instructions that is executable by a processor of a system for monitoring a beam in an inspection system to cause the system to perform a method, the method comprising:
33. The medium of clause 32, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
34. The medium of clause 33, wherein adjusting the image of the sequence of images comprises:
35. The medium of either one of clauses 33 and 34, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
36. The medium of either one of clauses 33 and 34, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
37. The medium of clause 36, wherein the image is a first selected image of the sequence of images, and the set of instructions that is executable by the processor of the system further causes the system to perform:
38. The medium of any one of clauses 18 to 23, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
39. The medium of any one of clauses 32 to 38, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
40. The medium of clause 39, wherein generating the averaged image comprising:
41. The medium of any one of clauses 32 to 40, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
42. The medium of clause 41, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
43. The medium of either one of clauses 41 and 42, wherein the inspection system further includes an emitter that emits the beam, and the set of instructions that is executable by the processor of the system further causes the system to perform:
44. The medium of any one of clauses 32 to 43, wherein the inspection system further includes an emitter that emits the beam, and the emitter includes a beam shaper, and the set of instructions that is executable by the processor of the system further causes the system to perform:
45. The medium of any one of clauses 18 to 30, wherein the inspection system further includes an emitter that emits the beam and a charged particle beam tool that emits a charged particle beam onto the surface, the set of instructions that is executable by the processor of the system further causes the system to perform:
46. A system for monitoring a beam in an inspection system, comprising:
47. A method of monitoring a beam in an inspection system, comprising:
48. A non-transitory computer-readable medium that stores a set of instructions that is executable by a processor of a system for monitoring a beam in an inspection system to cause the system to perform a method, the method comprising:
49. A system for monitoring a beam in an inspection system, comprising:
50. The system of clause 49, wherein the controller is configured to generate the averaged image by:
51. A method of monitoring a beam in an inspection system, comprising:
52. The method of clause 51, wherein the generating the averaged image comprises:
53. A non-transitory computer-readable medium that stores a set of instructions that is executable by a processor of a system for monitoring a beam in an inspection system to cause the system to perform a method, the method comprising:
54. The medium of clause 53, wherein the generating the averaged image comprises:
55. A system for monitoring a beam in an inspection system, comprising:
56. The system of clause 55, wherein the controller is configured to determine the predetermined relationship between the total grey level and the power by:
57. The system of either one of clauses 55 and 56, wherein
58. The system of any one of clauses 55 to 57, wherein the inspection system further includes a charged particle beam tool that that includes circuitry to emit a charged particle beam onto the surface.
59. The system of clause 58, wherein the inspection system further includes an Advanced Charge Control module that includes circuitry to emit the beam.
60. The system of clause 59, wherein the charged particle beam tool, the Advanced Charge Control module, and the image detector are disposed outside of a vacuum chamber.
61. A method of monitoring a beam in an inspection system, comprising:
62. The method of clause 61, further comprising determining the relationship between the total grey level and the power by:
63. A non-transitory computer-readable medium that stores a set of instructions that is executable by a processor of a system for monitoring a beam in an inspection system to cause the system to perform a method, the method comprising:
64. The non-transitory computer-readable medium of clause 63, wherein the set of instructions that is executable by the processor of the system further causes the system to perform:
It will be appreciated that the present invention is not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes can be made without departing from the scope thereof. It is intended that the scope of the invention should only be limited by the appended claims.
This application claims priority to International Application No. PCT/EP2019/072678, filed Aug. 26, 2019, and published as WO 2020/052943 A1, which claims priority of U.S. application 62/730,972 which was filed on Sep. 13, 2018. The contents of these applications are incorporated herein by reference in their entireties.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2019/072678 | 8/26/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/052943 | 3/19/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
7084406 | Knippelmeyer | Aug 2006 | B2 |
9536697 | Wang et al. | Jan 2017 | B2 |
9966227 | Hatakeyama et al. | May 2018 | B2 |
20090135419 | Finarov | May 2009 | A1 |
20110155905 | Hatakeyama et al. | Jun 2011 | A1 |
20170256465 | Van Leest | Sep 2017 | A1 |
20190049859 | Tsiatmas | Feb 2019 | A1 |
20190155173 | Tsiatmas | May 2019 | A1 |
Number | Date | Country |
---|---|---|
2000 048722 | Feb 2000 | JP |
2000 234916 | Aug 2000 | JP |
2016-127023 | Jul 2016 | JP |
WO 2019063531 | Apr 2019 | WO |
WO-2020052943 | Mar 2020 | WO |
Entry |
---|
Shibuya, Hisae et al.; “Beam Profile Measurement System for Focus Evaluation of CRTs,” IDW, CRTP-4, London, UK, Jan. 1, 2000; XP007015050; pp. 573-576 (4 pgs.). |
Mitchell, T.J. et al.; “Quantitative high dynamic range beam profiling for fluorescence microscopy,” Review of Scientific Instruments, AIP, Melville, NY, US, vol. 85, No. 10; Oct. 29, 2014, XP012191378, ISSN: 0034-6748, DOI: 10.1063/1.4899208 (6 pgs.). |
López-Alonso, José Manuel et al.; “Noise in imaging systems: Fixe pattern noise, electronic and interference noise,” SPIE, P.O. Box 10, Bellingham, WA; XP040186990; 2004 (10 pgs.). |
International Search Report issued by the International Searching Authority in related International Application No. PCT/EP2019/072678, mailed Jan. 30, 2020 (6 pgs.). |
Office Action of the Intellectual Property Office of Taiwan issued in the Taiwanese Patent Application No. 108131789, mailed Apr. 29, 2021 (12 pgs.). |
Number | Date | Country | |
---|---|---|---|
20220042935 A1 | Feb 2022 | US |
Number | Date | Country | |
---|---|---|---|
62730972 | Sep 2018 | US |