WAFER NOTCH DETECTION

Abstract
Notch detection methods and modules are provided for efficiently estimating a position of a wafer notch. Capturing an image of specified region(s) of the wafer, a principle angle is identified in a transformation, converted into polar coordinates, of the captured image. Then the wafer axes are recovered from the identified principle angle as the dominant orientations of geometric primitives in the captured region. The captured region may be selected to include the center of the wafer and/or certain patterns that enhance the identification and recovering of the axes. Multiple images and/or regions may be used to optimize image quality and detection efficiency.
Description
FIELD OF THE INVENTION

The present invention relates to the field of semiconductor technology, and more particularly, to identification of the position of the wafer notch.


BACKGROUND OF THE INVENTION

Typically, the wafer orientation is conveyed by the position of the notch, which indicates the crystallographic orientation of wafer. In the prior art, wafer orientation computation based on notch orientation is time consuming, as it requires exhaustive search in the full angular range of 360 degrees. Typically, the result of wafer orientation based on notch alone is limited in accuracy and requires additional step of “Fine Alignment,” which is time consuming. In order to avoid long mechanical movements, some of the existing solutions require additional hardware such as extra sensors, or an additional camera which covers a large part of the wafer within its field of view or several cameras. All those exhibit added complexity and cost.


BRIEF SUMMARY OF THE INVENTION

One aspect of the present invention provides a method of estimating a position of a wafer notch, the method comprising capturing an image of one or more specified region(s) of the wafer, identifying a principle angle in a transformation of the captured image which is converted into polar coordinates, and recovering one or more wafer axis (or axes) from an identified principle angle.


These, additional, and/or other aspects and/or advantages of the present invention are set forth in the detailed description which follows; possibly inferable from the detailed description; and/or learnable by practice of the present invention.





BRIEF DESCRIPTION OF THE DRAWINGS

For a better understanding of embodiments of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings in which like numerals designate corresponding elements or sections throughout.


In the accompanying drawings:



FIG. 1 is a high level schematic illustration of a wafer with its notch and a notch detection module associated with an optical system within a metrology system, according to some embodiments of the invention;



FIG. 2 is a high level schematic illustration of intermediate steps in a street orientation algorithm, operable by the notch detection module, according to some embodiments of the invention;



FIG. 3 is an illustration of exemplary inputs onto which the street orientation algorithm may be applied, displaying increasing levels of noise, according to some embodiments of the invention;



FIG. 4 is an illustration of exemplary input images from different wafer regions for selection according to their characteristics, according to some embodiments of the invention;



FIG. 5 is a high level schematic flowchart of a method, according to some embodiments of the invention.





DETAILED DESCRIPTION OF THE INVENTION

Prior to the detailed description being set forth, it may be helpful to set forth definitions of certain terms that will be used hereinafter.


The term “geometric primitives” in an image, as used in this application refers to basic forms, objects and patterns in the image, such as lines, simple shapes or recurring elements.


The term “street orientation” as used in this application refers to orientations of geometric primitives in the image, e.g., with respect to a given grid. The term “street orientation algorithm” as used in this application refers to ways of deriving the street orientation.


It is further noted, that there is a strong geometrical correlation between one wafer axis and the other wafer axes, and respectively between the principle angle to one wafer axis and the principle angles to the other wafer axes. Geometrically, the wafer axes are separated by multiples of 90° as are the principle angles with respect to the wafer axes. Hence, in the following description, any aspect relating to one wafer axis and/or one principle angle is to be understood as potentially relating to any number of wafer axes and/or principle angles. For example, any orientation measurement may be carried out with respect to one or more wafer axes.


With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.


Before at least one embodiment of the invention is explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.


Notch detection methods and modules are provided for efficiently estimating a position of a wafer notch. Capturing an image of specified region(s) of the wafer, a principle angle is identified in a transformation, converted into polar coordinates, of the captured image. Then the wafer axes are recovered from the identified principle angle as the dominant orientations of geometric primitives in the captured region. The captured region may be selected to include the center of the wafer and/or certain patterns that enhance the identification and recovering of the axes. Multiple images and/or regions may be used to optimize image quality and detection efficiency. The angular orientation of the wafer is measured with high accuracy in minimal time, by simultaneous minimization of algorithmic complexity and mechanical movements of the vision system with respect to the wafer. The notch position may be detected implicitly, without directly sensing it. Certain embodiments overcome the challenges produced by the tool architecture such as: limited field of view of the vision system and the time-consuming mechanical movements for positioning of the camera with respect to the wafer both lateral and rotational, as well as image corruption by various types of noise.


It is assumed that the orientation of the wafer at the beginning of the procedure is arbitrary and that the wafer deviation from the chuck center is limited to a 2 mm error. The computation of wafer orientation may be triggered in the system as part of either “Train” or “Run” sequences. The time optimization of the “Run” sequence is the most critical because it directly affects the throughput of the tool. Disclosed modules and methods use a Street Orientation Algorithm to reduce the search space of orientation and notch position to four points, for example, a robust algorithm for Street Orientation may be used to enable skipping the fine alignment step. In addition, anchor points may be used to avoid long mechanic movements from center of wafer, as described below.



FIG. 1 is a high level schematic illustration of a wafer 60 with its notch 70 and a notch detection module 107 associated with an optical system 105 within a metrology system 100, according to some embodiments of the invention. It is noted that disclosed methods and modules may be applied to systems in the semiconductor industry other than metrology systems and that the latter merely serve in the present disclosure as a non-limiting example.


The orientation of wafer 60 is uniquely defined by the position of its notch 70. The periodic layout of devices printed on wafer 60 and the borders of dies are aligned along the Cartesian axes 61, 62, with notch 70 is at the end-point of y axis 62.


Instead of detecting notch 70 visually at the periphery of wafer 60, e.g., by imaging whole wafer 60 with respective axes 61, 62 and center 65, or imaging the wafer periphery, notch detection module 107 merely images 110 a central region 115 of wafer 60, which may include wafer center 65 or not, and derives from the imaged region the orientations 111, 112 of wafer axes 61, 62. For example, central region 115 may comprise unique pattern(s) with known position with respect to center of wafer 65. Notch detection module 107 uses derived orientations 111, 112 to suggest possible locations 71A-71D for notch 70, and in certain embodiments also proceeds to determine which of locations 71A-71D is the actual notch location. Image(s) 110 of central region(s) 115 may be selected from multiple image(s) 110 and/or multiple regions 115, e.g., according to image quality, derivation quality, or derivations from multiple image(s) 110 and/or multiple regions 115 may be statistically analyzed to derive more accurate location estimations.


Notch detection module 107 may implement any of the following components of the orientation computation task:


i. A “street orientation algorithm” may be used to find the dominant orientation (designated by the angle θ) of geometric primitives in the field of view (FOV) of imaging device 100, for example with respect to image 110 acquired at an estimated center of wafer 60.


ii. A “notch detection algorithm” may comprise searching specified notch pattern(s) in up to three (out of four) possible locations 71A-71D at the edge of wafer 60 according to detected orientations 111, 112.


iii. An “anchor point training algorithm” may comprise searching and defining the unique pattern(s) for selection of region 115, e.g., in the near proximity to center 65 of wafer 60.


iv. An “anchor point detection algorithm” may comprise pattern searching of anchor point template(s) near center 65 of wafer 60 as basis for the anchor point training algorithm.


It is noted that the estimation of street orientation and hence of the orientation of the wafer axes may be carried out by itself. Additionally, estimation of the notch location and/or usage of anchor points or patterns as proxies for the notch location may be applied to derive the notch location from the street orientation without applying direct imaging of the notch region.



FIG. 2 is a high level schematic illustration of intermediate steps in a street orientation algorithm 101, operable by notch detection module 107, according to some embodiments of the invention. In certain embodiments, street orientation algorithm 101 relies on image analysis in the Fourier domain. Unlike other edge-based approaches the inventors found this method to be extremely fast and robust to noisy, out-of-focus and low-contrast inputs (images 110). The steps of proposed algorithm 101 are illustrated in FIG. 2.


Upon captured image 122, shown in schematic wafer coordinates and designated 1(x,y), the 2D (two dimensional) Discrete Fourier Transform is applied and the absolute value of the Fourier coefficients are computed and presented as an image 124, denoted J(w,u). Then, J(w,u) is converted to polar coordinates to yield Jp(r,θ) (illustrated in image 126) and the orthogonal projection thereof onto the θ axis 128 is used for the orientation recovery using the resulting peaks separated by 90°, giving θ and θ+90°. Deriving θ, rotated image 130 may be generated from captured image 122, with rotated image 130 characterized by orientations 111, 112 that are congruent to wafer axes 61, 62. Notch 70 is located at one of the ends of one of orientations 111, 112. In certain embodiments, an ambiguity in the relative angle between orientations 111, 112 and wafer axes 61, 62 (among θ 71A, θ+90° 71C, θ+180° 71D and θ +270° 71B) may be resolved via notch pattern search (in “Train” sequences) or anchor pattern search (in “Run” sequences), both possibly implemented as multi-scale rigid template matching with known scale and orientation.


The inventors have performed an accuracy and time requirements analysis, exemplified in the following by a few experimental results which demonstrate the robustness of proposed street orientation approach 101 to eliminate the need for any additional fine alignment step.



FIG. 3 is an illustration of exemplary inputs 150 onto which street orientation algorithm 101 may be applied, displaying increasing levels of noise, according to some embodiments of the invention. Street orientation algorithm 101 is applied on input image 122 with synthetically added noise of varying level (single run) to yield images 122A-122F with increasing levels of noise (the black and white illustrations miss some of the color coded information, especially in images 122C, 122D). The mean and the standard deviation of the intensities of input images 122 are standardized to be between 0 and 1 with noise standard deviation ranging from 0 (122A, resulting in accurate θ=−40.481° for the specifically illustrated example) through 1 (122B, θ=−40.486°), 2 (122C, θ=−40.471°) and 4 to 6 (122D-F and θ=−40.486°, θ=−40.504° and θ=−45.000° respectively). The inventors thus observed that even under very heavy added white noise with standard deviation 5 (122E), the error in θ resulting from street orientation algorithm 101 and/or notch detection module 107 is much smaller than 0.25°, which practically eliminates the need of fine alignment in system 100 under many circumstances.



FIG. 4 is an illustration of exemplary input images 155 from different wafer regions 115, for selection of image 122 according to their characteristics, according to some embodiments of the invention. It is noted that different wafer regions 115 may comprise different regions on one wafer 60 or on different wafers 60.


In certain embodiments, multiple images 122G-J may be captured from different regions 115 on wafer and one or more images 122 may be selected therefrom by notch detection module 107 for applying the notch detection analysis such as street orientation algorithm 101. In illustrated non-limiting example 155, input images 122G-J may be acquired on wafer 60 with arbitrary orientations at a specified number of different locations. Since no additional rotation of wafer 60 is applied between acquisitions the orientation result (θ and/or notch position) is expected to be the same so that the results from multiple images 122 may be compared and analyzed, e.g., statistically by measuring scattering metrics of the results. Experiments may be repeated for several wafers 60, at various orientations, various imaging conditions (contrast and focus) to optimize the selection of region(s) 115. In addition, comparing edge-based algorithm (EB) to Fourier Transform-based algorithms (FTB) 101 illustrate the superiority of the latter.


In 29 images 122 taken from different wafers 60 and/or different regions 115, and with respect to accuracy metrics of range (of measured angles θ)>0.3° and standard deviation (of measured angles θ)>0.2°, all FTB measurements conformed with both metrics, while in 11 and 9 EB measurements the metric's threshold were exceeded (respectively). FIG. 4 illustrates as examples image 122G suffering from insufficient illumination and low contrast (in FTB θrange=0.050 θSTD=0.018 while in EB θrange=0.200 θSTD=0.057), image 122H suffering from saturation and low contrast (in FTB θrange=0.112 θSTD=0.032 while in EB θrange=0.400 θSTD=0.119), image 122I being out of focus (in FTB θrange=0.031 θSTD=0.032 while in EB θrange=0.780 θSTD=0.266) and image 122J exhibiting atypical wafer design (in FTB θrange=0.105 θSTD=0.032 while in EB θrange=0.390 θSTD=0.168), all exhibiting the better performance of the disclosed invention. It is noted that the results of street orientation algorithm 101 were stable even for out-of-focus and low-contrast inputs 122 and way below the limits which require additional refinement steps. It is further noted that while from a theoretical point of view, the most time-consuming operator is the Fourier Transform (O(NlogN)), where N is the number of pixels in input image 122, yet in practice, the computation time is negligible with comparison to the time required for the mechanical movement of the camera in optical system 100 with respect to wafer 60.



FIG. 5 is a high level schematic flowchart of a method 200, according to some embodiments of the invention. Method 200 comprises estimating a position of a wafer notch (stage 210) and may be at least partially carried out by at least one computer processor (stage 280).


Method 200 may comprise capturing an image of a specified region of the wafer (stage 220), e.g., capturing an image of a central region of the wafer (stage 222), and possibly capturing multiple images and selecting images for further processing according to image characteristics (stage 225). Method 200 may employ any algorithm for finding the dominant orientation (designated by the angle θ) of geometric primitives in the field of view (FOV) of imaging device 100, for example with respect to image 110 acquired at an estimated center of wafer 60 (stage 228).


Method 200 may further comprise transforming the captured image (stage 230), calculating Fourier transform coefficients for the captured image (stage 235), converting the transformed image into polar coordinates (stage 240) and projecting the converted transformed image orthogonally (stage 245), and may further comprise identifying principle angle(s) in the transformation of the captured image which is converted into polar coordinates (stage 250).


In certain embodiments, method 200 comprises recovering wafer axis(es) from the identified principle angle(s) (stage 260) and identifying the wafer notch from the recovered wafer axis(es) (stage 270), for example by searching specified notch pattern(s) along the recovered wafer axis(es) (stage 272), by searching and defining unique pattern(s) to be captured in the image, which allow notch identification (stage 274) and respective selection of the captured region, e.g., in the near proximity to the center of the wafer and/or by pattern searching of anchor point template(s) that indicate the axis along which the notch is located (stage 276).


Advantageously using street orientation 101 to calculate wafer orientation narrows down the search space of orientations to four possibilities and the quality of street orientation 101 exemplified above eliminates the need for extra time required for a fine alignment step. Consecutively, using anchor points allows performing short mechanical movement to identify the quadrate, i.e., and respectively the axis along which the notch is located. For example, an anchor point close to the center of wafer may be selected to allow little or no travelling of the optical head, which is advantageous with respect to prior art movement to the location of the notch at the edge of wafer. Method 200 thus requires short stroke movements and hence shorter operation time.


In the above description, an embodiment is an example or implementation of the invention. The various appearances of “one embodiment”, “an embodiment”, “certain embodiments” or “some embodiments” do not necessarily all refer to the same embodiments.


Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment.


Certain embodiments of the invention may include features from different embodiments disclosed above, and certain embodiments may incorporate elements from other embodiments disclosed above. The disclosure of elements of the invention in the context of a specific embodiment is not to be taken as limiting their used in the specific embodiment alone.


Furthermore, it is to be understood that the invention can be carried out or practiced in various ways and that the invention can be implemented in certain embodiments other than the ones outlined in the description above.


The invention is not limited to those diagrams or to the corresponding descriptions. For example, flow need not move through each illustrated box or state, or in exactly the same order as illustrated and described.


Meanings of technical and scientific terms used herein are to be commonly understood as by one of ordinary skill in the art to which the invention belongs, unless otherwise defined.


While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the preferred embodiments. Other possible variations, modifications, and applications are also within the scope of the invention. Accordingly, the scope of the invention should not be limited by what has thus far been described, but by the appended claims and their legal equivalents.

Claims
  • 1. A method of estimating a position of a wafer notch comprising: capturing an image of at least one specified region of the wafer;identifying at least one principle angle in a transformation of the image which is converted into polar coordinates; and, recovering at least one wafer axis from the at least one principle angle.
  • 2. The method of claim 1, wherein at least one of the identifying step and the recovering step is carried out at least partially by a computer processor.
  • 3. The method of claim 1, wherein the specified region of the wafer includes a center of the wafer.
  • 4. The method of claim 1, wherein the transformation is a two dimensional Discrete Fourier Transform.
  • 5. The method of claim 1, further comprising: identifying the wafer notch from the at least one wafer axis.
  • 6. The method of claim 5, wherein the identifying of the wafer notch is carried out by: searching at least one specified notch pattern along the at least one wafer axis.
  • 7. The method of claim 1, further comprising: searching and defining at least one unique pattern to be captured in the image, to facilitate notch identification.
  • 8. The method of claim 1, further comprising: capturing multiple images; and,selecting therefrom images for further processing according to image characteristics.
  • 9. A notch detection module associated with an optical system, comprising: a notch detection module configured to identify at least one principle angle in a transformation, converted into polar coordinates, of an image captured by the optical system of at least one specified region of a wafer, and further configured to recover at least one wafer axis from the at least one identified principle angle.
  • 10. The notch detection module of claim 9, wherein the transformation is a two dimensional Discrete Fourier Transform.
  • 11. The notch detection module of claim 9, wherein the notch detection module is further configured to identify a position of a wafer notch of the wafer from the at least one wafer axis.
  • 12. The notch detection module of claim 11, wherein the notch detection module is configured to identify the wafer notch by searching at least one specified notch pattern along the at least one wafer axis.
  • 9. The notch detection module of claim 9, wherein the notch detection module is further configured to search and define at least one unique pattern to be captured in the image, to facilitate notch identification.
  • 14. The notch detection module of claim 9, wherein the optical system is configured to capture multiple images and the notch detection module is further configured to select therefrom images for further processing according to image characteristics.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is filed under 35 U.S.C. §120 and §365(c) as a continuation of International Patent Application Serial No. PCT/US15/15270, filed on Feb. 10, 2015, which application claims the benefit under 35 U.S.C. 119(e) of U.S. Provisional Patent Application No. 61/939,131 filed on Feb. 12, 2014, which applications are incorporated herein by reference in their entirety.

Provisional Applications (1)
Number Date Country
61939131 Feb 2014 US
Continuations (1)
Number Date Country
Parent PCT/US2015/015270 Feb 2015 US
Child 14958535 US