Apparatus and Method for Wafer Edge Defects Detection

Information

  • Patent Application
  • 20090116727
  • Publication Number
    20090116727
  • Date Filed
    August 08, 2008
    16 years ago
  • Date Published
    May 07, 2009
    15 years ago
Abstract
A substrate illumination and inspection system provides for illuminating and inspecting a substrate particularly the substrate edge. The system a image processor to automatically detect and characterize defects on the wafer's edge.
Description
FIELD

The present disclosure relates to illumination and inspection of a substrate, particularly illumination and inspection of specular surfaces of a silicon wafer edge with diffuse light from a plurality of light sources for enhanced viewing of the wafer edge.


BACKGROUND

The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.


Substrate processing, particularly silicon wafer processing involves deposition and etching of films and other processes at various stages in the eventual manufacture of integrated circuits. Because of this processing, contaminants, particles, and other defects develop in the edge area of the wafer. This includes particles, contaminants and other defects such as chips, cracks or delamination that develop on edge exclusion zones (near edge top surface and near edge back surface), and edge (including top bevel, crown and bottom bevel) of the wafer. It has been shown that a significant percentage of yield loss, in terms of final integrated circuits, results from particulate contamination originating from the edge area of the wafer causing killer defects inside the FQA (fixed quality area) portion of the wafer. See for example, Braun, The Wafer's Edge, Semiconductor International (Mar. 1, 2006), for a discussion of defects and wafer edge inspection methodologies.


Attempts at high magnification inspection of this region of the wafer have been confounded by poor illumination of these surfaces. It is difficult to properly illuminate and inspect the edge area of an in-process wafer. An in-process wafer typically has a reflective specular (“mirror”) surface. Attempts at illuminating this surface from a surface normal position frequently results in viewing reflections of surrounding environment of the-wafer edge thus making it difficult to visualize defects or distinguish the defects from reflective artifact. Further, the wafer edge area has a plurality of specular surfaces extending from the near edge top surface across the top bevel, the crown, the bottom bevel to the near edge bottom surface. These too cause non-uniform reflection of light necessary for viewing the wafer edge area and defect inspection. In addition, color fidelity to observed films and contrast of lighting are important considerations for any wafer edge inspection system.


Therefore, there is a need for a system that adequately illuminates the edge area of a wafer for inspection. It is important that the system provide for illumination and viewing suitable for a highly reflective surface extending over a plurality of surfaces and for a variety of defects to be observed. The system must provide for efficient and effective inspection of the edge area for a variety of defects.


SUMMARY

Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.


The object of the present invention is to provide a color image-based edge defect inspection and review system. It comprises an illuminator to provide uniform diffused illumination across the five wafer edge regions: top near edge surface, top bevel, apex, bottom bevel and bottom near edge surface, an optical imaging subsystem to image a portion of wafer edge supported by a wafer chuck, a positioning assembly to orientate the optical imaging subsystem to the user-defined inspection angle, an eccentricity sensor to actively measure the center offset of a wafer relative to the rotation center of the wafer chuck, a wafer chuck to hold the backside of a wafer onto the supporting pins, a linear stage to move a wafer from its load position to the inspection position, a rotary stage rotates the wafer in a step-and-stop fashion, a control console to provide tool control functions as well as at least the following capabilities: 1) automatic capture of defects of interest with enough sensitivity and speed, 2) automatic defect detection and classification, 3) automatic measurement of wafer edge exclusion width; and 4) automatic report of inspection results to the yield management system of a semiconductor fabrication plant.


In accordance with the present disclosure, a substrate illumination system has a light diffuser with an opening extending at least a portion of its length for receiving an edge of a wafer. The system also comprises a plurality of light sources in proximity to the light diffuser. The system further comprises an optic for viewing the wafer wherein the optic is exterior of the light diffuser and is angled off of the wafer edge surface normal position. A processor is provided to automatically characterize defects.


In an additional. aspect, the system comprises an illumination control system for independently controlling the plurality of light sources. Individually or by groups or sections, the plurality of lights can be dimmed or brightened. In addition, the plurality of lights can change color, individually or by groups or sections. Yet another aspect of the system comprises a rotation mechanism for rotating the optic from a position facing the top of the wafer to a position facing the bottom of the wafer. In an additional aspect of the system, the plurality of light sources is an LED matrix or alternatively a flexible OLED or LCD. In this aspect the flexible OLED or LCD can act in place of the plurality of lights or in place of both the light diffuser and the plurality of lights. The light sources can also be one or more halogen lamps. The one or more halogen lamps can be coupled to an array of fiber optics.


In yet an additional aspect, the system comprises a method for imaging the specular surface of a substrate. This method comprises, isolating a portion of the substrate in a light diffuser, emitting light onto the specular surface to be imaged and imaging the specular surface with an optic positioned at an angle off the specular surface normal from a position exterior to the light emitter.





DRAWINGS

The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.



FIG. 1 shows a schematic top view of the substrate illumination system of the present disclosure;



FIG. 2 shows a schematic side view of the system as shown in FIG. 1;



FIG. 3 shows a detailed view of a portion of the view shown in FIG. 2;



FIG. 4 shows a schematic side view of an alternative embodiment of the substrate illumination system;



FIG. 5 shows a detailed view of a portion of the view shown in FIG. 4;



FIG. 6 shows a schematic side view of another alternative embodiment of the substrate illumination system;



FIG. 7 shows a perspective view of yet another embodiment of the substrate illumination system; and



FIG. 8 shows a top plan view of the alternative embodiment of the substrate illumination system as shown in FIG. 7;



FIG. 9 shows a perspective view of a wafer edge inspection and review system of the present disclosure;



FIG. 10 shows a cross section view of the illuminator shown in FIG. 9;



FIG. 11 shows a enlarged cross section view of the wafer edge regions;



FIG. 12 shows a schematic view of the optical imaging subsystem shown in FIG. 9;



FIG. 13 shows the inspection angles of the optical imaging subsystem shown in FIG. 9;



FIG. 14 shows the angle between the principal axis of the optical imaging subsystem and the normal of the edge portion;



FIG. 15 illustrates the step-and-stop angular motion of a wafer;



FIG. 16 shows a user interface for semi-automated defect review;



FIG. 17 shows the process to review a specific defect of interest;



FIGS. 18 and 19 show an example of edge exclusion measurement;



FIG. 20 shows a perspective view of the wafer edge inspection and review system of the present disclosure;



FIG. 21 represents a flow chart describing the system;



FIG. 22 represents a diagram of the system shown in FIG. 20;



FIG. 23 represents the three camera imaging systems shown in FIG. 20;



FIG. 23
b represents the rotary inspection camera shown in FIG. 20;



FIG. 24
a-24b represent configurations of the camera imaging system shown in FIG. 20;



FIG. 25 represents an imaging map of a wafer;



FIG. 26 represents a defect map plotted on the image map shown in FIG. 25;



FIG. 27 represents bright and dark defects;



FIGS. 28 and 29 represent a graphical user interface showing defect images and statistical information;



FIGS. 30 and 31 represent statistical information related to defects on a wafer;



FIG. 32 represents a graphical user interface showing the categorization of defects;



FIG. 33 is a flowchart illustrating one embodiment of the procedure for automatic wafer edge defect detection and classification of the present disclosure;



FIG. 34 represents the defect detection engine;



FIG. 35 represents an algorithm to classify pixels;



FIG. 36 represents a boundary detection algorithm to determine defects size and characteristics; and



FIG. 37 represents the global thresholding algorithm for classifying defects.





DETAILED DESCRIPTION

The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.


Referring to FIGS. 1, 2, and 3 a substrate illumination system 10 (the “system”) of the disclosure has a diffuser 12 with a slot 14 along its length and a plurality of lights 16 surrounding its exterior radial periphery. Exterior of the diffuser 12 is an optic 18 that is connected to an imaging system 20 for viewing a substrate 22 as the substrate is held within the slot 14. The plurality of lights 16 are connected to a light controller 34.


The system 10 can be used to uniformly illuminate for brightfield inspection of ail surfaces of an edge area of the substrate 22 including, a near edge top surface 24, a near edge bottom surface 26, a top bevel 28, a bottom bevel 30 and a crown 32.


The optic 18 is a lens or combination of lenses, prisms, and related optical hardware. The optic 18 is aimed at the substrate 22 at an angle off a surface normal to the crown 32 of the substrate 22. The angle of the optic 18 advantageously allows for preventing a specular surface of the substrate 22 from reflecting back the optic 18 whereby the optic 18 “sees itself.” The viewing angle is typically 3 to 6 degrees off normal. Some optimization outside of this range is possible depending on illuminator alignment relative to the substrate 22 and the specific optic 18 configuration.


The imaging system 20 is for example a charge-coupled device (CCD) camera suitable for microscopic imaging. The imaging system 20 may be connected to a display monitor and/or computer (not shown) for viewing, analyzing, and storing images of the substrate 22.


Diffuser 12 is formed of a translucent material suitable for providing uniform diffuse illumination. The diffuser 12 may be formed of a frosted glass, a sand blasted quartz or a plastic or the like, where light passing through it is uniformly diffused. In a preferred embodiment, the diffuser 12 is a circular cylinder as illustrated. Diffuser 12 may be an elliptic cylinder, generalized cylinder, or other shape that allows for surrounding and isolating a portion of a substrate 22 including the substrate 22 edge. The slot 14 in the diffuser 12 extends for a suitable length to allow introduction of the substrate 22 into the diffuser 12 far enough to provide uniform illumination of the edge area and to isolate the edge area from the outside of the diffuser 12.


Importantly, the interior of the diffuser 12 serves as a uniform neutral background for any reflection from the specular surface of the substrate 22 that is captured by the optic 18. Thus, the optic 18 while looking towards focal point F on the specular surface of the crown 32 images (sees) the interior of the diffuser 12 at location I. Similarly, the optic 18 looking towards focal points F′ and F″ on the specular surfaces of the top bevel 28 and bottom bevel 30 respectively, images the interior of the diffuser 12 at locations I′ and I″.


The angle of the optic 18 in cooperation with the diffuser 12 prevents reflective artifacts from interfering with viewing the plurality of specular surfaces of the edge area of the substrate 22. Instead, and advantageously, a uniform background of the diffuser 12 interior is seen in the reflection of the specular surfaces of the substrate 22.


The plurality of lights 16 is a highly incoherent light source including an incandescent light. In a preferred embodiment, the plurality of lights 16 is an array of LEDs. Alternatively, a quartz halogen bulb can be the light source with fiber optics (not shown) used to distribute light of this single light source radially around the diffuser 12. In another preferred embodiment the plurality of lights 16 is an array of fiber optics each coupled to an independent, remotely located quartz tungsten halogen (QTH) lamp.


The plurality of lights 16 is preferably a white light source to provide the best color fidelity. In substrate 22 observation, color fidelity is important because of film thickness information conveyed by thin film interference colors. If the substrate 22 surface is illuminated with light having some spectral bias, the thin film interference information can be distorted. Slight amounts of spectral bias in the light source can be accommodated by using filters and/or electronic adjustment (i.e., camera white balance).


In operation, a substrate 22, for example, a wafer is placed on a rotatable chuck (not shown) that moves the edge of the wafer into the slot 14 of the diffuser 12. The light controller 34 activates in suitable brightness the plurality of lights 16 for providing uniform illumination of the edge area of the wafer. The wafer is viewed through the imaging system 20 via the optic 18 and inspected for defects. The wafer may be automatically rotated or manually rotated to allow for selective viewing of the wafer edge. Thus, observation of the wafer edge for defects is facilitated and is unhindered by a specular surface of the wafer.


With added reference to FIGS. 4 and 5, in an embodiment of the system 10 the plurality of lights 16 are individually controlled by the light controller 34. In this embodiment light controller 34 is a dimmer/switch suitable for dimming individually or in groups a plurality of lights. Alternatively, light controller 34 can be the type as disclosed in U.S. Pat. No. 6,369,524 or 5,629,607, incorporated herein by reference. Light controller 34 provides for dimming and brightening or alternatively turning on/off individually or in groups each of the lights in the plurality of lights 16.


The intensity of a portion of the plurality of lights 16 is dimmed or brightened to anticipate the reflective effect of specular surfaces that are inherent to the substrate 22, particularly at micro locations along the edge profile that have very small radii of curvature. These micro locations are the transition zones 33 where the top surface 24 meets the top bevel 28 and the top bevel meets the crown 32 and the crown meets the bottom bevel 30 and the bottom bevel 30 meets the bottom surface 26.


An example of addressable illumination is illustrated in FIGS. 4 and 5 where higher intensity illumination 36 is directed to a top bevel 28, crown 32 and bottom bevel 30 while lower intensity illumination 38 is directed to the transition zones 33 in between. With this illumination configuration, the image of these transition zones 33 are seen illuminated with similar intensity as compared to the top bevel 28, crown 32 and bottom bevel 30.


Further, addressable illumination is useful to accommodate intensity variation seen by the optic 18 due to view factor of the substrate 22 edge area. Some portions of the substrate 22 edge area have a high view factor with respect to the illumination from the diffuser 12 and consequently appear relatively bright. Other portions with low view factor appear relatively dark. Addressable illumination allows mapping an intensity profile onto the wafer surface that allows for the view factor variation and provides a uniformly illuminated image. The required intensity profile can change with viewing angle change of the optic 18.


Addressability of the illumination or its intensity can be accomplished in a number of ways. One embodiment is to locate independently controllable light-emitting diodes (LEDs) around the outside of the diffuser 12 consistent with the plurality of lights 16. Another alternative is to employ a small flexible organic light-emitting diode (OLED), liquid crystal display (LCD) or other micro-display module. Such modules are addressable to a much greater degree than an LED matrix. In this embodiment the flexible OLED, LCD or other micro-display module can replace both the plurality of lights 16 and the diffuser 12. For example, a flexible OLED can both illuminate and have a surface layer with a matte finish suitable for acting as a diffuser and neutral background for imaging. Further, the flexible OLED can be formed into a suitable shape such as a cylinder. Examples of a suitable OLED are disclosed in U.S. Pat. Nos. 7,019,717 and 7,005,671, incorporated herein by reference.


Further, those modules can also provide programmable illumination across a broad range of colors including white light. Color selection can be used to highlight different thin films and can be used in combination with part of an OLED, for example, emitting one color while another part of the OLED emits another color of light. In some cases it can be beneficial to use only part of the light spectrum, for example, to gain sensitivity to a film residue in a given thickness range. This is one mode of analysis particularly applicable to automatic defect classification. One analysis technique to detect backside etch polymer residue preferentially looks at light reflected in the green portion of the spectrum. Thus, this embodiment of the system 10 provides for a suitable color differential based inspection of the substrate 22.


Now referring to FIG. 6, in another embodiment of the system 10, the optic 18 is rotatable in a radial direction 40 around the substrate 22 at a maintained distance from a center point of the substrate 22 edge. The optic 18 is rotatable while maintaining the angle of the optic 18 relative to surface normal of the substrate 22 edge. This allows for focused imaging of all regions of the substrate 22 surface, including the top surface 24, bottom surface 26, top bevel 28, bottom bevel 30 and crown 32. The rotating optic 18 can also include the imaging system 20 or consist of a lens and a CCD camera combination or can be a subset of this consisting of moving mirrors and prisms. This embodiment provides the additional advantage of using one set of camera hardware to view the substrate 22 rather than an array of cameras.


Now referring to FIGS. 7 and 8, in another embodiment of the system 10, the optic 18 includes a fold mirror 50 and a zoom lens assembly 52. The optic 18 is connected to a rotatable armature 54 for rotating the optic 18 radially around the edge of the substrate 22 (as similarly discussed in relation to FIG. 6). The substrate 22 is retained on a rotatable chuck 56. The diffuser 12 is housed in an Illumination cylinder 58 that is retained on a support member 60 connected to a support stand 62.


The operation of this embodiment of the system 10 is substantially the same as described above with the additional functionality of radially moving the optic 18 to further aid in inspecting all surfaces of the edge of the substrate 22. Further, the substrate 22 can be rotated either manually or automatically by the rotatable chuck 56 to facilitate the inspection process.


Referring to FIG. 9 an automatic wafer edge inspection and review system 10 consists of an illuminator 11, an optical imaging subsystem 64, a wafer supporting chuck 66 (not shown), a positioning assembly 68, an eccentricity sensor 70, a linear stage 72, a rotary stage 74, and a control console 76. The eccentricity sensor 70 is used to provide eccentricity data to the controller to allow the controller to positionally adjust the substrate 22 with respect to the imaging system 64. Optionally, data from the eccentricity sensor 70 can be used to adjust the optics system to ensure uniformity of the image and focus as opposed to or in conjunction with the supporting chuck 66.


Referring to FIG. 10 and as described above, the illuminator 11 provides uniform illumination across the five wafer edge regions: top near edge surface 78, top bevel 80, apex 82, bottom bevel 84, and bottom near edge surface 86, as show in FIG. 11. It is also envisioned the illuminator 11 can vary the intensity or color of the illumination depending upon the expected defect or substrate region. Additionally, the illuminator 11 can individually illuminate different regions of the wafer. The light controller received input from the system controller 76.


Referring to FIG. 12, the optical imaging subsystem 64 has a filter 121, a mirror 122, an attachment objective lens 123, a motorized focus lens 124, a motorized zoom lens 125, and a magnifier lens 126, and a high resolution area scan color camera 127. The motorized focus lens 124 automatically or manually sets best focus position before starting automatic inspection and during the review process. The filter 121 can be a polarizer, or optical filter which allows the passage of predetermined frequencies.


The motorized zoom lens 125 can be configured in the low magnification range for inspection purpose and high magnification range for review purpose. As shown in FIG. 14, the positioning assembly 68 orientates the optical imaging subsystem 64 to the predefined inspection angle 51. To improve the image, the optical imaging subsystem 64 is orientated in such a way that its principal axis 128 preferably is kept from the normal direction 191 of the wafer edge portion under inspection. The linear stage 72 moves the wafer from its load position to the inspection position, and also performs the eccentricity compensation to bring the wafer always to the best focus position during the image acquisition period. While the rotary stage 74 rotates the substrate 22 along the circumference direction in a step-and-stop manner, as shown in FIG. 15, it is envisioned a continuous rotation of the wafer is possible.


The control console 76 controls the system 10 via the tool control software. In this regard, the console 76 controls the motion of linear stage 72 and rotary stage 74, positioning the assembly 68 to the user-defined inspection angle. The controller further presets the magnification of the motorized zoom lens 125 and focus position of the motorized focus lens 124, initializing the image acquisition timing and other essential functions to complete the automatic inspection of a wafer using user-predefined routines. The control console 76 also displays the acquired images and runs the defect inspection and classification software, reporting the results files to a factory automation system.


Referring generally to FIG. 9 which shows the operation of one embodiment, a substrate 22 is picked up from a FOUP (not shown) or an open cassette (not shown) in the equipment front end module (not shown) by the transportation robot arm 27, placed onto the rotational table of the aligner (not shown). The aligner detects the center of the substrate 22 as well as its notch, aligns the wafer to the center axis of the rotational table. After alignment is completed, the transport robot arm 27 picks up the substrate 22 from the aligner, places it onto the wafer chuck (not shown) of the inspection and review system 10.


Then, the wafer is rotated and the eccentricity sensor 70 starts to measure the eccentricity of the wafer relatively to the spin center of the rotary stage 74. The eccentricity information is fed back to the control console 76. At the same time, the positioning assembly 68 moves the optical imaging subsystem 64 to the routine inspection angle. Then the linear stage 72 moves the substrate 22 to the inspection position from the load position. The rotary stage 74 starts to move forward one step (routine-defined angle) and stops completely. The illuminator 11 is turned on, and the camera 127 takes an image of the portion of the wafer edge within the field of view of the optical imaging system 64. After completion, the rotary stage 74 rotates one more step, settling down completely. The linear stage 72 moves the substrate 22 to the best focus position based on the eccentricity data stored in the control console 76. During the movement of the stage 72, the control console 76 downloads the previous images from the camera to the onboard memory and the hard disk media. Then, the camera 127 takes the second picture of the wafer edge. The above steps are repeated until the region of interest or the whole circumference of the substrate 22 is imaged.


If the system is set to inspect the edge regions of substrate 22 in more than one inspection angles, the control console 76 moves the positioning assembly 68 to another inspection angle, repeating the steps described above. The images of the edge of the substrate 22 at the new inspection angle are recorded until all inspection angles of interest are covered.


After the completion of imaging all the predefined edge regions of substrate 22, the transport robot arm 27 picks the substrate 22 from the inspection chamber, and place it back to a FOUP or a cassette in the equipment front end module.


While the system 10 takes pictures of the edge of substrate 22, the inspection and classification software installed in control console 76 processes the raw images, detects the defects of interest, classifies them into different classes or category and outputs to the results files. To review a specific defect found by the system 10, the location and the inspection angle of the specific defect can be retrieved from the results files. As shown in FIG. 16, an operator inputs this information to the review system setup area of tool control software in the control console 76. The control console 76 automatically moves the substrate 22 and the positioning assembly 68 to the predetermined positions, locates the specific defect of interest. Then, the user adjusts the magnification of the motorized zoom lens 125 to the desired value, focusing on the defect by adjusting the position of the motorized focus lens 124. The operator can now review the details of the defect on the display and record its image to storage devices of the control console 76.


Referring to FIGS. 9 and 18, the system is used to measure the cut line 141 of the edge bead removal of a film layer 140. The positioning assembly 68 moves the optical imaging subsystem 64 and the area scan camera 127. In this position, the top near edge surface of the substrate 22 with the cut line 141 is visible within the field of view. The motorized focus lens 124 is set to the position where the image is under best focus. The rotary stage 74 starts to move forward one step (predefined angle) and stops completely. The illuminator 11 is turned on, and the camera 127 takes an image of a portion of the near top edge surface including the cut line 141. Then, the rotary stage 74 moves one more step, settling down completely. While the stage is in motion, the control console 76 downloads the image from the camera 127 to the onboard memory and the hard disk media. Upon completion, the camera 127 takes the second picture. The above steps are repeated until the whole cut line along the circumference of the substrate 22 is completely imaged and recorded onto onboard memory and the hard disk media.


During operation, the control console 76 processes the recorded images to calculate the profile of the cut line 141 as well as the following parameters: the center disposition from the wafer center, mean edge exclusion distance, the standard deviation, and the peak-to-peak variation. The results are output to the results file with predefined format.


As shown in FIGS. 9 and 19, the wafer edge inspection and review system 10 can be used to measure multiple cut lines, for example, 151, 152, and 153 of multiple film layers 154, 155, and 156. The positioning assembly 68 moves the optical imaging subsystem 64 and the area scan camera 127 to a position so that the top near edge surface of the substrate 22 with the cut lines 151, 152 and 153 is within the field of view. The motorized focus lens 124 is set to the position where the image is under best focus. The rotary stage 74 starts to move forward one step and stops completely. The illuminator 11 is turned on, and the camera 127 takes an image of a portion of the near top edge surface including the cut lines 151, 152 and 153. Then, the rotary stage 74 moves a second step, settling down completely. While the rotary stage is in motion, the control console 76 downloads the picture from the camera 127 to the onboard memory and the hard disk media. Upon completion, the camera 127 takes the second picture. The above steps are repeated until the whole cut lines along the circumference of the substrate 22 are completed imaged and recorded onto onboard memory and the hard disk media. Referring generally to FIG. 20 which shows the operation of another embodiment, a substrate 22 is picked up from a FOUP (not shown) or an open cassette (not shown) in the equipment front end module (not shown) by the transportation robot arm (not shown), placed onto the rotational table of the aligner. The aligner detects the center of the substrate 22 as well as its notch, aligns the wafer to the center axis of the rotational table. After alignment is completed, the transport robot arm picks up the substrate 22 from the aligner, places it onto the wafer chuck (not shown) of the inspection and review system 10.


Then, the wafer is rotated and the eccentricity sensor 70 starts to measure the eccentricity of the wafer relatively to the spin center of the rotary stage 74. The eccentricity information is fed back to the control console 76. At the same time, the positioning assembly 68 moves the optical imaging subsystem 64 to the routine inspection angle. Then the linear stage 72 moves the substrate 22 to the inspection position from the load position. The rotary stage 74 starts to move forward one step (routine-defined angle) and stops completely.


A first and second illuminators 11a, 11b are turned on, and the cameras 127a, 127b, 127c and 127d take images of the portion of the wafer edge within the field of view of the optical imaging system 64a-d. After completion, the rotary stage 74 rotates one more step, settling down completely. The linear stage 72 moves the substrate 22 to the best focus position based on the eccentricity data stored in the control console. During the movement of the stage 72, the control console downloads the previous images from the camera to the onboard memory and the hard disk media. Then, the cameras 127a-c take the second set of pictures of the wafer edge. The above steps are repeated until the region of interest or the whole circumference of the substrate 22 is imaged.


By using three cameras 127a-c to inspect the edge regions of substrate 22 in more than one inspection angle; multiple sides can be inspected simultaneously. The images of the edge of the substrate 22 at each rotational inspection angle are recorded until all inspection angles of interest are covered.


After the completion of imaging all the predefined edge regions of substrate 22, the transport robot arm 27 picks the substrate 22 from the inspection chamber, and place it back to a FOUP or a cassette in the equipment front end module.


While the system 10 takes pictures of the edge of substrate 22, the inspection and classification software installed in control console 76 processes the raw images, detects the defects of interest, classifies them into different classes or category and outputs to the results files. To review a specific defect found by the system 10, the location and the inspection angle of the specific defect can be retrieved from the results files. This information can be used to view a specific defect region using the rotatable camera 127d.



FIG. 21 represents a flow chart describing the analysis system for the inspection module shown in FIG. 20. In this regard, data from the edge image acquisition system 200 is transferred to the edge image database 202. This data is transferred from the edge image database to either the defect inspector 204 or the wafer edge exclusion zone detection module 206. Results from either of these modules 204 or 206 can be then transferred to a defect classifier module 208, which evaluates and classifies detected defects. These defects are then viewable in the defect reviewer 210 using a graphical user interface. Data from each of the modules is storable within the edge image database 202 for further review.



FIG. 22 represents a diagram of the system shown in FIG. 20. Optionally, each of the cameras 64a-d can be individually coupled the separate image processors or computers 220. Each of these computers 220, which process the images in parallel at very high speeds, can be coupled to the image database 222 for storage. As described above, images from the database 222 can be processed to detect point and edge and edge location defects. Additionally shown, the database 222 can be coupled to a fabrication network 224 and a host analysis computer 226 for flexibility.



FIG. 23
a represents the three fixed wafer edge imaging cameras. As shown in FIGS. 24a and 24b, the three fixed cameras can moved to review differing portions of the wafer's edge. These images can be stitched together using optical methods. FIG. 23b represents a side view of the rotary camera as described above.



FIGS. 25 and 26 represent image maps of a wafer. The image map allow for the indexing of the location and types of defects on a single image. Shown is the location of the defect with respect to the edge and the indexing notch and a radial sizing grid. FIG. 27 represents typical bright type and dark type defects which can be imaged and detected by the system. Varying the intensity and spectrum of the light can influence the defects visibility.



FIGS. 28 and 29 represent a graphical user interface which shows an image map and an image of radial location about the wafer's edge. Also shown is an enhanced image of a portion of the wafer. The enhanced image specifically displays detected defects within a single edge image.


As shown in FIGS. 29 through 31, the defect statistics are viewable through a graphical user interface. These defect statistics may include, for example, the number of defects at a given location or the distribution of defects of varying types or sizes. Briefly returning to FIG. 20, in operation an operator inputs this information to the review system setup area of tool control software in the control console 76. The control console 76 automatically moves the substrate 22 and the positioning assembly 68 to the predetermined positions, locates the specific defect of interest. Then, the user adjusts the magnification of the motorized zoom lens 125 to the desired value, focusing on the defect by adjusting the position of the motorized focus lens 124. The operator can now review the details of a specific defect on the display and record its image to storage devices of the control console 76.


Referring to FIG. 33, the procedures and methods for automatic defect detection and classification of the present disclosure 100 includes the following steps. First the color image of a wafer edge portion is acquired by the wafer edge defect detection system described above.


In step 104, the acquired images are stored in the image database. These images can be stored with indexing information showing the rotational location of the image with respect to the wafer.


In step 106, the image portions corresponding to the following edge regions of a wafer are identified: top near edge surface, top bevel, apex, bottom bevel, bottom near edge surface. During this step, the edge of the wafer is determined. Optionally, focusing can occur at this time.


In step 108, the images of the above five regions are pre-processed for contrast enhancement, and global noise reduction.


In step 110, the five-region images are transformed into grayscale images using the following formula.






T=a1*R+a2*G+a3*B





0=<a1, a2, a3<=1.0


In step 112, defects are identified using the algorithms illustrated in FIG. 35. After step 112, the five-region images are converted into binary images. In step 116, the morphological operations such as hole filling, smoothing, erosion and dilation are applied to the above binary images. In step 118, Blob analysis is contacted, the features of a defect such as dimensions, center location are calculated and defects in the frame are counted.


In step 120, defects are categorized into different classes based on pre-defined criteria such as shape, size and color. In step 122, statistical analysis such as density, histogram, angular distribution is performed to the classified defects. In the final step 124, the classified defects are displayed in defect map, histogram and other charts. Also, they are output to the standard results files. The defect review engine 126 retrieves the original images, binary images, defect map, charts as well as the reporting file for users to review.


Referring to FIG. 35, a local Gaussian thresholding algorithm classifies each pixel into non-defect or defect candidate pixel represented by 0 and 1 based on the local pixel distribution statistics in a local window centering the interested pixel. The window size and shapes for local pixel statistics computation is set up in such a way that it is only sensitive to the defects of interests, while insensitive to lighting and other noise effect. The system then uses binary morphological operation to filter out false defects and in the process converts the defects into binary blobs that are representative of their two-dimensional shapes.


As the computation cost for local statistics such as mean and deviation for every pixel is very expensive. The Gaussian background defect detection algorithm utilizes the constant time filtering technique to significantly improve the computation speed. The basic theory of the constant time filtering algorithm is to compute local statistics in current step by updating the result from previous step with a recursive filter for fast computation.


Referring to FIG. 36, the boundary detection algorithm creates a boundary map image of defect image. It then uses binary morphological operation to group the boundary pixels into binary blobs that are representative of their two-dimensional shapes. The size, location, aspect ratio, and color can be used to classify the defect.


Referring to FIG. 37, the global thresholding algorithm classifies each pixel into non-defect or defect candidate pixel represented by 0 and 1 by comparing them to pre-determined upper and lower threshold values. The system then uses binary morphological operation to filter out false defects and in the process converts the defects into binary blobs that are representative of their two-dimensional shapes.


The object of the defect classification engine is to further classify the defect candidates into defect and nuisance defects. Because there are many types of suspicious patterns in a wafer edge image, the detection phase outputs all anomalies of the wafer edge images to include all possible defects to avoid missing alarm. Therefore, occurrence of false alarms may be unavoidable using the defect detection algorithms described before.


Thus, a cost effective yet efficient and effective system is provided for illuminating and inspecting the plurality of surfaces of the edge area of a substrate 22 and providing high quality imaging of the inspected surfaces while avoiding the interference associated with specular surfaces. The system provides for improving quality control of wafer processing through edge inspection with the intended benefit of identifying and addressing defects and their causes in the IC manufacturing process with resulting improvement in yield and throughput.


It should be appreciated that while the embodiments of the system 10 are described in relation to an automated system, a manual system would also be suitable. The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.

Claims
  • 1. A method for measuring the location of a feature on a wafer's edge comprising: acquiring an image of a region of a wafer;converting the image to a grey scale image;preprocessing one of the image or grey scale image;conducting a blob analysis on the image; andcalculating the size and center of a defect found in the image based on results of the blob analysis.
  • 2. The method according to claim 1 further comprising converting the grey scale image into a binary file.
  • 3. The method according to claim 2 further comprising categorizing a plurality of defects into various categories based on at least one of shape, size or color.
  • 4. The method according to claim 3 further comprising conducting a statistical analysis of the categorization of the defects.
  • 5. The method according to claim 3 further comprising implementing a local Gaussian threshold algorithm to classify each pixel into non-defect or defect categories.
  • 6. The method according to claim 5 further comprising varying a window size to select only defects of interest.
  • 7. The method according to claim 5 further comprising implementing a Gaussian background defect algorithm using a constant time filtering technique.
  • 8. The method according to claim 7 wherein a constant time filtering technique is computing local statistics in a current process step by updating the results from a previous process step.
  • 9. The method according to claim 3 further comprising implement a binary morphological operation to group blob boundary pixels into binary blobs.
  • 10. A method for measuring the location of a feature on a wafer's edge comprising: providing a grey scale image of a portion of the edge of a wafer;conducting a blob analysis on the grey scale image to locate defects on the wafer; andcalculating the size a defect found in the grey scale image based on results of the blob analysis.
  • 11. The method according to claim 10 further comprising categorizing a plurality of defects into various categories based on at least one of shape, size or color.
  • 12. The method according to claim 11 further comprising conducting a statistical analysis of the categorization of the defects.
  • 13. The method according to claim 12 further comprising implementing a local Gaussian threshold algorithm to classify each pixel into non-defect or defect categories.
  • 14. The method according to claim 13 further comprising varying a window size to select only defects having a size greater than a predetermined size.
  • 15. The method according to claim 13 further comprising implementing a Gaussian background defect algorithm on the grey scale data using a constant time filtering technique.
  • 16. The method according to claim 15 wherein a constant time filtering technique is computing local statistics in a current process step by updating the results from a previous process step.
  • 17. The method according to claim 10 further comprising implement a binary morphological operation to group blob boundary pixels into binary blobs.
  • 18. An automatic wafer edge inspection and review system comprising: an illuminator configured to provide illumination across a wafer edge;an optical imaging subsystem to image a portion of the wafer edge;a positioning assembly to orientate the optical imaging subsystem to an inspection angle;an eccentricity sensor to actively measure the center offset of a wafer edge relative to the rotation center of the wafer chuck;a wafer chuck to hold the backside of a wafer;
  • 19. The system of claim 18 wherein the optical imaging subsystem further comprises an optical filter to cut off certain wavelength spectrum; a mirror;an objective lens;a motorized focus lens to provide routine-defined focus adjustment;a motorized zoom lens;a magnifier lens; anda high resolution area scan color camera to image a portion of the wafer edge.
  • 20. The system of claim 19 wherein the illuminator comprises, a cylindrical light diffuser having a slit extending at least a portion of its length for receiving an edge portion of a wafer; a plurality of light sources exterior or interior to the cylindrical light diffuser; andan intensity controller for independently. controlling the plurality of light sources.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 60/964,163, filed on Aug. 9, 2007. This application is a continuation-in-part application of U.S. patent application Ser. No. 11/891,657, filed on Aug. 9, 2007, which is a continuation-in-part application of U.S. patent application Ser. No. 11/417,297, filed May 2, 2006. The entire disclosure of each of the above applications is incorporated herein by reference.

Provisional Applications (1)
Number Date Country
60964163 Aug 2007 US
Continuation in Parts (2)
Number Date Country
Parent 11891657 Aug 2007 US
Child 12188849 US
Parent 11417297 May 2006 US
Child 11891657 US