The present technology is directed generally to descriptor guided fast marching method for analyzing images and associated systems.
Manufacturers of microelectronic devices are continually reducing the size and increasing the density of components in integrated circuits to increase the speed and capacity of devices and reduce the power consumption. Complex series of manufacturing protocols, inspection protocols, and testing protocols are often performed to produce microelectronic devices, such as light-emitting diodes (LEDs), integrated circuits, and microelectromechanical systems (MEMS). It may be difficult to rapidly and accurately detect defects in microelectronic devices. For example, microelectronic devices often include circuitry with vias that are formed by depositing conductive material into openings (e.g., through-holes) in wafers. The conductive material may deposit faster at the edges of the openings than within the holes, and the build-up of conductive material at the openings may inhibit deposition of conductive material at central regions of the holes resulting in voids or other defects. These voids are commonly referred to as keyholes. A scanning electron microscope (SEM) image can be visually inspected to locate and identify keyholes. However, it is difficult to locate and accurately identify keyholes because SEM images often have significant noise and low contrast, and keyholes often vary in size and shape.
Conventional automated systems can compare patterns to identify defects. For example, a conventional automated system can compare a pattern in a captured image to a reference pattern and identify defects based on the comparison. Conventional automated systems can also measure critical dimensions that are necessary to the function of the component. Unfortunately, reference patterns or critical dimensions have to be known before performing pattern comparison or critical dimension measurements. Additionally, detection rates of potential defects using pattern comparison or critical dimension measurements can decrease significantly when analyzing images with irregular shaped features (e.g., keyholes), significant noise, and low-contrast.
Specific details of several descriptor guided image processing methods and systems using the same are described below. In particular embodiments, the image processing methods are performed on images of a microelectronic device. The images can be from an electron microscope (e.g., a scanning electron microscope (SEM), a transmission electron microscope (TEM), or the like), an optical imager (e.g., an optical microscope, a camera, etc.), or other type of imaging equipment capable of capturing images for defect recognition, pattern recognition, or the like. The term “image” generally refers to raw images, preprocessed images, processed images, labeled images, and the like. The term “SEM image” generally refers to an image produced by a SEM and may be in grayscale or color. A person skilled in the relevant art will understand that the technology may have additional embodiments, and that the technology may be practiced without several of the details of the embodiments described below with reference to
The vias 120 can be formed using a multi-step process, including forming an insulated opening (e.g., an insulated through-hole) in a substrate 134. After forming the opening, a filling process (e.g., physical vapor deposition, chemical vapor deposition, sputtering, etc.) may be performed to fill the opening with a conductive material. If the conductive material accumulates at the periphery of the opening in the substrate 134 faster than within the hole, a keyhole may be formed. The potential defect 132 in
At block 160, an image is preprocessed to produce a preprocessed image. Image preprocessing can include correction processing such as, without limitation, intensity correction, color correction, saturation correction (e.g., color saturation correction), contrast correction, and/or tilt correction. A wide range of different types of filters, algorithms, and routines (e.g., data fitting routines) can be used for correction processing to increase the accuracy of descriptor based analysis, decrease image processing times, or achieve other results. Other types of imaging preprocessing can also be performed.
At block 162, the preprocessed image can be further processed using a fast marching method to smooth data and generate arrival time information that can include, without limitation, an arrival time function, an arrival time image, an arrival time matrix, or the like. The arrival time function can be a smooth, continuous function (e.g., a differentiable function, a monotonic function, etc.) suitable for contour analysis (e.g., intensity contour analysis, color contour analysis, etc.), segmentation (e.g., segmentation based on one or more descriptors), or the like. The arrival time image can be inspected to identify targeted features, such as potential defects, structural features, electrical features, etc. The arrival time image can be visually inspected by, for example, a technician, a wafer inspector, or an engineer. Additionally or alternatively, an automated system can inspect the arrival time image. The arrival time matrix can be analyzed using statistical techniques, numerical techniques, or the like. In some embodiments, the fast marching method uses different fast marching algorithms to process different targeted features. The targeted features can include, without limitation, patterns, potential defects (e.g., stiction, keyholes, cracks, improperly formed bond pads, etc.), edges (e.g., edges of cleaved or singulated LEDS), boundaries, or the like.
The fast marching method can process images without having information about the image. For example, the fast marching method can start at high intensity areas (e.g., light areas) of an image that may correspond to areas surrounding a potential defect and move towards low intensity area(s) (e.g., dark areas) that may correspond to the potential defect. As the fast marching method moves towards low intensity area(s), the fast marching method can detect a boundary of the potential defect. In other embodiments, the starting areas can be in low intensity areas of an image and move towards high intensity area(s) to detect a boundary of the potential defect. In some embodiments, starting areas are in a portion of an image having a uniform intensity such that the fast marching method identifies intensity variations as boundaries of potential defects (e.g., regular shaped defects, irregularly shaped defects, etc.).
In some embodiments, the fast marching method can identify and/or analyze the interior regions of contacts or vias. For example, to identify interior areas of vias, a raw image can be processed to produce an arrival time image representative of the vias. The fast marching method can process such raw images to identify the interior areas of vias without having information about the image.
At block 163, a descriptor guided analysis can be used to identify targeted features based on a targeted feature descriptor. The targeted feature descriptor can include a segmentation algorithm that performs segmentation using on one or more criteria (e.g., a threshold value, mean, mode, median, rate of change, standard deviation, etc.) based on, for example, intensity, color coordinators, or the like. Additionally or alternatively, the targeted feature descriptor can include a defect descriptor algorithm used to identify potential defects whereas a pattern descriptor can be used to identify patterns (e.g., a regular pattern, an irregular pattern, etc.). Defects can include, without limitation, holes (e.g., voids, keyholes, etc.), stictions, improperly positioned features, misaligned features, substrate defects (e.g., defects in unprocessed wafers, processed wafers, etc.), or the like.
The fast marching method can take image intensity of an image as a speed function and convert it into arrival time information to, for example, reduce, limit, or substantially eliminate noise, identify/analyze selected features (e.g., edges of vias, edges of contacts, edges of runners, or the like), or the like. The targeted feature descriptor can be used to determine the arrival time mark of reaching a predetermined area corresponding to the area of the targeted feature. The arrival time mark can define the domain of each feature to enable identification of internal areas of the features that exhibit significant acceleration. For example, if more than a predetermined number of neighboring pixels or cells within a predetermined area show significant acceleration (e.g., acceleration at or above a threshold level), the pixels or cells can be identified as a potential defect. Additionally or alternatively, a set of edge vectors (with or without offset tolerance) can be used to analyze patterns, such as irregular patterns and regular patterns. In some embodiments, a set of edge vectors can be used to separate an irregular pattern from a regular pattern at the same general area intensity level.
At block 164, the results of the descriptor guided analysis are outputted and can include, without limitation, labeled images, statistics, correction routines, optimization routines, reports, or the like. The labeled images can be labeled raw images, labeled preprocessed images, labeled processed images, or the like. The labels can be boxes, highlighting, annotation, or other types of indicators for identifying features or areas for visual inspection, automated inspection, or the like. The statistics can include, without limitation, across-wafer statistics, batch statistics, wafer-to-wafer statistics, or the like. The correction routines can be used to adjust processing parameters (e.g., processing parameters for deposition equipment, processing parameters for CMP equipment, etc.) to reduce the frequency of defects, increase throughput, or otherwise enhance processing. The reports can include, without limitation, characteristics of identified features, statistics, alignment information, or the like.
Referring to
R|∇T|=1 (Equation I)
where R is a rate (speed function) at (x, y) of an image and T is the arrival time at (x, y) of the image. The rate (speed function) can be the intensity of a cell or a pixel at a location (x, y) of the image.
Referring to
Because boundaries of the targeted features may be identified by sudden changes in intensity, starting areas can be spaced apart from sudden changes in intensity. In some images, keyholes may appear as black dots whereas the upper surfaces of vias may be light gray or some other higher intensity. The fast marching method can move away from starting areas in high intensity regions surrounding the keyhole to areas of low intensity in the keyholes. This helps ensure that boundaries of the keyholes are accurately detected. By way of example, the dots 200 of
Referring to
Referring to
Referring to
Referring to
The descriptor in the form of a targeted feature descriptor 341 can have a segmented threshold of about 410.
Descriptors can have a number of thresholds that are selected based on, for example, the number of targeted features to be identified. For example, a first threshold can be used to detect boundaries of a contact or a via. A second threshold can be used to detect potential defects within the contact or via. Different descriptors can identify different targeted features. A stiction descriptor can detect potential stiction, an alignment descriptor can detect alignment of features, and a pattern descriptor can detect patterns. A single image can be analyzed using different descriptors. Additionally, different descriptors can be used to process different types of images. For example, a first descriptor can be used to analyze an SEM image from a manufacturer and a second descriptor can be used to analyze an SEM image from another manufacturer. The descriptors can compensate for differences in intensities between different imaging equipment. A descriptor can also be modified based on prior inspections (e.g., inspection of microelectronic devices, inspection of a wafer, inspection of a batch of wafers, etc.). This can increase the accuracy of the analysis and also may reduce inspection times.
In contrast to
Other types of speed functions and descriptors can be used. In some embodiments, the speed function R at (x, y) can be determined according to the equation:
where ∇I(x, y) is the gradient image intensity at (x, y) and α is a constant which can be selected based on the detection. For example, α can be set to 1 and increased by 1 until a desired detection is achieved. In some embodiments, the maximum integer for α can be 15 or other appropriate maximum integer. The rate function R according to equation II can make the intensity less significant in the fast marching method as compared to the rate function R that is equal to the intensity. The changes of intensity rather than the intensity itself drive the rate function R of Equation II to provide enhanced boundary detection of relatively noisy images or patterns. The rate can be the lowest when the gradient of intensity is the highest to facilitate detection of features. For example, the gradient of intensity can be the highest at boundaries to significantly extend the calculated fast marching arrival time period within the boundary regions, to facilitate the detection of any noisy boundary contours. Other types of speed functions can also be used. Elliptical functions or other types of descriptors can be used to find characteristics (e.g., dimensions, shapes, etc.) of detected items. In some embodiments, a descriptor in the form of an ellipse function can be used to fit the boundary locations detected using the speed function R of Equation II. Thus, boundary locations can be detected without counting higher intensity pixels. Additionally or alternatively, lines, curves (e.g., 1st order curves, 2nd order curves, etc.), polygons (e.g., rectangles, squares, triangles, etc.), or other fitting features can be fitted (e.g., fitted to boundary values, pixels, cells, etc.) to enhance detection.
The computing system 554 includes memory 560, processing unit 562, and a display 568. Memory 560 can include, without limitation, a computer readable medium, volatile memory, non-volatile memory, read-only memory (ROM), random access memory (RAM), or the like. The memory 560 can store information, including instructions (e.g., executable code, computer programs, etc.), reference images, correction algorithms, fast marching algorithms, descriptor functions, optimization programs (e.g., fast marching optimization algorithms, descriptor function optimization algorithms, etc.), calibration programs, labeling protocols, databases, look up tables, or the like. For example, the memory 560 can store a sequence of instructions executable by the processing unit 562.
The processing unit 562 can be in communication with the memory 560 and display 568 and can include, without limitation, one or more programmable processors, computers, central processing units, processing devices, microprocessors, digital signal processors (DSPs), and/or application-specific integrated circuits (ASICs). The processing unit 562 can execute instructions to perform the descriptor guided based analysis, fast marching method, etc.
The display 568 can be a liquid crystal display, LED display, or organic light emitting diode (OLED) display. Other types of displays can also be used. A user can view results on the display 568. If the display 568 is a touchscreen, the user can manually select displayed features. In some embodiments, the computing system 554 is communicatively coupled to a network. A user can view images transferred via the network to a remote display.
In operation, items to be inspected are delivered to the imager 552, which captures an image of the items. The processing unit 562 is configured to execute instructions from memory 560 to process the images. In one embodiment, the processing unit 562 executes instructions to automatically identify targeted features. Different instructions can be executed to analyze different targeted features. Additionally or alternatively, results can be stored in memory 560 or transmitted via a communication link (e.g., a computer communication link, the Internet, etc.) to a remote computer or network for viewing, storage, and/or analysis. Results can be used to monitor performance of processing equipment (e.g., deposition equipment, sputtering equipment, CMP equipment, cluster tools, etc.).
The computing system 554 can perform self calibration. A raw image can be sent to the processing unit 562. Information (e.g., dimensions, shapes, position, etc.) about the target feature can be stored in memory or inputted by a user. The processing unit 562 can determine an appropriate scale to perform fast marching method. The computing system 554 can match the boundaries to expected boundary sizes (e.g., diameters of vias) and can perform self calibration for new images based on variations of scale. After the computer system 554 identifies vias, the computer system 554 can rapidly identify additional vias.
Certain aspects of the technology described in the context of particular embodiments may be combined or eliminated in other embodiments. For example, embodiments disclosed herein can be used to inspect integrated circuitry, optoelectronics, packaging, or the like. Additionally, the fast marching method can be used to process raw images. Different types of fast marching methods (e.g., generalized fast marching methods, classical fast marching methods, etc.) can be combined with other analysis techniques (e.g., level set method). For example, a portion of an image can be processed using the fast marching method and another portion of the image can be processed using a level set method. Results from the fast marching method and results from the level set method can be compared or combined. Descriptors (e.g., descriptor algorithms, descriptor functions, etc.) can be used to modify the fast marching methods, identify regions of an image for detailed analysis, and the like. Further, while advantages associated with certain embodiments have been described in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the present technology. Accordingly, the present disclosure and associated technology can encompass other embodiments not expressly described or shown herein.
This application is a continuation of U.S. patent application Ser. No. 15/609,797, filed May 31, 2017, now U.S. Pat. No. 10,614,338; which is a continuation of U.S. patent application Ser No. 13/597,890, filed Aug. 29, 2012; each of which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5095204 | Novini | Mar 1992 | A |
RE33956 | Lin | Jun 1992 | E |
5544256 | Brecher | Aug 1996 | A |
5568563 | Tanaka et al. | Oct 1996 | A |
5608816 | Kawahara et al. | Mar 1997 | A |
6188785 | Nakamura et al. | Feb 2001 | B1 |
6200823 | Steffan | Mar 2001 | B1 |
6452677 | Do et al. | Sep 2002 | B1 |
6456899 | Gleason et al. | Sep 2002 | B1 |
6494999 | Herrera | Dec 2002 | B1 |
7010447 | Ninomiya et al. | Mar 2006 | B2 |
7034272 | Leonard et al. | Apr 2006 | B1 |
7069155 | Phan et al. | Jun 2006 | B1 |
7130036 | Kuhlmann | Oct 2006 | B1 |
7133550 | Hiroi et al. | Nov 2006 | B2 |
7424146 | Honda et al. | Sep 2008 | B2 |
7523027 | Chang et al. | Apr 2009 | B2 |
7539338 | Fukae et al. | May 2009 | B2 |
7570797 | Wang | Aug 2009 | B1 |
7570800 | Lin et al. | Aug 2009 | B2 |
7602962 | Miyamoto et al. | Oct 2009 | B2 |
7734082 | Honda et al. | Jun 2010 | B2 |
7747062 | Chen et al. | Jun 2010 | B2 |
8160351 | Sandstroem et al. | Apr 2012 | B2 |
8270703 | Takahashi | Sep 2012 | B2 |
8452076 | Nakagaki et al. | May 2013 | B2 |
9070180 | Amzaleg et al. | Jun 2015 | B2 |
9330450 | Sah et al. | May 2016 | B2 |
20030053046 | Ise | Mar 2003 | A1 |
20030076989 | Maayah et al. | Apr 2003 | A1 |
20030169916 | Hayashi | Sep 2003 | A1 |
20030223639 | Shlain | Dec 2003 | A1 |
20040218806 | Miyamoto et al. | Nov 2004 | A1 |
20050010890 | Nehmadi | Jan 2005 | A1 |
20050139772 | Hasegawa | Jun 2005 | A1 |
20050146714 | Kitamura | Jul 2005 | A1 |
20050177315 | Ghosh | Aug 2005 | A1 |
20050194535 | Noji et al. | Sep 2005 | A1 |
20050224457 | Satake | Oct 2005 | A1 |
20070116380 | Ciuc | May 2007 | A1 |
20090013304 | Peng | Jan 2009 | A1 |
20090196490 | Matsumiya | Aug 2009 | A1 |
20100128988 | Kincaid | May 2010 | A1 |
20110317903 | Hasslmeyer | Dec 2011 | A1 |
20110320023 | Sullivan | Dec 2011 | A1 |
20120027286 | Xu | Feb 2012 | A1 |
20120197713 | Stroila | Aug 2012 | A1 |
20120327403 | Obuchi | Dec 2012 | A1 |
20130035867 | De Moor | Feb 2013 | A1 |
20130230230 | Ajemba | Sep 2013 | A1 |
20130304399 | Chen | Nov 2013 | A1 |
20170357870 | He et al. | Dec 2017 | A1 |
Entry |
---|
Liapis et al. (“Colour and Texture Segmentation Using Wavelet Frame Analysis, Deterministic Relaxation, and Fast Marching Algorithms,” Journal of Visual Communication Image Representation, 15 (2004)) (Year: 2004). |
Yan et al. (“Applying improved fast marching method to endocardial boundary detection in echocardiographic images,” Pattern Recognition Letters, vol. 24, No. 15, Nov. 2003) (Year: 2003). |
Sebbe et al. (“Segmentation of Opacified Thorax Vessels using Model-driven Active Contour,” IEEE 27th Annual Conference Engineering in Medicine and Biology, 2005) (Year: 2005). |
J.A. Sethian. Advancing Interfaces: Level Set and Fast Marching Methods, Dept. of Mathematics, Univ. of California, Berkeley, 1999, 12 pages. |
Yan, Jiayong, et al. “Lymph node segmentation from CT images using fast marching method.” Computerized Medical Imaging and Graphics 28.1 (2004): 33-38. 6 pages. |
Yapa, Roshan Dharshana, and Koichi Harada. “Breast skin-line estimation and breast segmentation in mammograms using fastmarching method.” International Journal of Biological, Biomedical and Medical Sciences 3.1 (2008): 54-62. 9 pages. |
“Malladi, R. et al., Level Set and Fast Marching Methods in Image Processing and Computer Vision, Proceedings, International Conference on Image Processing, Sep. 1996, pp. 489-492.” |
“Porikli, F., Automatic Image Segmentation by Wave Propagation, Mitsubishi Electric Research Laboratory, Mar. 2004, 10 pages.” |
Number | Date | Country | |
---|---|---|---|
20190347505 A1 | Nov 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15609797 | May 2017 | US |
Child | 16519508 | US | |
Parent | 13597890 | Aug 2012 | US |
Child | 15609797 | US |