Semiconductor wafers may be inspected using inspection systems to measure features of the wafer for quality control. It is advantageous to increase throughput, improve accuracy, increase dynamic range, improve reliability, and reduce the cost of inspection systems.
In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific examples in which the disclosure may be practiced. It is to be understood that other examples may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. The following detailed description, therefore, is not to be taken in a limiting sense, and the scope of the present disclosure is defined by the appended claims. It is to be understood that features of the various examples described herein may be combined, in part or whole, with each other, unless specifically noted otherwise.
Disclosed herein is a system and method of laser triangulation for wafer inspection. A laser line generator may project a line onto the surface of a wafer. The laser line may be imaged onto a three-dimensional (3D) camera by microscope optics. The 3D camera may acquire two-dimensional (2D) images of the laser line, convert the 2D images into 3D lines in a field programmable gate array (FPGA) processing board, and output the 3D lines to a frontside computer via a universal serial bus (USB) (e.g., USB3.0) interface.
The sensor head 102 may include a first 3D camera 116, an isolator 118, a specular filter/blocker 120, a magnification changer (e.g., turret/slide) 122, a laser mount 124, and an interface/control board 126. The first 3D camera 116 may include a first camera enclosure 128, a mount 130, a tube 132, and a lens 134. The magnification changer 122 may include a plurality of objective lenses 1361 to 1363 of different magnification (e.g., 2×, 10×, and 5×). The magnification changer 122 may provide the same nominal focal plane position of the first camera 116 with respect to the wafer 106 for each of the objective lenses 1361 to 1363 by shimming the objective lenses.
The laser mount 124 may include a laser 138, an attenuator 140, a mirror 142, and a quarter wave plate 144. The laser 138 may generate a laser line 146, which may be attenuated by the attenuator 140, reflected by the mirror 142 and passed through the quarter wave plate 144 for projection onto the wafer 106. The attenuator 140, mirror 142, and quarter wave plate 144 between the laser 138 and the wafer 106 may provide a circularly polarized laser line on the surface of the wafer 106 at an angle of 45 degrees to the wafer normal. The laser mount 124 may allow for the following adjustments: translation in Z to get the focal point on the objective lens center axis, rotation about Z to get the line parallel to the tool Y axis, rotation about X to get the line center on the objective lens center axis or to flatten the field of the laser, and rotation about Y to get the plane on the objective lens nominal working distance.
The quarter wave plate 144 may convert the naturally linear polarized light out of the laser 138 to circularly polarized light. The attenuator 140 may be a neutral density type fixed attenuator to achieve a slight reduction in laser power. The mirror 142 may be a turning mirror to redirect the laser beam 146 to the wafer 106. The mirror 142 may include a rotation about Y adjustment to get the plane passing through the nominal tool point. Rotation about Z may also be used to get the line center on the objective lens center axis or to flatten the field of the laser 138. The laser line 146 projected onto the surface of the wafer 106 is reflected back toward the first camera 116 through the magnification changer 122, the specular filter/blocker 120, the isolator 118, and the camera lens 134.
The camera 200 may be mounted with the long dimension of the image sensor 206 parallel to the Y axis. The camera 200 may be mounted in such a way that a splitter and second camera 148 (
The receiver section of the camera mount 210 may enable reconfiguration for camera lenses with focal lengths in the range of 148 mm to 295 mm. Referring back to
An automated specular filter/blocker 120 may be used when inspecting diffuse reflective surfaces such as pre-reflow bumps at low magnification. The specular filter/blocker 120 may be located close to the objective lens aperture. The specular filter/blocker 120 may include a plurality of selectable filters and/or blockers. The specular filter/blocker 120 may include a wheel with flag/phase positioning. Low moment of inertia (MOI) and friction may allow for a small power stepper motor to position blockers or filters quickly and accurately. The multiple blockers may have different sizes and may have different shapes depending on what spatial blocking would provide the cleanest signal (similar to micro inspection or scattering tools that block all light not related to the signals that are of interest). A filter may include a Fourier filter. Liquid crystal display (LCD) as well as solid blocking material (e.g., Vantablack) may be used. The specular filter/blocker 120 may be controlled automatically using recipes and may include a means to detect which position the filter/blocker is in and/or to detect if the filter/blocker is not fully in one of the positions.
A cylindrical lens 150 or receiver defocus may be used. In other examples, cylindrical lens 150 is not used. As the radius of the top of a mirror like spherical bump becomes small compared to the laser line width, the data can exhibit a stair step effect. This effect is at its worst when the spherical surface spreads the light far beyond the objective lens numerical aperture (NA) causing a diffraction limited line to be formed on the image sensor. When the line width is less than one pixel, the stair step effect is easily visible. The centroid error as a function of laser image size for a Gaussian shape should be kept a factor of two below the 1/16th subpixel resolution, which comes to 3% of a pixel. Accordingly, the image width should be at least 1.5 pixels or larger to keep the centroid error below 3%. A weak cylindrical lens that increases the camera lens focal length (FL) in the X direction may be used to accomplish this. The camera may also be moved closer to the camera lens to accomplish this, however, this also defocuses Y which may be desirable or undesirable depending on the amount of defocusing and the feature being inspected. The camera focus adjustment may have additional travel in the negative Z direction to accommodate defocusing to increase the objective lens NA diffraction limited spot size to at least three pixels.
The magnification changer 122 of the sensor head 102 may automatically switch between two or three objective lenses 1361 to 1363 via recipe control. The magnification changer 122 does not cause the distance between the 2D and laser triangulation optical center lines to increase. The magnification changer 122 may include a means to detect which lens positon the magnification changer is in. If the magnification changer 122 is not fully in a lens positon, then the magnification changer does not report as being in any positon.
The magnification changer 122 may support multiple interchangeable objective lenses of the same family. For example, the objective lenses may include any suitable combination of the following: 2×, 3×, 5×, 7.5×, 10×, or 20× lenses. The objective lenses may be manually swapped in the field with only configuration file changes and recalibration. Optical adjustments should not be required. If the parfocal distance varies too much between objectives, then custom spacers may be used to adjust the distance. In this case, a master 10× objective with nominal spacer may be used in manufacturing so that all production 2×, 3×, 5×, and 10× objectives may be spaced to the ideal parfocal distance.
The interface/control board 126 may include a micro controller chip (MCU) with SPI, general purpose input/output (GPIO), analog to digital converters (ADCs), and digital to analog converters (DACs). The interface/control board 126 may support at least one laser (e.g., laser 138) and two 3D cameras (e.g., 3D cameras 116 and 148). The interface/control board 126 may distribute power from power path 162 to power path 164 to the 3D camera(s) 116 and/or 148, laser(s) 138 and other devices as necessary. The interface/control board 126 may control the magnification changer 122 (e.g., turret/slide) through a communication path 166, the specular filter/blocker 120 through a communication path 170, and the on/off and output power of the laser(s) 138 through a communication path 168.
The interface/control board 126 may receive the RS422 trigger and encoder signals through signal path 112 and convert them to single ended TTL signals for the 3D camera(s). The interface/control board 126 may output the trigger signals to the 3D camera(s) through a signal path 172 and output the encoder signals to the 3D camera(s) through a signal path 174. In one example where two 3D cameras are used, one camera may receive odd numbered triggers and the other camera may receive even numbered triggers.
The interface/control board 126 may support trigger buffering logic, which queues up triggers when the triggers are coming in too fast during acceleration overshoot and velocity ripple peaks and then catches up during velocity ripple valleys. The maximum queue depth may be configured. With the XY stage, trigger buffering may allow velocity safety margins as small a 0.3% to be used while falling behind by no more than 1 trigger. The trigger buffering logic may use the trigger output signal from the camera through signal path 172 to decide when the next trigger can be sent. In other examples, the trigger buffering logic may be implemented by trigger board 108 or by 3D camera 116.
A star pattern may be used for Y calibration and XY origin calibration. A star pattern may also be used for profiling the laser throughout the entire Y field of view (FOV) and Z FOV and for checking and/or adjusting the calibration block slope and rotation. In one example, the rectangle and star features accommodate a 2 mm Z FOV and an 8 mm Y FOV.
Although specific examples have been illustrated and described herein, a variety of alternate and/or equivalent implementations may be substituted for the specific examples shown and described without departing from the scope of the present disclosure. This application is intended to cover any adaptations or variations of the specific examples discussed herein. Therefore, it is intended that this disclosure be limited only by the claims and the equivalents thereof.
This application is a PCT Application that claims priority to U.S. Provisional Patent Application No. 62/516,701, filed Jun. 8, 2017, entitled “WAFER INSPECTION SYSTEM INCLUDING A LASER TRIANGULATION SENSOR” and is incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2018/036573 | 6/8/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/227031 | 12/13/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5999266 | Takahashi et al. | Dec 1999 | A |
6164118 | Suzuki et al. | Dec 2000 | A |
6501554 | Hackney et al. | Dec 2002 | B1 |
20030001117 | Hyun | Jan 2003 | A1 |
20060262295 | Backhauss et al. | Nov 2006 | A1 |
20070148792 | Marx et al. | Jun 2007 | A1 |
20070206204 | Jia et al. | Sep 2007 | A1 |
20090123060 | Liu et al. | May 2009 | A1 |
20100177951 | Vodanovic | Jul 2010 | A1 |
20100188486 | Amanullah et al. | Jul 2010 | A1 |
20100189339 | Amanullah et al. | Jul 2010 | A1 |
20110093274 | Lee | Apr 2011 | A1 |
20120087569 | O'Dell et al. | Apr 2012 | A1 |
20150228069 | Fresquet et al. | Aug 2015 | A1 |
20150233840 | Amanullah et al. | Aug 2015 | A1 |
20170003230 | Park et al. | Jan 2017 | A1 |
20170082552 | Kim | Mar 2017 | A1 |
20170254639 | Wong | Sep 2017 | A1 |
20170276613 | Liu et al. | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
104197837 | Dec 2014 | CN |
2000074845 | Mar 2000 | JP |
2005337943 | Dec 2005 | JP |
2011134759 | Jul 2011 | JP |
2012078302 | Apr 2012 | JP |
10-2012-0087680 | Aug 2012 | KR |
20140089846 | Jul 2014 | KR |
201100779 | Jan 2011 | TW |
201544788 | Dec 2015 | TW |
201801217 | Jan 2018 | TW |
201802434 | Jan 2018 | TW |
2017162456 | Sep 2017 | WO |
Entry |
---|
IPOS Written Opinion Report for 11201911246X, dated Oct. 22, 2020, 5 pages. |
Supplementary European Search Report for related Application No. 18814215, dated Feb. 5, 2021 (7 pages). |
PCT International Preliminary Report on Patentability for PCT/US2018/059404, dated May 12, 2020, 6 pages. |
English Translation of TW Search Report for Patent application No. 107119848 dated Dec. 17, 2021, 7 pages. |
Non Final Office Action for U.S. Appl. No. 16/762,011, dated Jun. 22, 2022, 31 pages. |
Number | Date | Country | |
---|---|---|---|
20200191557 A1 | Jun 2020 | US |
Number | Date | Country | |
---|---|---|---|
62516701 | Jun 2017 | US |