The present invention relates to precise positioning of stages used in manufacturing and pertains particularly to using target images to determine a location of a stage.
Many manufacturing processes require precise positioning of stages used in manufacturing. What is meant by a stage is any platform or device used to support or hold an article of manufacture, or a stage is any object that can be attached to another object.
One part of the positioning used during manufacturing is to determine precisely where a stage is located in relation to a reference position. For example, when locating, relative to a reference position, a movable stage used in semiconductor processing, several types of systems can be used. For example, a self-mixing feedback laser can be used to determine a location relative to a reference position. See, for example, U.S. Pat. No. 6,233,045. However, accuracy of measurements using self-mixing feedback lasers is currently limited to 1 millimeter. This is insufficient for some applications.
For applications that require high resolution, other types of systems can be used to determine a location of a stage relative to a reference position. For example, a two wavelength, synthetic wavelength interferometer can be used. See, for example, U.S. Pat. No. 4,907,886. Alternatively, a grating sensor can be used. See, for example, U.S. Pat. No. 4,176.276. A disadvantage of each of these solutions is the relatively high expense associated with each of these systems.
Other types of systems can be used to precisely determine a location relative to a reference position. For example reflective sensors such as the Keyence photoelectric sensor PS 47 available from Keyence Corporation can be used. However, this system requires one sensor per degree of freedom, which complicates system geometry.
A fiber optic bundle sensor, such as the MTI-2000 Fotonic vibration sensor, available from MTI Instruments, Inc., can also be used. However, for such a fiber optic bundle sensor, there is typically a stage clearance of approximately 1 millimeters, which is insufficient for many applications.
In accordance with an embodiment of the present invention, the position of a stage is determined. Images of a plurality of targets located on the stage are captured. The captured images of the plurality of targets are compared with stored images to determine displacement coordinates for each target. The displacement coordinates for the targets are translated into position coordinates for the stage.
Sensor 11 illuminates and images a target area 17. Light between sensor 11 and target area 17 travels along a light path 14. Sensor 12 illuminates and images a target area 18. Light between sensor 12 and target area 18 travels along a light path 15. Sensor 13 illuminates and images a target area 19. Light between sensor 13 and target area 19 travels along a light path 16. Processing software 22 is used to process images captured from the targets and compare the images with stored images to produce displacement coordinates for each target. Processing software 22 then translates displacement coordinates for the targets into absolute position coordinates for stage 10, measured from a reference location. Portions of processing software 22 can reside within sensors 11, 12 and 13. Alternatively, processing software 22 used for image processing can be located completely outside sensors 11, 12 and 13 and in a separate processing system.
Imaging chip 22 is for example, a complementary metal-oxide semiconductor (CMOS) imager or a charge coupled device (CCD) array or another type of imaging hardware or camera. Processing software 22 can be partially located within imaging chip 22. Alternatively, processing software 22 used for image processing can be located completely outside imaging chips and in a separate processing system.
Optics 23 include, for example one or more optic lenses. Optics 23 are used to magnify the image of a target within target area 17 and project the image towards a sensor of imaging chip 22 or a sensor package connected to imaging chip 22.
In a block 73, image processing software/firmware located either in imaging chips within sensors 11, 12 and 13 (shown in
In a block 74, the displacement coordinates reported by all of sensors 11, 12 and 13 are translated to calculate position coordinates for stage 50 in the six degrees of freedom.
Target plane 57 is defined in two dimensions by a first coordinate W0 and a second coordinate V0. Target plane 58 is defined in two dimensions by a first coordinate W1 and a second coordinate V1. Target plane 59 is defined in two dimensions by a first coordinate W2 and a second coordinate V2.
The six degrees of freedom of motion for stage 50 are defined as translational movement (dx) along the x-axis, translational movement (dy) along the y-axis, translational movement (dz) along the z-axis, rotational movement (dRx) about the x-axis, rotational movement (dRy) about the y-axis and rotational movement (dRy) about the z-axis.
Dimensions of stage 50 are 2X along the x-axis, 2Y along the y-axis and 2Z along the z-axis. That is, the distance between target 57 and target 58 along the x-axis is 2X. The distance between target 57 and target 59 along the y-axis is 2Y. The distance between the plane defined by target 57, target 58 and target 59 and the xy plane along the z-axis is Z.
Target planes 57, 58 and 59 are all at Arctan ({square root}{square root over (2)}) or 54.73561 degrees, to the three orthogonal planes (xy plane, xz plane and yz plane) of stage 50.
A sensor 60 captures images of target plane 57 and thus is used to monitor coordinates (W0, V0). A sensor 61 captures images of target plane 58 and thus is used to monitor coordinates (W1, V1). A sensor 62 captures images of target plane 59 and thus is used to monitor coordinates (W2, V2). The optical axes of sensors 60, 61, 62 are nominally perpendicular to respective target planes 57, 58, 59 for the purpose of minimizing optical distortion of the target images.
Three dimensional translational movement (dx, dy, dz) and three dimensional rotational movement (dRx, dRy, dRz) of stage 50 cause target plane 58 to move a total of Δx, Δy1, Δz1 respectively along the x, y and z axes. The movement manifests in a change of target co-ordinates readings of ΔW1 and ΔV1 as follows:
ΔW1=−αΔx1−αΔy1
ΔV1=βΔx1−βΔy1−2βΔz1,
where α={square root}{square root over (2/2)} and β={square root}{square root over (6/6)}.
The three dimensional translational movement (dx, dy, dz) and three dimensional rotational movement (dRx, dRy, dRz) of stage 50 cause target plane 57 to move a total of Δx0, Δy0, Δz0 respectively along the x, y and z axes. The movement manifests in a change of target co-ordinates readings of ΔW0 and ΔV0 as follows:
ΔW0=−α66 x0+αΔy0
ΔV0=−βΔx0−βΔy0−2βΔz0.
The three dimensional translational movement (dx, dy, dz) and three dimensional rotational movement (dRx, dRy, (dRz) of stage 50 cause target plane 59 to move a total of Δx2, Δ2, Δz2 respectively along the x, y and z axes. The movement manifests in a change of target co-ordinates readings of ΔW2 and ΔV2 as follows:
ΔW2=αΔx2+αΔy2
ΔV2=−βΔx2+βΔy2−2βΔz2.
Total movements Δx, Δy, Δz at each target location due to both stage translation (dx, dy, dz) and stage rotation (dRx, dRy, dRz) are related by simple geometry, as set out in Table 1 below:
Cascading by matrix multiplication, changes in the six target co-ordinates (ΔW1, ΔV1, ΔW0, ΔV0, ΔW2, ΔV2) can be obtained from the six stage movements (dx, dy, dz, dRx, dRy, dRz) as set out in Table 2 below:
Conversely, the six stage movements (dx, dy, dz, dRx, dRy, dRz) can be computed from the changes of the six target co-ordinates (ΔW1, ΔV1, ΔW0, ΔV0, ΔW2, ΔV2) by a scaled inverse of the above 6×6 matrix. The six target co-ordinates (ΔW1, ΔV1, ΔW0, ΔV0, ΔW2, ΔV2) are monitored by sensor 61, sensor 60 and sensor 62. This is illustrated in Table 3 below:
It is convenient to define the x-axis and the y-axis to be on the plane defined by the three targets 57, 58 and 59. That is, target 57, target 58 and target 59 lie on the xy plane. In effect, Z equals 0. For this design, the transformation set out in Table 3 simplifies to the transformation set out in Table 4 below:
The foregoing discussion discloses and describes merely exemplary methods and embodiments of the present invention. As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.