Component sensor for pick and place machine using improved shadow imaging

Information

  • Patent Grant
  • 8068664
  • Patent Number
    8,068,664
  • Date Filed
    Thursday, June 5, 2008
    16 years ago
  • Date Issued
    Tuesday, November 29, 2011
    12 years ago
Abstract
A method of sensing a component held by a nozzle of a pick and place machine is provided. The method includes engaging a source of illumination and recording a reference background image when no component is held by the nozzle. Then, a component is adhered to the nozzle. A shadow image of the component is detected while the component is held by the nozzle. The detected shadow image of the component is adjusted based upon the recorded reference background image. Positional information relative to the component held on the nozzle is computed using the adjusted shadow image. The component is then mounted upon a workpiece using the positional information.
Description
BACKGROUND

Pick and place machines are generally used to manufacture electronic circuit boards. A blank printed circuit board is usually supplied to the pick and place machine, which then picks individual electronic components from component feeders, and places such components upon the board. The components are held upon the board temporarily by solder paste, or adhesive, until a subsequent step in which the solder paste is melted or the adhesive is fully cured. The individual electronic components must be placed precisely on the circuit board in order to ensure proper electrical contact, thus requiring correct angular orientation and lateral positioning of the component upon the board.


Pick and place machine operation is challenging. In order to drive the cost of the manufactured circuit board down, the machine must operate quickly to maximize the number of components placed per hour. However, as the state-of-the-art of the electronics industry has advanced, the sizes of the components have decreased and the density of interconnections has increased. Accordingly, the acceptable tolerance on component placement has decreased markedly. Actual pick and place machine operation often requires a compromise in speed to achieve an acceptable level of placement accuracy.


One way in which pick and place machine operation is efficiently sped up is in the utilization of a sensor that is able to accurately evaluate both the position and angular orientation of a picked component upon a vacuum nozzle or quill, while the component is in transit to the placement site. Such sensors essentially allow the task of determining the component position and orientation upon the vacuum quill to be performed without any impact on placement machine speed, unlike systems that require separate motion to a fixed alignment sensor. Such sensors are known, and are commercially available from CyberOptics Corporation, of Golden Valley, Minn., under the trade designation Model LNC-60. Several aspects of these sensors are described in U.S. Pat. Nos. 5,278,634; 6,490,048; and 6,583,884.


These laser-based alignment sensors are used in pick and place machines to measure the offset (x, y and θ) and size (Sx, Sy) of picked components. Laser-based alignment sensors generally transmit the measured offset values to the pick and place machine controller, so that the controller of the machine can correct for the offset and accurately place the component upon the circuit board at the placement site. Additionally, the part size (Sx, Sy) features are also measured and transmitted to the pick and place machine allowing the pick and place machine to detect incorrect part size, or other problems.


In a focused imaging system, the relatively large numerical apertures used to acquire images tend to cause dust on intervening surfaces to be appreciably out of focus, which in turn minimizes the effect of dust on the image. However, for reasons of compactness and low cost, it is often desirable to use shadow imaging. Shadow imaging is a technique where illumination is cast upon a component to be detected, and the detector is placed behind the component to detect the shadow cast by the component as the component block some of the illumination. Unfortunately, with shadow imaging, Fresnel diffraction can cause serious disturbances to the image even from a small dust mote.


Providing a shadow-image sensing component sensor for pick and place machines with an improved resistance to disturbances caused by Fresnel diffraction would benefit the art of automated electronics assembly.


SUMMARY

A method of sensing a component held by a nozzle of a pick and place machine is provided. The method includes engaging a source of illumination and recording a reference background image when no component is held by the nozzle. Then, a component is adhered to the nozzle. A shadow image of the component is detected while the component is held by the nozzle. The detected shadow image of the component is compensated based upon the recorded reference background image. Positional information relative to the component held on the nozzle is computed using the adjusted shadow image. The component is then mounted upon a workpiece using the positional information.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an elevation view illustrating a diagrammatic alignment sensor with which embodiments of the present invention are particularly useful.



FIG. 2 is a plan view illustrating a diagrammatic laser-based alignment sensor with which embodiments of the present invention are particularly useful.



FIG. 3 is a flow diagram of a method of operating a component sensor of a pick and place machine in accordance with an embodiment of the present invention.





DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS


FIGS. 1 and 2 are elevation and plan views illustrating a diagrammatic laser-based alignment sensor with which embodiments of the present invention are particularly useful. One or more vacuum quills or nozzles 24 are used to pick up components, such as component 30, from a feeder, such as a tape feeder, and move the component 30 to a placement location upon a circuit board. While component 30 is undergoing the relative motion to its placement site, sensor 43 is able to determine both the location of component 30 upon vacuum quill 24, as well as the rotational orientation of component 30 upon vacuum quill 24. In the example illustrated in FIGS. 1 and 2, a source of monochromatic illumination, such as laser diode 60, directs illumination away from part 30. Two reflecting mirrors 70, 72, direct the laser light beam through collimating lens 61, and slit aperture 75, past electronic component 30 with that portion of the laser beam or stripe which passes the edge of component 30 being filtered by optical filter 26 to strike linear CCD array 65, which then provides data to be processed for angular orientation and x, y location. While the illumination is energized, vacuum quill 24, and accordingly, component 30, are rotated and the corresponding movement of shadows upon linear CCD array 65 are used to calculate the angular orientation and x, y location.


Since the sensor uses shadow imaging with collimated light, if a dust mote is lodged on either of mirrors 70, 72 or lens 61, Fresnel diffraction can occur and cause undesirable effects. If the shadows from dust motes were completely black, the light being completely obstructed by an opaque dust mote, information could be completely lost in the shadow region. Fortunately, in a shadow-imaging system, the dust shadows are not usually very deep; that is, even in the center of the shadow there is usually an appreciable amount of light. Though the successive attenuation of light by a dust mode and a component involves complicated multiple Fresnel diffraction, a good approximation is to assume that the two effects are purely multiplicative. Irregularities in the shadow image can, therefore, be substantially corrected by simple and practical normalization schemes. In accordance with one embodiment of the present invention, irregularities in the shadow image are recorded by obtaining and storing a reference background image when illumination is engaged, but no component is present on the nozzle or quill. Then, the stored reference background image is used to compensate subsequent images of components on the nozzle in order to remove irregularities. This compensation is preferably done using suitable arithmetic operations on a pixel-by-pixel basis.



FIG. 3 is a flow diagram of a method of operating a component sensor of a pick and place machine in accordance with an embodiment of the present invention. Method 100 begins at block 102 where the illumination source, such as laser diode 60, is engaged to generate illumination. As used herein, “illumination” is intended to mean any suitable electromagnetic radiation that can be obstructed by a component to generate a detectable shadow. Thus, illumination can be visible or invisible to the human eye; can be structured or unstructured; and can be monochromatic or polychromatic.


At block 104, the image detected by detector 65 is stored while the illumination source is engaged, and while no component is present on the nozzle. This stored image is referred to hereinafter as ImgBACKGROUND. At block 106, the illumination is disengaged, and an image is detected from detector 65 while no illumination is present. This stored image is referred to hereinafter as ImgDARK. Note, while block 106 is illustrated as occurring after blocks 102 and 104, that sequence is arbitrary, and embodiments of the present invention can be practiced where block 106 precedes blocks 102 and 104.


At block 108, component 30 is picked up by nozzle 24. At block 110, detector 65 takes an image of component 30, which image is referred to hereinafter as ImgFG. At block 112, ImgFG is adjusted, or otherwise compensated using ImgBACKGROUND and preferably ImgDARK. Phantom blocks 114 and 116 illustrate different ways in which block 112 can be performed. Block 114 shows a pixel-by-pixel operation where each adjusted pixel is computed as follows:

ImgADJUSTED=(ImgFG−ImgDARK)/(ImgBACKGROUND−ImgDARK)


Thus, block 114 illustrates an operation where the component image is divided by the reference background image (ImgBACKGROUND), and both images ImgFG and ImgBACKGROUND are adjusted by subtracting ImgDARK. This division removes the irregularities, as long as they are stable over time, but the shadow of component 30 is not canceled by this technique because it appears only in ImgFG. Additionally, in situations where ImgDARK is sufficiently dark, or darker than a selected threshold, all pixels of ImgDARK can be set to zero (ImgDARK=0). The selected threshold can be determined for each application and/or through experimentation. Further, each pixel can be multiplied by a selected constant (r) that is chosen to maximize the video level in the adjusted image without significant risk of clipping on overshoots.


Block 116 does a similar operation, but since division is relatively difficult in digital hardware, the reciprocal of the reference background image is stored (1/ImgBACKGROUND). Moreover, if the variations are small enough, they can be approximately removed using subtraction, which is easier still.


At block 118, the adjusted image ImgADJUSTED is used to compute component positional information, such as (x, y and θ) and size (Sx, Sy) in accordance with known techniques. This positional information relative to the component is then used to mount the component upon a workpiece, such as a printed circuit board.


Since the only condition necessary to record a reference background image is the absence of a component in the sensor's field of view, it is practical, and preferable, to perform this calibration automatically on a periodic basis during operation. With such periodic calibration, it is believed that embodiments of the present invention will allow compensation for slow accumulation of dust on exposed optical surfaces, which means that cleaning is needed less frequently.


Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.

Claims
  • 1. A method of sensing a component held by a nozzle of a pick and place machine, the method comprising: engaging a source of illumination and recording a reference background image when no component is held by the nozzle;adhering a component to the nozzle;detecting a shadow image of the component while the component is held by the nozzle;adjusting the detected shadow image of the component based upon the recorded reference background image;computing positional information relative to the component held on the nozzle using the adjusted shadow image, and mounting the component upon a workpiece; anddisengaging the source of illumination and recording a dark image (ImgDARK), and subtracting the dark image from the reference background image and from the detected shadow image of the component.
  • 2. The method of claim 1, wherein adjusting the detected shadow image includes dividing the detected shadow image by the reference background image (ImgBACKGROUND).
  • 3. The method of claim 1, wherein adjusting the detected shadow image includes multiplying the detected shadow image by a reciprocal of the reference background image (ImgBACKGROUND).
  • 4. The method of claim 1, wherein adjusting the detected shadow image includes dividing the detected shadow image by the reference background image (ImgBACKGROUND).
  • 5. The method of claim 1, wherein adjusting the detected shadow image includes multiplying the detected shadow image by a reciprocal of the reference background image (ImgBACKGROUND).
  • 6. The method of claim 1, wherein the method is performed periodically.
  • 7. The method of claim 6, wherein the method is performed automatically.
  • 8. The method of claim 1, wherein the method is performed automatically.
  • 9. The method of claim 1, wherein Fresnel diffraction from a dust mote is reduced.
  • 10. The method of claim 1, wherein the illumination is monochromatic illumination.
  • 11. The method of claim 10, wherein the illumination is structured illumination.
  • 12. The method of claim 11, wherein the illumination is visible.
  • 13. The method of claim 1, wherein each pixel is multiplied by a constant that is chosen to increase video level without clipping.
CROSS-REFERENCE TO RELATED APPLICATION

The present application is based on and claims the benefit of U.S. provisional patent application Ser. No. 60/933,287, filed Jun. 5, 2007, the content of which is hereby incorporated by reference in its entirety.

US Referenced Citations (86)
Number Name Date Kind
3337941 Drop et al. Aug 1967 A
3487226 Yetter et al. Dec 1969 A
3622396 Fernandez et al. Nov 1971 A
3624401 Stoller Nov 1971 A
3636635 Lemelson Jan 1972 A
3764813 Clement et al. Oct 1973 A
3781115 Rader et al. Dec 1973 A
3854052 Asar et al. Dec 1974 A
3876877 Meulensteen et al. Apr 1975 A
3888362 Fletcher et al. Jun 1975 A
3905705 Petrohilos Sep 1975 A
4074938 Taylor Feb 1978 A
4144449 Funk et al. Mar 1979 A
4151945 Ragard et al. May 1979 A
4247767 O'Brien et al. Jan 1981 A
4312109 Kawana Jan 1982 A
4346293 Fetzer Aug 1982 A
4383359 Suzuki et al. May 1983 A
4405233 Grau Sep 1983 A
4424588 Satoh et al. Jan 1984 A
4456378 Goldowsky et al. Jun 1984 A
4553843 Langley et al. Nov 1985 A
4559452 Igaki et al. Dec 1985 A
4585350 Pryor Apr 1986 A
4598456 McConnell Jul 1986 A
4615093 Tews et al. Oct 1986 A
4628464 McConnell Dec 1986 A
4706379 Seno et al. Nov 1987 A
4733969 Case et al. Mar 1988 A
4741621 Taft et al. May 1988 A
4747198 Asai et al. May 1988 A
4776088 Biggs et al. Oct 1988 A
4794689 Seno et al. Jan 1989 A
4805110 Takahashi et al. Feb 1989 A
4812666 Wistrand Mar 1989 A
4881319 Yagi et al. Nov 1989 A
4891772 Case et al. Jan 1990 A
4905370 Hineno et al. Mar 1990 A
4973216 Domm Nov 1990 A
5005978 Skunes et al. Apr 1991 A
5012115 Asai et al. Apr 1991 A
5030839 van de Stadt Jul 1991 A
5035047 Harigane et al. Jul 1991 A
5039210 Welstead et al. Aug 1991 A
5040291 Janisiewicz et al. Aug 1991 A
5046851 Morgan Sep 1991 A
5060366 Asai Oct 1991 A
5088187 Takata et al. Feb 1992 A
5114229 Hideshima May 1992 A
5114230 Pryor May 1992 A
5131139 Oyama et al. Jul 1992 A
5162866 Tomiya et al. Nov 1992 A
5260791 Lubin Nov 1993 A
5278634 Skunes et al. Jan 1994 A
5293048 Skunes et al. Mar 1994 A
5309223 Konicek et al. May 1994 A
5331406 Fishbaine et al. Jul 1994 A
5377405 Sakurai et al. Jan 1995 A
5384956 Sakurai et al. Jan 1995 A
5455870 Sepai et al. Oct 1995 A
5467186 Indo et al. Nov 1995 A
5471310 Spigarelli et al. Nov 1995 A
5493391 Neal et al. Feb 1996 A
5493403 Nishi Feb 1996 A
5559727 Deley et al. Sep 1996 A
5566447 Sakurai Oct 1996 A
5570993 Onodera et al. Nov 1996 A
5608642 Onodera Mar 1997 A
5619328 Sakurai Apr 1997 A
5619528 Rebec et al. Apr 1997 A
5660519 Ohta et al. Aug 1997 A
5739525 Greve Apr 1998 A
5745241 Hashimoto Apr 1998 A
5749142 Hanamura May 1998 A
5897611 Case et al. Apr 1999 A
5900940 Aoshima May 1999 A
5901241 Koljonen et al. May 1999 A
6031242 Hudson Feb 2000 A
6100922 Honda et al. Aug 2000 A
6118538 Haugan et al. Sep 2000 A
6195165 Sayegh Feb 2001 B1
6400459 Haugan et al. Jun 2002 B1
6490048 Rudd et al. Dec 2002 B1
RE38025 Skunes et al. Mar 2003 E
6583884 Rudd et al. Jun 2003 B2
7545514 Manickam et al. Jun 2009 B2
Foreign Referenced Citations (38)
Number Date Country
28 34 836 Jun 1979 DE
30 22 803 Apr 1981 DE
062335 Oct 1982 EP
144717 Jun 1985 EP
0293175 May 1988 EP
0374848 Dec 1989 EP
0582086 Jul 1993 EP
0582171 Feb 1994 EP
2183820 Jun 1987 GB
57-017804 Jan 1982 JP
60-183507 Sep 1985 JP
60-189951 Sep 1985 JP
61-225604 Oct 1986 JP
62-008006 Jan 1987 JP
62-263405 Nov 1987 JP
62-288504 Dec 1987 JP
63-202096 Aug 1988 JP
63249018 Oct 1988 JP
63-283100 Nov 1988 JP
63-299400 Dec 1988 JP
2059231 Feb 1990 JP
2062099 Mar 1990 JP
2303751 Dec 1990 JP
3045919 Feb 1991 JP
3115538 May 1991 JP
4322924 Nov 1992 JP
6104596 Apr 1994 JP
6249629 Sep 1994 JP
11040991 Feb 1999 JP
11040992 Feb 1999 JP
11068397 Mar 1999 JP
11068398 Mar 1999 JP
2001-230597 Aug 2001 JP
1370456 Jan 1988 SU
WO 9949713 Sep 1999 WO
WO 0174127 Oct 2001 WO
WO 2007033349 Mar 2007 WO
WO 2007033349 Mar 2007 WO
Related Publications (1)
Number Date Country
20090003683 A1 Jan 2009 US
Provisional Applications (1)
Number Date Country
60933287 Jun 2007 US