A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
Automated electronics assembly machines are often used in the manufacture of printed circuit boards, which are used in various electronic devices. The manufacturing process is generally required to operate quite swiftly. Rapid or high speed manufacturing ensures that costs of the completed printed circuit board are minimized. However, the speed with which printed circuit boards are manufactured must be balanced by the acceptable level of scrap or defects caused by the process. Printed circuit boards can be extremely complicated and small and any one board may have a vast number of components and consequently a vast number of electrical connections. Printed circuit boards are now produced in large quantities. Since such printed circuit boards can be quite expensive and/or be used in expensive equipment, it is important that they be produced accurately and with high quality, high reliability, and minimum scrap. Unfortunately, because of the manufacturing methods available, some level of scrap and rejects still occurs. Typical faults on printed circuit boards include inaccuracy of placement of components on the board, which might mean that the components are not correctly electrically connected in the board. Another typical fault occurs when an incorrect component is placed at a given location on a circuit board. Additionally, the component might simply be absent, or it may be placed with incorrect electrical polarity. Further still, if there are insufficient solder paste deposits, this can lead to poor connections. Additionally, if there is too much solder paste, such a condition can lead to short circuits, and so on. Further still, other errors may prohibit, or otherwise inhibit, electrical connections between one or more components, and the board. An example of this condition is when a small, “stray” electrical component is accidentally released onto a section of the circuit board where another component is to be subsequently placed by another placement operation. This stray component may prevent electrical connectivity of the “correct” component that is placed onto the printed circuit board after the stray component. The condition if further exacerbated when the correct component has a package style, such as a ball grid array (BGA) or flip chip, where the electrical connections are visibly hidden after placement. In this condition, the stray component and the integrity of the solder joints cannot be visibly inspected either manually or by automated optical inspection (AOI) systems for errors or defects since the defects are hidden by the component package. X-ray systems may detect these errors, but these systems remain too slow and expensive for wide spread adoption in most printed circuit board assembly lines.
Conventional automated optical inspection systems receive a substrate, such as a printed circuit board, either immediately after placement of the components upon the printed circuit board and before wave soldering, or post reflow. Typically, the systems include a conveyor that is adapted to move the substrate under test through an optical field of view that acquires one or more images and analyzes those images to automatically derive conclusions about components on the substrate and/or the substrate itself. The amount of time to initially program the inspection inputs is often high for these systems and also to fine tune the inspection parameters or models. Another drawback to these automated optical inspection systems is that, although they can identify manufacturing errors, they often provide little help to identify the particular processes that caused the manufacturing error. As such, a need has arisen to provide an improved inspection system that simplifies the initial inspection programming as well as providing additional insight into the root cause of manufacturing errors.
An electronics assembly line includes a first electronics assembly machine and a second electronics assembly machine. The first electronics assembly machine has a first electronics assembly machine outlet. The second electronics assembly machine has a second electronics assembly machine inlet and outlet. The inlet of the second electronics assembly machine is coupled to the outlet of the first electronics assembly machine by a conveyor. A first optical inspection sensor is disposed over the conveyor before the inlet of the second electronics assembly and is configured to provide first sensor inspection image data relative to a substrate that passes beneath the first optical inspection sensor in a non-stop fashion. A second optical inspection sensor is disposed over the conveyor after the outlet of the second electronics assembly machine and is configured to provide second sensor inspection image data relative to a substrate that passes beneath the second optical inspection sensor in a non-stop fashion. A computer is operably coupled to the first and second optical inspection sensors and is configured to provide an inspection result based upon at least one of the first and second inspection image data.
Embodiments of the present invention will generally be described with respect to the figures. A number of reference numerals are used to refer to the various features of the figures. For clarity, a listing of the various reference numerals follows.
Reference Numbers:
Embodiments of the present invention generally provide an inspection system and method with high speed acquisition of multiple illumination images without the need for expensive and sophisticated motion control hardware. Processing of the images acquired with different illumination types may appreciably enhance the inspection results.
Workpiece transport conveyor 26 translates printed circuit board 10 in the X direction in a nonstop mode to provide high speed imaging of printed circuit board 10 by camera array 4. Conveyor 26 includes belts 14 which are driven by motor 18. Optional encoder 20 measures the position of the shaft of motor 18 hence the approximate distance traveled by printed circuit board 10 can be calculated. Other methods of measuring and encoding the distance traveled of printed circuit board 10 include time-based, acoustic or vision-based encoding methods. By using strobed illumination and not bringing printed circuit board 10 to a stop, the time-consuming transport steps of accelerating, decelerating, and settling prior to imaging by camera array 4 are eliminated. It is believed that the time required to entirely image a printed circuit board 10 of dimensions 210 mm×310 mm can be reduced from 11 seconds to 4 seconds using embodiments of the present invention compared to coming to a complete stop before imaging.
Panel sensor 24 senses the edge of printed circuit board 10 as it is loaded into inspection system 92 and this signal is sent to main board 80 to begin an image acquisition sequence. Main board 80 generates the appropriate signals to begin each image exposure by camera array 4 and commands strobe board 84 to energize the appropriate flash lamps 87 and 88 at the proper time. Strobe monitor 86 senses a portion of light emitted by flash lamps 87 and 88 and this data may be used by main electronics board 80 to compensate image data for slight flash lamp output variations. Image memory 82 is provided and preferably contains enough capacity to store all images generated for at least one printed circuit board 10. For example, in one embodiment, each camera in the array of cameras has a resolution of about 5 megapixels and memory 82 has a capacity of about 2.0 gigabytes. Image data from cameras 2A-2H may be transferred at high speed into image memory buffer 82 to allow each camera to be quickly prepared for subsequent exposures. This allows the printed circuit board 10 to be transported through inspection system 92 in a nonstop manner and generate images of each location on printed circuit board 10 with at least two different illumination field types. The image data may begin to be read out of image memory 82 into PC memory over a high speed electrical interface such as PCI Express (PCIe) as soon as the first images are transferred to memory 82. Similarly, inspection program 71 may begin to compute inspection results as soon as image data is available in PC memory.
The image acquisition process will now be described in further detail with respect to
In one preferred embodiment, each field of view 30A-30H has approximately 5 million pixels with a pixel resolution of 17 microns and an extent of 33 mm in the X direction and 44 mm in the Y direction. Each field of view 30A-30H overlaps neighboring fields of view by approximately 4 mm in the Y direction so that center-to-center spacing for each camera 2A-2H is 40 mm in the Y direction. In this embodiment, camera array field of view 32 has a large aspect ratio in the Y direction compared to the X direction of approximately 10:1.
There is a small overlap in the X dimension between field of views 32 and 34 in order to have enough overlapping image information in order to register and digitally merge, or stitch together, the images that were acquired with the first illumination field type. There is also small overlap in the X dimension between field of views 33 and 35 in order to have enough overlapping image information in order to register and digitally merge the images that were acquired with the second illumination field type. In the embodiment with fields of view 30A-30H having extents of 33 mm in the X direction, it has been found that an approximate 5 mm overlap in the X direction between field of views acquired with the same illumination field type is effective. Further, an approximate 14 mm displacement in the X direction between fields of view acquired with different illumination types is preferred.
Images of each feature on printed circuit board 10 may be acquired with more than two illumination field types by increasing the number of fields of view collected and ensuring sufficient image overlap in order to register and digitally merge, or stitch together, images generated with like illumination field types. Finally, the stitched images generated for each illumination type may be registered with respect to each other. In a preferred embodiment, workpiece transport conveyor 26 has lower positional accuracy than the inspection requirements in order to reduce system cost. For example, encoder 20 may have a resolution of 100 microns and conveyor 26 may have positional accuracy of 0.5 mm or more. Image stitching of fields of view in the X direction compensates for positional errors of the circuit board 10.
It is desirable that each illumination field is spatially uniform and illuminates from consistent angles. It is also desirable for the illumination system to be compact and have high efficiency. Limitations of two prior art illumination systems, linear light sources and ring lights, will be discussed with reference to
Although a ring light could be used to provide acceptable uniformity in azimuth, the ring light would need to be very large to provide acceptable spatial uniformity for camera field of view 32 of approximately 300 mm in the Y direction. For typical inspection applications, it is believed that the ring light would need to be over 1 meter in diameter to provide sufficient spatial uniformity. This enormous ring fails to meet market needs in several respects: the large size consumes valuable space on the assembly line, the large light source is expensive to build, the illumination angles are not consistent across the working field, and it is very inefficient—the light output will be scattered over a significant fraction of the 1 meter circle while only a slim rectangle of the board is actually imaged.
An optical device, referred to as a light pipe, can be used to produce a very uniform light field for illumination. For example, U.S. Pat. No. 1,577,388 describes a light pipe used to back illuminate a film gate. Conventional light pipes, however, need to be physically long to provide uniform illumination.
A brief description of light pipe principles is provided with respect to
As the elevation angle of light exiting illuminator 65 is the same as those present in the source 60, it is relatively easy to tune those angles to specific applications. If a lower elevation illumination angle is desired then the source may be aimed closer to the horizon. The lower limit to the illumination angle is set by the standoff of the light pipe bottom edge as light cannot reach the target from angles below the bottom edge of the light pipe. The upper limit to the illumination elevation angle is set by the length of light pipe 66 since several reflections are required to randomize, or homogenize, the illumination azimuth angle. As elevation angle is increased there will be fewer bounces for a given length light pipe 64 before reaching workpiece 11.
The polygonal light pipe homogenizer only forms new azimuth angles at its corners, therefore many reflections are needed to get a uniform output If all portions of the light pipe side walls could spread or randomize the light pattern in the azimuth direction, then fewer reflections would be required and the length of the light pipe in the Z direction could be reduced making the illuminator shorter and/or wider in the Y direction.
In one aspect, reflective surface 70 is curved in segments of a cylinder. This spreads incoming light evenly in one axis, approximating a one-dimensional Lambertian surface, but does not spread light in the other axis. This shape is also easy to form in sheet metal. In another aspect, reflective surface 70 has a sine wave shape. However, since a sine wave shape has more curvature at the peaks and valleys and less curvature on the sides, the angular spread of light bundle 62 is stronger at the peaks and valleys than on the sides.
Light pipe illuminator 42 can be shortened in the Z direction compared to illuminator 41 if multiple light sources are used. Multiple sources, for example a row of collimated LEDs, reduce the total number of reflections required to achieve a spatially uniform source and hence reduce the required light pipe length. Illuminator 42 is illustrated with light sources 87A-87E which may also be strobed arc lamp sources.
In another aspect of the present invention shown in
In another embodiment of the present invention,
Light projected by source 88 is reflected by mirrors 54 and 55 and aperture plate 58. As the light reflects in mixing chamber 57, diffuser plate 52 also reflects a portion of this light and is injected back into mixing chamber 57. After multiple light reflections within mixing chamber 57, diffuser plate 52 is uniformly illuminated. The light transmitted through diffuser plate 52 is emitted into the lower section of illuminator 44 which is constructed of reflective surfaces 70, such as those discussed with reference to
It is understood by those skilled in the art that the image contrast of various object features vary depending on several factors including the feature geometry, color, reflectance properties, and the angular spectrum of illumination incident on each feature. Since each camera array field of view may contain a wide variety of features with different illumination requirements, embodiments of the present invention address this challenge by imaging each feature and location on workpiece 10 two or more times, with each of these images captured under different illumination conditions and then stored into a digital memory. In general, the inspection performance may be improved by using object feature data from two or more images acquired with different illumination field types.
It should be understood that embodiments of the present invention are not limited to two lighting types such as dark field and cloudy day illumination field nor are they limited to the specific illuminator configurations. The light sources may project directly onto workpiece 10. The light sources may also have different wavelengths, or colors, and be located at different angles with respect to workpiece 10. The light sources may be positioned at various azimuthal angles around workpiece 10 to provide illumination from different quadrants. The light sources may be a multitude of high power LEDs that emit light pulses with enough energy to “freeze” the motion of workpiece 10 and suppress motion blurring in the images. Numerous other lighting configurations are within the scope of the invention including light sources that generate bright field illumination fields or transmit through the substrate of workpiece 10 to backlight features to be inspected.
Inspection performance may be further enhanced by the acquisition of three dimensional image data. For example, electrical component polarity marks such as notches, chamfers, and dimples are three dimensional in nature. Acquisition of three dimensional solder paste deposit image data enables measurement of critical height and volume parameters. Further, three dimensional image data can improve segmentation and identification of small features with height relative to the nearly flat substrate.
Three dimensional information such as the profile of a solder paste deposit may be measured using well known laser triangulation, phase profilometry, or moiré methods, for example. U.S. Pat. No. 6,577,405 (Kranz, et al) assigned to the assignee of the present invention describes a representative three dimensional imaging system. Stereo vision based systems are also capable of generating high speed three dimensional image data.
Stereo vision systems are well known. Commercial stereo systems date to the stereoscopes of the 19th century. More recently a great deal of work has been done on the use of computers to evaluate two camera stereo image pairs (“A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms” by Scharstein and Szeliski) or multiple cameras (“A Space-Sweep Approach to True Multi-Image Matching” by Robert T. Collins). This last reference includes mention of a single camera moved relative to the target for aerial reconnaissance.
To acquire high speed two and three dimensional image data to meet printed circuit board inspection requirements, multiple camera arrays may be arranged in a stereo configuration with overlapping camera array fields of view. The circuit board can then be moved in a nonstop fashion with respect to the camera arrays. Multiple, strobed illumination fields effectively “freeze” the image of the circuit to suppress motion blurring.
In a preferred embodiment, optical inspection sensors 130, 132, and 134 are configured similarly to optical inspection sensor 94 shown in
Inspection application program 73 computes inspection results for solder paste printing such as print registration, area, percent coverage, and unintended bridging between adjacent solder pads. Height and volume may also be computed if three dimensional image data is available. After components are mounted on circuit board 10 by component placement machines 114 and 116, inspection program 73 computes inspection results to verify absence or presence of a component at a particular location on circuit board 10, whether the correct component was placed, the spatial offset of a component from its nominal design location, the spatial offset with respect to the solder paste print, and whether a component was mounted with the correct polarity. Inspection program 73 also computes whether a stray component was inadvertently released onto circuit board 10 at an improper location such as where another component is to be mounted during a subsequent placement operation.
During the assembly process and after solder paste screen printing, conveyor 122 transports printed circuit board 10 into component placement machine 114 in a non-stop fashion while inspection sensor 130 acquires images of circuit board 10 with one or more illumination field types. These images are transmitted to computer 77 and are made available to inspection application program 73 where the solder paste deposits are identified and the solder paste inspection results are generated.
Component placement machine 114 then places a portion of electrical components onto circuit board 10. When the assembly operation by component placement machine 114 is complete, conveyor 124 facilitates transport of circuit board 10 in a non-stop fashion while optical inspection sensor 132 acquires images of circuit board 10 with one or more illumination types. These images are transmitted to computer 77 and are made available to inspection program 73. Inspection program 73 computes inspection results for component presence/absence, correct component, spatial offset, and component polarity for components placed by placement machine 114. The component offset with respect to the solder paste deposits is also computed by inspection program 73 by using images captured before and after the component placement operation as explained with respect to
With the industry trend of electrical component sizes shrinking ever smaller, there is a risk of component placement machine 114 inadvertently releasing a component at an improper location on circuit board 10. For example, if this so-called stray component was released onto the location where a subsequent ball grid array (BGA) component was to be mounted by component placement machine 114, then this error would go undetected by AOI machine 120 since the stray component would not be visible. Circuit board 10 would not function as intended which may result in it being scrapped, or at least, the faulty BGA site would have to be diagnosed by other methods and reworked at significant cost. Inspection program 73 identifies stray components as explained with respect to
When the assembly operation by component placement machine 116 is complete, conveyor 126 facilitates transport of circuit board 10 in a non-stop fashion while optical inspection sensor 134 acquires images of circuit board 10 with one or more illumination types. These images are transmitted to computer 77 and are made available to inspection program 73. Inspection program 73 then computes inspection results for presence/absence, correct component, spatial offset, polarity, and offset with respect to the solder paste deposits for the remaining portion of components placed onto circuit board 10 by placement machine 116.
AOI machine 120 computes results such as verifying component presence/absence, location, polarity, and proper solder joint fillets after the solder has been reflowed by oven 118. However, AOI machine 120 cannot identify stray components at BGA or other larger component sites since they are no longer visible. When AOI machine 120 does detect an error, it is often difficult to determine the root cause of an assembly error at that stage in the assembly process. To facilitate improved root cause failure analysis, inspection program 73 can provide images of circuit board 10 to the defect review subsystem of AOI machine 120 that were captured by optical inspection sensors 130, 132, and 134 at the various stages of the assembly process and in the region of the defect identified by AOI machine 120. These images help narrow the list of potential assembly error sources and speed up root cause failure analysis.
Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.
The present application is a Continuation-In-Part application of U.S. patent application Ser. No. 12/886,803, filed Sep. 21, 2010, which application is based on and claims the benefit of U.S. Provisional Application Ser. No. 61/244,616, filed Sep. 22, 2009 and U.S. Provisional Application Ser. No. 61/244,671, filed on Sep. 22, 2009; and is a Continuation-In-Part application of U.S. patent application Ser. No. 12/864,110 filed Jul. 22, 2010; and is a Continuation-In-Part application of U.S. patent application Ser. No. 12/564,131, filed Sep. 22, 2009. All applications listed above are herein incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
1577388 | Twyman | May 1923 | A |
4677473 | Okamoto | Jun 1987 | A |
4750798 | Whitehead | Jun 1988 | A |
4795913 | Blessing et al. | Jan 1989 | A |
4799175 | Sano | Jan 1989 | A |
4896211 | Hung | Jan 1990 | A |
4978224 | Kishimoto et al. | Dec 1990 | A |
4993826 | Yoder | Feb 1991 | A |
5039868 | Kobayashi | Aug 1991 | A |
5058178 | Ray | Oct 1991 | A |
5058982 | Katzir | Oct 1991 | A |
5060065 | Wasserman | Oct 1991 | A |
5086397 | Schuster | Feb 1992 | A |
5153668 | Katzir | Oct 1992 | A |
5245421 | Robertson | Sep 1993 | A |
5260779 | Wasserman | Nov 1993 | A |
5291239 | Jackson | Mar 1994 | A |
5347363 | Yamanaka | Sep 1994 | A |
5455870 | Sepai | Oct 1995 | A |
5517234 | Straayer et al. | May 1996 | A |
5550583 | Amir | Aug 1996 | A |
5684530 | White | Nov 1997 | A |
5696591 | Bilhorn | Dec 1997 | A |
5822055 | Tsai | Oct 1998 | A |
5825495 | Huber | Oct 1998 | A |
5880772 | Kalnajs | Mar 1999 | A |
6020957 | Rosengaus et al. | Feb 2000 | A |
6023663 | Kim | Feb 2000 | A |
6175107 | Juvinall | Jan 2001 | B1 |
6222624 | Yonezawa | Apr 2001 | B1 |
6362877 | Kobayashi | Mar 2002 | B1 |
6577405 | Kranz et al. | Jun 2003 | B2 |
6603103 | Ulrich et al. | Aug 2003 | B1 |
6633375 | Veith | Oct 2003 | B1 |
6750899 | Fishbaine et al. | Jun 2004 | B1 |
6757966 | Inoue | Jul 2004 | B2 |
6850855 | Kawai | Feb 2005 | B2 |
7019826 | Vook | Mar 2006 | B2 |
7027639 | Fishbaine | Apr 2006 | B2 |
7075565 | Raymond | Jul 2006 | B1 |
7310438 | Prince | Dec 2007 | B2 |
7372632 | Lizotte | May 2008 | B2 |
7394084 | Kuriyama et al. | Jul 2008 | B2 |
7460219 | Jung | Dec 2008 | B2 |
7590279 | Akiyama | Sep 2009 | B2 |
7828472 | Liu et al. | Nov 2010 | B2 |
8098372 | Eitan et al. | Jan 2012 | B2 |
20020089664 | Shibata | Jul 2002 | A1 |
20030039388 | Ulrich et al. | Feb 2003 | A1 |
20030110610 | Duquette et al. | Jun 2003 | A1 |
20030179369 | Feldman | Sep 2003 | A1 |
20030227618 | Some | Dec 2003 | A1 |
20040037677 | Koyama et al. | Feb 2004 | A1 |
20040156539 | Jansson | Aug 2004 | A1 |
20050219518 | Korngut | Oct 2005 | A1 |
20050259245 | Cemic | Nov 2005 | A1 |
20060062013 | Imade | Mar 2006 | A1 |
20070160283 | Saphier et al. | Jul 2007 | A1 |
20080156207 | Ellenbogen | Jul 2008 | A1 |
20090148033 | Alumot et al. | Jun 2009 | A1 |
20110069154 | Case | Mar 2011 | A1 |
20110069507 | Haugan | Mar 2011 | A1 |
20110069878 | Case | Mar 2011 | A1 |
20110075156 | Patel et al. | Mar 2011 | A1 |
20110090333 | Haugan | Apr 2011 | A1 |
20110175997 | Case | Jul 2011 | A1 |
20120133920 | Skunes et al. | May 2012 | A1 |
20120327215 | Case et al. | Dec 2012 | A1 |
Number | Date | Country |
---|---|---|
202008004430 | Jan 2009 | DE |
0301255 | Feb 1989 | EP |
0994646 | Apr 2000 | EP |
1578186 | Sep 2005 | EP |
1694109 | Aug 2006 | EP |
2271683 | Apr 1994 | GB |
2417072 | Feb 2006 | GB |
2444409 | Jun 2008 | GB |
61134718 | Jun 1986 | JP |
6229875 | Dec 1987 | JP |
63011842 | Jan 1988 | JP |
02268260 | Nov 1990 | JP |
08327561 | Dec 1996 | JP |
2002271099 | Sep 2002 | JP |
2006324599 | Nov 2006 | JP |
WO 9819200 | May 1998 | WO |
WO 0026640 | May 2000 | WO |
WO 0038494 | Jun 2000 | WO |
WO 0196839 | Dec 2001 | WO |
WO 2009014940 | Jan 2009 | WO |
Entry |
---|
Martin, “A practical Guide to Machine Vision Lighting”, Advance Illumination, Rochester, VT, United States, Oct. 2007. |
Scharstein and Szeliski, “A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms”Microsoft Research, Microsoft Corporation, Redmond, WA. |
Smith, “Modern Optical Engineering: The Design of Optical Systems”, 4th ed. New York: McGraw-Hill, 2008. |
Kang, Web, Zitinick, and Takeo, “A Multibaseline Stereo System with Active Illumination and Real-time Image Acquisition.” |
Collins, “A Space-Sweep Approach to True Multi-Image Matching” University of Massachusetts, Amherst, MA. |
CyberOptics, “Flex Ultra TM HR, Automated Optical Inspection”, CyberOptics Corporation 2007. |
Notification of transmittal of the International Search Report and the Written Opinion for International application No. PCT/US2009/031744 dated May 18, 2009. |
Notification of transmittal of the International Search Report and the Written Opinion for International application No. PCT/US2010/049619 dated Dec. 8, 2010. |
Notification of transmittal of the International Search Report and the Written Opinion for International application No. PCT/US2010/049617 dated Dec. 8, 2010. |
Notification of transmittal of the International Search Report and the Written Opinion for International application No. PCT/US2010/055452 dated Jan. 17, 2011. |
U.S. Appl. No. 12/864,110, filed Jul. 22, 2010. |
U.S. Appl. No. 12/564,131, filed Sep. 22, 2009. |
U.S. Appl. No. 12/886,784, filed Sep. 21, 2010. |
U.S. Appl. No. 12/886,803, filed Sep. 21, 2010. |
U.S. Appl. No. 12/939,267, filed Nov. 4, 2010. |
Written Opinion for International application No. PCT/US2010/049617 dated Feb. 14, 2012. |
Notification of Transmittal of International Preliminary Report on Patentability for the International application No. PCT/US10/49617 dated Aug. 24, 2012. |
Notification of transmittal of the International Search Report and the Written Opinion for International application No. PCT/US2011/059040 dated Jul. 20, 2012. |
Invitation to Pay Additional Fees from International patent application No. PCT/US2011/059040 dated Mar. 15, 2012. |
Related U.S. Appl. No. 13/480,079, filed May 24, 2012. |
Number | Date | Country | |
---|---|---|---|
20110102575 A1 | May 2011 | US |
Number | Date | Country | |
---|---|---|---|
61244616 | Sep 2009 | US | |
61244671 | Sep 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12886803 | Sep 2010 | US |
Child | 12940214 | US | |
Parent | 12864110 | Jan 2011 | US |
Child | 12886803 | US | |
Parent | 12564131 | Sep 2009 | US |
Child | 12864110 | US |