Coplanar camera scanning system

Information

  • Patent Grant
  • 9088683
  • Patent Number
    9,088,683
  • Date Filed
    Friday, July 22, 2011
    13 years ago
  • Date Issued
    Tuesday, July 21, 2015
    9 years ago
Abstract
A system for scanning objects having at least two linear array sensors, adapted to detect light input signals, is provided. A lens is optically connected to each of the linear array sensors, and are adapted to receive and transmit an optical image located in a respective lens field of view along a respective lens axis to the respective one of the at least two linear array sensor. A light source which generates an illumination stripe in general linear alignment with the lens axis across a depth of the field of view is provided. A cylindrical lens is positioned between the light source and an object to be scanned. The cylindrical lens adapted to collect, transmit and focus light from the light source to form the illumination stripe. This arrangement provides a wider system field of view with generally more uniform resolution.
Description
BACKGROUND

The present invention relates generally to optical scanning systems. More particularly, this invention relates to a scanning system containing a camera using a coplanar light source.


Various optical scanning systems have been developed for reading and decoding coded symbologies, identification of objects, comparison of objects, and measurement of objects. Each of these scanning systems utilizes either a non-coherent or coherent light source. Lighting is one of the key elements in obtaining good image quality. The intensity of light needed for scanning is directly proportional to the transport speed of the scanned object and the speed of the sensor. Generally, the faster an image is to be acquired, the more light is needed. Until now, only high intensity sodium or halogen lighting was adequate to obtain crisp images in cameras that focus over a significant depth of field at high speeds. The light source is usually located off axis from the camera and sensor detecting the light reflected from the object being scanned.


In applications using sodium lamps as a light source, the lamps are used to provide the illumination required by the camera detection means. These lamps provide an abundance of optical power because they are very bright and have a wide spectral range. There are, however, several disadvantages to sodium lamp light sources. First, due to their extreme brightness, sodium lamps can create an annoyance and possible hazard to workers working in the vicinity of the scanning systems. Second, sodium lights require a large amount of AC power, thus increasing production costs. Third, these light sources create a large amount of heat. Additionally, radio frequency interference can be created which can present operational problems to equipment in the vicinity of the scanning system.


The use of light sources such as LEDs presents several advantages over sodium and halogen lighting. LED illumination is a more cost effective and ergonomic method of illumination. The problem presented by LED illumination is how to get enough light to the object that is being imaged when focusing over a large depth of field. By eliminating the mounting angle between the light source and the line of sight of the camera lens, the reflected light is managed and a lower intensity light source may be used. Because LEDs can be energized almost instantaneously, they can be de-energized when objects are not being transported within the field of view. This extends the life of the LEDs and also conserves power. Additionally, the power input to individual LEDs may be modulated and pinpointed to a desired area, such that different LEDs within an LED array may be energized at different levels according to the desired application.


The use of a coherent or non-coherent light source which will provide sufficient optical illumination to an object to be scanned, which uses less energy while alleviating potential problems of radio frequency interference or heat emission is needed.


SUMMARY OF THE INVENTION

Briefly stated, the present invention provides an optical scanning system which uses a light source to provide an illumination stripe that is optically coplanar to a camera lens and light sensor for barcode reading applications. The light source may be coplanar to the lens axis and light sensor, and preferably is formed from LEDs or other low power consumption illumination sources. The coplanar design provides adequate illumination for a large depth of field at low speeds.


In another aspect, the invention provides a scanning system in which the light source is shifted relative to the line of sight of the camera such that the illumination stripe remains coplanar with the camera line of sight at the required depth of field. The light stripe profile coming from the array can therefore be narrow. The intensity of light required to illuminate an object over the depth of field is significantly reduced, thus allowing for the use of an LED array or other low power light source.


In another aspect, the invention provides a plurality of off-axis light sources to provide an illumination stripe on the object generally coplanar with camera line of sight at the required depth of field. Different arrays of lights sources are energized according to the depth of field of the target object, allowing adequate lighting over a range of distances.


In another aspect, the present invention provides an optical scanning system which uses a light source to provide an illumination stripe that is coplanar to at least two lenses and light sensors for imaging applications. The light source is preferably optically coplanar to the axes of the lenses and light sensors, and preferably is formed from LEDs or other low power consumption illumination sources. The design provides broader imaging capability for wide width conveyors or higher density imaging, along with more uniform resolution of the scanned symbologies or images.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a side view of the coplanar camera in accordance with the preferred embodiment of the present invention.



FIG. 2 is a top view of the coplanar camera in accordance with the preferred embodiment of the present invention.



FIG. 3 is a front isometric view of the coplanar camera in accordance with the preferred embodiment of the invention.



FIG. 4 is a side isometric view of a second embodiment of the invention with a movable array of light sources used in an off-camera lens axis orientation in accordance with the present invention.



FIG. 5 is a side isometric view of a multiple row large depth of field illuminator in accordance with the present invention.



FIG. 6 is an end view of a movable light source in accordance with the present invention.



FIG. 7 is an elevation view of another embodiment of the invention including two optically coplanar cameras.



FIG. 8 is a perspective view showing the system field of view of the coplanar cameras of FIG. 7 and the focusing of the illumination beam across a depth of the system field of view



FIG. 9 is an enlarged isometric view of the scanning system of FIG. 7.



FIG. 10 is a bottom view taken along lines 10-10 in FIG. 7, showing the two camera lenses located in a generally optically coplanar position with the light source.



FIG. 11 is a top view taken along lines 11-11 in FIG. 7.



FIG. 12 is a side view of the coplanar camera system of FIG. 7.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The present invention will be described with reference to the drawing figures wherein like numerals represent like elements throughout.


Referring to FIG. 1, a coplanar camera scanning system 10 in accordance with the present invention is shown. The coplanar camera scanning system 10 preferably includes a light source 11, a camera lens 12, a focusing ring 13 for the lens 12, a linear array sensor 14, a window 22, a cylindrical lens 18, and a voice coil actuator 16. In the preferred embodiment, the light source 11 is comprised of one or more very high intensity LED arrays, although those skilled in the art will recognize other suitable lighting could be utilized, such as lasers or a laser line generator.


The light source 11 is used to illuminate a surface of a target object, indicated by broken line 17. The emitted light illuminates the target object and is reflected back to the coplanar aligned sensor 14. The coplanar camera scanning system 10 is preferably used to read barcode information from the scanned object. The coplanar camera scanning system 10 preferably utilizes a CMOS linear array sensor 14 to detect the light reflected from the object being scanned. In the first preferred embodiment a CMOS-based image sensor is referenced, but as those skilled in the art should know, any image sensor can be used, e.g., a CCD-based image sensor. The light reflected onto the CMOS linear array sensor 14 is generated in the preferred embodiment by very high intensity LEDs 11. The preferred embodiment of the present invention utilizes red LEDs within the array. As the technology regarding light sources advances, brighter, more intense LEDs can be used, including LEDs having different wavelengths. Also low power semiconductor lasers can be utilized.


The LED array 11 acts as the light source for the coplanar camera scanning system 10. As shown in FIG. 2, in the first preferred embodiment of the present invention, the light source 11 is positioned parallel to, and in the same plane as the CMOS linear array sensor 14. Those skilled in the art should realize that the light source 11 positioned in this manner is on-axis with the CMOS linear array sensor 14. The light source 11 preferably comprises a plurality of LEDs in series with each other, located on one or more circuit boards 31. In this embodiment, the coplanar camera utilizes two LED arrays to generate the required amount of light. In this embodiment, the light source 11 is positioned on each side of the camera lens 12. As should be clear to those skilled in the art, the number of LEDs required for each light source 11 differs based on the size of the conveyor belt and required depth of field. The present invention preferably utilizes 50 LEDs in each of the up to four arrays, totaling 200 LEDs. Alternatively, a desired number of low power semiconductor laser arrays may be mounted on the circuit board 31.


The light emitted from the light source 11 is focused to a narrow “stripe” on the object using a cylindrical lens 18. This cylindrical lens 18 is positioned parallel to and in between the light source 11 and the target object. In the present preferred embodiment a Fresnel lens is used, but as those skilled in the art should realize, any optical lens can be used in this application. As shown in FIGS. 1 and 2, the positioning of the cylindrical lens in relation to the light source 11 provides an illumination plane that can define a narrow “stripe” of light anywhere within the depth of field. When the target object enters this scanning field, the illumination from the light source 11 illuminates the object. Due to the positioning of the sensor 14 relative to the light source 11, the CMOS linear array sensor 14 detects the most intense light provided by the light source 11.


As shown in FIGS. 1 and 3, the cylindrical lens 18 includes a center slit 20. This center slit 20 permits the light reflected from the target object to return through the cylindrical lens 18 to the camera lens 12 and then projected onto the CMOS linear array sensor 14.


In order to maximize the depth of field of the coplanar camera scanning system 10, the voice coil actuator 16 is coupled to the focusing ring 13 of the imaging lens 12 to dynamically focus the image onto the CMOS linear array sensor 14, based on a signal from a range finder 24. Those skilled in the art should recognize that there are many methods and apparatuses that can be used as range finders and for focusing. The signal received from the range finder 24 causes the voice coil actuator 16 to move the camera lens 12 and focus the light reflected from the object onto the linear array sensor 14.


Optionally, the invention may include a focusing mechanism 26 for the light source to more accurately focus the emitted light onto a scanned object. This enhances the image which is received by the camera lens 12 and projected onto the CMOS linear array sensor 14. The focusing mechanism 26 is coupled to the light source 11, and dynamically moves the position of the lens 18 with respect to the position of the light source 11. It should be noted that either the focusing mechanism 26 or the light source 11, or both, may be moved to focus the light. Such movement, of course, depends on the distance of the object from the co-planer camera 10. This alternative embodiment keeps the intensity of the illumination stripe maximized at any distance, providing a cleaner image for detection by the CMOS linear array sensor 14.


Referring to FIG. 4, a second embodiment of the present invention uses an off axis light source 40 which is located off the camera lens axis and the linear array sensor, as represented by lines 43. The off axis light source 40 illuminates a target object by directing a beam of light onto its surface. However, the focused illumination stripe 44 is coplanar with the camera lens axis 43 and the linear sensor array at the required depth of field. The off axis light source 40 is preferably a movable array of LED sources 45 adapted to provide light to the target object. The invention, however, is not limited to this particular configuration or light source, as those skilled in the art will recognize alternative light sources from those described, such as semiconductor lasers, may be used.


The light source 40 may be focused by using an optional lens 41. The lens 41 may be any optical type lens, although a Fresnel lens is preferred. A light source positioner 42, preferably in the form of a controllable motor is connected to the light source 40 to allow movement of the light source 40. The positioner 42 is adapted to move the light source 40 based on a height of an object to be scanned, such that the focused illumination stripe 44, 44′ is located on the surface of the object. The object height may be determined by a range finder or other means.


As shown schematically in FIG. 5, the position of the off axis light source 40 is infinitely variable. Accordingly, the illumination stripe 44, 44′, 44″ can be shifted to multiple positions depending on the required depth of field along the axis 43.


Referring to FIG. 6, a third embodiment of the invention is shown which includes multiple arrays of light sources 51 which are located on one or more circuit boards 52 placed off-axis to the lens 53 and the linear array sensor. A range finder 50 is connected to the array of light sources 51. The range finder 50 determines distance between the camera and the target object. The distance data is sent to a controller which then powers on or off selected arrays of light sources 51 focused to a corresponding depth of field 55, 55′, 55″, 55″′ providing an illumination stripe 56, 56′, 56″, 56″′ coplanar to the camera lens axis 57. The camera 53 and lens 54 detect the reflected light from the illumination stripe to read required data from the object. Alternatively, all of the light sources 51 may be activated to provide the desired illumination stripe at any depth of field, eliminating the need for the distance to the target object.


Referring now to FIGS. 7-12, a fourth embodiment of a system 110 for scanning an object 105 in an object scanning area on a support surface 107 is shown. The support surface 107 is preferably in the form of a conveyor or other moving surface upon which objects are carried. The system 110 includes at least two linear array sensors 114, 115 to detect light input signals. A sensor lens 112, 113 is optically connected to each of the at least two linear array sensors 114, 115, with each of the lenses 112, 113 being adapted to receive and transmit an optical image located in a respective optical field of view 124, 125 to the respective one of the at least two linear array sensors 114, 115. A light source 111, similar to the light source 11 described above is also provided, and is preferably in the form of an array of LEDs or an array of semiconductor lasers, as shown in FIGS. 9 and 10. The arrays are preferably linear and are directed toward a lens 118, which is preferably in the form of a cylindrical lens or Fresnel lens, such as described above in connection with lens 18. The light source 111 in connection with the lens 118 produces an illumination plane 134 that has a height (h) that extends over a depth of field 132 and a width (w) that extends across the support surface 107 so that an illumination stripe is formed on a surface of the object 105 in the system field of view 130. The illumination plane 134, indicated by the two lines shown, and the system field of view 130 are generally coplanar over the depth of field 132 in the object scanning area.


As shown most clearly in FIG. 8, the illumination plane 134 has a tapering thickness that extends from a greatest thickness, adjacent to the lens 118, to a narrowest thickness, adjacent to the support surface 107. This taper will depend upon the focal length of the lens 118, but generally produces a high enough intensity illumination plane across the entire depth of field (h) so that the reflected optical image can be transmitted back to a respective one of the sensor lenses 112, 113.


Based upon an offset distance from the support surface 107 to the linear array sensors 114, 115, the system field of view 130 has a generally uniform resolution across the depth of field 132. This is in contrast to the previously described embodiments of the invention where there is a more pronounced change in resolution from the shortest throw distance between the linear array sensors 114, 115 and a surface of an object 105 to be scanned that has a height of about h, and the longest throw distance for a short object. This is a function of the angle between the support surface 107 and the lines defining the respective fields of view 124 and 125 of the linear array sensors 114, 115. The closer that the lines defining the fields of view 124, 125 come to vertical, the more uniform the resolution across the depth of field, generally following a sine function of the angle. This has a practical limit based upon a height for the system 110 above the support surface 107 and the number of linear array sensors 114, 115 which can be utilized.


A benefit of the system 110 is that the system field of view 130 has an effective width factor (ew) that is greater than that for a single sensor system. Still with reference to FIG. 7, for the system according to the invention ew>(w−s)h, where s is an offset distance at the height h for a single field of view system, as represented schematically in FIG. 7. Utilizing the present embodiment of the invention with at least two linear array sensors provides an offset distance s2, as shown in FIG. 7, which results in an effective width factor ew=(w−s2)h. In a preferred embodiment, s2<0.8 s, and more preferably is less than 0.7 s, resulting in a greater effective width for scanning objects which are carried along the support surface 107.


A further benefit of the system 110 is the ability to independently focus each lens 112, 113 and sensor 114, 115 on a different throw distance. Independent focus provides optimum focus on each surface where a single item 105 has two or more surfaces that are at different heights from the support surface 107 or where two or more items are present that have surfaces at different heights from the support surface 107.


In the preferred embodiment, the linear array sensors are CMOS image sensors and the light source lens 118 has a plurality of openings 120, 121, as best shown in FIG. 9, to allow reflective light from a surface of the object 105 to return to the at least two of the linear array sensors 114, 115 without being effected by the light source lens 118. The two linear array sensors may also comprise CCD image sensors, as noted above.


Preferably, the system 110 has the linear array of LEDs or semiconductor lasers, as well as an axis of the light source lens 118 and the illumination plane 134 located coplanar with one another. Additionally, preferably the at least two linear array sensors 114, 115 are coplanar with the linear array of LEDs or semiconductor lasers as well as the axis of the light source lens 118 and the illumination plane 134. While this is preferred, those skilled in the art will recognize that the critical aspect of the invention is providing the system optical field of view 130 in a generally coplanar location with the illumination plane 134 over the entire depth of field 132.


The use of at least two linear array sensors requires some overlap x between the two fields of view 124, 125 so that the known size barcodes or other labels can be read entirely by one of the linear array sensors 114, 115, without the need for advanced logic for combining partial codes read by different sensors. In a preferred embodiment, x equals approximately three inches, and the controller for the linear sensor arrays 114, 115 is preferably set to discriminate so that only a single reading of one label is taken in the event that the entire label falls within both fields of view 124, 125 of the individual linear sensor arrays 114, 115. However, in some applications, multiple readings are permitted and are passed on to a system controller for further evaluation in connection with the dimensioning and/or other data relating to the object 105 on the support surface 107.


The system 110 can also be used in connection with mass flow conveyors where objects are side by side. In this case, the cameras are independently focused and there is significant overlap of the two fields of view 124, 125 of the sensor arrays 114, 115 so that the two fields of view each cover substantially the entire belt, less a width of the narrowest object.


The system 110 can also be used in connection with scanning irregular shaped objects having varying heights. In this case, the cameras are again focused independently and there is again a significant overlap of the two fields of view 124, 125 of the sensor arrays 114, 115 so that the two fields of view each cover substantially the entire belt. This provides a higher performance system with a greater read rate.


The invention thus allows coverage over a wider support surface and/or a higher density read by the linear array sensors 114, 115. Additionally, the use of at least two linear array sensors 114, 115 results in more uniform resolution and less image distortion over a height h of the depth of field 132.


Preferably all of the system components described above are packaged inside a read head assembly 150 which includes camera modules that house the linear array sensors 114, 115, an illumination module that includes the light source 111 in the form of LEDs or semiconductor lasers with the focusing lens 118, and a controller for operating the sensors 114, 115 and the light source 111. These are preferably mounted in a housing 152 which can be constructed using any conventional means, but is preferably made of sheet metal or polymeric or other metallic materials.


In the preferred embodiment, the lenses 112, 113 have a fixed focal length; however, it is also possible to provide an adjustable focal length lens for the linear sensor arrays 114, 115, in the same manner as described above in connection with the prior embodiments of the invention.


While the preferred embodiments of the invention have been described in detail, the invention is not limited to the specific embodiments described above, which should be considered exemplary. Further, modifications and extensions of the present invention may be developed based upon the foregoing, all such modifications are deemed to be within the scope of the present invention as defined by the appended claims.

Claims
  • 1. A fixed-position scanning system for scanning objects in an object scanning area on a moving conveyor, the scanning system comprising: a linear array sensor adapted to detect light input signals and that is disposed above the moving conveyor that carries objects;a first lens optically connected to the linear array sensor and adapted to receive and transmit an optical image located in a field of view of the first lens to the linear array sensor;a light source comprising light emitting diodes;a second lens positioned between the light source and the moving conveyor, wherein the second lens is adapted to provide an illumination plane from light produced by the light source;a range finder; anda focusing device and an actuator in operative communication with the focusing device to adjust a focus of an image of a said object from the first lens onto the linear array sensor in response to signals from the range finder,wherein the illumination plane is coplanar with the field of view over at least a portion of a depth of field of the object scanning area.
  • 2. The system as in claim 1, wherein the light source comprises a respective at least one linear array of said light emitting diodes on opposite sides of the first lens, wherein the linear arrays collectively comprise at least fifty said light emitting diodes.
  • 3. The system as in claim 1, wherein the focusing device is attached to the first lens to move the first lens in response to the actuator.
  • 4. The system as in claim 1, wherein the range finder is configured to determine a distance between the scanning system and a target object on the conveyor.
  • 5. The system as in claim 1, wherein the focusing device comprises a focusing ring attached to the first lens, wherein the actuator moves the focusing ring to thereby move the first lens and adjust a focus of an image from the first lens onto the linear array sensor.
  • 6. The system as in claim 1, wherein the light emitting diodes are high intensity light emitting diodes.
  • 7. The system as in claim 6, wherein the light emitting diodes are red.
  • 8. The system as in claim 6, wherein the light source comprises at least fifty of said high intensity light emitting diodes.
  • 9. The system as in claim 6, wherein the light source comprises at least one hundred of said high intensity light emitting diodes.
  • 10. The system as in claim 6, wherein the light emitting diodes are red.
  • 11. The system as in claim 6, wherein the second lens is a cylindrical lens.
  • 12. The system as in claim 6, wherein the second lens includes at least one slit to allow light reflected from a surface of an object to return to the linear array sensor.
  • 13. The system as in claim 6, wherein the illumination plane is coplanar with the field of view over the depth of field.
  • 14. The system as in claim 1, wherein the light source has a number of said light emitting diodes based upon the depth of field.
  • 15. The system as in claim 1, wherein the light source has a number of said light emitting diodes based upon a size of the conveyor.
  • 16. The system as in claim 1, wherein the light source has a number of said light emitting diodes based upon the depth of field and a size of the conveyor.
  • 17. The system as in claim 1, wherein the second lens is a cylindrical lens.
  • 18. The system as in claim 17, wherein the cylindrical lens is a Fresnel lens.
  • 19. The system as in claim 1, wherein the second lens includes at least one slit to allow light reflected from a surface of an object to return to the linear array sensor.
  • 20. The system as in claim 1, wherein the linear array sensor is a CCD image sensor.
  • 21. The system as in claim 1, wherein the linear array sensor is a CMOS image sensor.
  • 22. The system as in claim 1, wherein the illumination plane is coplanar with the field of view over the depth of field.
  • 23. The system as in claim 1, further comprising a focusing mechanism in operative communication with at least one of the second lens and the light source to adjust the relative position of the second lens and the light source.
  • 24. A fixed-position scanning system for scanning objects in an object scanning area on a conveyor belt, the scanning system comprising: a linear array sensor adapted to detect light input signals;a first lens optically connected to the linear array sensor and adapted to receive and transmit an optical image located in a field of view of the first lens to the linear array sensor;a light source comprising a respective at least one linear array of light emitting diodes on opposite sides of the first lens, wherein the linear arrays collectively comprise at least fifty said light emitting diodes; anda second lens positioned between the light source and the conveyor belt, wherein the second lens is adapted to provide an illumination plane from light produced by the light source,wherein the illumination plane is coplanar with the field of view over at least a portion of a depth of field of the object scanning area.
  • 25. The system as in claim 24, wherein the linear arrays collectively comprise at least one hundred said light emitting diodes.
  • 26. The system as in claim 25, wherein the illumination plane is coplanar with the field of view over the depth of field.
  • 27. The system as in claim 24, wherein the light emitting diodes are high intensity light emitting diodes.
  • 28. The system as in claim 27, comprising a focusing device and an actuator in operative communication with the focusing device to adjust a focus of an image from the first lens onto the linear array sensor.
  • 29. The system as in claim 28, wherein the focusing device is attached to the first lens to move the first lens in response to the actuator.
  • 30. The system as in claim 27, wherein the light emitting diodes are red.
  • 31. The system as in claim 24, wherein the light source has a number of said light emitting diodes based upon the depth of field.
  • 32. The system as in claim 24, wherein the light source has number of said light emitting diodes based upon a size of the conveyor belt.
  • 33. The system as in claim 24, wherein the second lens is a cylindrical lens.
  • 34. The system as in claim 24, wherein the linear array sensor is a CCD image sensor.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 12/468,050 (filed on May 18, 2009, and issued as U.S. Pat. No. 8,004,604 on Aug. 23, 2011), which is a continuation of U.S. patent application Ser. No. 11/165,602 (filed on Jun. 23, 2005, and issued as U.S. Pat. No. 7,548,274 on Jun. 16, 2009), which is a continuation of U.S. patent application Ser. No. 10/982,820 (filed on Nov. 5, 2004, and issued as U.S. Pat. No. 6,912,076 on Jun. 28, 2005), which is a continuation-in-part of U.S. patent application Ser. No. 10/676,834 (filed on Sep. 30, 2003, and issued as U.S. Pat. No. 6,856,440 on Feb. 15, 2005), which is a continuation of U.S. patent application Ser. No. 09/810,204 (filed on Mar. 16, 2001, and issued as U.S. Pat. No. 6,628,445 on Sep. 30, 2003), which claims the benefit of U.S. patent application No. 60/190,273 (filed on Mar. 17, 2000). The entire disclosure of each of the foregoing applications is hereby incorporated by reference as if set forth verbatim herein and relied upon for all purposes.

US Referenced Citations (114)
Number Name Date Kind
3541310 Stites Nov 1970 A
3812459 MacNeill et al. May 1974 A
4011435 Phelps et al. Mar 1977 A
4095095 Muraoka et al. Jun 1978 A
4335302 Robillard Jun 1982 A
4427286 Bosse Jan 1984 A
4652730 Marshall Mar 1987 A
4758716 Mayer et al. Jul 1988 A
4900907 Matusima et al. Feb 1990 A
5063460 Mutze et al. Nov 1991 A
5115121 Bianco et al. May 1992 A
5151581 Kricheuer et al. Sep 1992 A
5155344 Fardeau et al. Oct 1992 A
5280161 Niwa Jan 1994 A
5298727 Spratte et al. Mar 1994 A
5331176 Sant′ Anselmo et al. Jul 1994 A
5354977 Roustaei Oct 1994 A
5359185 Hanson Oct 1994 A
5365049 Peng Nov 1994 A
5397855 Ferlier Mar 1995 A
5430286 Hammond, Jr. et al. Jul 1995 A
5450291 Kumagai Sep 1995 A
5468950 Hanson Nov 1995 A
5486688 Iima et al. Jan 1996 A
5532467 Roustaei Jul 1996 A
5578813 Allen et al. Nov 1996 A
5585616 Roxby et al. Dec 1996 A
5600121 Kahn et al. Feb 1997 A
5608202 Bridgelall et al. Mar 1997 A
5623137 Powers et al. Apr 1997 A
5646390 Wang et al. Jul 1997 A
5684290 Arackellian et al. Nov 1997 A
5686720 Tullis Nov 1997 A
5693929 Dvorkis et al. Dec 1997 A
5693930 Katoh et al. Dec 1997 A
5710417 Joseph et al. Jan 1998 A
5744815 Gurevich et al. Apr 1998 A
5747796 Heard et al. May 1998 A
5750974 Sasaki et al. May 1998 A
5754670 Shin et al. May 1998 A
5780831 Seo et al. Jul 1998 A
5786582 Roustaei et al. Jul 1998 A
5793033 Feng et al. Aug 1998 A
5798516 Shreesha Aug 1998 A
5814803 Olmstead et al. Sep 1998 A
5818028 Meyerson et al. Oct 1998 A
5834752 Kumakura Nov 1998 A
5847859 Murata Dec 1998 A
5852287 Taniguchi et al. Dec 1998 A
5852288 Nakazawa et al. Dec 1998 A
5859418 Li et al. Jan 1999 A
5859419 Wynn Jan 1999 A
5867522 Green et al. Feb 1999 A
5886338 Arackellian et al. Mar 1999 A
5894348 Bacchi et al. Apr 1999 A
5900619 Honda et al. May 1999 A
5912452 Wiklof et al. Jun 1999 A
5914477 Wang Jun 1999 A
5925871 Knowles et al. Jul 1999 A
5971276 Sano et al. Oct 1999 A
5984188 Dvorkis et al. Nov 1999 A
5986745 Hermary et al. Nov 1999 A
5988506 Schaham et al. Nov 1999 A
5992753 Xu Nov 1999 A
5992757 Smith et al. Nov 1999 A
RE36528 Roustaei Jan 2000 E
6039254 Froese-Peeck et al. Mar 2000 A
6039255 Seo Mar 2000 A
6053409 Brobst et al. Apr 2000 A
6057952 Kubo et al. May 2000 A
6075883 Stern et al. Jun 2000 A
6105869 Scharf et al. Aug 2000 A
6119939 Schwartz et al. Sep 2000 A
6129280 DeRenzis et al. Oct 2000 A
6135252 Knotts Oct 2000 A
6149062 Danielson et al. Nov 2000 A
6154260 Matsuda et al. Nov 2000 A
6164544 Schwartz et al. Dec 2000 A
6177999 Wurz et al. Jan 2001 B1
6179208 Feng Jan 2001 B1
6185030 Overbeck Feb 2001 B1
6194697 Gardner, Jr. Feb 2001 B1
6195202 Kusunose Feb 2001 B1
6211989 Wulf et al. Apr 2001 B1
6223988 Batterman et al. May 2001 B1
6239904 Serfling et al. May 2001 B1
6246446 Heimbuch et al. Jun 2001 B1
6260763 Svetal Jul 2001 B1
6290132 Dickson et al. Sep 2001 B1
6310710 Shahar et al. Oct 2001 B1
6338433 Drexler Jan 2002 B1
6360947 Knowles et al. Mar 2002 B1
6371374 Schwartz et al. Apr 2002 B1
6388788 Harris et al. May 2002 B1
6433907 Lippert et al. Aug 2002 B1
6452710 Hiraga et al. Sep 2002 B1
6462880 Ohkawa et al. Oct 2002 B1
6480323 Messner et al. Nov 2002 B1
RE38195 Sakai et al. Jul 2003 E
6603874 Stern et al. Aug 2003 B1
6607128 Schwartz et al. Aug 2003 B1
6628445 Chaleff et al. Sep 2003 B2
6629641 Tsikos et al. Oct 2003 B2
6633423 Ishibe Oct 2003 B2
6732929 Good et al. May 2004 B2
6739511 Tsikos et al. May 2004 B2
6795221 Urey Sep 2004 B1
6830189 Tsikos et al. Dec 2004 B2
8004604 Chaleff et al. Aug 2011 B2
20010038037 Bridgelall et al. Nov 2001 A1
20010055422 Roustaei Dec 2001 A1
20030156303 Schnee et al. Aug 2003 A1
20090095047 Patel et al. Apr 2009 A1
20100085567 Dottery et al. Apr 2010 A1
Foreign Referenced Citations (2)
Number Date Country
0751669 Jan 1997 EP
0887676 Dec 1998 EP
Non-Patent Literature Citations (9)
Entry
Patent Abstracts of Japan, vol. 007, No. 065 (E-165), for JP57-211869, Mar. 18, 1983.
“Press Release for Accu-Sort Model AV3700” Jul. 12, 1999.
International Search Report and Written Opinion dated Jan. 29, 2002, for corresponding PCT Application No. PCT/US01/08475.
International Preliminary Examination Report dated May 21, 2002, for corresponding PCT Application No. PCT/US01/08475.
Applicant request for Accelerated Prosecution and Specification Amendments dated Mar. 30, 2005, for corresponding European Application No. 01920442.9.
Official Communication from European Patent Office for corresponding European Application No. 01920442.9, dated Oct. 4, 2005.
Response to Official Communication from European Patent Office for corresponding European Application No. 01920442.9, dated Oct. 4, 2005.
Official Communication from European Patent Office for corresponding European Application No. 01920442.9, dated Apr. 24, 2012.
Response to Official Communication from European Patent Office for corresponding European Application No. 01920442.9, dated Apr. 24, 2012.
Related Publications (1)
Number Date Country
20110279672 A1 Nov 2011 US
Provisional Applications (1)
Number Date Country
60190273 Mar 2000 US
Continuations (4)
Number Date Country
Parent 12468050 May 2009 US
Child 13188858 US
Parent 11165602 Jun 2005 US
Child 12468050 US
Parent 10982820 Nov 2004 US
Child 11165602 US
Parent 09810204 Mar 2001 US
Child 10676834 US
Continuation in Parts (1)
Number Date Country
Parent 10676834 Sep 2003 US
Child 10982820 US