The present invention relates generally to optical scanning systems. More particularly, this invention relates to a scanning system containing a camera using a coplanar light source.
Various optical scanning systems have been developed for reading and decoding coded symbologies, identification of objects, comparison of objects, and measurement of objects. Each of these scanning systems utilizes either a non-coherent or coherent light source. Lighting is one of the key elements in obtaining good image quality. The intensity of light needed for scanning is directly proportional to the transport speed of the scanned object and the speed of the sensor. Generally, the faster an image is to be acquired, the more light is needed. Until now, only high intensity sodium or halogen lighting was adequate to obtain crisp images in cameras that focus over a significant depth of field at high speeds. The light source is usually located off axis from the camera and sensor detecting the light reflected from the object being scanned.
In applications using sodium lamps as a light source, the lamps are used to provide the illumination required by the camera detection means. These lamps provide an abundance of optical power because they are very bright and have a wide spectral range. There are, however, several disadvantages to sodium lamp light sources. First, due to their extreme brightness, sodium lamps can create an annoyance and possible hazard to workers working in the vicinity of the scanning systems. Second, sodium lights require a large amount of AC power, thus increasing production costs. Third, these light sources create a large amount of heat. Additionally, radio frequency interference can be created which can present operational problems to equipment in the vicinity of the scanning system.
The use of light sources such as LEDs presents several advantages over sodium and halogen lighting. LED illumination is a more cost effective and ergonomic method of illumination. The problem presented by LED illumination is how to get enough light to the object that is being imaged when focusing over a large depth of field. By eliminating the mounting angle between the light source and the line of sight of the camera lens, the reflected light is managed and a lower intensity light source may be used. Because LEDs can be energized almost instantaneously, they can be de-energized when objects are not being transported within the field of view. This extends the life of the LEDs and also conserves power. Additionally, the power input to individual LEDs may be modulated and pinpointed to a desired area, such that different LEDs within an LED array may be energized at different levels according to the desired application.
The use of a coherent or non-coherent light source which will provide sufficient optical illumination to an object to be scanned, which uses less energy while alleviating potential problems of radio frequency interference or heat emission is needed.
Briefly stated, the present invention provides an optical scanning system which uses a light source to provide an illumination stripe that is optically coplanar to a camera lens and light sensor for barcode reading applications. The light source may be coplanar to the lens axis and light sensor, and preferably is formed from LEDs or other low power consumption illumination sources. The coplanar design provides adequate illumination for a large depth of field at low speeds.
In another aspect, the invention provides a scanning system in which the light source is shifted relative to the line of sight of the camera such that the illumination stripe remains coplanar with the camera line of sight at the required depth of field. The light stripe profile coming from the array can therefore be narrow. The intensity of light required to illuminate an object over the depth of field is significantly reduced, thus allowing for the use of an LED array or other low power light source.
In another aspect, the invention provides a plurality of off-axis light sources to provide an illumination stripe on the object generally coplanar with camera line of sight at the required depth of field. Different arrays of lights sources are energized according to the depth of field of the target object, allowing adequate lighting over a range of distances.
In another aspect, the present invention provides an optical scanning system which uses a light source to provide an illumination stripe that is coplanar to at least two lenses and light sensors for imaging applications. The light source is preferably optically coplanar to the axes of the lenses and light sensors, and preferably is formed from LEDs or other low power consumption illumination sources. The design provides broader imaging capability for wide width conveyors or higher density imaging, along with more uniform resolution of the scanned symbologies or images.
The present invention will be described with reference to the drawing figures wherein like numerals represent like elements throughout.
Referring to
The light source 11 is used to illuminate a surface of a target object, indicated by broken line 17. The emitted light illuminates the target object and is reflected back to the coplanar aligned sensor 14. The coplanar camera scanning system 10 is preferably used to read barcode information from the scanned object. The coplanar camera scanning system 10 preferably utilizes a CMOS linear array sensor 14 to detect the light reflected from the object being scanned. In the first preferred embodiment a CMOS-based image sensor is referenced, but as those skilled in the art should know, any image sensor can be used, e.g., a CCD-based image sensor. The light reflected onto the CMOS linear array sensor 14 is generated in the preferred embodiment by very high intensity LEDs 11. The preferred embodiment of the present invention utilizes red LEDs within the array. As the technology regarding light sources advances, brighter, more intense LEDs can be used, including LEDs having different wavelengths. Also low power semiconductor lasers can be utilized.
The LED array 11 acts as the light source for the coplanar camera scanning system 10. As shown in
The light emitted from the light source 11 is focused to a narrow “stripe” on the object using a cylindrical lens 18. This cylindrical lens 18 is positioned parallel to and in between the light source 11 and the target object. In the present preferred embodiment a Fresnel lens is used, but as those skilled in the art should realize, any optical lens can be used in this application. As shown in
As shown in
In order to maximize the depth of field of the coplanar camera scanning system 10, the voice coil actuator 16 is coupled to the focusing ring 13 of the imaging lens 12 to dynamically focus the image onto the CMOS linear array sensor 14, based on a signal from a range finder 24. Those skilled in the art should recognize that there are many methods and apparatuses that can be used as range finders and for focusing. The signal received from the range finder 24 causes the voice coil actuator 16 to move the camera lens 12 and focus the light reflected from the object onto the linear array sensor 14.
Optionally, the invention may include a focusing mechanism 26 for the light source to more accurately focus the emitted light onto a scanned object. This enhances the image which is received by the camera lens 12 and projected onto the CMOS linear array sensor 14. The focusing mechanism 26 is coupled to the light source 11, and dynamically moves the position of the lens 18 with respect to the position of the light source 11. It should be noted that either the focusing mechanism 26 or the light source 11, or both, may be moved to focus the light. Such movement, of course, depends on the distance of the object from the co-planer camera 10. This alternative embodiment keeps the intensity of the illumination stripe maximized at any distance, providing a cleaner image for detection by the CMOS linear array sensor 14.
Referring to
The light source 40 may be focused by using an optional lens 41. The lens 41 may be any optical type lens, although a Fresnel lens is preferred. A light source positioner 42, preferably in the form of a controllable motor is connected to the light source 40 to allow movement of the light source 40. The positioner 42 is adapted to move the light source 40 based on a height of an object to be scanned, such that the focused illumination stripe 44, 44′ is located on the surface of the object. The object height may be determined by a range finder or other means.
As shown schematically in
Referring to
Referring now to
As shown most clearly in
Based upon an offset distance from the support surface 107 to the linear array sensors 114, 115, the system field of view 130 has a generally uniform resolution across the depth of field 132. This is in contrast to the previously described embodiments of the invention where there is a more pronounced change in resolution from the shortest throw distance between the linear array sensors 114, 115 and a surface of an object 105 to be scanned that has a height of about h, and the longest throw distance for a short object. This is a function of the angle between the support surface 107 and the lines defining the respective fields of view 124 and 125 of the linear array sensors 114, 115. The closer that the lines defining the fields of view 124, 125 come to vertical, the more uniform the resolution across the depth of field, generally following a sine function of the angle. This has a practical limit based upon a height for the system 110 above the support surface 107 and the number of linear array sensors 114, 115 which can be utilized.
A benefit of the system 110 is that the system field of view 130 has an effective width factor (ew) that is greater than that for a single sensor system. Still with reference to
A further benefit of the system 110 is the ability to independently focus each lens 112, 113 and sensor 114, 115 on a different throw distance. Independent focus provides optimum focus on each surface where a single item 105 has two or more surfaces that are at different heights from the support surface 107 or where two or more items are present that have surfaces at different heights from the support surface 107.
In the preferred embodiment, the linear array sensors are CMOS image sensors and the light source lens 118 has a plurality of openings 120, 121, as best shown in
Preferably, the system 110 has the linear array of LEDs or semiconductor lasers, as well as an axis of the light source lens 118 and the illumination plane 134 located coplanar with one another. Additionally, preferably the at least two linear array sensors 114, 115 are coplanar with the linear array of LEDs or semiconductor lasers as well as the axis of the light source lens 118 and the illumination plane 134. While this is preferred, those skilled in the art will recognize that the critical aspect of the invention is providing the system optical field of view 130 in a generally coplanar location with the illumination plane 134 over the entire depth of field 132.
The use of at least two linear array sensors requires some overlap x between the two fields of view 124, 125 so that the known size barcodes or other labels can be read entirely by one of the linear array sensors 114, 115, without the need for advanced logic for combining partial codes read by different sensors. In a preferred embodiment, x equals approximately three inches, and the controller for the linear sensor arrays 114, 115 is preferably set to discriminate so that only a single reading of one label is taken in the event that the entire label falls within both fields of view 124, 125 of the individual linear sensor arrays 114, 115. However, in some applications, multiple readings are permitted and are passed on to a system controller for further evaluation in connection with the dimensioning and/or other data relating to the object 105 on the support surface 107.
The system 110 can also be used in connection with mass flow conveyors where objects are side by side. In this case, the cameras are independently focused and there is significant overlap of the two fields of view 124, 125 of the sensor arrays 114, 115 so that the two fields of view each cover substantially the entire belt, less a width of the narrowest object.
The system 110 can also be used in connection with scanning irregular shaped objects having varying heights. In this case, the cameras are again focused independently and there is again a significant overlap of the two fields of view 124, 125 of the sensor arrays 114, 115 so that the two fields of view each cover substantially the entire belt. This provides a higher performance system with a greater read rate.
The invention thus allows coverage over a wider support surface and/or a higher density read by the linear array sensors 114, 115. Additionally, the use of at least two linear array sensors 114, 115 results in more uniform resolution and less image distortion over a height h of the depth of field 132.
Preferably all of the system components described above are packaged inside a read head assembly 150 which includes camera modules that house the linear array sensors 114, 115, an illumination module that includes the light source 111 in the form of LEDs or semiconductor lasers with the focusing lens 118, and a controller for operating the sensors 114, 115 and the light source 111. These are preferably mounted in a housing 152 which can be constructed using any conventional means, but is preferably made of sheet metal or polymeric or other metallic materials.
In the preferred embodiment, the lenses 112, 113 have a fixed focal length; however, it is also possible to provide an adjustable focal length lens for the linear sensor arrays 114, 115, in the same manner as described above in connection with the prior embodiments of the invention.
While the preferred embodiments of the invention have been described in detail, the invention is not limited to the specific embodiments described above, which should be considered exemplary. Further, modifications and extensions of the present invention may be developed based upon the foregoing, all such modifications are deemed to be within the scope of the present invention as defined by the appended claims.
This application is a continuation of U.S. application Ser. No. 10/982,820, filed Nov. 5, 2004, which is a continuation-in-part of U.S. application Ser. No. 10/676,834, filed Sep. 30, 2003, now U.S. Pat. No. 6,856,440, which is a continuation of U.S. application Ser. No. 09/810,204, filed Mar. 16, 2001, now U.S. Pat. No. 6,628,445, which claims the benefit of U.S. Provisional Application No. 60/190,273, filed Mar. 17, 2000.
Number | Name | Date | Kind |
---|---|---|---|
3541310 | F.H. Stites | Nov 1970 | A |
3812459 | MacNeill et al. | May 1974 | A |
4011435 | Phelps et al. | Mar 1977 | A |
4095095 | Muraoka et al. | Jun 1978 | A |
4335302 | Robillard | Jun 1982 | A |
4427286 | Bosse | Jan 1984 | A |
4652730 | Marshall | Mar 1987 | A |
4758716 | Mayer et al. | Jul 1988 | A |
4900907 | Matusima et al. | Feb 1990 | A |
5063460 | Mutze et al. | Nov 1991 | A |
5115121 | Bianco et al. | May 1992 | A |
5151581 | Kricheuer et al. | Sep 1992 | A |
5155344 | Fardeau et al. | Oct 1992 | A |
5280161 | Niwa | Jan 1994 | A |
5298727 | Spratte et al. | Mar 1994 | A |
5331176 | Sant'Anselmo et al. | Jul 1994 | A |
5397855 | Ferlier | Mar 1995 | A |
5430286 | Hammond, Jr. et al. | Jul 1995 | A |
5450291 | Kumagai | Sep 1995 | A |
5468950 | Hanson | Nov 1995 | A |
5486688 | Iima et al. | Jan 1996 | A |
5532467 | Roustaei | Jul 1996 | A |
5578813 | Allen et al. | Nov 1996 | A |
5585616 | Roxby et al. | Dec 1996 | A |
5623137 | Powers et al. | Apr 1997 | A |
5646390 | Wang et al. | Jul 1997 | A |
5684290 | Arackellian et al. | Nov 1997 | A |
5686720 | Tullis | Nov 1997 | A |
5693929 | Dvorkis et al. | Dec 1997 | A |
5693930 | Katoh et al. | Dec 1997 | A |
5710417 | Joseph et al. | Jan 1998 | A |
5744815 | Gurevich et al. | Apr 1998 | A |
5747796 | Heard et al. | May 1998 | A |
5750974 | Sasaki et al. | May 1998 | A |
5754670 | Shin et al. | May 1998 | A |
5780831 | Seo et al. | Jul 1998 | A |
5786582 | Roustaei et al. | Jul 1998 | A |
5793033 | Feng et al. | Aug 1998 | A |
5798516 | Shreesha | Aug 1998 | A |
5814803 | Olmstead et al. | Sep 1998 | A |
5818028 | Meyerson et al. | Oct 1998 | A |
5834752 | Kumakura | Nov 1998 | A |
5847859 | Murata | Dec 1998 | A |
5852287 | Taniguchi et al. | Dec 1998 | A |
5852288 | Nakazawa et al. | Dec 1998 | A |
5859418 | Li et al. | Jan 1999 | A |
5859419 | Li et al. | Jan 1999 | A |
5867522 | Green et al. | Feb 1999 | A |
5886338 | Arackellian et al. | Mar 1999 | A |
5894348 | Bacchi et al. | Apr 1999 | A |
5900619 | Honda et al. | May 1999 | A |
5912452 | Wiklof et al. | Jun 1999 | A |
5914477 | Wang | Jun 1999 | A |
5925871 | Knowles et al. | Jul 1999 | A |
5971276 | Sano et al. | Oct 1999 | A |
5984188 | Dvorkis et al. | Nov 1999 | A |
5986745 | Hermary et al. | Nov 1999 | A |
5988506 | Schaham et al. | Nov 1999 | A |
5992753 | Xu | Nov 1999 | A |
5992757 | Smith et al. | Nov 1999 | A |
RE36528 | Roustaei | Jan 2000 | E |
6039254 | Froese-Peeck et al. | Mar 2000 | A |
6039255 | Seo | Mar 2000 | A |
6053409 | Brobst et al. | Apr 2000 | A |
6057952 | Kubo et al. | May 2000 | A |
6075883 | Stern et al. | Jun 2000 | A |
6105869 | Scharf et al. | Aug 2000 | A |
6119939 | Schwartz et al. | Sep 2000 | A |
6129280 | DeRenzis et al. | Oct 2000 | A |
6135252 | Knotts | Oct 2000 | A |
6154260 | Matsuda et al. | Nov 2000 | A |
6164544 | Schwartz et al. | Dec 2000 | A |
6177999 | Wurz et al. | Jan 2001 | B1 |
6179208 | Feng | Jan 2001 | B1 |
6185030 | Overbeck | Feb 2001 | B1 |
6194697 | Gardner, Jr. | Feb 2001 | B1 |
6195202 | Kusunose | Feb 2001 | B1 |
6211989 | Wulf et al. | Apr 2001 | B1 |
6223988 | Batterman et al. | May 2001 | B1 |
6239904 | Serfling et al. | May 2001 | B1 |
6246446 | Heimbuch et al. | Jun 2001 | B1 |
6260763 | Svetal | Jul 2001 | B1 |
6290132 | Dickson et al. | Sep 2001 | B1 |
6310710 | Shahar et al. | Oct 2001 | B1 |
6338433 | Drexler | Jan 2002 | B1 |
6360947 | Knowles et al. | Mar 2002 | B1 |
6371374 | Schwartz et al. | Apr 2002 | B1 |
6388788 | Harris et al. | May 2002 | B1 |
6433907 | Lippert et al. | Aug 2002 | B1 |
6452710 | Hiraga et al. | Sep 2002 | B1 |
6462880 | Ohkawa et al. | Oct 2002 | B1 |
6480323 | Messner et al. | Nov 2002 | B1 |
RE38195 | Sakai et al. | Jul 2003 | E |
6603874 | Stern et al. | Aug 2003 | B1 |
6607128 | Schwartz et al. | Aug 2003 | B1 |
6628445 | Chaleff et al. | Sep 2003 | B2 |
6629641 | Tsikos et al. | Oct 2003 | B2 |
6633423 | Ishibe | Oct 2003 | B2 |
6732929 | Good et al. | May 2004 | B2 |
6739511 | Tsikos et al. | May 2004 | B2 |
6795221 | Urey | Sep 2004 | B1 |
6830189 | Tsikos et al. | Dec 2004 | B2 |
20030156303 | Schnee et al. | Aug 2003 | A1 |
Number | Date | Country |
---|---|---|
0 751 669 | Jan 1997 | EP |
0 887 676 | Dec 1998 | EP |
Number | Date | Country | |
---|---|---|---|
20060098433 A1 | May 2006 | US |
Number | Date | Country | |
---|---|---|---|
60190273 | Mar 2000 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10982820 | Nov 2004 | US |
Child | 11165602 | US | |
Parent | 09810204 | Mar 2001 | US |
Child | 10676834 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10676834 | Sep 2003 | US |
Child | 10982820 | US |