The field of this disclosure relates generally to systems and methods for data reading, and more particularly but not exclusively to reading of optical codes (e.g., barcodes) using imaging readers. Optical codes encode useful, optically readable information about the items to which they are attached or otherwise associated. Perhaps the best example of an optical code is the barcode. Barcodes are ubiquitously found on or associated with objects of various types, such as the packaging of retail, wholesale, and inventory goods; retail product presentation fixtures (e.g., shelves); goods undergoing manufacturing; personal or company assets; and documents. By encoding information, a barcode typically serves as an identifier of an object, whether the identification be to a class of objects (e.g., containers of milk) or a unique item (see e.g., U.S. Pat. No. 7,201,322). Linear (one-dimensional) barcodes consist of alternating bars (i.e., relatively dark areas) and spaces (i.e., relatively light areas). The pattern of alternating bars and spaces and the widths of those bars and spaces represent a string of binary ones and zeros, wherein the width of any particular bar or space is an integer multiple of a specified minimum width, which is called a “module” or “unit.” Thus, to decode the information, a barcode reader must be able to reliably discern the pattern of bars and spaces, such as by determining the locations of edges demarking adjacent bars and spaces from one another, across the entire length of the barcode.
Barcodes are just examples of the many types of optical codes in use today. Linear barcodes, such as the UPC code, are typically considered an example of a one dimensional or linear optical code, as the information is encoded in one direction—the direction perpendicular to the bars and spaces. Higher-dimensional optical codes, such as, two-dimensional matrix codes (e.g., MaxiCode) or stacked codes (e.g., PDF 417), which are also sometimes referred to as barcodes, are also used for various purposes.
Two of the more common types of devices that read optical codes are (1) flying-spot scanners and (2) imager-based readers. The flying-spot scanner type of reader is typically a laser-based barcode reader (also called a scanner), which generates a spot of laser light and sweeps or scans the spot out into a read area and across a barcode label. A laser-based scanner detects reflected and/or refracted laser light from the bars and spaces in a barcode as the laser spot moves across the barcode. An optical detector measures the intensity of the return light as a function of time or position and generates an electrical signal having an amplitude determined by the intensity of the detected light. As the barcode is scanned, positive-going transitions and negative-going transitions in the electrical signal occur, signifying transitions between bars and spaces in the barcode. The electrical signal can be analyzed to determine the arrangement of bars and spaces of the scanned barcode. The bar and space information can be provided to a decoding unit to determine whether the barcode is recognized and, if so, to decode the information contained in the barcode. Other examples of laser-based scanners are disclosed in U.S. Pat. No. 7,198,195.
Imager-based readers operate according to a different principle compared to laser-based scanners. An imager-based reader utilizes a camera or imager to generate electronic image data (typically in digital form) of an entire area. The image data is then processed to find and decode the optical code. For example, virtual scan line techniques are known techniques for digitally processing an image containing a barcode by looking across an image along a plurality of lines, typically spaced apart and at various angles, somewhat like a laser beam's scan pattern in a flying spot scanner.
One advantage of imager-based readers is the ability to produce a high-density image of a scan area and thus are able to read 2-D and high density optical codes. Laser scanners are better suited for reading one-dimensional barcodes and typically have longer depth of field than imager-based readers. Laser scanners are also well suited for multi-plane (e.g. bioptic) reading using complex mirror arrays to generate scan lines from different directions and through windows oriented in different orthogonal planes. The present inventor has recognized that it would be advantageous to have an imaging data reader that possesses the advantages of the different types of readers.
With reference to the above-listed drawings, this section describes particular embodiments and their detailed construction and operation. The embodiments described herein are set forth by way of illustration only and not limitation. Those skilled in the art will recognize in light of the teachings herein that there is a range of equivalents to the example embodiments described herein. Most notably, other embodiments are possible, variations can be made to the embodiments described herein, and there may be equivalents to the components, parts, or steps that make up the described embodiments.
The data reading system 10 includes a single image sensor shown disposed in the lower housing section 24. Alternately the image sensor 30 may be positioned in any suitable location within the housing 20. Mirrors 34, 36, and 38 are arranged to redirect fields of view from respective windows 26 and 28 onto the image sensor 30. Mirror 34 is a splitting mirror selectively allowing the field of view to be directed out both windows. Generally speaking, the first view of the reader 10 is a two-dimensional (2D) view through window 26, reflecting off of the mirror 34, focused by the focusing lens system 32 and onto the image sensor 30. Simultaneously or alternately/consecutively (as will be described further below), the second two-dimensional view of the reader 10 is a view through vertical window 28, reflecting off of upper mirror 38 then lower mirror 36 and then passing through the splitter mirror 3, focusing by lens system 32 onto the image sensor 30. Thus multiple 2D perspectives/images of objects in the scan volume 50 are produced onto the image sensor.
According to a first embodiment, the mirror 34 comprises a chromatic splitting (or combining) element such as a dichroic mirror or dichroic reflector. Dichroic mirrors, also known as cold mirrors, are highly accurate color reflecting filters used to selectively pass light of one specific range of colors while reflecting other colors. In operation, the scan volume 50 is selectively (in an alternating fashion) illuminated by the first light source 40 or the second light source 42. The first light source 40 generates and passes light of a first wavelength λ1 out through the lower window 26 and the second light source 42 passes light of a second wavelength λ2 out through the upper window 28 and into the scan volume 50.
The dichroic mirror 34 is designed and selected to reflect incoming light of a first wavelength λ1 and pass incoming light of a second wavelength λ2 , thus chromatically combining light of both wavelengths λ1 , λ2 along a common path as shown in the diagram of
As previously described, by its properties, the dichroic mirror 34 efficiently reflects light in a first given wavelength range and passes/transmits light of a second given wavelength range. For example,
The lens system 32 may be comprised of one or more lenses, filters and/or mirrors to comprise a focusing system as needed for focusing light onto the image sensor 30. Alternately, the fold mirrors 36, 38 or other focusing elements may be employed solely for focusing light coming from the upper window 28 since that light is traveling a greater distance than the light through lower window 26.
In one embodiment, the image sensor 30 is a monochrome sensor of fairly broad range encompassing fully the ranges of the λ1 and λ2 wavelengths of the light sources 40, 42. Alternatively, by limiting the effectiveness of the image sensor 30 to narrower wavelength ranges, such as by including narrow band filters in the light path, wavelengths outside the light wavelengths generated by the light sources 40, 42 or light from other sources such as ambient illumination, is not formed with sufficient power on the sensor to cause image confusion. Alternately, a color image sensor may be employed and filters may be included in the incoming light path or provided by the lens system 32 to filter out undesirable wavelengths of light from reaching the image sensor 30.
According to one alternate system, the light sources 40, 42 may be illuminated simultaneously and a filtering system be interposed between the dichroic mirror and the image sensor 30 which alternately allows the light of the specified wavelengths to alternately pass. Such a filtering mechanism may comprise, for example, a rotating filter with two different filter regions, an electronic filter, or other suitable mechanism.
In one example method employing a color imager, image data for an optical code on an object in a read volume may be gathered by the steps of:
In the system 110, the image sensor 130 is a color image sensor which incorporates sensors for different color ranges (e.g. red, blue and green wavelength ranges). The light sources 140 and 142 may be white light sources (i.e. non-monochromatic) or multiple/mixture of sources, or the light may be the prevailing light in the vicinity of the scanner (i.e., ambient light) illuminating the item 60 within the read volume 50. In one configuration, the dichroic mirror 134 may be designed/selected to split between two wavelengths, for example reflect blue wavelength light and transmit red wavelength light. The signal at the color image sensor 130 (both the blue light signal and the red light signal) would then be processed, knowing that the blue light signal corresponds to the 2D image coming from the upper window 128 passing through the dichroic mirror 134 and then process the red light signal knowing that the red light corresponds to the 2D image coming through the window 126 reflecting off the dichroic mirror 134. The light being directed onto the color image sensor includes both color wavelengths focused onto common active areas of the sensor. The color imager may accept both wavelengths at the same time (meaning both wavelengths are acquired in the same image capture) onto common active regions because the color imager includes active pixel areas for the different colors/wavelengths. The different wavelengths may then be processed separately, either sequentially or in parallel.
Thus far, the described embodiments employ an image sensor array or camera providing 2D view of the read volume. Other types of image view system may be employed. For example, Olmstead U.S. Published Application No. Publication No. US-2009-0206161 (“Olmstead '161”), hereby incorporated by reference, discloses a system/method whereby the image plane is divided into a plurality of strip-shaped sections on the sensor thereby, in certain embodiments, mimicking a laser scanner scan pattern. Olmstead '161 refers to these strip-shaped pattern view systems as kaleidoscopic. Thus in one alternate configuration, the dichroic splitter of the present application could switch reading as between (a) a 2D image view of the scan volume and (b) a strip-shape sectional imager (kaleidoscopic) system.
One advantage that may be realized by certain embodiments disclosed herein is that the incoming signal from each of the views may engage the entire sensor array. Other designs such as disclosed in Olmstead '161 typically split the views from the respective windows onto separate sections of the sensor or to multiple sensor. For example, in FIG. 36A of that Olmstead '161 application, a system 3600 includes a bi-optic splitting mirror 3620, and some representative pattern mirrors and/or redirection mirrors 3630-3680, whereby the bi-optic splitting mirror 3620 is disposed to redirect half of the imager's field of view to the horizontal window 804A portion of the reader 3600, while the other half of the imager's field of view services the vertical window 804B. As for multiple sensors, FIG. 34 of Olmstead '161 illustrates a two camera/sensor system 3400 having a first camera 806 employing a kaleidoscopic (strip-shaped) portion and a second 2D camera 3410 that views a 2D view of the read volume via optional fold mirror 3420.
In another alternate embodiment, the field of view splitting design of the Olmstead '161 application may be incorporated herein. For example, in a single window configuration, a light source generating a first wavelength λ1 is directed out of the window, and a full image of the read volume (or primary portion thereof) is reflected off of the dichroic mirror and then focused onto the sensor array. A second light source (or plurality of light sources) of a wavelength λ2 is formed according to the kaleidoscopic mirror configuration as in the Olmstead '161 application creating the “image scan pattern” whereby the incoming image of the fields of view (scan line images) of wavelength λ2 pass through the dichroic mirror and are focused onto the various sections of the sensor array.
In the above-described embodiments, the light sources are optional as the system may be operable using ambient lighting. If the light sources are employed, the light sources may both comprise white light (i.e., broad band light spectrum) or the two light sources may have light of different wavelength ranges as described above. Each light source may comprise one or more LED's or other suitable lighting element. Alternately, the read volume may be illuminated with a single illumination source, such as a non-coherent light source generating white light.
Though the present invention has been set forth in the form of certain example embodiments, it is nevertheless intended that modifications to the disclosed systems and methods may be made without departing from inventive concepts set forth herein.
This application claims priority to U.S. provisional application No. 61/172,594 filed Apr. 24, 2009, hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5161051 | Whitney | Nov 1992 | A |
5418357 | Inoue et al. | May 1995 | A |
5912451 | Gurevich et al. | Jun 1999 | A |
5942762 | Hecht | Aug 1999 | A |
6147358 | Hecht | Nov 2000 | A |
6237851 | Detwiler | May 2001 | B1 |
6323503 | Hecht | Nov 2001 | B1 |
6568598 | Bobba et al. | May 2003 | B1 |
6621063 | McQueen | Sep 2003 | B2 |
6726094 | Rantze et al. | Apr 2004 | B1 |
6793138 | Saito | Sep 2004 | B2 |
6899272 | Krichever et al. | May 2005 | B2 |
6963074 | McQueen | Nov 2005 | B2 |
7215493 | Olmstead et al. | May 2007 | B2 |
7224540 | Olmstead et al. | May 2007 | B2 |
7243850 | Tamburrini et al. | Jul 2007 | B2 |
7387246 | Palestini et al. | Jun 2008 | B2 |
7398927 | Olmstead et al. | Jul 2008 | B2 |
7626769 | Olmstead | Dec 2009 | B2 |
7721966 | Rudeen et al. | May 2010 | B2 |
7748631 | Patel et al. | Jul 2010 | B2 |
20040183004 | Niggemann et al. | Sep 2004 | A1 |
20090001166 | Barkan et al. | Jan 2009 | A1 |
20090206161 | Olmstead | Aug 2009 | A1 |
Number | Date | Country | |
---|---|---|---|
20100270376 A1 | Oct 2010 | US |
Number | Date | Country | |
---|---|---|---|
61172594 | Apr 2009 | US |