This disclosure relates generally to reading optical indicia and, more specifically, to combining fragments of data from multiple sources to decode an optical indicia.
The use of optical indicia, or bar code symbols, for product and article identification is well known in the art. Presently, various types of bar code symbol scanners have been developed. One common type of bar code symbol reader is the laser-based scanner, which uses a focused laser beam to sequentially scan the bars and spaces of a bar code symbol to be read. The majority of laser scanners in use today, particular in retail environments, employ lenses and moving (e.g., rotating or oscillating) mirrors and/or other optical elements in order to focus and scan laser beams across bar code symbols during code symbol reading operations.
In demanding retail scanning environments, it is common for such systems to have both bottom and side-scanning windows to enable highly aggressive scanner performance, so the cashier need only drag a bar-coded product past these scanning windows for the bar code to be automatically read with minimal assistance of the cashier or checkout personal. Such dual scanning window systems are typically referred to as “bioptic” laser scanning systems as such systems employ two sets of optics—a first set disposed behind the bottom or horizontal scanning window, and a second set disposed behind the side-scanning or vertical window.
In general, prior art bioptic laser scanning systems are generally more aggressive that conventional single scanning window systems. For this reason, bioptic scanning systems are often deployed in demanding retail environments, such as supermarkets and high-volume department stores, where high check-out throughput is critical to achieving store profitability and customer satisfaction. While prior art bioptic scanning systems represent a technological advance over most single scanning window system, prior art bioptic scanning systems in general suffered from various shortcomings and drawbacks.
In particular, the laser scanning patterns of such prior art bioptic laser scanning systems are not optimized in terms of scanning coverage and performance, and the scanning systems are generally expensive to manufacture by virtue of the large number of optical components presently required to construct such laser scanning systems.
Additionally, in scanning a bar code symbol and accurately producing digital scan data signals representative of a scanned bar code symbol, the performance of such aggressive laser scanning systems is susceptible to noise, including ambient noise, thermal noise and paper noise. During operation of a laser scanning system, a focused light beam is produced from a light source such as a visible laser diode (VLD), and repeatedly scanned across the elements of the code symbol. In the case of bar code scanning applications, the elements of the code symbol consists of a series of bar and space elements of varying width. For discrimination purposes, the bars and spaces have different light reflectivity (e.g., the spaces are highly light-reflective while the bars are highly light-absorptive). As the laser beam is scanned across the bar code elements, the bar elements absorb a substantial portion of the laser beam power, whereas the space elements reflect a substantial portion of the laser beam power. As a result of this scanning process, the intensity of the laser beam is modulated in accordance with the information structure encoded within the scanned bar code symbol.
As the laser beam is scanned across the bar code symbol, a portion of the reflected light beam is collected by optics within the scanner. The collected light signal is subsequently focused upon a photodetector within the scanner which, in one example, generates an analog electrical output signal which can be decomposed into a number of signal components, namely: a digital scan data signal having first and second signal levels, corresponding to the bars and spaces within the scanned code symbol; ambient-light noise produced as a result of ambient light collected by the light collection optics of the system; thermal noise produced as a result of thermal activity within the signal detecting and processing circuitry; and “paper” or substrate noise, which may be produced as a result of the microstructure of the substrate in relation to the cross-sectional dimensions of the focused laser scanning beam, or noise related to the bar code printing quality (e.g., bar code edge roughness, unwanted spots, void defects, and/or printing contrast).
The analog scan data signal has positive-going transitions and negative-going transitions which signify transitions between bars and spaces in the scanned bar code symbol. However, a result of such noise components or operating the scanner near the operational limits of the focal zones, the transitions from the first signal level to the second signal level and vice versa are not perfectly sharp, or instantaneous. Consequently, it is sometimes difficult to determine the exact instant that each binary signal level transition occurs in the detected analog scan data signal.
The ability of a scanner to accurately scan an encoded symbol character and accurately produce digital scan data signals representative of a scanned bar code symbol in noisy environments depends on the depth of modulation of the laser scanning beam. The depth of modulation of the laser scanning beam, in turn, depends on several important factors. Among the factors are (i) the ratio of the laser beam cross-sectional dimensions at the scanning plane to the width of the minimal bar code element in the bar code symbol being scanned; (ii) the signal-to-noise ratio (SNR) in the scan data signal processor at the stage where binary level (1-bit) analog to digital (A/D) signal conversion occurs; (iii) the object distance; and (iv) the field of view (FOV) angle.
As a practical matter, it is not possible in most instances to produce analog scan data signals with precisely-defined signal level transitions. Therefore, the analog scan data signal must be further processed to precisely determine the point at which the signal level transitions occur. Various circuits have been developed for carrying out such scan data signal processing operations. Typically, signal processing circuits capable of performing such operations include filters for removing unwanted noise components, and signal thresholding devices for rejecting signal components which do not exceed a predetermined signal level. One drawback to these approaches is that thermal and “paper” (or substrate) noise imparted to the analog scan data input signal tends to generate “false” positive-going and negative-going transitions in the first derivative signal, and may also generate zero-crossings in the second-derivative signal. Consequently, the circuit logic allows “false” first derivative peak signals and second-derivative zero-crossing signals to be passed on, thereby producing erroneous binary signal levels at the output stage of the signal processor. In turn, error-ridden digital data scan data signals are transmitted to the digital scan data signal processor of the bar code scanner for conversion into digital words representative of the length of the binary signal levels in the digital scan data signal. This can result in significant errors during bar code symbol decoding operations, causing objects to be incorrectly identified and/or erroneous data to be entered into a host system.
Another drawback to retail laser scanning systems is that the bar code label may be damaged or printed improperly. As is often the case, there being no redundancy in the scanning system, the bar code reader fails to decode and the cashier must input the bar code numbers manually, wasting valuable time at the checkout counter and frustrating customers.
Yet another drawback to retail laser scanning systems is the opportunity for theft by way of scanning the bar code of a significantly less expensive item instead of the item actually passing through the check-out line. Some retailers print their own bar codes to discount certain items. The in-house bar codes are typically printed on stickers and set aside in a bin near the register. Cashiers or customers may peel off these stickers and place them over an existing bar code for an expensive item. As the expensive item is passed over the scan zone, the laser scanner will recognize and decode the less expensive bar code as a valid item, and will complete the transaction at a loss for the retailer. In other fraudulent schemes, cashiers may place the in-house bar code sticker on their hand, and quickly scan their hand instead of the expensive item. Policing such fraudulent actions can be time-consuming and expensive. One method of policing that is presently practiced is to manually review the security camera video at the cashier counter and cross-reference it with sales receipts to assure expensive items (as seen in the video) have been properly transacted. One drawback to this approach is that the theft is identified long after the sale is completed and the customer has left the store.
Accordingly, there is a need in the art for a retail laser scanner that can verify the authenticity of an item with its purported bar code label at the time of checkout.
Moreover, there is a need in the art for an aggressive bioptic scanner that overcomes the deficiencies with respect to laser scanner noise and lack of redundancy.
Even though bioptic laser scanners employ two sets of optics to aggressively scan and decode bar code symbols, the problems noted above apply equally to each of the horizontal and vertical laser systems. Thus, although the added complexity and cost of a bioptic laser scanning system may be beneficial in terms of aggressive scanning, the extra laser optics do not necessarily remedy the problems associated with reading the bar code (for example, noise).
In one aspect of the invention, a system for decoding an encoded symbol character associated with a product is provided. The system includes a bioptic scanning apparatus comprising a first scan source disposed within a housing, and a second scan source disposed within the housing. The second scan source comprises an operating technology distinct from an operating technology of the first scan source. The first scan source is adapted to output a first scan data set, and the second scan source is adapted to output a second scan data set. At least one of the first scan data set and the second scan data set comprises product bar code scan data. The bioptic scanning apparatus further includes a central processing unit adapted to execute a bar code decoding process by cross-referencing the first scan data set and the second scan data set. The system further includes a memory coupled to the central processing unit.
In another aspect of the invention, a method for decoding optical indicia is provided. The method includes the step of providing a bioptic scanning apparatus having a first scan source and a second scan source. The second scan source includes an operating technology distinct from an operating technology of the first scan source. The method further includes the steps of scanning with the first scan source an optical indicia affixed to a product, and generating a first scan data set from the first scan source. The method further includes the steps of scanning the product with the second scan source, and generating a second scan data set from the second scan source. The method further includes the steps of combining the first scan data set with the second scan data set, and decoding the optical indicia from the combined first scan data set and second scan data set.
The features described herein can be better understood with reference to the drawings described below. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views.
In the illustrative embodiments, the apparatus of the present invention is realized in the form of an automatic bar code symbol scanning system having a plurality of scan sources as well as a scan data processor for decode processing scan data signals produced thereby. However, for the sake of convenience of expression, the term “bioptic scanner” shall be used hereinafter to denote the bar code symbol scanning system which employs the plurality of scan sources of the present invention.
The countertop 12 includes an optically transparent (e.g., glass) horizontal-scanning window 20 mounted flush with the checkout counter, covered by an imaging window protection plate 22 which is provided with a pattern of apertures 24a. These apertures 24 permit the projection of a plurality of vertical illumination planes from a first scan source located beneath the horizontal-scanning window 20, to be described more fully below.
The bioptic scanner 14 includes a vertical-scanning window 26 formed in the second housing portion 18. The vertical-scanning window 26 further includes a pattern of apertures 24b to permit the projection of a plurality of horizontal illumination planes. The illumination may be provided by the first scan source utilizing a series of splitting mirrors to direct some of the laser light from the source in the horizontal portion through the vertical-scanning window 26 in the second housing portion 18. Alternately, a second scan source could provide the illumination, such as a separate laser scanner assembly.
A product 28 having an encoded symbol character 30 may be scanned by the bioptic scanner 14. If the encoded symbol character 30 is located on the bottom of the product 28, one of the scan lines projected through the horizontal-scanning window 20 will traverse the symbol. If the character 30 is located on the side of the product, then one of the scan lines projected through the vertical-scanning window 26 will traverse the symbol.
As used herein, “encoded symbol character” is intended to denote a representation of a unit of information in a message, such as the representation in a bar code symbology of a single alphanumeric character. One or more encoded symbol characters can be used to convey information, such as the identification of the source and the model of a product, for example in a UPC bar code that comprises twelve encoded symbol characters representing numerical digits. Also, an encoded symbol character may be a non-alphanumeric character that has an agreed upon conventional meaning, such as the elements comprising bars and spaces that are used to denote the start, the end, and the center of a UPC bar code. The bars and spaces used to encode a character as an encoded symbol are referred to generally as “elements.” For example an encoded character in a UPC symbol consists of four elements, two bars, and two spaces. Similarly, encoded symbol characters can be defined for other bar code symbologies, such as other one-dimensional (“1-D”) bar code systems including Code 39 and Code 128, or for stacked two-dimensional (“2-D”) bar code systems including PDF417.
As used herein, bioptic scanner is not limited to a construction having horizontal and vertical scan windows. A bioptic scanner can include a single scan window, such as the horizontal-scanning window illustrated in
In some constructions, the workstation 10 may further include a radio frequency identification (RFID) reader 32; a credit card reader 34; a wide-area wireless (WIFI) interface 36 including RF transceiver and antenna 38 for connecting to the TCP/IP layer of the Internet as well as one or more storing and processing relational database management system (RDBMS) server 40; a Bluetooth 2-way communication interface 42 including RF transceivers and antenna 44 for connecting to Bluetooth-enabled hand-held scanners, imagers, PDAs, portable computers and the like 46, for control, management, application and diagnostic purposes. The workstation 10 may further include an electronic weight scale module 48 employing one or more load cells positioned centrally below the system's structurally rigid platform for bearing and measuring substantially all of the weight of objects positioned on the horizontal-scanning window 20 or window protection plate 22, and generating electronic data representative of measured weight of such objects.
Referring to
The laser beam reflects off the product 28 and travels along axis 58 in a receiving direction 74 back to a detector assembly 76. In the example wherein the product 28 includes a bar code, the incident laser light strikes areas of dark and white bands and is reflected. The reflected beam will thusly have variable intensity representative of the bar code pattern. Detector assembly 76 including detector 78 and analog to digital converter 80 can receive the reflected beam of variable intensity, generate an analog signal corresponding to the reflected beam, and convert it to a digital data set for storage into first memory 82 where it can be processed by CPU 84 in accordance with a program stored in non-volatile memory 86, provided in a particular example by an EPROM.
For attempting to decode a bar code symbol, CPU 84 can process a digitized image signal corresponding to a scanned, reflected, and detected laser beam to determine a spatial pattern of dark cells and light cells and can convert each light and dark cell pattern determined into a character of character string via table lookup. Laser scanner 52 can include various interface circuits allowing CPU 84 to communicate with various circuits of scanner 52 including first interface circuit 88 coupled to laser source control circuit 60 and system bus 90, second interface circuit 92 coupled to motor control circuit 72, and third interface circuit 94 coupled to electrical power input unit 96.
Referring to
The image sensor assembly 100 can include an image sensor 102 comprising a multiple pixel image sensor array 104 having pixels arranged in rows and columns of pixels, column circuitry 106, and row circuitry 108. Associated with the image sensor 102 can be amplifier circuitry 110, and an analog-to-digital (A/D) converter 112 which converts image information in the form of analog signals read out of multiple pixel image sensor array 104 into image information in the form of digital signals. Image sensor 102 can also have an associated timing and control circuit 114 for use in controlling, e.g., the exposure period of image sensor 102, and/or gain applied to the amplifier 110. The noted circuit components 102, 110, 112, and 114 can be packaged into a common image sensor integrated circuit 116. In one example, image sensor integrated circuit 116 can be provided by an MT10V022 image sensor integrated circuit available from Micron Technology, Inc. In another example, image sensor integrated circuit 116 can incorporate a Bayer pattern filter. In such an embodiment, CPU 118 prior to subjecting a frame to further processing can interpolate pixel values intermediate of green pixel values for development of a monochrome frame of image data. In other embodiments, red, and/or blue pixel values can be utilized for the image data.
In the course of operation of the image sensor assembly 100, image signals can be read out of image sensor 102, converted, and stored into a system memory such as RAM 120. A memory 122 of image sensor assembly 100 can include RAM 120, a nonvolatile memory such as EPROM 124, and a storage memory device 126 such as may be provided by a flash memory or a hard drive memory. In one embodiment, image sensor assembly 100 can include CPU 118 which can be adapted to read out image data stored in memory 122 and subject such image data to various image processing algorithms. Image sensor assembly 100 can include a direct memory access unit (DMA) 128 for routing image information read out from image sensor 102 that has been subject to conversion to RAM 120. In another embodiment, image sensor assembly 100 can employ a system bus providing for bus arbitration mechanism (e.g., a PCI bus) thus eliminating the need for a central DMA controller. A skilled artisan would appreciate that other embodiments of the system bus architecture and/or direct memory access components providing for efficient data transfer between the image sensor 102 and RAM 120 are within the scope of the invention.
Referring to further aspects of image sensor assembly 100, the sensor assembly can include an imaging lens assembly 130 for focusing an image of the encoded symbol character 30 onto image sensor 102. Imaging light rays can be transmitted about an optical axis 132. Image sensor assembly 100 can also include an illumination assembly 134 or excitation illumination module that comprises one or more of an illumination pattern light source bank 136 for generating an illumination pattern substantially corresponding to the field of view of image sensor assembly 100, and an aiming pattern light source bank 138 for generating an aiming pattern. In use, the product 28 can be presented by an operator to the image sensor assembly 100 in such manner that the aiming pattern is projected on the encoded symbol character 30. In the example of
The image sensor assembly 100 can further include a filter module 140 that comprises one or more optical filters, as well as in some embodiments an actuator assembly 142 that is coupled generally to the filter module, such as to the optical filters. The filter module 140 can be located on either side of the imaging lens assembly 130. Likewise, one or more of the optical filters within the filter module 140 can be disposed on one or more surfaces of the imaging lens assembly 130 and/or the image sensor 102.
Each of illumination pattern light source bank 136 and aiming pattern light source bank 138 can include one or more light sources. Lens assembly 130 can be controlled with use of lens assembly control circuit 144 and the illumination assembly 134 comprising illumination pattern light source bank 136 and aiming pattern light source bank 138 can be controlled with use of illumination assembly control circuit 146. Filter module 140 can be controlled with use of a filter module control circuit 148, which can be coupled to the actuator assembly 142. Lens assembly control circuit 144 can send signals to lens assembly 130, e.g., for changing a focal length and/or a best focus distance of lens assembly 130. Illumination assembly control circuit 146 can send signals to illumination pattern light source bank 136, e.g., for changing a level of illumination output.
Although not incorporated in the illustrated embodiments, image sensor assembly 100 can also include a number of peripheral devices such as display 150 for displaying such information as image frames captured with use of image sensor assembly 100, keyboard 152, pointing device 154, and trigger 156 which may be used to make active signals for activating frame readout and/or certain decoding processes.
Image sensor assembly 100 can include various interface circuits for coupling several of the peripheral devices to system address/data bus (system bus) bus 158, for communication with second CPU 118 also coupled to system bus 158. Image sensor assembly 100 can include interface circuit 160 for coupling image sensor timing and control circuit timing and control circuit 114 to system bus 158, interface circuit 162 for coupling the lens assembly control circuit 144 to system bus 158, interface circuit 164 for coupling the illumination assembly control circuit 146 to system bus 158, interface circuit 166 for coupling the display 150 to system bus 158, interface circuit 168 for coupling keyboard 152, pointing device 154, and trigger 156 to system bus 158, and interface circuit 170 for coupling the filter module control circuit 148 to system bus 158.
In a further aspect, image sensor assembly 100 can include one or more I/O interfaces 172, 174 for providing communication with external devices (e.g., a cash register server, a store server, an inventory facility server, an image sensor assembly 100, a local area network base station, a cellular base station). I/O interfaces 172, 174 can be interfaces of any combination of known computer interfaces, e.g., Ethernet (IEEE 802.3), USB, IEEE 802.11, Bluetooth, CDMA, and GSM.
In one embodiment, resources between the first scan source 50 and the second scan source 98 may be combined or shared to form a hybrid bar code symbol scanning system that improves the first-pass read rate and also decreases misreads. For example, the system bus 90 of the first scan source 50 may include or be the same as the bus 158 from the second scan source 98. Further, the CPU 84 from the first scan source 50 may include or be the same as the CPU 118 from the second scan source 98. In this manner, data sets obtained from processing the signals received by each of the scan sources may be cross-referenced in order to successfully decode a bar code. In one embodiment, one or both of the CPUs 84, 118 may execute a bar code decoding process by cross-referencing (or stitching) a first scan data set obtained from the first scan source 50 and a second scan data set obtained from the second scan source 98.
In one example, the two optical sources 50, 98 are configured as a host system and a slave system, with the host performing as the primary optical reader and the slave acting as a back-up. Both the host (e.g., the laser scanner 52) and slave (e.g., the image sensor assembly 100) can capture the bar code passing through the scan zone. As described above, the host system can process the received signals and store a first scan data set in first memory 82. Likewise, the slave system can process the received signals and store a second scan data set in second memory 122. Both the host laser scanner 52 and the slave image sensor assembly 100 can attempt to decode the encoded symbol character 30. If the host system only obtains a partial read from the laser-based reflection off the encoded symbol character 30, it may look for additional data sets from other scan lines (e.g., multiple scan lines) and stitch them in a conventional manner. If conventional stitching proves unsuccessful, the host system 52 may be configured to retrieve and combine the data sets obtained from the slave system 100 and stitch them to the host data set according to a custom algorithm. For example, in a UPC bar code that comprises twelve encoded symbol characters representing numerical digits, the first scan source 50 may successfully decode only seven of the characters. The second scan source 98 can provide a data set comprising the remaining five characters, providing sufficient overlap exists to perform stitching.
This hybrid stitching methodology is especially advantageous when noise (e.g., ambient noise, thermal noise and paper noise) is prevalent in the laser scanning system. The other operating technology will likely not be predisposed to the same sources of noise, and therefore can increase the likelihood of obtaining data sets that are decodable.
In another example, the scan source that decodes the encoded symbol character 30 quickest will pass the result to the host system. In this manner, the fastest possible decoding result is achieved, regardless of the host or slave system capabilities. For example, the slave optical imager 98 may capture an image and post-process the bar code faster than the host laser scanner 50. The decoded result can be passed to the host system, e.g., the laser scanner, as if the host had decoded the bar code. If, however, only a partial read is obtained by either system, the first and second scan data sets obtained may be stitched together by the custom algorithm.
In some embodiments of the present invention, redundancy can be built in to the retail transaction by programming one or more of the scan sources (or the point of sale system) to store a small database of patterns, color schemes, dimensions or other identifying marks for a specific number of bar codes. The bar codes may correspond to high-value SKUs, for example, or any and all of the products offered for sale in the retail establishment. In this manner, the added security of matching color or image can be utilized to prevent theft. Alternatively, the bar codes can be known codes that are difficult to scan, so reducing manual key-ins would improve productivity.
Thus, the second scan source 98 need not comprise a bar code reader. In another embodiment of the invention, the second scan source 98 is an image sensor array such as a digital camera. The digital camera captures an image of the item 28 being scanned and stores it as a second scan data set, for example in RAM 120. The CPU 118 can be adapted to compare the image of the product 28 with pre-stored patterns or images corresponding to the bar code being scanned. If the pre-stored images match the image of the product 28, the sale of the item is completed. The image or pattern comparison can assure the expected pattern/color is located at the correct distance/orientation from the bar code, for example. In another example, dimensions of the product being scanned could be cross-referenced with a database storing the actual dimensions associated with the particular bar code. Referring briefly to
In another embodiment of the invention, the second scan source 98 comprises an imaging scan module with a color sensor adapted to capture color patterns within the bar code. In one example, the color of the bar code can be captured as the second scan data set and cross-referenced to a stored value to provide redundancy in the retail environment. In much the same way, the color surrounding the perimeter of the bar code can be captured as the second scan data set and cross-referenced to a stored value. In the color matching example, one possible implementation requires that a significant portion of the bar code be identified by the stitching methodologies disclosed herein, then that data set could be compared to a database which would then have the scanner validate that, given the complete data string (or large enough portion), the expected color is located at the correct distance or orientation from the bar code.
In another embodiment of the invention, the second scan source 98 comprises an imaging scan module adapted to capture alphanumeric characters. The image (second scan data set) obtained by the second scan source 98 can be post-processed and, utilizing optical character recognition (OCR) software stored in memory EPROM 124 for example, the alphanumeric characters of the bar code can be identified. In this manner, any characters indiscernible by the first scan source 50 may be cross-referenced or stitched from the second scan source 98.
In yet another embodiment of the invention, the second scan source 98 comprises the RFID reader 32 for redundancy. Thus, an imager is not required to generate a second scan data set for cross-referencing. In one example, the RFID reader 32 can be integrated into in the laser scanner 52. An RFID tag can be read on a product 28, and the RFID reader can obtain the electronic product code (EPC) as the second scan data set and convert that to an actual bar code for lookup. If the product 28 having an RFID tag also has a bar code on it, the bar code scanner (first scan source 50) will read that bar code 30 and also have that data set available to be cross-referenced with the bar code generated by the RFID reader 32. In this manner, a measure of redundancy is added in the event the first scan source 50 cannot obtain a good read.
In yet another embodiment of the invention, the first scan source 50 is the weight scale module 48 and the second scan source 98 is a multiple pixel image sensor assembly 100. The image sensor assembly 100 can perform a multiplicity of operations, depending upon the software loaded therein, such as decoding the bar code 30, pattern matching, or color matching. In one example, the image sensor assembly 100 can perform decoding operations on the bar code 30 and, once the product 28 is identified, the measured weight of the product can be compared to the weight of the item stored in a memory location, such as storage memory device 126 or RDBMS server 40.
The above-described invention is not limited to two scan sources. As many differing-technology scan sources as practical may be utilized to achieve the desired redundancy or aggressive processing speed. In another embodiment, three scan sources are utilized comprising a laser scanner, a multiple pixel image sensor, and OCR. In one example, a first scan data set comprises four characters of a UPC bar code obtained from the laser scanner. A second scan data set obtained from the imager comprises seven characters of the bar code, and third data set comprises five characters of the bar code obtained from OCR software. Using stitching techniques, the three data sets can be compared and overlaps identified in order to acquire the twelve-character UPC code. In another example, three complete data sets are cross-referenced for redundancy. In the event all three date sets do not match exactly, algorithms may be used to choose which characters belong to the code (for example, two out of three matching).
One of the improvements of the present disclosure is that the first-pass read rate is increased. By utilizing custom stitching techniques from multiple data sets obtained from scan sources of differing technologies, partial bits of information can be combined in hybrid fashion to obtain a good read on the first pass, rather having a decode error and resorting to manual input.
Further, retail theft can be reduced because a scanned bar code can be cross-referenced to other identifying information to assure the bar code matches the correct product.
A system for decoding an encoded symbol character associated with a product is provided. The system includes a bioptic scanning apparatus comprising a first scan source disposed within a housing, and a second scan source disposed within the housing. The second scan source comprises an operating technology distinct from an operating technology of the first scan source. The first scan source is adapted to output a first scan data set, and the second scan source is adapted to output a second scan data set. At least one of the first scan data set and the second scan data set comprises product bar code scan data. The bioptic scanning apparatus further includes a central processing unit adapted to execute a bar code decoding process by cross-referencing the first scan data set and the second scan data set. The system further includes a memory coupled to the central processing unit. In one embodiment, the first scan source is a laser scanner, and the second scan source is a multiple pixel image sensor assembly. In one example, the multiple pixel image sensor assembly is adapted to capture an image of the encoded symbol character, the second scan data set comprises bar code data, and the central processing unit is adapted to execute a bar code decoding process by stitching the first scan data set and the second scan data set.
While the present invention has been described with reference to a number of specific embodiments, it will be understood that the true spirit and scope of the invention should be determined only with respect to claims that can be supported by the present specification. Further, while in numerous cases herein wherein systems and apparatuses and methods are described as having a certain number of elements it will be understood that such systems, apparatuses and methods can be practiced with fewer than the mentioned certain number of elements. Also, while a number of particular embodiments have been described, it will be understood that features and aspects that have been described with reference to each particular embodiment can be used with each remaining particularly described embodiment.
A sample of systems and methods that are described herein follows:
A1. A system for decoding an encoded symbol character associated with a product, the system comprising:
a bioptic scanning apparatus comprising a first scan source disposed within a housing; a second scan source disposed within the housing, the second scan source comprising an operating technology distinct from an operating technology of the first scan source; the first scan source adapted to output a first scan data set; the second scan source adapted to output a second scan data set; and further comprising a central processing unit adapted to execute a bar code decoding process by cross-referencing the first scan data set and the second scan data set;
a memory coupled to the central processing unit;
wherein at least one of the first scan data set and the second scan data set comprise product bar code scan data.
A2. The system of A1 wherein the first scan source is a laser scanner, the second scan source is a multiple pixel image sensor assembly, and the first scan data set comprises bar code data.
A3. The system of A2 wherein the multiple pixel image sensor assembly is adapted to capture an image of the encoded symbol character, the second scan data set comprises bar code data, and the central processing unit is adapted to execute a bar code decoding process by stitching the first scan data set and the second scan data set.
A4. The system of A1 wherein the first scan data set comprises product bar code scan data, the second scan source comprises a multiple pixel image sensor assembly adapted to capture an image of alphanumeric characters associated with the product bar code, the second scan data set comprises alphanumeric characters associated with the product bar code, and the central processing unit is adapted to access optical character recognition software to compare the product bar code scan data with the alphanumeric characters.
A5. The system of A4, wherein the first scan source is a laser scanner.
A6. The system of A1, wherein the first scan data set comprises product bar code scan data and the second scan source comprises a radio frequency identification reader.
A7. The system of A6, wherein the second scan data set comprises bar code data associated with the first scan data set stored in the memory of the system, the second scan data set obtained from associating an electronic product code output by the radio frequency identification reader.
A8. The system of A1 wherein the first scan data set comprises product bar code scan data and the second scan source comprises a multiple pixel image sensor assembly adapted to capture and output as the second scan data set color images associated with the product, the central processing unit adapted to compare the color images output from the multiple pixel image sensor assembly with product color images associated with the first scan data set stored in the memory of the system.
A9. The system of A1 wherein the first scan data set comprises product bar code scan data and the second scan source comprises a multiple pixel image sensor assembly adapted to capture and output as the second scan data set pattern images associated with the product, the central processing unit adapted to compare the pattern images output from the multiple pixel image sensor assembly with product pattern images associated with the first scan data set stored in the memory of the system.
A10. The system of A1 wherein the first scan data set comprises product bar code scan data and the second scan source comprises a weight scale module adapted to output as the second scan data set a weight associated with the product, the central processing unit adapted to compare the weight of the product with product weights associated with the first scan data set stored in the memory of the system.
A11. The system of A1 wherein the housing comprises a horizontal section integrally connected to a vertical section, the horizontal section comprising a horizontal-scanning window formed therein, the first scan source aligned with the horizontal-scanning window, the vertical section comprising a vertical-scanning window substantially orthogonal to the horizontal-scanning window, the second scan source aligned with the vertical-scanning window.
B1. A method for decoding optical indicia, comprising the steps of:
providing a bioptic scanning apparatus having a first scan source and a second scan source, the second scan source comprising an operating technology distinct from an operating technology of the first scan source;
scanning with the first scan source an optical indicia affixed to a product;
generating a first scan data set from the first scan source;
scanning the product with the second scan source;
generating a second scan data set from the second scan source;
cross-referencing the first scan data set with the second scan data set; and
decoding the optical indicia from the cross-referenced first scan data set and second scan data set.
B2. The method of B1, wherein the step of scanning the product with the second scan source comprises scanning the optical indicia on the product.
B3. The method of B2, wherein the first scan source is a laser scanner and the second scan source is a multiple pixel image sensor assembly.
B4. The method of B2, wherein the step of combining the first scan data set with the second scan data set comprises combining a portion of the first scan data set with a portion of the second scan data set.
B5. The method of B4, wherein the combining step is stitching.
B6. The method of B1, wherein the step of combining the first scan data set with the second scan data set comprises cross-referencing the second scan data set with the first scan data set for redundancy.
B7. The method of B1, wherein the step of scanning the product with the second scan source comprises generating an image of the product.
B8. The method of B7, wherein the step of combining the first scan data set with the second scan data set comprises comparing the second scan data set with pre-stored product features associated with the first scan data set.
B9. The method of B8, wherein the pre-stored product features are dimensions.
B10. The method of B8, wherein the pre-stored product features are colors.
B11. The method of B8, wherein the pre-stored product features are patterns.
B12. The method of B1, wherein the step of scanning the product with the second scan source comprises scanning the product with a radio frequency identification reader.
B13. The method of B12, wherein the radio frequency identification reader obtains an electronic product code of the product, converts the electronic product code to a bar code as the second scan data source, and compares to the bar code to the first scan data set.
B14. The method of B1, wherein the step of decoding the optical indicia from the combined first scan data set and second scan data set comprises determining which scan data set decoded the optical indicia quickest.
The present application claims the benefit of U.S. patent application Ser. No. 14/222,994 for a Method and Apparatus for Reading Optical Indicia Using a Plurality of Data Sources filed Mar. 24, 2014 (and published Jul. 24, 2014 as U.S. Patent Application Publication No. 2014/0203087), now U.S. Pat. No. 9,033,240, which claims the benefit of U.S. patent application Ser. No. 13/359,087 for a Method and Apparatus for Reading Optical Indicia Using a Plurality of Data Sources filed Jan. 26, 2012 (and published Aug. 2, 2012 as U.S. Patent Application Publication No. 2012/0193416), now U.S. Pat. No. 8,678,286, which claims the benefit of U.S. Patent Application No. 61/438,075 for a Method and Apparatus for Reading Optical Indicia Using a Plurality of Data Sources filed Jan. 31, 2011. Each of the foregoing patent applications, patent publication, and patents is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5984185 | Dickson | Nov 1999 | A |
5992744 | Smith | Nov 1999 | A |
6003772 | Dickson | Dec 1999 | A |
6006993 | Dickson | Dec 1999 | A |
6006995 | Amundsen | Dec 1999 | A |
6024282 | Dickson et al. | Feb 2000 | A |
6073846 | Dickson | Jun 2000 | A |
6076736 | Dickson | Jun 2000 | A |
6085980 | Dickson et al. | Jul 2000 | A |
6158659 | Dickson et al. | Dec 2000 | A |
6199759 | Dickson | Mar 2001 | B1 |
6290132 | Dickson | Sep 2001 | B1 |
6305606 | Brunelli | Oct 2001 | B1 |
6328215 | Dickson | Dec 2001 | B1 |
6375074 | Dickson | Apr 2002 | B1 |
6415980 | Dickson et al. | Jul 2002 | B1 |
6422466 | Dickson | Jul 2002 | B1 |
6422467 | Lucera | Jul 2002 | B2 |
6439462 | Dickson | Aug 2002 | B1 |
6457646 | Dickson | Oct 2002 | B1 |
6464141 | Dickson | Oct 2002 | B2 |
6474556 | Dickson | Nov 2002 | B2 |
6481625 | Lucera | Nov 2002 | B2 |
6494377 | Lucera | Dec 2002 | B1 |
6499661 | Lucera | Dec 2002 | B1 |
6517001 | Knowles | Feb 2003 | B1 |
6523750 | Dickson | Feb 2003 | B1 |
6527186 | Lucera | Mar 2003 | B1 |
6530522 | Check | Mar 2003 | B1 |
6540139 | Lucera | Apr 2003 | B1 |
6547144 | Dickson | Apr 2003 | B1 |
6561424 | Dickson | May 2003 | B1 |
6572018 | Lucera | Jun 2003 | B1 |
6581835 | Lucera | Jun 2003 | B1 |
6588663 | Lucera | Jul 2003 | B1 |
6616040 | Lucera | Sep 2003 | B1 |
6619550 | Good | Sep 2003 | B1 |
6629641 | Tsikos et al. | Oct 2003 | B2 |
6631842 | Tsikos et al. | Oct 2003 | B1 |
6705526 | Zhu | Mar 2004 | B1 |
6758402 | Check | Jul 2004 | B1 |
6793138 | Saito | Sep 2004 | B2 |
6840449 | Check | Jan 2005 | B2 |
6851610 | Knowles | Feb 2005 | B2 |
6913202 | Tsikos | Jul 2005 | B2 |
6945463 | Rockstein | Sep 2005 | B2 |
6953153 | Dickson | Oct 2005 | B2 |
6969004 | Ralph | Nov 2005 | B2 |
6991167 | Check et al. | Jan 2006 | B2 |
7086595 | Zhu | Aug 2006 | B2 |
7086596 | Meier | Aug 2006 | B2 |
7104453 | Zhu | Sep 2006 | B1 |
7128266 | Zhu | Oct 2006 | B2 |
7140543 | Giordano | Nov 2006 | B2 |
7152796 | Dickson | Dec 2006 | B2 |
7164810 | Schnee | Jan 2007 | B2 |
7178733 | Zhu | Feb 2007 | B2 |
7207492 | Lucera | Apr 2007 | B2 |
7216810 | Zhu | May 2007 | B2 |
7219843 | Havens | May 2007 | B2 |
7237722 | Zhu | Jul 2007 | B2 |
7243847 | Zhu | Jul 2007 | B2 |
7255278 | Acosta | Aug 2007 | B2 |
7267282 | Zhu | Sep 2007 | B2 |
7281661 | Zhu | Oct 2007 | B2 |
7284705 | Zhu | Oct 2007 | B2 |
7299975 | McQueen | Nov 2007 | B2 |
7344082 | Zhu | Mar 2008 | B2 |
7347374 | Zhu | Mar 2008 | B2 |
7383994 | Smith | Jun 2008 | B2 |
7387241 | Hassenbuerger | Jun 2008 | B2 |
7395968 | Dickson | Jul 2008 | B2 |
7398930 | Longacre, Jr. | Jul 2008 | B2 |
7407103 | Check | Aug 2008 | B2 |
7451917 | McCall et al. | Nov 2008 | B2 |
7454947 | McCall | Nov 2008 | B2 |
7487917 | Kotlarsky | Feb 2009 | B2 |
7490774 | Zhu | Feb 2009 | B2 |
7490778 | Zhu | Feb 2009 | B2 |
7494063 | Kotlarsky | Feb 2009 | B2 |
7503496 | Kenny | Mar 2009 | B2 |
7503499 | Zhu | Mar 2009 | B2 |
7510118 | Ralph | Mar 2009 | B2 |
7510122 | Zhu | Mar 2009 | B2 |
7513428 | Giordano | Apr 2009 | B2 |
7513430 | Zhu | Apr 2009 | B2 |
7527203 | Bremer et al. | May 2009 | B2 |
7527206 | Zhu | May 2009 | B2 |
7527207 | Acosta | May 2009 | B2 |
7540425 | Kotlarsky | Jun 2009 | B2 |
7543752 | Kotlarsky | Jun 2009 | B2 |
7559475 | Kotlarsky | Jul 2009 | B2 |
7568628 | Wang et al. | Aug 2009 | B2 |
7571858 | Knowles et al. | Aug 2009 | B2 |
7575167 | Kotlarsky | Aug 2009 | B2 |
7588190 | Zhu | Sep 2009 | B2 |
7604175 | Zhu | Oct 2009 | B2 |
7607581 | Kotlarsky | Oct 2009 | B2 |
7611064 | Zhu | Nov 2009 | B2 |
7624926 | Zhu | Dec 2009 | B2 |
7637432 | Kotlarsky | Dec 2009 | B2 |
7637433 | Zhu | Dec 2009 | B1 |
7654461 | Kotlarsky | Feb 2010 | B2 |
7658329 | Gelbman | Feb 2010 | B2 |
7668406 | Schnee | Feb 2010 | B2 |
7669768 | Gelbman | Mar 2010 | B2 |
7673800 | Gelbman | Mar 2010 | B2 |
7677454 | Gelbman | Mar 2010 | B2 |
7681799 | Zhu | Mar 2010 | B2 |
7703678 | Gelbman | Apr 2010 | B2 |
7721966 | Rudeen | May 2010 | B2 |
7735735 | Gelbman | Jun 2010 | B2 |
7735736 | Gelbman | Jun 2010 | B2 |
7735737 | Kotlarsky | Jun 2010 | B2 |
7743987 | Gelbman | Jun 2010 | B2 |
7743990 | Schnee | Jun 2010 | B2 |
7748620 | Gomez | Jul 2010 | B2 |
7748626 | Gelbman | Jul 2010 | B2 |
7748627 | Gelbman | Jul 2010 | B2 |
7753276 | Gelbman | Jul 2010 | B2 |
7753277 | Gelbman | Jul 2010 | B2 |
7757954 | Gelbman | Jul 2010 | B2 |
7762461 | Gelbman | Jul 2010 | B2 |
7762462 | Gelbman | Jul 2010 | B2 |
7766238 | Gelbman | Aug 2010 | B2 |
7770798 | Kotlarsky | Aug 2010 | B2 |
7770799 | Wang | Aug 2010 | B2 |
7780087 | Bobba | Aug 2010 | B2 |
7784701 | Gelbman | Aug 2010 | B2 |
7789309 | Kotlarsky | Sep 2010 | B2 |
7791489 | Gelbman | Sep 2010 | B2 |
7798404 | Gelbman | Sep 2010 | B2 |
7815116 | Gelbman | Oct 2010 | B2 |
7854384 | Kotlarsky et al. | Dec 2010 | B2 |
7871001 | Gelbman | Jan 2011 | B2 |
7891569 | Gelbman | Feb 2011 | B2 |
7909257 | Wang et al. | Mar 2011 | B2 |
7913908 | Gelbman | Mar 2011 | B2 |
7918395 | Gelbman | Apr 2011 | B2 |
7918396 | Gelbman | Apr 2011 | B2 |
7946489 | Gelbman | May 2011 | B2 |
7950583 | Kotlasky | May 2011 | B2 |
8054218 | Gelbman | Nov 2011 | B2 |
8066188 | Patel et al. | Nov 2011 | B2 |
8113428 | Rudeen | Feb 2012 | B2 |
8172145 | Rudeen | May 2012 | B2 |
8181876 | Smith | May 2012 | B2 |
8245926 | Guess | Aug 2012 | B2 |
8286877 | Olmstead | Oct 2012 | B2 |
8322621 | Olmstead | Dec 2012 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8733660 | Wang et al. | May 2014 | B2 |
8978985 | Wang et al. | Mar 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
20020179708 | Zhu | Dec 2002 | A1 |
20030085284 | Bremer et al. | May 2003 | A1 |
20030156303 | Schnee | Aug 2003 | A1 |
20040134987 | Check | Jul 2004 | A1 |
20050103846 | Zhu | May 2005 | A1 |
20050103847 | Zhu | May 2005 | A1 |
20050103848 | Zhu | May 2005 | A1 |
20050103851 | Zhu | May 2005 | A1 |
20050103855 | Zhu | May 2005 | A1 |
20050103856 | Zhu | May 2005 | A1 |
20050103858 | Zhu | May 2005 | A1 |
20050103861 | Zhu | May 2005 | A1 |
20050103863 | Zhu | May 2005 | A1 |
20050103865 | Zhu | May 2005 | A1 |
20050116039 | Zhu | Jun 2005 | A1 |
20050116040 | Zhu | Jun 2005 | A1 |
20050116043 | Zhu | Jun 2005 | A1 |
20050263599 | Zhu | Dec 2005 | A1 |
20060208083 | Kotlarsky | Sep 2006 | A1 |
20070040035 | Kotlarsky | Feb 2007 | A1 |
20070131775 | Zhu | Jun 2007 | A1 |
20070138284 | Giordano | Jun 2007 | A1 |
20070138291 | Zhu | Jun 2007 | A1 |
20070138292 | Zhu | Jun 2007 | A1 |
20070138293 | Zhu | Jun 2007 | A1 |
20070138294 | Zhu | Jun 2007 | A1 |
20070145144 | Zhu | Jun 2007 | A1 |
20070145145 | Zhu | Jun 2007 | A1 |
20070145148 | Zhu | Jun 2007 | A1 |
20070181689 | Kotlarsky | Aug 2007 | A1 |
20070187509 | Kotlarsky | Aug 2007 | A1 |
20070194122 | Kotlarsky | Aug 2007 | A1 |
20070194124 | Kotlarsky | Aug 2007 | A1 |
20070228175 | Kotlarsky | Oct 2007 | A1 |
20070295813 | Kotlarsky | Dec 2007 | A1 |
20080135620 | Zhu | Jun 2008 | A1 |
20080198098 | Gelbman | Aug 2008 | A1 |
20080223933 | Smith | Sep 2008 | A1 |
20080296382 | Connell et al. | Dec 2008 | A1 |
20080297442 | Gelbman | Dec 2008 | A1 |
20080297454 | Gelbman | Dec 2008 | A1 |
20080303637 | Gelbman | Dec 2008 | A1 |
20080309551 | Gelbman | Dec 2008 | A1 |
20080314991 | Gelbman | Dec 2008 | A1 |
20080314992 | Gelbman | Dec 2008 | A1 |
20090014512 | Gelbman | Jan 2009 | A1 |
20090014517 | Gelbman | Jan 2009 | A1 |
20090014528 | Gelbman | Jan 2009 | A1 |
20090014529 | Gelbman | Jan 2009 | A1 |
20090014530 | Gelbman | Jan 2009 | A1 |
20090014531 | Gelbman | Jan 2009 | A1 |
20090014532 | Gelbman | Jan 2009 | A1 |
20090014533 | Gelbman | Jan 2009 | A1 |
20090014534 | Gelbman | Jan 2009 | A1 |
20090014535 | Gelbman | Jan 2009 | A1 |
20090014536 | Gelbman | Jan 2009 | A1 |
20090014537 | Gelbman | Jan 2009 | A1 |
20090014538 | Gelbman | Jan 2009 | A1 |
20090014539 | Gelbman | Jan 2009 | A1 |
20090014540 | Gelbman | Jan 2009 | A1 |
20090014541 | Gelbman | Jan 2009 | A1 |
20090014542 | Gelbman | Jan 2009 | A1 |
20090015427 | Gelbman | Jan 2009 | A1 |
20090020605 | Gelbman | Jan 2009 | A1 |
20090020614 | Gelbman | Jan 2009 | A1 |
20090026273 | Gelbman | Jan 2009 | A1 |
20090052807 | Kotlarskv | Feb 2009 | A1 |
20090206161 | Olmstead | Aug 2009 | A1 |
20100139989 | Atwater | Jun 2010 | A1 |
20100163622 | Olmstead | Jul 2010 | A1 |
20100163626 | Olmstead | Jul 2010 | A1 |
20100163628 | Olmstead | Jul 2010 | A1 |
20100165160 | Olmstead | Jul 2010 | A1 |
20100177363 | Zhou | Jul 2010 | A1 |
20100193588 | Cherry | Aug 2010 | A1 |
20110168780 | McQueen | Jul 2011 | A1 |
20110290889 | Tamburrini | Dec 2011 | A1 |
20120018516 | Gao | Jan 2012 | A1 |
20120067956 | Gao | Mar 2012 | A1 |
20120087551 | Bhagwan et al. | Apr 2012 | A1 |
20120132714 | Ruden | May 2012 | A1 |
20120181338 | Gao | Jul 2012 | A1 |
20120248188 | Kearney | Oct 2012 | A1 |
20120318868 | Horn | Dec 2012 | A1 |
20120318869 | Edmonds | Dec 2012 | A1 |
20130020392 | Olmstead | Jan 2013 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130048726 | Cunningham, IV | Feb 2013 | A1 |
Number | Date | Country |
---|---|---|
101069190 | Nov 2007 | CN |
201054149 | Apr 2008 | CN |
201450562 | May 2010 | CN |
2073145 | Jun 2009 | EP |
2482224 | Aug 2012 | EP |
Entry |
---|
European Search report and annex in EP Application No. 12153178.4 (248224 EP) dated Sep. 16, 2013, 5 pages, Previously submitted in Parent Application. |
Communication and Partial European Search Report in EP Application No. 12153178.4 (248224 EP) dated May 29, 2013, 8 pages. Previously Submitted in Parent Application. |
Chinese Office Action in Related Chinese Application 201210097168.X, Dated Oct. 21, 2015, 6 pages. |
Translated Chinese Office Action in Related Chinese Application 201210097168.X, Dated Oct. 21, 2015, 8 pages. |
Translated Chinese Search Report in Related Chinese Application 201210097168.X, Dated Oct. 21, 2015, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20150242671 A1 | Aug 2015 | US |
Number | Date | Country | |
---|---|---|---|
61438075 | Jan 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14222994 | Mar 2014 | US |
Child | 14707492 | US | |
Parent | 13359087 | Jan 2012 | US |
Child | 14222994 | US |