System and method for improved reading of data from reflective surfaces of electronic devices

Information

  • Patent Grant
  • 9594936
  • Patent Number
    9,594,936
  • Date Filed
    Wednesday, November 4, 2015
    8 years ago
  • Date Issued
    Tuesday, March 14, 2017
    7 years ago
Abstract
Disclosed systems and methods for a data reader operable to capture one or more images from items having a highly, or relatively highly, reflective surface. The data reader includes a controller/processor in operative communication with an imager and an illumination system, where the controller/processor is programmed to selectively operate the imager and the illumination systems to interleave the data reader between a first reading period for reading items having a surface with little or no reflectivity and a second reading period for reading items having a surface with high reflectivity. In some embodiments, the items with highly reflective surfaces may include electronic devices, such as mobile phones.
Description
BACKGROUND

The field of the disclosure relates generally to data reading devices, and particularly, to improved data reading devices for reading data from a reflective surface of electronic devices.


Optical codes, such as barcodes and other machine-readable indicia, appear in a variety of applications. There are a variety of forms, such as: linear barcodes (e.g., UPC code), 2D codes including stacked barcodes (e.g., PDF-417 code), and matrix codes (e.g., Datamatrix code, QR code, or Maxicode). There are several types of data readers used for reading these optical codes. The most common types of optical code readers are laser scanners and imaging readers. A laser scanner typically moves, i.e. scans, a laser light beam across the barcode. Imaging readers are typically used to capture a 2D image of an area, including the optical code or other scene, focused onto a detector array such as charge-coupled devices (CCDs) and complementary metal oxide semiconductor (CMOS) imagers. With some such imaging readers, it may be advantageous to provide a source of illumination that illuminates the optical code or other scene being imaged, to provide the required signal response in the imaging device. Such a source of illumination can reduce exposure time, thereby improving imager performance, especially in low ambient light conditions and when imaging moving items.


Typically, in a grocery or retail establishment, optical codes are often printed directly on items or printed on a sticker that is thereafter affixed to the item. These optical codes are usually printed or located on surfaces with little or no reflectivity so that illumination from a data reading device is not reflected back toward the data reading device, which may render the image obtained by the data reader difficult to process.


Businesses have begun sending optical codes to customers who display such optical codes on a portable electronic device, such as a mobile telephone or cell phone, personal digital assistant, palm, tablet, or laptop computer, or other suitable device having an electronic display, such as a liquid crystal display (LCD). For example, an airline passenger may display an optical code on a portable electronic device for an airline employee to read using a data reader as verification of the passenger's ticket. Or, a customer in a store may display an optical code on a portable electronic device for a cashier to read using a data reader to redeem a coupon. Optical codes are also included on other items having highly, or relatively highly, reflective surfaces, for example, but not limited to, identification (ID) cards, aluminum cans, and objects in plastic packaging.


The present inventors have recognized that optical codes presented on, or under, a highly, or relatively highly, reflective surface are typically difficult to decode using general-purpose data readers. For example, the present inventors have recognized that general-purpose data readers commonly use artificial illumination to illuminate an object bearing an optical code to create an image of the optical code having sufficient contrast for decoding the optical code. The present inventors have also recognized that highly, or relatively highly, reflective surfaces bearing optical codes commonly reflect a large amount of such artificial illumination resulting in a saturated, or partially saturated, image that does not have sufficient contrast for decoding the optical code because all, or portions, of the image appear light, or white. However, simply eliminating the artificial illumination is not a practicable solution since the data reader may not otherwise have sufficient illumination to read optical labels from non-reflective items, which would likely be the most common use of the data reader.


Other general-purpose data readers may be capable of detecting the presence of an electronic device or other reflective surface by detecting the amount of light reflected toward the data reader. In some systems, the data reader may attempt to switch from a primary data reading mode (e.g., reading data from items having non-reflective surfaces or surfaces with relatively low reflectivity) to a secondary reading mode (e.g., reading data from items having highly reflective surfaces) in response to detecting the electronic device. However, many existing data readers have difficulty reliably detecting the presence of an electronic device. In such cases, the data reader may improperly switch to the secondary reading mode, thereby being unable to read normal optical codes on non-reflective surfaces, or may fail to properly switch over when presented with an electronic device, thereby being unable to read data from highly-reflective surfaces of the electronic device.


Still other data readers attempt to divide the imager between the primary and secondary reading modes, dedicating a specific percentage of the imager exclusively to each reading mode. Accordingly, the imager includes an area specifically dedicated to detecting an electronic device and reading data therefrom, and an area specifically dedicated to reading data from non-reflective surfaces. However, the present inventors have recognized that a disadvantage of this configuration is that the data reader dedicates significant resources for reading data from electronic devices regardless of whether such a device is present or not, which detrimentally affects overall performance of the data reader by reducing resources that may be used for reading data from non-reflective surfaces, which as noted previously, is likely the most common use of the data reader.


Thus, the present inventors have identified a need for a general-purpose data reader that has improved versatility in handling reading of optical codes appearing on (or behind) highly, or relatively highly, reflective surfaces, as well as reading optical codes appearing on surfaces have no or little reflectivity.


SUMMARY

Methods and systems are disclosed for improved reading of optical codes on highly reflective surfaces, such as a display on a mobile phone or other electronic device.


In one example system/method, the data reader includes an imager having a first portion for reading optical codes from electronic devices or from other highly reflective surfaces, and a second portion dedicated to reading optical codes from non-reflective surfaces or surface having little reflectivity. The data reader includes an illumination module and a processor designed to control the illumination output during the reading process, where the processor controls the output depending on whether the data reader is obtaining data from a highly reflective surface or from a non-reflective surface (or surface with low reflectivity). In some embodiments, the processor interleaves the first portion of the imager to alternate between reading periods for highly reflective and non-reflective surfaces. By interleaving the first portion of the imager, the data reader may be able to quickly and efficiently read data from a variety of surfaces while also minimizing reflectivity issues.


Additional aspects and advantages will be apparent from the following detailed description of example embodiments, which proceeds with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically illustrates a data reader reading an optical code displayed on a display screen of an electronic device.



FIG. 2 is a diagrammatic view of a data reader according to an example embodiment.



FIG. 3 is a diagrammatic top view of the data reader of FIG. 2.



FIG. 4 is a diagrammatic view illustrating an example of imager view allocation for the data reader of FIG. 2.



FIG. 5 is a diagrammatic view of the data reader of FIG. 2, illustrating an example reading process from a reflective surface of an electronic display.



FIG. 6 is a diagrammatic view of the data reader of FIG. 2, illustrating an example reading process from a non-reflective surface or other surface with low reflectivity.



FIGS. 7-8 are diagrams illustrating relative timing of imager frame exposure and illumination pulses of the data reader of FIG. 2.





DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

With reference to the drawings, this section describes particular embodiments and their detailed construction and operation. The embodiments described herein are set forth by way of illustration only and not limitation. The described features, structures, characteristics, and methods of operation may be combined in any suitable manner in one or more embodiments. In view of the disclosure herein, those skilled in the art will recognize that the various embodiments can be practiced without one or more of the specific details or with other methods, components, materials, or the like. For the sake of clarity and conciseness, certain aspects of components or steps of certain embodiments are presented without undue detail where such detail would be apparent to those skilled in the art in light of the teachings herein and/or where such detail would obfuscate an understanding of more pertinent aspects of the embodiments.


In the following description of the figures and any example embodiments, the system may be referred to in conjunction with use at a retail establishment. It should be understood that such use is merely one example use for such a system. Other uses for a system with the characteristics and features described herein may be implemented, for example, in an industrial location such as a parcel distribution (e.g., postal) station or for processing inventory, as well as other suitable uses that may involve reading optical codes from electronic devices or other reflective surfaces. In addition, for convenience, certain embodiments may refer to the data reader operable for capturing optical codes from a mobile or cellular phone. It should be understood that this is merely one example embodiment and use of the system with the features and functionalities described herein. The system may be used to capture optical codes from any suitable device or product having a reflective surface.


For convenience, the following description may at times refer to the data reader having a normal label reading mode or period and an electronic device reading mode or period. References to the normal label reading mode or period may refer to instances where the data reader is used to obtain images from items having surfaces with little or no reflectivity such that specular reflection issues do not meaningfully interfere with an ability of the data reader to capture a decodable image (i.e., reflected illumination does not saturate the imager). References to the electronic device reading mode or period may refer to instances where the data reader is used to obtain images from items having surfaces with high reflectivity, and in some cases self-illuminating displays, such as electronic devices with LCD display screens, where reflected illumination may saturate the imager and interfere with an ability of the data reader to capture a decodable image from an electronic device. These references are meant to establish a frame of reference for convenience purposes and are not intended to otherwise limit the disclosure.


Collectively, FIGS. 1-8 illustrate embodiments of a data reader 10 that may be used in a typical checkout process, such as in a retail store or supermarket, to read optical codes or other data from typical grocery items and from electronic devices regardless of the reflectivity characteristics of the surface to which the optical label or data is affixed or otherwise presented on. Since many customers nowadays use their mobile phones or other electronic devices to carry coupons, loyalty cards, identification cards, credit cards, or other items that may be part of a typical checkout transaction, it is advantageous for a single data reader 10 to have the capability of reading data from a variety of sources, including from screens and electronic devices that may have highly reflective surfaces.


For example, with reference to FIG. 1, in one embodiment, the data reader 10 is capable of easily and efficiently reading the optical label or other data 12 from a display screen 14 of an electronic device 16, while minimizing or entirely avoiding specular reflection issues that may arise due to the high reflectivity of an exterior surface 18 of the display screen 14. The data reader 10 is also easily able to read an optical label or barcode (not shown) affixed to a typical grocery item, such as a cereal box or the like, where specular reflection may not be an issue because the optical label is typically printed on or otherwise affixed on a surface with little or no reflectivity. As is explained in more detail below, the data reader 10 may be used during a typical checkout process to read optical codes or other data from normal items 50 (see FIG. 6) and also from an electronic device 16 without requiring the operator to cycle between different readers or manually toggle between different reading modes of a data reader. Additional details of these and other embodiments of the data reader 10 are described herein with reference to the figures.



FIG. 1 is a diagrammatic view of a data reader 10 in accordance with one embodiment shown reading a barcode 12 displayed on the display screen 14 of a pda (personal digital assistant) or cell phone 16 (a smartphone being illustrated). The data reader 10 is illustrated as a Magellan 3300HSi model bar code reader available from Datalogic of Eugene Oreg. (U.S.A.), but any suitable imaging reader may be employed. The data reader 10 is schematically depicted as a horizontal single plane scanner suitable for reading optical codes, symbols, or other items. The data reader 10 is illustrated, by way of example, as a single window reader, but in other embodiments may be a presentation scanner, a multi-window reader, or may be arranged in any one of a variety of suitable configurations. The reader 10 may be configured as a fixed unit (mountable to a support surface or free standing on a horizontal surface) or a handheld unit. The reader 10 may alternately be configured as a combined handheld/fixed unit, e.g., one that may rest/be self-supporting upon a horizontal surface but be grasped by the user and moved to aim toward an item to be read.



FIGS. 2 and 3 are diagrams illustrating elements of the data reader 10 inside a housing 20 according to one embodiment. The data reader 10 includes an elongated window 22 and an illumination system comprising one or more illumination sources 24, 26 (illustrated in FIG. 3 as a row of group LEDs) for illuminating the read region 28 in front of the window 22. The data reader 10 includes an imager 32, which is shown mounted on a printed circuit board 42 disposed adjacent a bottom of the reader housing 20. The imager 32 may be a complementary metal oxide semiconductor (CMOS) imager, a semiconductor charge-coupled device (CCD) imager, or comprise other suitable imaging technology. In one embodiment, the CMOS imager may comprise an active-pixel imaging sensor with a global shutter, such as a model EV76C570 sensor sold by e2V technologies plc of Chelmsford, England or may operate on a rolling basis.


The data reader 10 may acquire an image of the read region 28 using anyone of a variety of mirror configurations. In one embodiment, an image of the read region 28 may be divided into a first portion 28a and a second portion 28b, each of which may be reflected off a series of mirrors (or other optical components) toward focusing optics 30, which in turn focuses the portions 28a, 28b onto the imager 32. For example, the first portion 28a may be reflected sidewardly by a first mirror 34 toward a second mirror 36, which directs the first portion 28a toward the focusing optics 30. Similarly, the second portion 28b may be reflected sidewardly by a third mirror 38 toward a fourth mirror 40, which directs the second portion 28b toward the focusing optics 30. In other embodiments, the mirrors may be arranged in a different configuration suitable to provide an image of the read region 28 onto the imager 32.


Preferably, the mirrors are arranged so that at least one portion (e.g., 28c in FIG. 2) of one of the fields-of-view (e.g., view 28b in FIG. 2) is directed at an angle α that is perpendicular or substantially dose to perpendicular (for example, ranging from 70° to 90°) with reference to the window 22, and another portion (e.g., 28d in FIG. 2) of the view 28b is directed at an angle β ranging from 60° to 90° to provide optimal viewing conditions to allow the data reader 10 to acquire an image when an electronic device 16 having a highly reflective screen surface is presented.


The imager 32, alone or together with logic components such as a complex programmable logic device (CPLD) or a field-programmable gate array (FPGA), is coupled to a controller or processor 44, which, among other functions, is preferably programmed to control operating parameters of the data reader 10 as discussed in further detail below. Processor 44 is also preferably programmed to read and decode optical codes or other symbols or imaged items. The processor 44 may comprise any suitable digital processor, such as a low-power DSP core or ARM core processor. In some embodiments, processor 44 comprises an ARM9 processor AT91SAM9G20 sold by Atmel of San Jose, Calif., USA, or OMAP processor sold by Texas Instruments of Dallas, Tek, USA or an i.MX1 series processor (such as the MC9328MX1 processor) sold by Freescale Semiconductor, Inc, of Austin, Tex., USA. Alternately, multiple processors or sub-processors or other types of processor electronics such as comparators or other specific function circuits may be used alone or in combination. For the purposes of this description, the term processor is meant to include any of these combinations.


In one embodiment, the processor 44 and on-board memory 46 are mounted on PCB 42 adjacent the imager 32, and are operable for controlling operation of the imager 32 and other reader components. The memory 46 may be flash memory, random access memory, or other suitable memory in communication with the processor 44. In some embodiments, memory 46 may be integrated with processor 44.


As mentioned previously, the data reader 10 includes illumination sources 24, 26 to illuminate the optical code on the item presented to the data reader 10. In one embodiment, the illumination sources 24, 26 comprise a collection of LEDs, for example, infrared or visible spectrum LEDs, but may alternatively comprise another suitable light source, such as a lamp or laser diode. The illumination sources 24, 26 may be coupled to and controlled by the processor 44 or may be remotely mounted and powered.


With reference to FIG. 4, in one embodiment, the imager 32 may be a two megapixel imager array that is divided into three views, labeled Views 1, 2, and 3. In this configuration, Views 2 and 3 may be dedicated for normal label reading of items 50 (see FIG. 3) having little or no specular reflection issues (e.g., items with little or no surface reflectivity) for every frame. As noted previously, in a typical checkout process, most of the items (e.g., grocery items) will likely have no specular reflection issues. Accordingly, in one embodiment, a larger share of the imager array 32 (e.g., approximately ⅔ of the imager) may be dedicated specifically for capturing images of these items 50. To accommodate data reading from electronic device 16 or other surfaces having high reflectivity, View 1 may be interleaved to alternate View 1 between normal label data reading (i.e., reading of items 50 with little or no specular reflection issues) and electronic device reading mode (i.e., reading of items 16 and surfaces with high reflectivity). As is described in further detail below with particular reference to FIGS. 5-8, the processor 44 may control the activation/pulsing frequency of the illumination sources 24, 26 and imager exposure timing to allow the data reader 10 to automatically transition between the normal label reading mode and the electronic device reading mode to obtain data from a variety of items having different surface reflectivity.



FIG. 5 is a diagrammatic view of the data reader 10 illustrating an example electronic device reading scenario for reading data from the reflective surface 18 of the self-illuminating display screen 14 of the electronic device 16. FIG. 6 is a diagrammatic view of the data reader 10 illustrating an example scenario where the data reader 10 is capable of reading optical codes from an item 50 having a non-reflective surface or surface with low reflectivity when the data reader 10 is in the electronic device reading mode. While the data reader 10 has been described previously as capable of operating in both the electronic device reading mode and a normal label reading mode, the processor 44 is capable of controlling illumination sources 24, 26 and exposure times of the imager 32 so that the data reader 10 is not only capable to capture data from electronic devices during the electronic device reading mode, but is also operable to capture data from normal items 50 that do not have highly reflective surfaces during the same electronic device reading mode. For example, when illumination source 24, 26 are activated, Views 1, 2, and 3 are optimized to read normal labels. When only one of the illumination sources (e.g., illumination source 24 in FIG. 5) is activated, Views 2 and 3 are optimized for normal label reading, and View 1 is optimized for cell phone reading as shown in FIG. 5. FIGS. 5 and 6 illustrate an embodiment of how this process operates. Additional details regarding specific control protocols by the processor 44 for controlling the illumination sources 24, 26 and exposure time of the imager 32 are further described with reference to FIGS. 7-8 below.



FIG. 5 is a diagrammatic view of the data reader 10 illustrating an example reading scenario for reading data from the reflective surface 18 of the display screen 14 of the electronic device 16. With particular reference to FIG. 5, the illumination sources 24 of the data reader 10 are preferably arranged so that when the illumination sources 24 are activated, a first travel path 48 of the illumination generated from the illumination sources 24 is generally directed at an acute angle θ relative to the window 22 so that at least a substantial portion, preferably the entirety, of the illumination generated from the illumination sources 24 is reflected off the self-illuminating display screen 14 of the electronic device 16 and travels along a second travel path 50 away from the read region 28 so that the illumination is not detected by the imager 32. Without the additional illumination from the illumination source 24 (which may otherwise saturate the imager 32), the imager 32 is able to obtain an image of the optical code 12 from the display screen 14 using the backlight illumination generated by the electronic device 16. In this embodiment, the illumination source 26 is not activated during the exposure time of the imager 32 because most of the illumination from illumination source 26 would be reflected back into the imager 32 and impair the obtained image.


It should be understood that the acute angle θ may depend on various factors, such as the number and arrangement of illumination sources 24, and a distance of the illumination sources 24 to the window 22. In addition, different light sources may have different light dispersion fields depending on various factors of the light source. For example, in one embodiment, the dispersion field may be conically shaped. Assuming the dispersion field of the illumination sources 24 is conically shaped, the acute angle θ may be measured based on a center line of the conical dispersion field.



FIG. 6 is a diagrammatic view of the data reader 10 illustrating an example of the data reader 10 obtaining an image from an item 50 having a non-reflective surface or other surface with low reflectivity when the data reader 10 is in the electronic device reading mode. In the same arrangement as described in FIG. 5, the illumination from illumination source 24 travels along the same travel path 48, but because of different specular reflection characteristics from the item 50 with a non-reflective surface 52, at least a portion of the illumination travels along a third travel path 54 and is ultimately detected by the imager 32. Depending on the reflectivity of the surface 52, portions of the illuminating traveling along travel paths 56, 58, 60 may or may not also be detected by the imager 32. The reflected light from the non-reflective surface 52 provides sufficient light for the imager 32 to obtain a decodable image of the optical label on the item 50. In some embodiments, to avoid direct reflection of illumination from the window 22 for illumination source 26, the illumination source 26 and/or the window 22 may be positioned further away from the imager 32.


The reading scenario illustrated in FIGS. 5 and 6 illustrate one embodiment for controlling the illumination sources 24, 26 and exposure time of the imager 32 during a data reading operation, which includes reading data from an LCD screen. In other embodiments, the processor 44 may control the specific activation of the illumination sources 24, 26 and the exposure time to allow the imager 32 to obtain images of the optical codes or other data present in the read region 28 without regard to the surface reflectivity of the item and without having the operator manually toggle between reading modes or periods of the data reader 10. For example, as is described in further detail in FIGS. 7 and 8, during a normal label reading mode, both the illumination sources 24, 26 may be activated during the exposure time of the imager 32 to provide sufficient illumination for the read region 28 so that the imager 32 obtains an adequately illuminated image of the optical code on the item 50.


With collective reference to FIGS. 7-8, the following sections provide additional details relating to control parameters of the processor 44 for controlling illumination and exposure characteristics of the data reader 10 to allow the data reader 10 to seamlessly transition between a normal reading mode (such as reading optical codes from stickers or other surfaces with no reflectivity) and an electronic device reading mode for reading optical labels from electronic devices. As mentioned previously, the processor 44 is operable to control the illumination and exposure times of the data reader 10 to ensure that the illumination and exposure is proper to allow the data reader 10 to capture images from a highly reflective surface, such as electronic device 16, without having the imager 32 become saturated with illumination reflected from the mobile device 16, while also allowing the data reader 10 to properly illuminate optical labels or other data on surfaces with little or no reflectivity to ensure that sufficient illumination is provided to obtain a decodable image. In addition, as mentioned previously with relation to FIG. 4, certain views or sections of the imager array 32 may be specifically dedicated for a normal reading mode, and other views or sections of the imager 32 may be interleaved between the normal reading mode and an electronic device reading mode.



FIGS. 7 and 8 are diagrams illustrating relative timing of imager frame exposure and illumination control for the data reader 10 of FIG. 2 in accordance with one embodiment. With particular reference to FIG. 7, at time T1, the processor 44 activates both the illumination sources 24, 26 to direct light outwardly from the window 22 and illuminate the read region 28. The processor 44 activates the illumination sources 24, 26 at a desired pulse rate and sets an exposure time for the imager 32 substantially equal to the pulse rate so that both illumination sources 24, 26 are active during the full exposure time. Turning back to FIG. 2, when illumination source 24 is activated, the illumination travels generally in a direction of the travel path 48 and at an angle θ relative to the window 22. When illumination source 26 is activated, the illumination travels generally in a direction of a travel path 62 to illuminate the read region 28. Since at least a portion of the illumination generated from the illumination source 26 travels outwardly and generally perpendicularly to the window 22, a substantial portion of the illumination from illumination source 26 may be reflected back toward the imager 32 if an item with a highly reflective surface is present at the read region 28. Accordingly, activating both illumination sources 24, 26 may be useful for items having little or no reflectivity so that the imager 32 is able to capture a proper image without being saturated from reflected illumination. However, having both illumination sources 24, 26 activated for electronic devices would likely saturate the imager 32 due to specular reflection issues as discussed previously.


Turning back to FIG. 7, at time T1, when both illumination sources 24, 26 are activated during the exposure time of the imager 26, the data reader 10 is in a normal label reading mode or period, where Views 2 and 3 (see FIG. 4) of the imager 32 capture an image from items 50 having little or no reflectivity.


At time T2, for View 1, the processor 44 activates both illumination sources 24, 26 and sets an exposure time equal to the pulse rate of the illumination sources 24, 26 in a similar fashion as during T1. As mentioned previously, View 1 is interleaved between a normal data reading mode or period (when both illumination sources 24, 26 are activated) and an electronic device reading mode or period (when only illumination source 24 is activated). At time T2, the processor 44 is operating the illumination sources 24, 26 and setting the exposure time to operate the data reader 10 in a normal reading mode or period for View 1.


At time T3, the processor 44 again activates both illumination sources 24, 26 and sets an exposure time equal to the pulse rate of the illumination sources 24, 26 in a similar fashion as during T1. During time T3, the data reader 10 continues operating in a normal label reading mode or period.


At time T4, the data reader 10 switches to an electronic device reading mode or period. During time T4, the processor 44 controls illumination sources 24, 26 so that illumination source 24 is activated during the exposure time while illumination source 26 is not activated. In this configuration, View 1 captures an image from the electronic device 16 when it is present in the read region 28. Because the illumination source 26 is not active during the exposure time, the illumination from illumination source 26 does not saturate the image obtained by the imager 32 in View 1. As was described previously with respect to FIGS. 5 and 6, when the data reader 10 operates in an electronic device reading mode or period, the illumination from illumination source 24 is mostly (if not entirely) reflected away from the imager 32 due to the angle θ at which the illumination source 24 is directed toward the reflective surface 18 of the electronic device 16. Accordingly, the imager 32 is able to capture a properly illuminated image (using ambient light and illumination from the backlit display screen of the electronic device 16) while avoiding light saturation issues since little or no illumination is reflected back into the imager 32 from illumination source 24 and illumination source 26 is inactive during time T4. Moreover, as described with reference to FIG. 6, when a normal item 50 is presented to the data reader 10 during time T4, a sufficient amount of illumination from illumination source 24 is reflected back toward the imager 32 to allow the data reader 10 to capture an image of the normal item 50 when the data reader 10 is optimized for reading data from an electronic device. Accordingly, the control of the illumination source 24, 26 allows the data reader 10 to operate to read normal items 50 even when specifically configured to read an optical label 12 from an electronic device 16.


In some embodiments, as illustrated in FIG. 7, the exposure time during time T4 may be lengthened (as compared to the exposure time during the normal item reading mode) to provide sufficient time for the imager 32 to properly capture an image of the optical code 12 from the electronic device 16. In some embodiments, the illumination source 24 may be activated by the processor 44 for only a portion of the exposure time, where the ambient illumination from the electronic device 16 may also serve to sufficiently illuminate the read region 28 without the need for additional illumination from the data reader 10.


The same process described during times T1 through T4 may be repeated again at times T5 through T8 (and so on) in a similar fashion to interleave the data reader 10 between a normal item reading mode and an electronic device reading mode.


It should be understood that the scenario described with reference to FIG. 7 is only one embodiment. In other embodiments, the processor 44 may be programmed to operate at any desired interleaving ratio for View 1. For example, as described, the processor 44 interleaves View 1 between normal item reading mode and an electronic device reading mode in a 1:1 ratio, that is, View 1 alternates between the normal item reading mode and the electronic device reading mode. For typical checkout processes, this interleaving ratio may be suitable since most, if not all, of the items 50 likely will have no specular reflection issues. However, in other embodiments where the data reader 10 may regularly or primarily read data from electronic devices 16, the interleaving ratio may be greater such that View 1 is dedicated for electronic device reading more often. Conversely, where the data reader 10 does not often read from electronic devices, then the interleaving ratio may be smaller such that View 1 operates in an electronic device reading mode once every three or four cycles, for example.


In other embodiments, the processor 44 may operate and control the illumination sources 24, 26 and the exposure times in a different manner to effectively alternate between the normal reading and electronic device reading modes. In one embodiment, the processor 44 may operate in a multiple integration mode with a long exposure time and cycling the illumination sources 24, 26 independently from one another during the exposure time.


For example, with reference to FIG. 8, at time T1, the processor 44 may activate illumination source 26 during a first portion of the exposure time (while illumination source 24 is deactivated), and thereafter activate illumination source 24 (while illumination source 26 is deactivated) during a second portion of the exposure time. In this arrangement, the captured frame of the read region 28 corresponds to Views 2 and 3 of the imager 32. The same cycling of illumination sources 24, 26 may be run at time T2 and T3, with the captured frame at time T2 corresponding to View 1 of the imager 32 and the captured frame at time T3 corresponding to Views 2 and 3 of the imager 32. At all three times T1, T2, and T3, the data reader 10 is operating in a normal item reading mode to read optical labels or other data from items 50 with little or no surface reflectivity.


At time T4, the processor 44 alternates to the electronic device reading mode. In this arrangement, the processor 44 activates illumination source 24 during a first portion of the exposure time, and thereafter deactivates illumination source 24 during a second portion of the exposure time. During time T4, the Illumination source 26 is not activated at any point during the exposure time of the imager 32 to minimize or avoid reflected illumination into the imager 32 from illumination source 26. As described previously and illustrated in FIG. 5, the illumination from illumination source 24 is substantially (if not entirely) reflected away from the imager 32 when an electronic device 16 is present in the read region 28. The same process described during times T1 through T4 may be repeated again at times T5 through T9 in a similar fashion to cycle the data reader 10 between a normal item reading mode and an electronic device reading mode.


It should be understood that the scenario described with reference to FIG. 8 is only one embodiment. In other embodiments, the processor 44 may be programmed to operate at any desired interleaving ratio for View 1. For example, as described, the processor 44 interleaves View 1 between normal item reading mode and an electronic device reading mode in a 1:1 ratio, that is, View 1 alternates between the normal item reading mode and the electronic device reading mode. For typical checkout processes, this interleaving ratio may be suitable where many of the items 50 may not have specular reflection issues. However, in other embodiments where the data reader 10 may regularly or primarily read data from electronic devices, such as electronic device 16, the ratio may be greater such that View 1 is more often dedicated for electronic device reading. Conversely, where the data reader 10 does not often read from electronic devices, then the interleaving ratio may be smaller such that View 1 operates in an electronic device reading mode once every three or four cycles, for example.


In other embodiments, the processor may be programmed to pulse one or more of the illumination sources to avoid or minimize the perception of illumination flicker by a user or bystander while implementing methods for reading optical codes presented on electronic display screens or other highly reflective surfaces. Additional details relating to example embodiments are described in U.S. Pat. No. 9,122,939, the disclosure of which is incorporated by reference herein in its entirety.


While certain preferred systems and methods have been shown and described, it will be apparent to one skilled in the art that modifications, alternatives and variations are possible without departing from the inventive concepts set forth herein. Therefore, the invention is intended to embrace all such modifications, alternatives and variations.

Claims
  • 1. A system for data reading comprising: a window;an imager for capturing an image of an item bearing an optical code, the imager including an imager array divided into a first portion and a non-overlapping second portion;an illumination system for illuminating the item, the illumination system including a first illumination source and a second illumination source, each of which direct light outwardly through the window and into a read region; anda controller in operative communication with the imager and the illumination system, wherein the controller is programmed to operate the imager and the illumination system during a first reading frame at a first time period and during a second reading frame at a subsequent second time period, wherein during the first time period for the first reading frame, the controller is programmed to activate both the first and second illumination sources during a first exposure period of the imager, the controller operating the imager to capture a first image of a first optical code when a first item is present in the read region, wherein the first portion of the imager is exposed during the first exposure period to capture the first image on the first portion of the imager; andwherein during the second time period for the second reading frame, the controller is programmed to activate the first illumination source during a second exposure period of the imager, and deactivate the second illumination source during the second exposure period, the controller operating the imager to capture a second image of a second optical code when a second item is present in the read region, wherein the second portion of the imager is exposed during the second exposure period to capture the second image on the second portion of the imager.
  • 2. The system of claim 1, where the first illumination source is oriented such that illumination generated from the first illumination source is directed toward the window at an acute angle relative to the window, and wherein when the second item is a reflective display of an electronic device, the illumination generated from the illumination source is reflected off the reflective display such that the imager does not detect the illumination from the first illumination source.
  • 3. The system of claim 2, wherein when the item does not have a reflective display, the first illumination source is reflected back toward the imager.
  • 4. The system of claim 1, wherein the processor is further programmed to interleave between the first reading period and the second reading period to capture images of items on the first portion of the imager.
  • 5. The system of claim 4, wherein the processor is further configured to operate the data reader in the first reading period to capture images of items on the second portion of the imager.
  • 6. The system of claim 4, wherein the processor is configured to interleave between the first and second reading periods at an interleaving ratio of 1:1.
  • 7. The system of claim 1, wherein during the first reading period, the processor is programmed to pulse the first and second illumination sources substantially simultaneously at a pulse rate at least equal to the first exposure period of the imager.
  • 8. The system of claim 1, wherein during the second reading period, the processor is programmed to pulse the first illumination source at a pulse rate at least equal to the first exposure period of the imager.
  • 9. The system of claim 1, wherein during the first reading period, the processor is programmed to activate the first illumination source and then the second illumination source in sequence during the first exposure period of the imager.
  • 10. A method for data reading using a data reader having a window, an imager for capturing an image of an item, the imager including an imager array divided into a first portion and a non-overlapping second portion, and an illumination system including a first illumination source and a second illumination source, the method comprising the steps of: in a first reading frame at a first time period, activating, via a processor, the first and second illumination sources during a first exposure period of the imager to direct light outwardly from the window and into a read region;exposing, via the processor, the first portion of the imager during the first exposure period;if a first item is present in the read region, capturing, via the imager, a first image of the first item on the first portion of the imager during the first exposure period; andin a second reading frame at a second time period subsequent to the first time period, activating, via a processor, the first illumination sources during the second exposure period to direct light outwardly from the window and into a read region;deactivating, via the processor, the second illumination sources during the second exposure period;exposing, via the processor, the second portion during the second exposure period; andif a second item is present in the read region, capturing, via the imager, a second image of the second item on the second portion of the imager during the second exposure period.
  • 11. The method of claim 10, further comprising interleaving, via the processor, between the first reading period and the second reading period to capture images of items on the first portion of the imager.
  • 12. The method of claim 11, further comprising operating, via the processor, the data reader in the first reading period to capture images of items on the second portion of the imager.
  • 13. The method of claim 12, further comprising interleaving, via the processor, between the first and second reading period at an interleaving ratio of 1:1.
  • 14. The method of claim 10, further comprising pulsing, via the processor, the first and second illumination sources substantially simultaneously at a pulse rate at least equal to the first exposure period of the imager during the first reading period.
  • 15. The method of claim 10, further comprising pulsing, via the processor, the first illumination source at a pulse rate at least equal to the first exposure period of the imager during the second reading period.
  • 16. The method of claim 10, further comprising activating, via the processor, the first illumination source and then the second illumination source in sequence during the first exposure period of the imager during the first reading period.
  • 17. A system for data reading comprising: a window;an imager for capturing an image of an item bearing an optical code;an illumination system for illuminating the item, the illumination system including a first illumination source and a second illumination source, each of which direct light outwardly through the window and into a read region, where the first illumination source is oriented such that illumination generated from the first illumination source is directed toward the window at an acute angle relative to the window, and wherein when the second item is a reflective display of an electronic device, the illumination generated from the illumination source is reflected off the reflective display such that the imager does not detect the illumination from the first illumination source; anda controller in operative communication with the imager and the illumination system, wherein the controller is programmed to operate the imager and the illumination system during a first reading period and a different second reading period, wherein during the first reading period, the controller is programmed to activate both the first and second illumination sources during a first exposure period of the imager, the controller operating the imager to capture a first image of a first optical code when a first item is present in the read region; andwherein during the second reading period, the controller is programmed to activate the first illumination source during a second exposure period of the imager, and deactivate the second illumination source during the second exposure period, the controller operating the imager to capture a second image of a second optical code when a second item is present in the read region.
  • 18. The system of claim 17, wherein the imager is an imager array divided into a first portion and a non-overlapping second portion, and wherein the processor is further programmed to interleave between the first reading period and the second reading period to capture images of items on the first portion of the imager.
  • 19. The system of claim 18, wherein the processor is further configured to operate the data reader in the first reading period to capture images of items on the second portion of the imager.
  • 20. The system of claim 18, wherein the processor is configured to interleave between the first and second reading periods at an interleaving ratio of 1:1.
US Referenced Citations (136)
Number Name Date Kind
4315245 Nakahara et al. Feb 1982 A
4578571 Williams Mar 1986 A
4930884 Tichenor Jun 1990 A
4949391 Faulkerson et al. Aug 1990 A
4969034 Salvati Nov 1990 A
5010241 Butterworth Apr 1991 A
5130856 Tichenor Jul 1992 A
5245168 Shigeta Sep 1993 A
5250791 Heiman et al. Oct 1993 A
5430558 Sohaei et al. Jul 1995 A
5475207 Bobba Dec 1995 A
5572006 Wang et al. Nov 1996 A
5585616 Roxby et al. Dec 1996 A
5648650 Sugifune et al. Jul 1997 A
5701001 Sugifune et al. Dec 1997 A
5705802 Bobba Jan 1998 A
5723868 Hammond Mar 1998 A
5756981 Roustaei et al. May 1998 A
5793031 Tani et al. Aug 1998 A
5837988 Bobba Nov 1998 A
5936218 Ohkawa Aug 1999 A
5949054 Karpen et al. Sep 1999 A
5969325 Hecht et al. Oct 1999 A
6061091 Van de Poel et al. May 2000 A
6065678 Li et al. May 2000 A
6189795 Ohkawa et al. Feb 2001 B1
6230975 Colley et al. May 2001 B1
6254003 Pettinelli et al. Jul 2001 B1
6290135 Acosta et al. Sep 2001 B1
6318635 Stoner Nov 2001 B1
6347163 Roustaei Feb 2002 B2
6462880 Ohkawa et al. Oct 2002 B1
6505778 Reddersen et al. Jan 2003 B1
6561427 Davis et al. May 2003 B2
6568598 Bobba May 2003 B1
6609660 Stoner Aug 2003 B1
6631844 Ohkawa Oct 2003 B1
6708883 Krichever Mar 2004 B2
6749120 Hung et al. Jun 2004 B2
6783068 Hecht Aug 2004 B2
6825486 Cole et al. Nov 2004 B1
6906699 Fahraeus et al. Jun 2005 B1
6974084 Bobba Dec 2005 B2
6980692 Chamberlain Dec 2005 B2
6991169 Bobba Jan 2006 B2
7097102 Patel et al. Aug 2006 B2
7119932 Sato et al. Oct 2006 B2
7131587 He et al. Nov 2006 B2
7147159 Longacre, Jr. et al. Dec 2006 B2
7148923 Harper et al. Dec 2006 B2
7172125 Wang et al. Feb 2007 B2
7178734 Hammer Feb 2007 B1
7198195 Bobba Apr 2007 B2
7204420 Barkan et al. Apr 2007 B2
7212279 Feng May 2007 B1
7227117 Lackemann et al. Jun 2007 B1
7234641 Olmstead Jun 2007 B2
7290711 Barkan Nov 2007 B2
7296744 He Nov 2007 B2
7308375 Jensen et al. Dec 2007 B2
7398927 Olmstead et al. Jul 2008 B2
7450559 Schotten et al. Nov 2008 B2
7461790 McQueen et al. Dec 2008 B2
7475821 Barkan et al. Jan 2009 B2
7490770 Shearin Feb 2009 B2
7490776 Thuries Feb 2009 B2
7494065 Barkan et al. Feb 2009 B2
7527207 Acosta et al. May 2009 B2
7546951 Kotlarsky et al. Jun 2009 B2
7568623 Retter et al. Aug 2009 B2
7762464 Goren et al. Jul 2010 B2
7984854 Nadabar Jul 2011 B2
8089665 Omori Jan 2012 B2
8387881 Van Volkinburg et al. Mar 2013 B2
8573497 Gao et al. Nov 2013 B2
8622299 Crooks et al. Jan 2014 B2
20020039137 Harper et al. Apr 2002 A1
20020067850 Williams et al. Jun 2002 A1
20020070278 Hung et al. Jun 2002 A1
20020084330 Chiu Jul 2002 A1
20030002024 Motegi Jan 2003 A1
20030098352 Schnee et al. May 2003 A1
20030230630 Whipple et al. Dec 2003 A1
20040118928 Patel et al. Jun 2004 A1
20040179254 Lewis et al. Sep 2004 A1
20040246367 Yoshida Dec 2004 A1
20050116044 Zhu Jun 2005 A1
20050135798 Szajewski Jun 2005 A1
20050156047 Chiba et al. Jul 2005 A1
20050253937 Moholt et al. Nov 2005 A1
20060011725 Schnee Jan 2006 A1
20060043194 Barkan et al. Mar 2006 A1
20060118629 Shiramizu et al. Jun 2006 A1
20060163355 Olmstead et al. Jul 2006 A1
20060255150 Longacre, Jr. Nov 2006 A1
20070001015 Suzuki et al. Jan 2007 A1
20070034696 Barkan et al. Feb 2007 A1
20070035718 Haddad Feb 2007 A1
20070040034 Hennick et al. Feb 2007 A1
20070158428 Havens et al. Jul 2007 A1
20070181692 Barkan et al. Aug 2007 A1
20070295812 Mazowiesky Dec 2007 A1
20070295814 Tanaka et al. Dec 2007 A1
20080011855 Nadabar Jan 2008 A1
20080035732 Vinogradov et al. Feb 2008 A1
20080035733 Vinogradov et al. Feb 2008 A1
20080080785 Ford Apr 2008 A1
20080099561 Douma May 2008 A1
20080105745 Lei May 2008 A1
20080191022 Barkan et al. Aug 2008 A1
20080266425 Shurboff Oct 2008 A1
20090001166 Barkan et al. Jan 2009 A1
20090001168 Hammer Jan 2009 A1
20090057412 Bhella et al. Mar 2009 A1
20090108076 Barkan et al. Apr 2009 A1
20090167490 Hayaashi et al. Jul 2009 A1
20100147948 Powell et al. Jun 2010 A1
20100213259 Gao Aug 2010 A1
20100219249 Barkan Sep 2010 A1
20100245270 Nako Sep 2010 A1
20110050650 McGibney Mar 2011 A1
20110157089 Rainisto Jun 2011 A1
20120000982 Gao Jan 2012 A1
20120018516 Gao Jan 2012 A1
20120067956 Gao et al. Mar 2012 A1
20120111944 Gao et al. May 2012 A1
20120138684 Van Volkinburg et al. Jun 2012 A1
20120181338 Gao Jul 2012 A1
20120193429 Van Volkinburg et al. Aug 2012 A1
20120248184 Naito Oct 2012 A1
20130062412 Tan et al. Mar 2013 A1
20130075464 Van Horn et al. Mar 2013 A1
20130082109 Meier et al. Apr 2013 A1
20130134217 Crooks et al. May 2013 A1
20130181055 Liu et al. Jul 2013 A1
20150001294 Zocca et al. Jan 2015 A1
Foreign Referenced Citations (8)
Number Date Country
1178957 Apr 1998 CN
101031930 Sep 2007 CN
101710407 May 2010 CN
0574024 Dec 1993 EP
08-129597 May 1996 JP
2009-544105 Dec 2009 JP
10-2007-0063899 Jun 2007 KR
10-2009-0110245 Oct 2009 KR
Non-Patent Literature Citations (9)
Entry
Hensel Jr., “Cell phone boarding passes going into use here first” http://www.chron.com/disp.story.mpl/front/5349969.html, visited Feb. 19, 2010.
Reardon “Cell phone as boarding pass,” http://news.cnet.com/8301-10784—3-9896859-7.html, visited Feb. 19, 2010.
TSA Blog, “TSA Paperless Boarding Pass Pilot Expanding,” http://www.tsa.gov/blog/2009/06/tsa:paperless-boarding—-pass-pilot.html, visited Feb. 19, 2010.
Mobile Commerce Daily, “Hexaware launches 2D bar code boarding pass service,” http://www.mobilecommercedaily.com/2009/11/30/hexaware-launches-2d-bar-code-boarding-pass-service , visited Feb. 19, 2010.
American Airlines, “Introducing Mobile Boarding Passes,” http://www.aa.com/i—l8n/urls/mobileBoarding.jsp, visited Feb. 19, 2010.
International Patent Application No. PCT/US2011/049556, International Search Report and Written Opinion, Feb. 9, 2012, 2 pages.
International Search Report, PCT/US2011/060232, Apr. 27, 2012.
International Search Report , PCT/US2011/036295, mailed Nov. 23, 2011.
Datalogic™, Magellan™ 3200Vsl On-Counter Vertical Presentation Scanner, Product Reference Guide, Rev. A. Jul. 2010.