The field of the disclosure relates generally to data reading devices, and particularly, to improved data reading devices for reading data from a reflective surface of electronic devices.
Optical codes, such as barcodes and other machine-readable indicia, appear in a variety of applications. There are a variety of forms, such as: linear barcodes (e.g., UPC code), 2D codes including stacked barcodes (e.g., PDF-417 code), and matrix codes (e.g., Datamatrix code, QR code, or Maxicode). There are several types of data readers used for reading these optical codes. The most common types of optical code readers are laser scanners and imaging readers. A laser scanner typically moves, i.e. scans, a laser light beam across the barcode. Imaging readers are typically used to capture a 2D image of an area, including the optical code or other scene, focused onto a detector array such as charge-coupled devices (CCDs) and complementary metal oxide semiconductor (CMOS) imagers. With some such imaging readers, it may be advantageous to provide a source of illumination that illuminates the optical code or other scene being imaged, to provide the required signal response in the imaging device. Such a source of illumination can reduce exposure time, thereby improving imager performance, especially in low ambient light conditions and when imaging moving items.
Typically, in a grocery or retail establishment, optical codes are often printed directly on items or printed on a sticker that is thereafter affixed to the item. These optical codes are usually printed or located on surfaces with little or no reflectivity so that illumination from a data reading device is not reflected back toward the data reading device, which may render the image obtained by the data reader difficult to process.
Businesses have begun sending optical codes to customers who display such optical codes on a portable electronic device, such as a mobile telephone or cell phone, personal digital assistant, palm, tablet, or laptop computer, or other suitable device having an electronic display, such as a liquid crystal display (LCD). For example, an airline passenger may display an optical code on a portable electronic device for an airline employee to read using a data reader as verification of the passenger's ticket. Or, a customer in a store may display an optical code on a portable electronic device for a cashier to read using a data reader to redeem a coupon. Optical codes are also included on other items having highly, or relatively highly, reflective surfaces, for example, but not limited to, identification (ID) cards, aluminum cans, and objects in plastic packaging.
The present inventors have recognized that optical codes presented on, or under, a highly, or relatively highly, reflective surface are typically difficult to decode using general-purpose data readers. For example, the present inventors have recognized that general-purpose data readers commonly use artificial illumination to illuminate an object bearing an optical code to create an image of the optical code having sufficient contrast for decoding the optical code. The present inventors have also recognized that highly, or relatively highly, reflective surfaces bearing optical codes commonly reflect a large amount of such artificial illumination resulting in a saturated, or partially saturated, image that does not have sufficient contrast for decoding the optical code because all, or portions, of the image appear light, or white. However, simply eliminating the artificial illumination is not a practicable solution since the data reader may not otherwise have sufficient illumination to read optical labels from non-reflective items, which would likely be the most common use of the data reader.
Other general-purpose data readers may be capable of detecting the presence of an electronic device or other reflective surface by detecting the amount of light reflected toward the data reader. In some systems, the data reader may attempt to switch from a primary data reading mode (e.g., reading data from items having non-reflective surfaces or surfaces with relatively low reflectivity) to a secondary reading mode (e.g., reading data from items having highly reflective surfaces) in response to detecting the electronic device. However, many existing data readers have difficulty reliably detecting the presence of an electronic device. In such cases, the data reader may improperly switch to the secondary reading mode, thereby being unable to read normal optical codes on non-reflective surfaces, or may fail to properly switch over when presented with an electronic device, thereby being unable to read data from highly-reflective surfaces of the electronic device.
Still other data readers attempt to divide the imager between the primary and secondary reading modes, dedicating a specific percentage of the imager exclusively to each reading mode. Accordingly, the imager includes an area specifically dedicated to detecting an electronic device and reading data therefrom, and an area specifically dedicated to reading data from non-reflective surfaces. However, the present inventors have recognized that a disadvantage of this configuration is that the data reader dedicates significant resources for reading data from electronic devices regardless of whether such a device is present or not, which detrimentally affects overall performance of the data reader by reducing resources that may be used for reading data from non-reflective surfaces, which as noted previously, is likely the most common use of the data reader.
Thus, the present inventors have identified a need for a general-purpose data reader that has improved versatility in handling reading of optical codes appearing on (or behind) highly, or relatively highly, reflective surfaces, as well as reading optical codes appearing on surfaces have no or little reflectivity.
Methods and systems are disclosed for improved reading of optical codes on highly reflective surfaces, such as a display on a mobile phone or other electronic device.
In one example system/method, the data reader includes an imager having a first portion for reading optical codes from electronic devices or from other highly reflective surfaces, and a second portion dedicated to reading optical codes from non-reflective surfaces or surface having little reflectivity. The data reader includes an illumination module and a processor designed to control the illumination output during the reading process, where the processor controls the output depending on whether the data reader is obtaining data from a highly reflective surface or from a non-reflective surface (or surface with low reflectivity). In some embodiments, the processor interleaves the first portion of the imager to alternate between reading periods for highly reflective and non-reflective surfaces. By interleaving the first portion of the imager, the data reader may be able to quickly and efficiently read data from a variety of surfaces while also minimizing reflectivity issues.
Additional aspects and advantages will be apparent from the following detailed description of example embodiments, which proceeds with reference to the accompanying drawings.
With reference to the drawings, this section describes particular embodiments and their detailed construction and operation. The embodiments described herein are set forth by way of illustration only and not limitation. The described features, structures, characteristics, and methods of operation may be combined in any suitable manner in one or more embodiments. In view of the disclosure herein, those skilled in the art will recognize that the various embodiments can be practiced without one or more of the specific details or with other methods, components, materials, or the like. For the sake of clarity and conciseness, certain aspects of components or steps of certain embodiments are presented without undue detail where such detail would be apparent to those skilled in the art in light of the teachings herein and/or where such detail would obfuscate an understanding of more pertinent aspects of the embodiments.
In the following description of the figures and any example embodiments, the system may be referred to in conjunction with use at a retail establishment. It should be understood that such use is merely one example use for such a system. Other uses for a system with the characteristics and features described herein may be implemented, for example, in an industrial location such as a parcel distribution (e.g., postal) station or for processing inventory, as well as other suitable uses that may involve reading optical codes from electronic devices or other reflective surfaces. In addition, for convenience, certain embodiments may refer to the data reader operable for capturing optical codes from a mobile or cellular phone. It should be understood that this is merely one example embodiment and use of the system with the features and functionalities described herein. The system may be used to capture optical codes from any suitable device or product having a reflective surface.
For convenience, the following description may at times refer to the data reader having a normal label reading mode or period and an electronic device reading mode or period. References to the normal label reading mode or period may refer to instances where the data reader is used to obtain images from items having surfaces with little or no reflectivity such that specular reflection issues do not meaningfully interfere with an ability of the data reader to capture a decodable image (i.e., reflected illumination does not saturate the imager). References to the electronic device reading mode or period may refer to instances where the data reader is used to obtain images from items having surfaces with high reflectivity, and in some cases self-illuminating displays, such as electronic devices with LCD display screens, where reflected illumination may saturate the imager and interfere with an ability of the data reader to capture a decodable image from an electronic device. These references are meant to establish a frame of reference for convenience purposes and are not intended to otherwise limit the disclosure.
Collectively,
For example, with reference to
The data reader 10 may acquire an image of the read region 28 using anyone of a variety of mirror configurations. In one embodiment, an image of the read region 28 may be divided into a first portion 28a and a second portion 28b, each of which may be reflected off a series of mirrors (or other optical components) toward focusing optics 30, which in turn focuses the portions 28a, 28b onto the imager 32. For example, the first portion 28a may be reflected sidewardly by a first mirror 34 toward a second mirror 36, which directs the first portion 28a toward the focusing optics 30. Similarly, the second portion 28b may be reflected sidewardly by a third mirror 38 toward a fourth mirror 40, which directs the second portion 28b toward the focusing optics 30. In other embodiments, the mirrors may be arranged in a different configuration suitable to provide an image of the read region 28 onto the imager 32.
Preferably, the mirrors are arranged so that at least one portion (e.g., 28c in
The imager 32, alone or together with logic components such as a complex programmable logic device (CPLD) or a field-programmable gate array (FPGA), is coupled to a controller or processor 44, which, among other functions, is preferably programmed to control operating parameters of the data reader 10 as discussed in further detail below. Processor 44 is also preferably programmed to read and decode optical codes or other symbols or imaged items. The processor 44 may comprise any suitable digital processor, such as a low-power DSP core or ARM core processor. In some embodiments, processor 44 comprises an ARM9 processor AT91SAM9G20 sold by Atmel of San Jose, Calif., USA, or OMAP processor sold by Texas Instruments of Dallas, Tek, USA or an i.MX1 series processor (such as the MC9328MX1 processor) sold by Freescale Semiconductor, Inc, of Austin, Tex., USA. Alternately, multiple processors or sub-processors or other types of processor electronics such as comparators or other specific function circuits may be used alone or in combination. For the purposes of this description, the term processor is meant to include any of these combinations.
In one embodiment, the processor 44 and on-board memory 46 are mounted on PCB 42 adjacent the imager 32, and are operable for controlling operation of the imager 32 and other reader components. The memory 46 may be flash memory, random access memory, or other suitable memory in communication with the processor 44. In some embodiments, memory 46 may be integrated with processor 44.
As mentioned previously, the data reader 10 includes illumination sources 24, 26 to illuminate the optical code on the item presented to the data reader 10. In one embodiment, the illumination sources 24, 26 comprise a collection of LEDs, for example, infrared or visible spectrum LEDs, but may alternatively comprise another suitable light source, such as a lamp or laser diode. The illumination sources 24, 26 may be coupled to and controlled by the processor 44 or may be remotely mounted and powered.
With reference to
It should be understood that the acute angle θ may depend on various factors, such as the number and arrangement of illumination sources 24, and a distance of the illumination sources 24 to the window 22. In addition, different light sources may have different light dispersion fields depending on various factors of the light source. For example, in one embodiment, the dispersion field may be conically shaped. Assuming the dispersion field of the illumination sources 24 is conically shaped, the acute angle θ may be measured based on a center line of the conical dispersion field.
The reading scenario illustrated in
With collective reference to
Turning back to
At time T2, for View 1, the processor 44 activates both illumination sources 24, 26 and sets an exposure time equal to the pulse rate of the illumination sources 24, 26 in a similar fashion as during T1. As mentioned previously, View 1 is interleaved between a normal data reading mode or period (when both illumination sources 24, 26 are activated) and an electronic device reading mode or period (when only illumination source 24 is activated). At time T2, the processor 44 is operating the illumination sources 24, 26 and setting the exposure time to operate the data reader 10 in a normal reading mode or period for View 1.
At time T3, the processor 44 again activates both illumination sources 24, 26 and sets an exposure time equal to the pulse rate of the illumination sources 24, 26 in a similar fashion as during T1. During time T3, the data reader 10 continues operating in a normal label reading mode or period.
At time T4, the data reader 10 switches to an electronic device reading mode or period. During time T4, the processor 44 controls illumination sources 24, 26 so that illumination source 24 is activated during the exposure time while illumination source 26 is not activated. In this configuration, View 1 captures an image from the electronic device 16 when it is present in the read region 28. Because the illumination source 26 is not active during the exposure time, the illumination from illumination source 26 does not saturate the image obtained by the imager 32 in View 1. As was described previously with respect to
In some embodiments, as illustrated in
The same process described during times T1 through T4 may be repeated again at times T5 through T8 (and so on) in a similar fashion to interleave the data reader 10 between a normal item reading mode and an electronic device reading mode.
It should be understood that the scenario described with reference to
In other embodiments, the processor 44 may operate and control the illumination sources 24, 26 and the exposure times in a different manner to effectively alternate between the normal reading and electronic device reading modes. In one embodiment, the processor 44 may operate in a multiple integration mode with a long exposure time and cycling the illumination sources 24, 26 independently from one another during the exposure time.
For example, with reference to
At time T4, the processor 44 alternates to the electronic device reading mode. In this arrangement, the processor 44 activates illumination source 24 during a first portion of the exposure time, and thereafter deactivates illumination source 24 during a second portion of the exposure time. During time T4, the Illumination source 26 is not activated at any point during the exposure time of the imager 32 to minimize or avoid reflected illumination into the imager 32 from illumination source 26. As described previously and illustrated in
It should be understood that the scenario described with reference to
In other embodiments, the processor may be programmed to pulse one or more of the illumination sources to avoid or minimize the perception of illumination flicker by a user or bystander while implementing methods for reading optical codes presented on electronic display screens or other highly reflective surfaces. Additional details relating to example embodiments are described in U.S. Pat. No. 9,122,939, the disclosure of which is incorporated by reference herein in its entirety.
While certain preferred systems and methods have been shown and described, it will be apparent to one skilled in the art that modifications, alternatives and variations are possible without departing from the inventive concepts set forth herein. Therefore, the invention is intended to embrace all such modifications, alternatives and variations.
Number | Name | Date | Kind |
---|---|---|---|
4315245 | Nakahara et al. | Feb 1982 | A |
4578571 | Williams | Mar 1986 | A |
4930884 | Tichenor | Jun 1990 | A |
4949391 | Faulkerson et al. | Aug 1990 | A |
4969034 | Salvati | Nov 1990 | A |
5010241 | Butterworth | Apr 1991 | A |
5130856 | Tichenor | Jul 1992 | A |
5245168 | Shigeta | Sep 1993 | A |
5250791 | Heiman et al. | Oct 1993 | A |
5430558 | Sohaei et al. | Jul 1995 | A |
5475207 | Bobba | Dec 1995 | A |
5572006 | Wang et al. | Nov 1996 | A |
5585616 | Roxby et al. | Dec 1996 | A |
5648650 | Sugifune et al. | Jul 1997 | A |
5701001 | Sugifune et al. | Dec 1997 | A |
5705802 | Bobba | Jan 1998 | A |
5723868 | Hammond | Mar 1998 | A |
5756981 | Roustaei et al. | May 1998 | A |
5793031 | Tani et al. | Aug 1998 | A |
5837988 | Bobba | Nov 1998 | A |
5936218 | Ohkawa | Aug 1999 | A |
5949054 | Karpen et al. | Sep 1999 | A |
5969325 | Hecht et al. | Oct 1999 | A |
6061091 | Van de Poel et al. | May 2000 | A |
6065678 | Li et al. | May 2000 | A |
6189795 | Ohkawa et al. | Feb 2001 | B1 |
6230975 | Colley et al. | May 2001 | B1 |
6254003 | Pettinelli et al. | Jul 2001 | B1 |
6290135 | Acosta et al. | Sep 2001 | B1 |
6318635 | Stoner | Nov 2001 | B1 |
6347163 | Roustaei | Feb 2002 | B2 |
6462880 | Ohkawa et al. | Oct 2002 | B1 |
6505778 | Reddersen et al. | Jan 2003 | B1 |
6561427 | Davis et al. | May 2003 | B2 |
6568598 | Bobba | May 2003 | B1 |
6609660 | Stoner | Aug 2003 | B1 |
6631844 | Ohkawa | Oct 2003 | B1 |
6708883 | Krichever | Mar 2004 | B2 |
6749120 | Hung et al. | Jun 2004 | B2 |
6783068 | Hecht | Aug 2004 | B2 |
6825486 | Cole et al. | Nov 2004 | B1 |
6906699 | Fahraeus et al. | Jun 2005 | B1 |
6974084 | Bobba | Dec 2005 | B2 |
6980692 | Chamberlain | Dec 2005 | B2 |
6991169 | Bobba | Jan 2006 | B2 |
7097102 | Patel et al. | Aug 2006 | B2 |
7119932 | Sato et al. | Oct 2006 | B2 |
7131587 | He et al. | Nov 2006 | B2 |
7147159 | Longacre, Jr. et al. | Dec 2006 | B2 |
7148923 | Harper et al. | Dec 2006 | B2 |
7172125 | Wang et al. | Feb 2007 | B2 |
7178734 | Hammer | Feb 2007 | B1 |
7198195 | Bobba | Apr 2007 | B2 |
7204420 | Barkan et al. | Apr 2007 | B2 |
7212279 | Feng | May 2007 | B1 |
7227117 | Lackemann et al. | Jun 2007 | B1 |
7234641 | Olmstead | Jun 2007 | B2 |
7290711 | Barkan | Nov 2007 | B2 |
7296744 | He | Nov 2007 | B2 |
7308375 | Jensen et al. | Dec 2007 | B2 |
7398927 | Olmstead et al. | Jul 2008 | B2 |
7450559 | Schotten et al. | Nov 2008 | B2 |
7461790 | McQueen et al. | Dec 2008 | B2 |
7475821 | Barkan et al. | Jan 2009 | B2 |
7490770 | Shearin | Feb 2009 | B2 |
7490776 | Thuries | Feb 2009 | B2 |
7494065 | Barkan et al. | Feb 2009 | B2 |
7527207 | Acosta et al. | May 2009 | B2 |
7546951 | Kotlarsky et al. | Jun 2009 | B2 |
7568623 | Retter et al. | Aug 2009 | B2 |
7762464 | Goren et al. | Jul 2010 | B2 |
7984854 | Nadabar | Jul 2011 | B2 |
8089665 | Omori | Jan 2012 | B2 |
8387881 | Van Volkinburg et al. | Mar 2013 | B2 |
8573497 | Gao et al. | Nov 2013 | B2 |
8622299 | Crooks et al. | Jan 2014 | B2 |
20020039137 | Harper et al. | Apr 2002 | A1 |
20020067850 | Williams et al. | Jun 2002 | A1 |
20020070278 | Hung et al. | Jun 2002 | A1 |
20020084330 | Chiu | Jul 2002 | A1 |
20030002024 | Motegi | Jan 2003 | A1 |
20030098352 | Schnee et al. | May 2003 | A1 |
20030230630 | Whipple et al. | Dec 2003 | A1 |
20040118928 | Patel et al. | Jun 2004 | A1 |
20040179254 | Lewis et al. | Sep 2004 | A1 |
20040246367 | Yoshida | Dec 2004 | A1 |
20050116044 | Zhu | Jun 2005 | A1 |
20050135798 | Szajewski | Jun 2005 | A1 |
20050156047 | Chiba et al. | Jul 2005 | A1 |
20050253937 | Moholt et al. | Nov 2005 | A1 |
20060011725 | Schnee | Jan 2006 | A1 |
20060043194 | Barkan et al. | Mar 2006 | A1 |
20060118629 | Shiramizu et al. | Jun 2006 | A1 |
20060163355 | Olmstead et al. | Jul 2006 | A1 |
20060255150 | Longacre, Jr. | Nov 2006 | A1 |
20070001015 | Suzuki et al. | Jan 2007 | A1 |
20070034696 | Barkan et al. | Feb 2007 | A1 |
20070035718 | Haddad | Feb 2007 | A1 |
20070040034 | Hennick et al. | Feb 2007 | A1 |
20070158428 | Havens et al. | Jul 2007 | A1 |
20070181692 | Barkan et al. | Aug 2007 | A1 |
20070295812 | Mazowiesky | Dec 2007 | A1 |
20070295814 | Tanaka et al. | Dec 2007 | A1 |
20080011855 | Nadabar | Jan 2008 | A1 |
20080035732 | Vinogradov et al. | Feb 2008 | A1 |
20080035733 | Vinogradov et al. | Feb 2008 | A1 |
20080080785 | Ford | Apr 2008 | A1 |
20080099561 | Douma | May 2008 | A1 |
20080105745 | Lei | May 2008 | A1 |
20080191022 | Barkan et al. | Aug 2008 | A1 |
20080266425 | Shurboff | Oct 2008 | A1 |
20090001166 | Barkan et al. | Jan 2009 | A1 |
20090001168 | Hammer | Jan 2009 | A1 |
20090057412 | Bhella et al. | Mar 2009 | A1 |
20090108076 | Barkan et al. | Apr 2009 | A1 |
20090167490 | Hayaashi et al. | Jul 2009 | A1 |
20100147948 | Powell et al. | Jun 2010 | A1 |
20100213259 | Gao | Aug 2010 | A1 |
20100219249 | Barkan | Sep 2010 | A1 |
20100245270 | Nako | Sep 2010 | A1 |
20110050650 | McGibney | Mar 2011 | A1 |
20110157089 | Rainisto | Jun 2011 | A1 |
20120000982 | Gao | Jan 2012 | A1 |
20120018516 | Gao | Jan 2012 | A1 |
20120067956 | Gao et al. | Mar 2012 | A1 |
20120111944 | Gao et al. | May 2012 | A1 |
20120138684 | Van Volkinburg et al. | Jun 2012 | A1 |
20120181338 | Gao | Jul 2012 | A1 |
20120193429 | Van Volkinburg et al. | Aug 2012 | A1 |
20120248184 | Naito | Oct 2012 | A1 |
20130062412 | Tan et al. | Mar 2013 | A1 |
20130075464 | Van Horn et al. | Mar 2013 | A1 |
20130082109 | Meier et al. | Apr 2013 | A1 |
20130134217 | Crooks et al. | May 2013 | A1 |
20130181055 | Liu et al. | Jul 2013 | A1 |
20150001294 | Zocca et al. | Jan 2015 | A1 |
Number | Date | Country |
---|---|---|
1178957 | Apr 1998 | CN |
101031930 | Sep 2007 | CN |
101710407 | May 2010 | CN |
0574024 | Dec 1993 | EP |
08-129597 | May 1996 | JP |
2009-544105 | Dec 2009 | JP |
10-2007-0063899 | Jun 2007 | KR |
10-2009-0110245 | Oct 2009 | KR |
Entry |
---|
Hensel Jr., “Cell phone boarding passes going into use here first” http://www.chron.com/disp.story.mpl/front/5349969.html, visited Feb. 19, 2010. |
Reardon “Cell phone as boarding pass,” http://news.cnet.com/8301-10784—3-9896859-7.html, visited Feb. 19, 2010. |
TSA Blog, “TSA Paperless Boarding Pass Pilot Expanding,” http://www.tsa.gov/blog/2009/06/tsa:paperless-boarding—-pass-pilot.html, visited Feb. 19, 2010. |
Mobile Commerce Daily, “Hexaware launches 2D bar code boarding pass service,” http://www.mobilecommercedaily.com/2009/11/30/hexaware-launches-2d-bar-code-boarding-pass-service , visited Feb. 19, 2010. |
American Airlines, “Introducing Mobile Boarding Passes,” http://www.aa.com/i—l8n/urls/mobileBoarding.jsp, visited Feb. 19, 2010. |
International Patent Application No. PCT/US2011/049556, International Search Report and Written Opinion, Feb. 9, 2012, 2 pages. |
International Search Report, PCT/US2011/060232, Apr. 27, 2012. |
International Search Report , PCT/US2011/036295, mailed Nov. 23, 2011. |
Datalogic™, Magellan™ 3200Vsl On-Counter Vertical Presentation Scanner, Product Reference Guide, Rev. A. Jul. 2010. |