The present invention relates to barcode (i.e., code) readers (i.e., scanners) and, more specifically, to a code reader that, when triggered, reads multiple codes on a target (e.g., shipping label) automatically, and projects visual feedback regarding the read status for each code onto the target.
Some code readers are capable of reading (i.e., scanning) multiple codes (e.g., barcodes) in a single scan. These code readers, known as “multi-code readers,” are used to enhance productivity, but may be hindered when a code-read error occurs. Errors may occur in barcode code scanning (e.g., due a damaged barcode), and when multi-code scanning, a user may not easily know which codes have been read and which codes have been missed and/or skipped. When one code in a multi-code scan is missed, the user must resort to scanning each code individually. In this scenario, productivity is lost and the user may become aggravated.
Therefore, a need exists for a multi-code scanner with visual feedback to indicate the read status for each code. This visual feedback will help a user understand which codes have been read and which codes (if any) have been missed and/or skipped. This feedback could allow a user to zoom in on the missed/skipped codes and re-scan, thereby eliminating the need to re-scan each code individually.
Accordingly, in one aspect, the present invention embraces a multiple code reading system with visual feedback. The system includes an imager for capturing a digital image of a code set in a field of view. The code set includes a plurality of codes. The system also includes a projector for projecting a feedback image selected from a set of feedback images. The system further includes a memory for storing the digital image, the set of feedback images, and a code-reading program. The code-reading program configures a processor that is communicatively coupled to the imager, the projector, and the memory to (i) retrieve the digital image from the memory, (ii) detect the codes within the digital image, (iii) read each detected code, (iv) select a feedback image from the set of feedback images for each code, the selection base on the code's reading results, and (v) project onto each code the code's particularly selected feedback image.
In an exemplary embodiment of the multiple code reading system with visual feedback, the code set includes multiple codes of the same symbology.
In another exemplary embodiment of the multiple code reading system with visual feedback, the code set includes multiple codes of mixed symbologies.
In another exemplary embodiment of the multiple code reading system with visual feedback, the set of feedback images includes a box with edges corresponding to a code.
In another exemplary embodiment of the multiple code reading system with visual feedback, the set of feedback images includes an “X” to cover a code.
In another exemplary embodiment of the multiple code reading system with visual feedback, the set of feedback images includes a box with edges corresponding to a code and an “X” to cover a code, wherein (i) the box is projected onto codes that were read correctly, (ii) the “X” is projected onto codes that were read incorrectly, and (iii) nothing is projected onto codes that were not detected.
In another exemplary embodiment of the multiple code reading system with visual feedback, the set of feedback images includes a box with edges corresponding to the field of view.
In another exemplary embodiment of the multiple code reading system with visual feedback, the projector includes a light emitting diode (LED).
In another exemplary embodiment of the multiple code reading system with visual feedback, the projector includes a laser.
In another exemplary embodiment of the multiple code reading system with visual feedback, the code set includes barcodes.
In another aspect, the present invention embraces a method for providing visual feedback regarding a multi-code scan using an imaging barcode scanner. The method includes the step of capturing a digital image of a field of view with the imaging barcode scanner. The field of view includes a set of codes for reading, and the method includes the step of detecting the codes within the digital image. The method further includes the steps of reading each code, and determining a code status for each code. The code status has a positive status if the code was read and a negative status if an attempt to read the code failed. Finally, the method includes the step projecting visual feedback into the field of view. The visual feedback includes positive feedback messages projected onto codes having a positive status and negative feedback messages projected onto codes having a negative status.
In an exemplary embodiment of the method, the set of codes includes a two-dimensional barcode.
In another exemplary embodiment of the method, the set of codes includes a linear barcode.
In another exemplary embodiment of the method, the positive feedback messages include a box with edges corresponding to a code.
In another exemplary embodiment of the method, the negative feedback messages include an “X” to cover a code.
In another exemplary embodiment of the method, the visual feedback includes a box with edges corresponding to the field of view.
In another exemplary embodiment of the method, the visual feedback only comprises the box with edges corresponding to the field of view and the positive or negative feedback message for detected codes within the digital image.
In another exemplary embodiment of the method, the positive feedback messages or the negative feedback messages are partially contained within the box corresponding to the field of view.
In another exemplary embodiment of the method, the imaging barcode scanning includes a light emitting diode (LED) for projecting the feedback.
In another exemplary embodiment of the method, the imaging barcode scanner includes a laser for projecting the feedback.
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
The present invention embraces a system and method for providing visual feedback to facilitate multi-code (e.g., barcode) reading (i.e., scanning).
Codes, such as barcodes (e.g., linear barcodes, two dimension barcodes) have found use in a wide variety of applications (e.g., shipping). Increasingly multiple codes are affixed to the same item to convey information pertinent to different users and/or pertinent at different times (e.g., different stages of the shipping process). Often multiple barcodes are arranged in a cluster (i.e., set) so that scanning each barcode becomes easier. The code set may include multiple codes of the same symbology (e.g., all codes are linear barcodes) or may include multiple codes of mixed symbologies (e.g., some codes are linear barcodes and some codes are two-dimensional barcodes).
Imaging barcode scanners (e.g., handheld scanner) may be configured to read multiple codes in a single scan (i.e., a single pull of a trigger on a hand-held scanner). The configuration of these multi-code readers may include a setup file stored in memory that provides information (e.g., number of codes, each code symbology, etc.) regarding the scan to facilitate each code being read. The imaging scanner may take an image of a field of view with an imaging subsystem. The image may be stored in scanner's memory and processed using algorithms running on a processor in order to locate, identify, and read each barcode in the image. In this way, the multi-code reader may enhance productivity. Trouble arises, however, when one or more of the barcodes in the set is not scanned properly.
Successful code reading requires good print quality of the code and good image quality of the code image. Image quality is typically the same for all codes in a set but may be affected by inhomogeneous illumination or shadows on a portion of the code set. Various factors affect a code's print quality (e.g., ink supply, ink smear, unwanted marks, etc.). Scan errors may be difficult to understand and not immediately obvious.
In some cases a multi-code scan may result in one or more barcodes of a set (i.e., cluster) having an unsuccessful scan. Without feedback, a failed multi-code scan may require either rescanning all the codes in the set or, in a worst case, scanning each code of the set individually.
Imaging barcode scanners may have a visual display to show the field of view. In some cases, feedback regarding the scan results may also be displayed using the graphical user interface (i.e., GUI). The present invention, however, embraces a system/method for generating feedback regarding the results of a multi-code scan in a more intuitive and ergonomic way. Specifically, the feedback regarding a multi-code scan may be projected using a projector integrated with the imaging scanner. The projected feedback is adaptable and is adjusted so that each code in a code set may have a corresponding feedback message projected onto it. This feedback message may provide information regarding each barcode scan that is spatially aligned with the code set and on the item that a user is already looking at. The information in this feedback may allow a user to identify a damaged code (i.e., indicium) and rescan or take another corrective action (e.g., manually entering in some data).
An exemplary shipping label having a set of codes (i.e., code set) for reading with an imaging barcode scanner is shown in
After a scan has been triggered, a digital image of the shipping label is captured by the imaging barcode scanner. The codes in the image are detected and read using a processor integrated with the imaging scanner or communicatively coupled to the imaging scanner via a data communications link. After reading the codes, the processor running algorithms assigns each code a status based on the results of the reading. A positive status is assigned to a code if the code is successfully read, while a negative status is assigned to a code if the code is read unsuccessfully (i.e., read error). The processor configures a projection subsystem in the imaging scanner to project visual feedback into the scanner's field of view. Exemplary codes 2,3,4,5 on the shipping label 1 overlaid with projected visual feedback 6,7,8,9,10 regarding the read status for each code are shown in
The feedback shown in
An exemplary block diagram of a multi-code reading system with visual feedback is shown in
A projection subsystem projects feedback images onto codes that are printed on (or affixed to) the target 11. The projection subsystem includes a projector 17. The projector 17 includes a light source to generate and radiate light. The light source may be a laser diode (LD) or a light emitting diode (LED). The light source generates light radiation in a portion of the visible (VIS) spectrum. The projector 17 may also include a light modulator to create feedback images. The light modulator may be reflective (e.g., digital light processing MEMS device) or transmissive (e.g., liquid crystal spatial light modulator). A projection lens (or lens group) 16 is placed in front of the projector 17 to focus the light into a projection field of view 13 and focuses the projected light onto the target. The projection lens 16 may be fixed or variable (e.g., auto) focus.
An imaging subsystem captures images of items located into the imaging field of view 14. To accomplish this, the imaging subsystem may use an imaging lens 20 to render a real image of the imaging field of view 14 onto an image sensor 21. This imaging field of view 14 overlaps (at least partially) with the projection field of view 13. The image sensor 21 may be a charge coupled device (i.e., CCD) or a sensor using complementary metal oxide semiconductor (i.e., CMOS) technology. The image sensor 21 includes a plurality of pixels that sample the real image and convert the real-image intensity into an electronic signal.
A digital signal processor (i.e., DSP) 23 may be included to convert the electronic signals from the image sensor 21 into a digital image and/or control the projector to create the feedback projections.
A processor 18 is communicatively coupled to the projection subsystem and imaging subsystem (e.g., coupled to the DSP controlling the projection and imaging subsystems). The processor 18 may be embodied in a variety of ways, such as (but not limited to) one or more controllers, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable gate array (PGA), and/or a programmable logic controller (PLC).
The processor 18 is typically configured by software (e.g., a code-reading program) stored in memory 24 (e.g., read only memory (ROM), flash memory, random access memory (RAM), and/or a hard-drive). The software, when executed by the processor 18 configures the multi-code reading system to (i) retrieve a digital image from the memory 24, (ii) detect codes within the digital image, (iii) read each detected code, (iv) select (for each code) a feedback image from a set of feedback images (stored in memory) based each code's reading results, and (v) project the selected feedback images onto the appropriate codes.
The multi-code reading system 15 may also include a user interface 25 to transmit information to a user and receive input from a user (e.g., trigger a scan). The user interface 25 may include interface elements (e.g., touch buttons, touch screen, trigger switch, etc.).
The multi-code reading system 15 may also include a communication subsystem 19 for transmitting and receiving information to/from a separate computing device and/or storage device. This communication subsystem may be wired or wireless and may enable communication with a variety of protocols (e.g., IEEE 802.11, including WI-FI®, BLUETOOTH®, CDMA, TDMA, or GSM).
The subsystem/components in the multi-code reading system 15 are electrically connected via couplers (e.g., wires or fibers), buses, and control lines to form an interconnection system 26. The interconnection system 26 may include power buses or lines, data buses, instruction buses, address buses, etc., which allow operation of the subsystems/components and interaction there between.
A flow chart of a method for providing visual feedback regarding a multi-code scan using an imaging barcode scanner is shown in
The method begins with capturing a digital image of a set of codes (e.g., linear barcodes, two-dimensional barcodes, etc.) in a field of view with an imaging barcode scanner 30. The imaging barcode scanner may be a single purposed device (e.g., handheld scanner), a mode of operation in a multipurpose device (e.g., mobile computing device), or a part of a larger system (e.g., point of sale system).
The digital image is processed to detect codes within the image 31. Next, a code is read 32 using code reading algorithms commonly used in the art. If an error is detected, (e.g., using error correction data encoded into the code) then the code is assigned a negative status 35. If the code is read successfully, however, it is assigned a positive status 34. The steps of reading a code and assigning the code a status are repeated for each code until there are no unread codes and all codes have been assigned a status 36.
The method concludes by projecting positive feedback messages (e.g., a box surrounding a barcode) onto codes having a positive status 37, and projecting negative feedback messages onto codes having a negative status 38.
Thus, the present invention embraces a system/method to provide visual feedback (e.g., indicia reading status information) to a user. The feedback may be created using the indicia reader's highlight beam projector to highlight indicia and convey information. This information is displayed on the target (i.e., object) instead of on a display and is, therefore, intuitive and convenient.
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
The present application claims the benefit of U.S. Patent Application No. 62/098,201 for Visual Feedback for Code Readers filed Dec. 30, 2014, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5783811 | Feng | Jul 1998 | A |
6832725 | Gardiner et al. | Dec 2004 | B2 |
7128266 | Zhu et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7331524 | Vinogradov | Feb 2008 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7611060 | Wang | Nov 2009 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
7780089 | Wang | Aug 2010 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Liu | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Van Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Horn et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van Horn et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8763909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van Horn et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | Akel et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
9224022 | Ackley et al. | Dec 2015 | B2 |
9224027 | Van Horn et al. | Dec 2015 | B2 |
D747321 | London et al. | Jan 2016 | S |
9230140 | Ackley | Jan 2016 | B1 |
9443123 | Hejl | Jan 2016 | B2 |
9250712 | Todeschini | Feb 2016 | B1 |
9258033 | Showering | Feb 2016 | B2 |
9262633 | Todeschini et al. | Feb 2016 | B1 |
9310609 | Rueblinger et al. | Apr 2016 | B2 |
D757009 | Oberpriller et al. | May 2016 | S |
9342724 | McCloskey | May 2016 | B2 |
9375945 | Bowles | Jun 2016 | B1 |
D760719 | Zhou et al. | Jul 2016 | S |
9390596 | Todeschini | Jul 2016 | B1 |
D762604 | Fitch et al. | Aug 2016 | S |
D762647 | Fitch et al. | Aug 2016 | S |
9412242 | Van Horn et al. | Aug 2016 | B2 |
D766244 | Zhou et al. | Sep 2016 | S |
9443222 | Singel et al. | Sep 2016 | B2 |
9478113 | Xie et al. | Oct 2016 | B2 |
20040232238 | Palestini et al. | Nov 2004 | A1 |
20050279832 | Kobayashi | Dec 2005 | A1 |
20060261167 | Ray et al. | Nov 2006 | A1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20090090782 | May | Apr 2009 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedraro | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140100813 | Showering | Jan 2014 | A1 |
20140034734 | Sauerwein | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131438 | Kearney | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131445 | Ding et al. | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140166761 | Todeschini et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197238 | Lui et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van Horn et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150071819 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150169925 | Chang et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150220753 | Zhu et al. | Aug 2015 | A1 |
20150254485 | Feng et al. | Sep 2015 | A1 |
20150327012 | Bian et al. | Nov 2015 | A1 |
20160014251 | Hejl | Jan 2016 | A1 |
20160040982 | Li et al. | Feb 2016 | A1 |
20160042241 | Todeschini | Feb 2016 | A1 |
20160057230 | Todeschini et al. | Feb 2016 | A1 |
20160109219 | Ackley et al. | Apr 2016 | A1 |
20160109220 | Laffargue | Apr 2016 | A1 |
20160109224 | Thuries et al. | Apr 2016 | A1 |
20160112631 | Ackley et al. | Apr 2016 | A1 |
20160112643 | Laffargue et al. | Apr 2016 | A1 |
20160124516 | Schoon et al. | May 2016 | A1 |
20160125217 | Todeschini | May 2016 | A1 |
20160125342 | Miller et al. | May 2016 | A1 |
20160133253 | Braho et al. | May 2016 | A1 |
20160171720 | Todeschini | Jun 2016 | A1 |
20160178479 | Goldsmith | Jun 2016 | A1 |
20160180678 | Ackley et al. | Jun 2016 | A1 |
20160189087 | Morton et al. | Jun 2016 | A1 |
20160125873 | Braho et al. | Jul 2016 | A1 |
20160227912 | Oberpriller et al. | Aug 2016 | A1 |
20160232891 | Pecorari | Aug 2016 | A1 |
20160292477 | Bidwell | Oct 2016 | A1 |
20160294779 | Yeakley et al. | Oct 2016 | A1 |
20160306769 | Kohtz et al. | Oct 2016 | A1 |
20160314276 | Sewell et al. | Oct 2016 | A1 |
20160314294 | Kubler et al. | Oct 2016 | A1 |
Number | Date | Country |
---|---|---|
201363789 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
Entry |
---|
Extended European Search Report in related EP Application No. 15202153.1, dated May 19, 2016, 7 pages (Commonly Owned references submitted in seperate IDS filed concurrently). |
U.S. Appl. No. 14/715,916 for Evaluating Image Values, filed May 19, 2015 (Ackley); 60 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device, filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages. |
U.S. Appl. No. 29/530,600 for Cyclone, filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface, filed May 8, 2015 (Pape); 47 pages. |
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control, filed May 21, 2014 (Liu et al.); 31 pages; now abandoned. |
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat, filed May 6, 2015 (Hussey et al.); 42 pages. |
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning, filed May 5, 2015 (Charpentier et al.); 60 pages. |
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle, filed May 6, 2015 (Fitch et al.); 44 pages. |
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display, filed May 19, 2015 (Venkatesha et al.); 35 pages. |
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System, filed Jun. 10, 2015 (Todeschini); 39 pages. |
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device, filed May 1, 2015 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 141747,197 for Optical Pattern Projector, filed Jun. 23, 2015 (Thuries et al.); 33 pages. |
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions, filed May 4, 2015 (Young et al.); 70 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device, filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner, filed Jun. 23, 2015 (Jovanovski et al.); 40 pages. |
U.S. Appl. No. 14/740,320 for Tactile Switch Fora Mobile Electronic Device, filed Jun. 16, 2015 (Bamdringa); 38 pages. |
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture, filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned. |
U.S. Appl. No. 29/516,892 for Table Computer, filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer, filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing, filed Jun. 2, 2015 (Fitch et al.); 61 pages. |
U.S. Appl. No. 29/526,918 for Charging Base, filed May 14, 2015 (Fitch et al.); 10 pages. |
Exam Report for related EP Application 15202153.1 dated Jul. 17, 2018; pp. 1-4. |
Number | Date | Country | |
---|---|---|---|
20160188939 A1 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
62098201 | Dec 2014 | US |