Code symbol reading system having adaptive autofocus

Information

  • Patent Grant
  • 9582698
  • Patent Number
    9,582,698
  • Date Filed
    Wednesday, August 5, 2015
    8 years ago
  • Date Issued
    Tuesday, February 28, 2017
    7 years ago
Abstract
A system for reading code symbols includes an imaging subsystem that includes a focusing module and an image processor. The image processor selects an initial, predicted focal distance for the imaging subsystem's focusing module with respect to a code symbol. The focal distance for each successfully decoded code symbol is stored in memory, and a weighted average of a pre-selected number of memorized focal distances is used to calculate the next initial, predicted focal distance.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus filed Jun. 26, 2013 (and published Jan. 1, 2015 as U.S. Patent Application Publication No. 2015/0001304), now U.S. Pat. No. 9,104,929. Each of the foregoing patent application, patent publication, and patent is hereby incorporated by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates to the field of code symbol readers, more specifically, to a code symbol reading system having adaptive autofocus.


BACKGROUND

Mobile computer devices such as smartphones have become an increasingly popular way to scan code symbols (e.g., barcodes) because of their highly functional camera technology as well as their ubiquity. Generally speaking, the greatest challenge when scanning a barcode with a smartphone is first getting the barcode in focus. Presently, most smartphones utilize an autofocus routine that attempts to get an image into focus. This procedure is quite time consuming, principally because the autofocus routine initiates the process without any real sense of the proper focal distance. As a result, the smartphone often spends a great deal of time readjusting the focal distance as it searches for the proper focal setting that will bring the barcode into focus at least enough to read the barcode. Autofocus routines are often hampered in their efforts by excessive motion and poor lighting conditions. This often leads to a significant time delay when scanning a barcode and accounts for the vast majority of the overall scan time.


Therefore, a need exists for a system for reading code symbols that is capable of generating faster readings of code symbols by first making an educated guess at where the focus should be fixed before a barcode scan is attempted.


SUMMARY

Accordingly, in one aspect, the present invention embraces a system for reading code symbols having adaptive autofocus. The system for reading code symbols includes an imaging subsystem for capturing images within the imaging subsystem's field of view. The imaging subsystem includes a focusing module. The system for reading code symbols also includes an image processor. The image processor is configured for selecting an initial, predicted focal distance for the imaging subsystem's focusing module. The image processor is also configured for processing an image of a code symbol captured by the imaging subsystem in order to decode the code symbol. The image processor is also configured for storing in memory the focal distance associated with the decoded code symbol. The image processor is configured to select the initial, predicted focal distance as a function of memorized focal distance(s) associated with previously decoded code symbol(s).


In an exemplary embodiment, the image processor is configured to select the initial, predicted focal distance as a function of a plurality of memorized focal distances associated with previously decoded code symbols.


In another exemplary embodiment, the image processor is configured to perform an autofocus routine if processing the captured image of the code symbol at the initial, predicted focal distance fails to decode the code symbol.


In yet another exemplary embodiment, the image processor is configured to perform an autofocus routine if the number of memorized focal distances associated with previously decoded code symbols is less than a predetermined minimum.


In yet another exemplary embodiment, the imaging subsystem detects the presence of a code symbol within the imaging subsystem's field of view.


In yet another exemplary embodiment, the system for reading code symbols according to the present invention also includes an object detection subsystem for detecting the presence of an object (e.g., an object bearing a code symbol) within the imaging subsystem's field of view.


In yet another exemplary embodiment, the system for reading code symbols according to the present invention also includes a hand-supportable housing. The imaging subsystem and image processor are disposed within the hand-supportable housing.


In yet another exemplary embodiment, the system for reading code symbols according to the present invention also includes an input/output subsystem. The input/output subsystem outputs signals from the system for reading code symbols.


In another aspect, the present invention embraces a system for reading code symbols that includes an imaging subsystem and an image processor. The imaging subsystem captures images within the imaging subsystem's field of view. The imaging subsystem includes a focusing module. The image processor is configured for selecting an initial, predicted focal distance for the imaging subsystem's focusing module with respect to a code symbol, and to process one or more images of a code symbol captured by the imaging subsystem at the initial, predicted focal distance. The image processor is configured to perform an autofocus routine if processing a predetermined number of captured images of a code symbol fails to decode the code symbol. The image processor is also configured to store in memory the focal distance associated with the decoded code symbol if processing a captured image of the code symbol decodes the code symbol. The image processor is configured to select the initial, predicted focal distance as a function of a plurality of memorized focal distances associated with previously decoded code symbols.


In an exemplary embodiment, the image processor is configured to store in memory the initial, predicted focal distance associated with the decoded code symbol if processing a captured image of a code symbol at the initial, predicted focal distance decodes the code symbol.


In another exemplary embodiment, the image processor is configured to perform an autofocus routine if the number of memorized focal distances associated with previously decoded code symbols is less than a predetermined minimum.


In yet another exemplary embodiment, the image processor is configured to process one or more additional captured images of the code symbol after the image processor performs an autofocus routine.


In yet another exemplary embodiment, the imaging subsystem detects the presence of a code symbol within the imaging subsystem's field of view.


In yet another exemplary embodiment, the system for reading code symbols according to the present invention includes an object detection subsystem for detecting the presence of a code symbol within the imaging subsystem's field of view.


In yet another exemplary embodiment, the system for reading code symbols according to the present invention includes a hand-supportable housing. The imaging subsystem and image processor are disposed within the hand-supportable housing.


In yet another exemplary embodiment, the system for reading code symbols according to the present invention includes an input/output subsystem for outputting system signals.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an exemplary system for reading code symbols according to the present invention.



FIG. 2 is a block diagram of a first alternative embodiment of the system for reading code symbols according to the present invention.



FIG. 3 is a block diagram of a second alternative embodiment of the system for reading code symbols according to the present invention.





DETAILED DESCRIPTION

The present invention embraces a system for reading code symbols. The term “code symbol” is intended broadly to refer to any indicia or device used to store information about an object, including barcodes, linear barcodes, 1D barcodes, matrix barcodes, 2D barcodes, QR codes, RFID tags, and optical character recognition (OCR) symbols. When a code symbol is read, the information encoded in the code symbol is decoded. Referring now to FIG. 1, the system 100 for reading code symbols according to the present invention includes an imaging subsystem 110. The imaging subsystem 110 captures images within the imaging subsystem's 110 field of view (i.e., angle of view). Objects that can be viewed by the imaging subsystem when it is in a given position and orientation in space are within the imaging subsystem's 110 field of view. When a user wishes to capture an image of a code symbol, the user either positions the code symbol within the imaging subsystem's 110 field of view, or re-orients the imaging subsystem 110 to change its field of view to include the code symbol. For example, if using a camera-equipped smartphone as a code symbol reader, the user could reposition the smartphone until the code symbol comes into the field of view of the smartphone's camera.


The imaging subsystem 110 of the system 100 according to the present invention includes a focusing module 115. The focusing module 115 functions to bring into focus the code symbol of interest such that the code symbol may be read. Typically, the focusing module includes (i) a lens assembly having one or more lenses and (ii) a small autofocus motor which manipulates the lens assembly to adjust its field of focus. The autofocus motor manipulates the lens assembly by changing the distance between the lens assembly and the image plane, or by otherwise adjusting the lens configuration, to bring the code symbol into focus. Alternatively, the focusing module could include a liquid lens module. The liquid lens module has a liquid lens having a chamber containing at least one type of liquid. A liquid lens driver applies an electric current to the liquid, and can vary the focal characteristics of the liquid lens by varying the level of electric current. These types of liquid lens modules are particularly useful in smaller, mobile devices because they do not require moving parts.


The system 100 according to the present invention also includes an image processor 120. Rather than immediately initiating a potentially time-consuming autofocus routine prior to capturing an image of a code symbol, the image processor 120 first selects an initial, predicted focal distance for the imaging subsystem's 110 focusing module 120. In other words, the image processor 120 estimates the focal distance at which the system 100 should attempt to capture the image of the code symbol. The method used to estimate the focal distance is described below in detail.


After an image of a code symbol is captured by the system 100, the image processor 120 processes the image of the code symbol to decode (e.g., read) the code symbol. If the captured image of the code symbol is of sufficient quality (e.g., focal quality) to allow the image processor 120 to successfully decode the code symbol, then the image processor 120 stores in memory the focal distance associated with the decoded code symbol. In other words, when the system 100 captures an image of a code symbol, the image processor 120 (i) determines the focal distance of the focusing module at the time the image was captured, (ii) determines whether the code symbol can be decoded from the captured image, and (iii) stores in memory the focal distance of each successfully decoded code symbol at the time the image was captured. If the image processor 120 determines that the image quality is not sufficient to permit the decoding of the code symbol, the focal length associated with the unsuccessful image capture is not stored in memory. The process is repeated each time the system 100 captures an image of a code symbol, with the focal distance associated with each image capture resulting in a successful decoding being stored in memory. In this way, the system 100 creates a continuously updated record of each of the focal distances employed to decode the code symbol(s). It will be appreciated by a person of ordinary skill in the art that the number of focal distances stored in memory can vary depending on how large of a sample is desired. Prior to capturing an image of a code symbol, the image processor 120 selects the initial, predicted focal distance needed to capture an image that can successfully decode the code symbol. The initial, predicted focal distance is a function of the memorized focal distance(s) associated with the previously decoded code symbol(s).


In an exemplary embodiment of the system 100 for reading code symbols according to the present invention, the image processor 120 selects the initial, predicted focal distance as a function of a plurality of memorized focal distances associated with previously decoded code symbols. Typically, the image processor 120 selects the initial, predicted focal distance that will be used for the current scan operation (e.g., code symbol image capture) by calculating a weighted moving average of the memorized focal distances. Typically, the image processor 120 keeps a record of the temporal order of memorized focal distances. In other words, the image processor 120 records which focal distance is associated with the most recently captured image, which focal distance is associated with the next most recently captured image, and so on. Using this temporal information, the image processor 120 typically gives greater weight to the focal distances associated with the more recently captured images. This approach is particularly advantageous when reading code symbols from varying distances, because this weighted average approach assumes that the distance of the next code symbol to be scanned will be roughly the same as the distances of the latest successful scans. Because, in many instances, when a user is scanning multiple code symbols in succession, those code symbols will be at similar distances from the system 100, the approach of more heavily weighting the more recent scans typically yields improved results. For example, a worker holding the system in hand while scanning various boxes bearing a code symbol while the boxes are resting on a table of substantially uniform height would likely achieve improved results using this method since most of the code symbols will be at roughly the same distance from the system 100. Any weighting scheme could be applied to find an optimal balance between highly valuing the most recent focal length data while still taking into account focal length data over a longer trend. By way of example, a linear weighting system could be employed. By way of further example, the image processor 120 could employ an exponentially weighted moving average, which would place greater value on the focal distance associated with the most recent scan while taking into consideration all of the memorized focal distances.


In an exemplary embodiment, the image processor 120 first seeks to successfully decode an image by initiating a scan using the initial, predicted focal distance. If the image processor 120 fails to decode the image using the initial, predicted focal distance (for example, because the image is out of focus), the image processor retries the scan attempt using a traditional autofocus routine. In this way, the system 100 seeks to expedite the scanning process by first attempting to decode the code symbol using the faster, educated estimate approach described herein; if that approach is not successful, the system 100 falls back on a traditional autofocus routine.


In another exemplary embodiment, the image processor 120 first checks to see if a sufficiently large sample size of focal distances are stored in memory. If less than a predetermined minimum number of memorized focal distances associated with previously decoded code symbols reside in memory, then the image processor 120 does not attempt to generate an initial, predicted focal distance. Instead, the image processor 120 goes directly to the traditional autofocus routine to attempt to decode the code symbol. In this way, if the available data set of focal distances is below a predetermined minimum number deemed to be sufficient to generate a reliable (e.g., useful) initial, predicted focal distance, the image processor 120 will not waste time or system resources on an estimation step that is unlikely to yield an image with a suitable focal quality.


In an exemplary embodiment, the imaging subsystem 110 detects the presence of a code symbol within the imaging subsystem's 110 field of view. The presence of a code symbol may be detected by an analysis of the image pixels by the imaging subsystem 110 to determine if they are consistent with the presence of a code symbol. This analysis may be capable of detecting the presence of a code symbol even when the image quality is insufficient to support reading the code symbol. If the imaging subsystem 110 detects a code symbol within the field of view of the imaging subsystem 110, it initiates an image capture using the focal distance estimating techniques described herein.


Referring now to FIG. 2, in an alternative embodiment, the system 100 for reading code symbols includes an object detection subsystem 130 for detecting the presence of an object (e.g., an object bearing a code symbol) within the system's 100 field of view. The object detection subsystem can project an IR-based light beam into the field of view and detect a return signal from an object present in the field of view to detect the presence of that object. Upon detection of an object, the system 100 may initiate the attempted reading of a code symbol(s) within the field of view using the adaptive focusing techniques described herein.


Whether by successfully relying on the initial, predicted focal distance, or by falling back on a traditional autofocus routine, the system 100 ultimately obtains a successful scan of the code symbol. When the code symbol is decoded, the system 100 will need to output the resulting data (e.g., the data decoded from the code symbol) to another system (e.g., a data processing system). The data processing system may be housed either within the same device that houses the system 100 for reading code symbols, or it may be housed in a separate device (e.g., a host device). Referring now to FIG. 3, to output the system signals (e.g., data, bits, electrical signals) representing the data generated by decoding the code symbol, the system 100 includes an input/output subsystem 140. The input/output subsystem 140 manages the sending of system signals to other systems and/or devices.


The system 100 for reading code symbols according to the present invention may take a variety of forms. For instance, the system 100 may be a stationary unit at a checkout register (e.g., point of sale (POS)), similar to a bioptic scanner found in most grocery stores. More typically, the system 100 for reading code symbols according to the present invention will take the form of a hand-held device such as a smartphone, a tablet computer, or a hand-held scanner. For hand-held devices, the system 100 according to the present invention includes a hand-supportable housing in which the imaging subsystem 110 and the image processor 120 are disposed.


In another aspect, the present disclosure embraces a system 100 for reading code symbols that includes an imaging subsystem 110 and an image processor 120. The image processor 120 selects an initial, predicted focal distance for the imaging subsystem's 110 focusing module with respect to a code symbol. The image processor 120 processes one or more images of the code symbol captured by the imaging subsystem 110 at the initial, predicted focal distance. If, after processing a predetermined number of captured images of the code symbol, the image processor 120 fails to decode the code symbol, then the image processor 120 performs an autofocus routine. On the other hand, if the image processor 120 is able to decode a code symbol from a captured image, then the image processor 120 stores in memory the focal distance associated with the decoded code symbol. The image processor 120 selects the initial, predicted focal distance as a function of a plurality of memorized focal distances associated with previously decoded code symbols.


To supplement the present disclosure, this application incorporates entirely by reference the following patents, patent application publications, and patent applications: U.S. Pat. Nos. 6,832,725; 7,159,783; 7,413,127; 7,726,575; 8,390,909; 8,294,969; 8,408,469; 8,408,468; 8,381,979; 8,408,464; 8,317,105; 8,366,005; 8,424,768; 8,322,622; 8,371,507; 8,376,233; 8,457,013; 8,448,863; 8,459,557; 8,469,272; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0193407; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2012/0318869; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0068840; U.S. Patent Application Publication No. 2013/0070322; U.S. Patent Application Publication No. 2013/0075168; U.S. Patent Application Publication No. 2013/0056285; U.S. Patent Application Publication No. 2013/0075464; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2010/0225757; U.S. patent application Ser. No. 13/347,219 for an OMNIDIRECTIONAL LASER SCANNING BAR CODE SYMBOL READER GENERATING A LASER SCANNING PATTERN WITH A HIGHLY NON-UNIFORM SCAN DENSITY WITH RESPECT TO LINE ORIENTATION, filed Jan. 10, 2012 (Good); U.S. patent application Ser. No. 13/347,193 for a HYBRID-TYPE BIOPTICAL LASER SCANNING AND DIGITAL IMAGING SYSTEM EMPLOYING DIGITAL IMAGER WITH FIELD OF VIEW OVERLAPPING FIELD OF FIELD OF LASER SCANNING SUBSYSTEM, filed Jan. 10, 2012 (Kearney et al.); U.S. patent application Ser. No. 13/367,047 for LASER SCANNING MODULES EMBODYING SILICONE SCAN ELEMENT WITH TORSIONAL HINGES, filed Feb. 6, 2012 (Feng et al.); U.S. patent application Ser. No. 13/400,748 for a LASER SCANNING BAR CODE SYMBOL READING SYSTEM HAVING INTELLIGENT SCAN SWEEP ANGLE ADJUSTMENT CAPABILITIES OVER THE WORKING RANGE OF THE SYSTEM FOR OPTIMIZED BAR CODE SYMBOL READING PERFORMANCE, filed Feb. 21, 2012 (Wilz); U.S. patent application Ser. No. 13/432,197 for a LASER SCANNING SYSTEM USING LASER BEAM SOURCES FOR PRODUCING LONG AND SHORT WAVELENGTHS IN COMBINATION WITH BEAM-WAIST EXTENDING OPTICS TO EXTEND THE DEPTH OF FIELD THEREOF WHILE RESOLVING HIGH RESOLUTION BAR CODE SYMBOLS HAVING MINIMUM CODE ELEMENT WIDTHS, filed Mar. 28, 2012 (Havens et al.); U.S. patent application Ser. No. 13/492,883 for a LASER SCANNING MODULE WITH ROTATABLY ADJUSTABLE LASER SCANNING ASSEMBLY, filed Jun. 10, 2012 (Hennick et al.); U.S. patent application Ser. No. 13/367,978 for a LASER SCANNING MODULE EMPLOYING AN ELASTOMERIC U-HINGE BASED LASER SCANNING ASSEMBLY, filed Feb. 7, 2012 (Feng et al.); U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.); U.S. patent application Ser. No. 13/780,356 for a Mobile Device Having Object-Identification Interface, filed Feb. 28, 2013 (Samek et al.); U.S. patent application Ser. No. 13/780,158 for a Distraction Avoidance System, filed Feb. 28, 2013 (Sauerwein); U.S. patent application Ser. No. 13/784,933 for an Integrated Dimensioning and Weighing System, filed Mar. 5, 2013 (McCloskey et al.); U.S. patent application Ser. No. 13/785,177 for a Dimensioning System, filed Mar. 5, 2013 (McCloskey et al.); U.S. patent application Ser. No. 13/780,196 for Android Bound Service Camera Initialization, filed Feb. 28, 2013 (Todeschini et al.); U.S. patent application Ser. No. 13/792,322 for a Replaceable Connector, filed Mar. 11, 2013 (Skvoretz); U.S. patent application Ser. No. 13/780,271 for a Vehicle Computer System with Transparent Display, filed Feb. 28, 2013 (Fitch et al.); U.S. patent application Ser. No. 13/736,139 for an Electronic Device Enclosure, filed Jan. 8, 2013 (Chaney); U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson); U.S. patent application Ser. No. 13/750,304 for Measuring Object Dimensions Using Mobile Computer, filed Jan. 25, 2013; U.S. patent application Ser. No. 13/471,973 for Terminals and Methods for Dimensioning Objects, filed May 15, 2012; U.S. patent application Ser. No. 13/895,846 for a Method of Programming a Symbol Reading System, filed Apr. 10, 2013 (Corcoran); U.S. patent application Ser. No. 13/867,386 for a Point of Sale (POS) Based Checkout System Supporting a Customer-Transparent Two-Factor Authentication Process During Product Checkout Operations, filed Apr. 22, 2013 (Cunningham et al.); U.S. patent application Ser. No. 13/888,884 for an Indicia Reading System Employing Digital Gain Control, filed May 7, 2013 (Xian et al.); U.S. patent application Ser. No. 13/895,616 for a Laser Scanning Code Symbol Reading System Employing Multi-Channel Scan Data Signal Processing with Synchronized Digital Gain Control (SDGC) for Full Range Scanning, filed May 16, 2013 (Xian et al.); U.S. patent application Ser. No. 13/897,512 for a Laser Scanning Code Symbol Reading System Providing Improved Control over the Length and Intensity Characteristics of a Laser Scan Line Projected Therefrom Using Laser Source Blanking Control, filed May 20, 2013 (Brady et al.); U.S. patent application Ser. No. 13/897,634 for a Laser Scanning Code Symbol Reading System Employing Programmable Decode Time-Window Filtering, filed May 20, 2013 (Wilz, Sr. et al.); U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.); U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin); U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield); U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.) and U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A system for reading code symbols, comprising: an imaging subsystem for capturing images within the imaging subsystem's field of view, the imaging subsystem including a focusing module; andan image processor configured for: selecting an initial, predicted focal distance for the imaging subsystem's focusing module;processing an image of a code symbol captured by the imaging subsystem in order to decode the code symbol; andstoring in memory the focal distance associated with the decoded code symbol;calculating a weighted moving average of the stored focal distances; andselecting the initial, predicted focal distance based on the calculated, weighted moving average of the stored focal distances.
  • 2. The system for reading code symbols according to claim 1, wherein the image processor is configured to perform an autofocus routine if processing the captured image of the code symbol at the initial, predicted focal distance fails to decode the code symbol.
  • 3. The system for reading code symbols according to claim 1, wherein the image processor is configured to perform an autofocus routine if the number of memorized focal distances associated with previously decoded code symbols is less than a predetermined minimum.
  • 4. The system for reading code symbols according to claim 1, wherein the imaging subsystem detects the presence of a code symbol within the imaging subsystem's field of view.
  • 5. The system for reading code symbols according to claim 1, comprising an object detection subsystem for detecting the presence of an object within the imaging subsystem's field of view.
  • 6. The system for reading code symbols according to claim 1, comprising a hand-supportable housing, wherein the imaging subsystem and image processor are disposed within the hand-supportable housing.
  • 7. The system for reading code symbols according to claim 1, comprising an input/output subsystem for outputting system signals.
  • 8. A system for reading code symbols, comprising: an imaging subsystem for capturing images, the imaging subsystem comprising a focusing module; andan image processor configured for: selecting an initial, predicted focal distance for the imaging subsystem's focusing module;processing an image of a code symbol captured by the imaging subsystem in order to decode the code symbol; andstoring in memory the focal distance associated with the decoded code symbol;wherein the image processor is configured to select the initial, predicted focal distance as a function of memorized focal distances associated with previously decoded code symbols if the number of memorized focal distances associated with previously decoded code symbols is equal to or greater than a predetermined minimum.
  • 9. The system for reading code symbols according to claim 8, wherein the image processor is configured to perform an autofocus routine if processing the captured image of the code symbol at the initial, predicted focal distance fails to decode the code symbol.
  • 10. The system for reading code symbols according to claim 8, wherein the imaging subsystem detects the presence of a code symbol within the imaging subsystem's field of view.
  • 11. The system for reading code symbols according to claim 8, comprising an object detection subsystem for detecting the presence of an object within the imaging subsystem's field of view.
  • 12. The system for reading code symbols according to claim 8, comprising a hand-supportable housing, wherein the imaging subsystem and image processor are disposed within the hand-supportable housing.
  • 13. The system for reading code symbols according to claim 8, comprising an input/output subsystem for outputting system signals.
  • 14. A system for reading code symbols, comprising: an imaging subsystem for capturing images within the imaging subsystem's field of view, the imaging subsystem including a focusing module; andan image processor configured for: selecting an initial, predicted focal distance for the imaging subsystem's focusing module with respect to a code symbol;processing one or more images of a code symbol captured by the imaging subsystem at the initial, predicted focal distance;if processing a predetermined number of captured images of a code symbol fails to decode the code symbol, then performing an autofocus routine; andif processing a captured image of the code symbol decodes the code symbol, then storing in memory the focal distance associated with the decoded code symbol;calculating a weighted moving average of the focal distances associated with decoded code symbols stored in memory;selecting the initial, predicted focal distance based on the calculated, weighted moving average of the stored focal distances.
  • 15. The system for reading code symbols according to claim 14, wherein the image processor is configured to store in memory the initial, predicted focal distance associated with the decoded code symbol if processing a captured image of a code symbol at the initial, predicted focal distance decodes the code symbol.
  • 16. The system for reading code symbols according to claim 14, wherein the image processor is configured to perform an autofocus routine if the number of memorized focal distances associated with previously decoded code symbols is less than a predetermined minimum.
  • 17. The system for reading code symbols according to claim 14, wherein, the image processor is configured to process one or more additional captured images of the code symbol after the image processor performs an autofocus routine.
  • 18. The system for reading code symbols according to claim 14, wherein the imaging subsystem detects the presence of a code symbol within the imaging subsystem's field of view.
  • 19. The system for reading code symbols according to claim 14, comprising an object detection subsystem for detecting the presence of a code symbol within the imaging subsystem's field of view.
  • 20. The system for reading code symbols according to claim 14, comprising a hand-supportable housing, wherein the imaging subsystem and image processor are disposed within the hand-supportable housing.
US Referenced Citations (410)
Number Name Date Kind
6336587 He et al. Jan 2002 B1
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366001 Craen et al. Feb 2013 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8608075 Tanimoto et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8651384 Ogawa et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9104929 Todeschini Aug 2015 B2
20040118919 Breytman et al. Jun 2004 A1
20070063048 Havens et al. Mar 2007 A1
20070131770 Nunnink Jun 2007 A1
20090072037 Good et al. Mar 2009 A1
20090134221 Zhu et al. May 2009 A1
20090166426 Giebel et al. Jul 2009 A1
20090206158 Thuries et al. Aug 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120237193 Kawarada Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166758 Goren Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Liu et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
Foreign Referenced Citations (4)
Number Date Country
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (77)
Entry
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed Aug. 19, 2014 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages.
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages.
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages.
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages.
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages.
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages.
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages.
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages.
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages.
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 14/748,446 for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and Eas Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages.
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages.
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages.
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages.
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Bian et al.); 22 pages.
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et al.); 23 pages.
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages.
U.S. Appl. No. 14/568,305 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages.
Examination Report in counterpart United Kingdom Application No. GB1410734.6 dated Oct. 6, 2015, pp. 1-2.
Search and Exam Report in Application No. GB1410734.6, related to Current Application, Dated Dec. 18, 2014, 6 pages.
Examination Report in counterpart United Kingdom Application No. GB1410734.6 dated Jul. 17, 2015, pp. 1-3.
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages.
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages.
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages.
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages.
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages.
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages.
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages.
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages.
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages.
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages.
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages.
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages.
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages.
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages.
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User'S Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages.
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 44 pages.
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User'S Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using Wildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages.
U.S. Appl. No. 14/687,289 for System for Communication Via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Related Publications (1)
Number Date Country
20150339504 A1 Nov 2015 US
Continuations (1)
Number Date Country
Parent 13927398 Jun 2013 US
Child 14818528 US