Systems and methods for determining a location of a user when using an imaging device in an indoor facility

Information

  • Patent Grant
  • 10592536
  • Patent Number
    10,592,536
  • Date Filed
    Tuesday, May 30, 2017
    7 years ago
  • Date Issued
    Tuesday, March 17, 2020
    4 years ago
Abstract
The present disclosure is generally directed to obtaining location information of a person or a machine when the person or machine uses an imaging device to capture an image of one or more objects located in an indoor facility. The location information can be obtained by processing the captured image in conjunction with a reference map and/or a database. The reference map can be generated by using one or more reference images of the various objects in the indoor facility. The database can contain information such as a location, a dimension, and an orientation of each of the various objects in the indoor facility. The location information can also be obtained by processing the captured image to examine encoded data content in a label and/or an imaging distortion present in the label. The labels, which can be barcode labels, are affixed to various object in the indoor facility.
Description
FIELD OF THE INVENTION

The present invention generally relates to location systems and more particularly relates to systems and methods for determining a location of a user when using an imaging device in an indoor facility.


BACKGROUND

Standalone cameras have been supplemented to a large extent by various other devices such as smartphones and tablet computers that are now capable of not only capturing images but providing certain types of information pertaining to the captured images as well. For example, a smartphone can be used to not only capture an image but to also use global positioning system (GPS) technology to identify an object in the captured image. The object can be an iconic structure such as the White House or the Eiffel Tower for example, and the smartphone can automatically identify these structures based on GPS location information of the smartphone when the image is being captured. However, GPS signals often fail to penetrate buildings and can therefore not be used as a reliable means to obtain positioning coordinates and information pertaining to some indoor objects. Consequently, a need exists to provide location systems that can operate in various environments in a reliable manner.


SUMMARY

Accordingly, in one aspect, the present disclosure embraces a method that includes generating at least one of a reference map or a database, the reference map comprising one or more reference images of a plurality of objects located in an indoor facility, the at least one of the reference map or the database providing information about one or more of a location, a dimension, and an orientation of one or more of the plurality of objects located in the indoor facility. The method further includes using an imaging device to capture a first image, the first image comprising a first object among the plurality of objects located in the indoor facility, and also includes processing the first image in cooperation with the at least one of the reference map or the database to determine a first location of a user of the imaging device or a machine incorporating the imaging device when the imaging device is used to capture the first image in the indoor facility.


In another aspect, the present disclosure pertains to a method that includes using an imaging device to capture an image of at least a first object among a plurality of objects located in an indoor facility. The method further includes processing the image in cooperation with at least one of a reference map or a database, the reference map comprising one or more reference images of the plurality of objects located in the indoor facility, the database containing information about one or more of a location, a dimension, and an orientation of each of the plurality of objects located in the indoor facility. The method also includes determining a location of a user of the imaging device or a machine using the imaging device, based on the processing.


In yet another aspect, the present disclosure pertains to a method that includes generating information indicative of at least one of a location, a dimension, or a placement attribute of each of a plurality of objects located in an indoor facility; using an imaging device to capture an image of at least a first object among the plurality of objects located in the indoor facility; processing the image to identify at least one of a distance, an orientation, or an angular offset of the imaging device with respect to the at least the first object; and using the at least one of the distance, the orientation, or the angular offset to determine a location of one of a user of the imaging device or a machine incorporating the imaging device, when the imaging device is used to capture the image of the at least the first object in the indoor facility.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages described in this disclosure, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts an exemplary imaging device being used in accordance with an embodiment of the disclosure to generate a reference map and/or a database pertaining to a number of objects located inside an indoor facility.



FIG. 2 schematically depicts a user of an exemplary imaging device when using the imaging device in accordance with an embodiment of the disclosure.



FIG. 3 schematically depicts an exemplary imaging device being used in accordance with the disclosure to capture one or more images of labels placed upon various objects arranged inside the indoor facility in a first exemplary arrangement.



FIG. 4 schematically depicts a user using an imaging device in accordance with the disclosure to capture one or more images of labels placed upon various objects arranged inside the indoor facility in a second exemplary arrangement.



FIGS. 5A-5E schematically depict some exemplary distortions in captured images of a label in accordance with the disclosure.



FIG. 6 schematically depicts an exemplary user location system that can be located in an imaging device and/or in a cloud device communicatively coupled to the imaging device.





DETAILED DESCRIPTION

Throughout this description, embodiments and variations are described for the purpose of illustrating uses and implementations of inventive concepts. The illustrative description should be understood as presenting examples of inventive concepts, rather than as limiting the scope of the concepts as disclosed herein. Towards this end, certain words and terms are used herein solely for convenience and such words and terms should be broadly understood as encompassing various objects and actions that are generally understood in various forms and equivalencies by persons of ordinary skill in the art. For example, words such as “location,” “placement,” “label,” “imaging device,” “user,” “person,” “machine,” and “database,” can have various interpretations, and certain operations associated with such words can be implemented in different ways without detracting from the spirit of the disclosure. It should also be understood that the words “person” and “user” as used herein can apply equally well to a machine in many instances. For example, when the description indicates certain actions executed by a person, it should be understood that these actions may be executed in some instances by a machine (a robot, for example). It should also be understood that the word “example” as used herein is intended to be non-exclusionary and non-limiting in nature. More particularly, the word “exemplary” as used herein indicates one among several examples and it should be understood that no special emphasis, exclusivity, or preference, is associated or implied by the use of this word.


The present disclosure is generally directed to a user location system that can be used to obtain location information of a user of an imaging device (or a machine that uses an imaging device) when the imaging device is used to capture one or more images of objects located in an indoor facility. The location information can be obtained by processing the captured image in conjunction with a reference map and/or a database. The reference map can be generated by using from one or more reference images that provide location information of the various objects in the indoor facility. The database can contain information such as a location, a dimension, and an orientation of the various objects in the indoor facility. The location information can also be obtained by processing one or more labels in the captured image. The labels, such as barcode labels, are affixed to the various objects. In some implementations, a label can contain embedded data that provides the location information. In some other implementations, a distortion in the image of a label can be used to determine the location of an object on which the label is located. The user location system can be installed in various devices, such as a smartphone, a tablet computer, a machine incorporating an imaging system, and/or a server computer. Some examples of a machine that can use an imaging system in accordance with the disclosure include a drone, a robot, or a vehicle (automated or driven by a person).


Attention is now drawn to FIG. 1, which schematically depicts an exemplary imaging device 105 being used in accordance with an embodiment of the disclosure to generate a reference map (and/or a database) pertaining to a number of objects located inside an indoor facility 100. The imaging device 105 can be any device that incorporates a camera, such as for example, a digital single-lens reflex (DSLR) camera, a video camera, a smartphone, or a tablet computer. The indoor facility 100 can be any building such as for example, a warehouse, a factory floor, a residence, a commercial establishment, or an office. The various objects can be items such as boxes, crates, parcels, books, household objects, or machinery parts, and can be moved and arranged in various ways such as on shelves, racks, pallets, or tabletops. The various objects can also include various fixtures and fixed objects located in the indoor facility 100. Such fixed objects can include for example, a pillar 155, a room 160, a window 170, and a door 165.


The imaging device 105 can be used by a person (not shown) to capture a number of reference images of the various objects, from various locations in the indoor facility 100. The reference images can then be used to generate a reference map of the various objects located in the indoor facility 100. For example, a first reference image can be captured by the person when standing at a first position next to the room 160 and pointing the imaging device 105 towards a first set of objects. The various line-of-sight visibility paths for the imaging device 105 from this first position to some of the objects located in the indoor facility 100 are indicated by dashed lines extending from the imaging device 105 to object 130, object 150, and object 145. Though not indicated by dashed lines, the imaging device 105 also has line-of-sight visibility paths from this first position to object 110, object 125, and object 140. Consequently, the first reference image includes images of object 110, object 125, object 140, object 130, object 150, and object 145. However, the first reference image does not include images of some other objects such as object 115, object 120, and object 135 due to the presence of intervening objects that block line-of-sight visibility to the imaging device 105.


A second reference image can be captured by the person when standing at a second position next to the pillar 155 and pointing the imaging device 105 towards a second set of objects. The various line-of-sight visibility paths from this second position are indicated by dashed lines extending from the imaging device 105 to object 115 and object 145. Though not indicated by dashed lines, the imaging device 105 also has line-of-sight visibility paths from this second position to object 110, object 125, and object 140. Consequently, the second reference image includes images of object 110, object 125, object 140, object 115, and object 145. However, the second reference image does not include images of some other objects such as object 130, object 120, object 135, and object 150 due to the presence of intervening objects.


A third reference image can be captured by the person when standing at a third position further away from the pillar 155 and pointing the imaging device 105 towards a third set of objects. The various line-of-sight visibility paths from this third position are indicated by dashed lines extending from the imaging device 105 to object 135 and object 130. Though not indicated by dashed lines, the imaging device 105 also has line-of-sight visibility paths from the third position to object 110, object 125, and object 140. Consequently, the third reference image includes images of object 110, object 125, object 140, object 130, and object 135. However, the third reference image does not include images of some other objects such as object 115, object 120, object 145, and object 150 due to the presence of intervening objects.


A fourth reference image can be captured by the person when standing at a fourth position near the window 170 and pointing the imaging device 105 towards a fourth set of objects. The various line-of-sight visibility paths from this fourth position are indicated by dashed lines extending from the imaging device 105 to object 120 and object 115. Though not indicated by dashed lines, the imaging device 105 also has line-of-sight visibility paths from the fourth position to object 110, object 125, and object 140. Consequently, the fourth reference image includes images of object 110, object 125, object 140, object 120, and object 115. However, the fourth reference image does not include images of some other objects such as object 130, object 135, object 145, and object 150 due to the presence of intervening objects.


A fifth reference image can be captured by the person when standing at a fifth position near the door 165 and pointing the imaging device 105 towards a fifth set of objects. The various line-of-sight visibility paths from this fifth position are indicated by dashed lines extending from the imaging device 105 to object 115 and object 130. Though not indicated by dashed lines, the imaging device 105 also has line-of-sight visibility paths from the fifth position to object 120, object 135, and object 150. Consequently, the fifth reference image includes images of object 120, object 135, object 150, object 115, and object 130. However, the fifth reference image does not include images of some other objects such as object 110, object 125, object 140, and object 145 due to the presence of intervening objects.


Additional images can be captured by the person when standing at various other locations in the indoor facility 100 and at various line-of-sight angles. It must be understood that in various implementations, the imaging device 105 can be used to not only capture reference images at various azimuth angles but can also be used to capture reference images at various elevation angles (looking upwards at rack shelves for example, or looking downwards from an upper floor of the indoor facility 100 for example). In some exemplary implementations, an airborne machine such as an aerial drone can be used to capture the reference images, thereby providing aerial views of the various objects located in the indoor facility 100.


The reference images can be consolidated to generate a reference map of the locations of the various objects in the indoor facility 100. Accordingly, the reference map can provide information such as for example, line-of-sight visibility of one or more objects from the first location and a lack of line-of-sight visibility of some other objects from the first location, line-of-sight visibility of one or more objects from the second location and a lack of line-of-sight visibility of some other objects from the second location, and so on.


A database of the locations of the various objects in the indoor facility 100 can be generated and used in lieu of, or in addition to, the reference map. In one example implementation, the database can be generated by using an x-y grid mapping system to provide a floor plan that indicates the placement of the various objects on the floor of the indoor facility 100. For example, in such an x-y grid mapping system, a first corner 153 of the indoor facility 100 can be designated as having (0,0) coordinates, a second corner 151 of the indoor facility 100 can be designated as having (0,100) coordinates, a third corner 152 of the indoor facility 100 can be designated as having (100,100) coordinates, and a fourth corner 154 of the indoor facility 100 can be designated as having (100,0) coordinates. Consequently, it can be determined that object 140 is located at a coordinate location (35,10), for example, and this location information can be incorporated into the database.


In another example implementation, the database can be generated by using an x-y-z grid mapping system that not only provides a floor plan indicating placement of the various objects on the floor of the indoor facility 100 but further provides elevation information of one or more objects, such as a box placed upon an upper shelf of a rack.


In yet another example implementation, the database can be generated by using an existing item location system such as a library book locating system that helps in identifying the location of a particular book on a shelf.


Irrespective of the manner in which a reference map or a database is generated, an image that is subsequently captured by a user (using the imaging device 105, which can be a digital camera for example, or a different imaging device, which can be a smartphone for example) from any location in the indoor facility 100 can be interpreted using the reference map and/or the database to derive information of where the user/machine is located in the indoor facility 100 with respect to various objects. In one example implementation, the user of the imaging device 105 is a worker moving around the indoor facility 100 (a warehouse for example) looking to retrieve object 130 for example. The imaging device 105, independently or in cooperation with one or more other devices (such as a communication device used by a supervisor) can be used to identify a current location of the worker based on one or more images captured by the worker. The current location information can then be used to guide the worker to the object 130. In another example implementation, the imaging device 105 is mounted on an aerial drone and the reference map/database can be used to identify a current location of the drone based on images captured by the aerial drone while in flight. The current location information can then be used to pilot the aerial drone around obstacles and towards a desired destination in the indoor facility 100.


The imaging device 105 used for such operations can be a digital camera that captures images one image at a time, or a video camera that captures images in a video streaming format. The user location system, which can be a part of the imaging device 105 or can be a part of a different device, can operate in real-time to process images being sent from the still camera or the video camera in real-time.



FIG. 2 schematically depicts a user 205 of an imaging device 210 (which can be the imaging device 105) when using the imaging device 210 in accordance with an exemplary embodiment of the disclosure to capture one or more images of the objects located inside the indoor facility 100. A first image captured by the user 205 can include images of object 110, object 125, object 140, object 130, object 150, and object 145. The user location system, which can be a part of the imaging device 210 or can be a part of a different device, can use the reference map and/or the database to interpret the first image captured by the user 205 and inform the user 205 that he can move from his current location (next to the room 160) in a north-easterly direction between object 125 and object 140 in order to reach and retrieve object 130.


A second image captured by the user 205 can include images of object 110, object 125, object 140, object 115, and object 145. The user location system, can use the reference map to determine that the user 205 is located next to the pillar 155 and can further use the various reference images of the reference map to provide guidance to the user for moving towards object 150, even though object 150 is not visible in the second image.


A third image captured by the user 205 can be similarly used to guide the user 205 from a location next to the window 170 to the object 145 for example. A fourth image captured by the user 205 can be used to guide the user 205 from a location next to the door 165 to the object 115 or the object 140 for example.



FIG. 3 schematically depicts the imaging device 310 (which can be the imaging device 105) being used in accordance with the disclosure to capture one or more images of labels placed upon various objects arranged inside the indoor facility 100 in a first exemplary arrangement. The imaging device 310 can be first used by the user 205 to generate a reference map and/or a database (as described above with respect to FIG. 1) that includes data pertaining to the various labels. The imaging device 310 can be subsequently used by the user 205 to capture one or more images of the objects located inside the indoor facility 100 (as described above with respect to FIG. 2) in order to allow operation of the user location system. However, in contrast to the procedures described above with respect to FIGS. 1 and 2, the labels placed upon the various objects are used here to determine a current location of the user 205. The labels can be any one of various types of labels, such as a Universal Product Code (UPC) label.


In accordance with the disclosure, a label can have various kinds of information embedded in a barcode for example, and can be used by the user location system for determining a location of the imaging device 310 in the indoor facility 100 when the imaging device 310 is used to capture the first image. For example, a label 305 that is attached to object 125 can provide information that indicates a characteristic of the object 125 (size, weight etc.) and/or information pertaining to a shelf or rack where the object 125 is placed. In some implementations, the label 305 can include a unique tag value that can be used by the user location system to look up information pertaining to the object 125. Label 315 and label 320 can be used in the same manner as label 305.



FIG. 4 schematically depicts the imaging device 310 (which can be the imaging device 105) being used in accordance with the disclosure to capture one or more images of labels placed upon various objects arranged inside the indoor facility 100 in a second exemplary arrangement. In this second exemplary implementation, one or more objects are oriented at an angle with respect to the imaging device 310 when the imaging device 310 is being used by the user 205 standing near the pillar 155 for example. Consequently, label 305, which is directly in front of the user 205 may appear undistorted when reproduced in a captured image, whereas each of the label 405 and the label 410 can appear distorted due to the angular orientation of the object 110 and the object 140 with respect to the user 205. The nature of the distortion in each of the label 405 and the label 410 can be used to obtain information such as a dimension information and/or an orientation information (in addition to location information that may be embedded in the label).


In some implementations, the nature of the distortion in each of the label 405 and the label 410 can be assessed by using a polygon that represents an orientation and placement of each of these labels on the object 110 and the object 140 respectively. A distance between the imaging device 310 and the object 110 for example can be calculated by using one or more of a pixel size of the polygon, a radians-per-pixel parameter of the imaging device 310, and/or a physical size of the label 405 placed upon the object 110. In some implementations, a direction vector from the label 405 to the imaging device 310 can be used for obtaining two scalar values. The two scalar values can be used to determine an orientation of the label 405 on the imaging device 310 where an angle from normal can be indicative of a distortion in a reproduction of the label 405 in a captured image, and/or indicative of a direction in which the imaging device 310 is pointed. The direction in which the imaging device 310 is pointed can be calculated based on length dimensions of a parallelogram with respect to a known shape of the label 405. Label 410 can provide similar information (angular orientation information, distance information etc.) pertaining to the object 140.



FIGS. 5A-5C schematically depict some distortions in captured images of the label 305 in accordance with the disclosure. FIG. 5A illustrates an undistorted label 305, which is a rectangular UPC label in this example. The label 305 can be other than an UPC label and can have other shapes (square, oval, circular etc.) in other implementations. The imaging device 310 can be first used by the user 205 to generate a reference map and/or database as described above with respect to FIG. 1. The reference map and/or database can include in this embodiment, information about one or more of a predefined shape, a predefined size, a predefined orientation, and/or a predefined location information provided in the label 305. The user 205 can use the imaging device 310 to capture one or more images of the labels that are affixed to the objects located inside the indoor facility 100 (as described above with respect to FIG. 2) in order to allow operation of the user location system. The user 205 can ensure that the label 305 is visible to the imaging device 310 when capturing the one or more images. The user location system can detect a distortion of the rectangular shape of the label 305 in a captured image and determine the location of the imaging device 310 in the indoor facility 100 when used to capture the first image based at least in part on examining a nature and an extent of the distortion of the rectangular shape.



FIG. 5B shows a first type of distortion that can be present in the label 405 in one or more images captured by the imaging device 310. Such a distortion may occur when the imaging device 310 is being pointed downwards (from an upper floor of the indoor facility 100, for example) when capturing the image having the label 405. Accordingly, based on the polygonal shape of the label 405 shown in FIG. 5B, the user location system can determine that the user 205 is standing at a specific spot and facing a specific direction, on the upper floor with respect to the object 110. Location information of the object 110 on which label 405 is affixed and/or distance information from the imaging device 310, can be determined using one or more of the procedures described above with reference to FIG. 4.



FIG. 5C shows a second type of distortion that can be present in the label 405 in one or more images captured by the imaging device 310. Such a distortion may occur when the imaging device 310 is being pointed upwards (towards an upper shelf of a rack, for example) when capturing the image having the label 405. Accordingly, based on the polygonal shape of the label 405 shown in FIG. 5C, the user location system can determine that the user 205 is standing at a specific spot and facing a specific direction, next to a shelf on which the object 110 is placed. Location information of the object 110 on which label 405 is affixed and/or distance information from the imaging device 310, can be determined using one or more of the procedures described above with reference to FIG. 4.



FIG. 5D shows a third type of distortion that can be present in the label 405 for example, in one or more images captured by the imaging device 310. Such a distortion may occur when the object 110 is oriented at a lateral angle with respect to the imaging device 310 when capturing the image having the label 405. Accordingly, based on the polygonal shape of the label 405 shown in FIG. 5D, the user location system can determine that the user 205 is standing at a specific spot and facing a specific direction, on one side (the right side, for example) of the object 110. Location information of the object 110 on which label 405 is affixed and/or distance information from the imaging device 310, can be determined using one or more of the procedures described above with reference to FIG. 4.



FIG. 5E shows a fourth type of distortion that can be present in the label 410 in one or more images captured by the imaging device 310. Such a distortion may occur when the object 140 is oriented at a lateral angle with respect to the imaging device 310 and is located at an elevation with respect to the imaging device 310 when capturing the image having the label 410. Accordingly, based on the polygonal shape of the label 405 shown in FIG. 5E, the user location system can determine that the user 205 is standing at a specific spot and facing a specific direction, on one side (the left side, for example) and below the object 140. Location information of the object 140 on which label 410 is affixed and/or distance information from the imaging device 310, can be determined using one or more of the procedures described above with reference to FIG. 4.



FIG. 6 schematically depicts a user location system 600 that can be located in an imaging device 605 and/or in a cloud device 635 that is communicatively coupled to the imaging device 605. The imaging device 605 can be one of the imaging device 105, imaging device 210, or imaging device 310 described above. Generally, in terms of hardware architecture, imaging device 605 can include a processor 610, memory 620, one or more input/output (I/O) interface 615 (or peripherals), and a camera 625. These components are communicatively coupled to each other via a local interface 630, which can include address, control, and/or data connections to enable appropriate communications.


The processor 610 is a hardware device for executing software, particularly that stored in memory 620. The processor 610 can be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the imaging device 605, a semiconductor based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions.


The memory 620 can include any one or a combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and nonvolatile memory elements (a ROM for example). The memory 620 can have a distributed architecture, where various components are situated remote from one another, but can be accessed by the processor 610.


The software in memory 620 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions. In the example of FIG. 6, the software in the memory 620 includes the user location system 600 in accordance with the disclosure, and a suitable operating system (O/S) 621. The operating system 621 essentially controls the execution of computer programs, such as the user location system 600, and provides input-output control, file and data management, memory management, and communication control and related services.


User location system 600 may be implemented as a source program, an executable program (object code), a script, or any other entity comprising a set of instructions to be performed. When a source program, the program may be translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory 620, so as to operate properly in connection with the O/S 621.


When the imaging device 605 is in operation, the processor 610 is configured to execute software stored within the memory 620, to communicate data to and from the memory 620, and to generally control operations of the imaging device 605 pursuant to the software. User location system 600 and the O/S 621, in whole or in part, but typically the latter, are read by the processor 610, perhaps buffered within the processor 610, and then executed.


When user location system 600 is implemented in software, it should be noted that the user location system 600 can be stored on any computer readable storage medium for use by or in connection with any computer related system or method. In the context of this disclosure, a computer readable storage medium is an electronic, magnetic, optical, or other physical device or means that can contain or store data and/or a computer program for use by or in connection with a computer related system or method.


The user location system 600 may be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this disclosure, a “non-transient computer-readable storage medium” or “non-transient computer-readable medium” can be any means that can store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.


In an alternative embodiment, where the user location system 600 is implemented in hardware, the user location system 600 can implemented with any or a combination of the following technologies, which are each well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinatorial logic gates, a programmable gate array(s) (PGA), or a field programmable gate array (FPGA).


The cloud device 635 can be a server computer for example that includes the user location system 600 and a database 636. The cloud device 635 can be communicatively coupled to the imaging device 105 via a network such as the Internet.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266;
  • U.S. Pat. Nos. 7,159,783; 7,413,127;
  • U.S. Pat. Nos. 7,726,575; 8,294,969;
  • U.S. Pat. Nos. 8,317,105; 8,322,622;
  • U.S. Pat. Nos. 8,366,005; 8,371,507;
  • U.S. Pat. Nos. 8,376,233; 8,381,979;
  • U.S. Pat. Nos. 8,390,909; 8,408,464;
  • U.S. Pat. Nos. 8,408,468; 8,408,469;
  • U.S. Pat. Nos. 8,424,768; 8,448,863;
  • U.S. Pat. Nos. 8,457,013; 8,459,557;
  • U.S. Pat. Nos. 8,469,272; 8,474,712;
  • U.S. Pat. Nos. 8,479,992; 8,490,877;
  • U.S. Pat. Nos. 8,517,271; 8,523,076;
  • U.S. Pat. Nos. 8,528,818; 8,544,737;
  • U.S. Pat. Nos. 8,548,242; 8,548,420;
  • U.S. Pat. Nos. 8,550,335; 8,550,354;
  • U.S. Pat. Nos. 8,550,357; 8,556,174;
  • U.S. Pat. Nos. 8,556,176; 8,556,177;
  • U.S. Pat. Nos. 8,559,767; 8,599,957;
  • U.S. Pat. Nos. 8,561,895; 8,561,903;
  • U.S. Pat. Nos. 8,561,905; 8,565,107;
  • U.S. Pat. Nos. 8,571,307; 8,579,200;
  • U.S. Pat. Nos. 8,583,924; 8,584,945;
  • U.S. Pat. Nos. 8,587,595; 8,587,697;
  • U.S. Pat. Nos. 8,588,869; 8,590,789;
  • U.S. Pat. Nos. 8,596,539; 8,596,542;
  • U.S. Pat. Nos. 8,596,543; 8,599,271;
  • U.S. Pat. Nos. 8,599,957; 8,600,158;
  • U.S. Pat. Nos. 8,600,167; 8,602,309;
  • U.S. Pat. Nos. 8,608,053; 8,608,071;
  • U.S. Pat. Nos. 8,611,309; 8,615,487;
  • U.S. Pat. Nos. 8,616,454; 8,621,123;
  • U.S. Pat. Nos. 8,622,303; 8,628,013;
  • U.S. Pat. Nos. 8,628,015; 8,628,016;
  • U.S. Pat. Nos. 8,629,926; 8,630,491;
  • U.S. Pat. Nos. 8,635,309; 8,636,200;
  • U.S. Pat. Nos. 8,636,212; 8,636,215;
  • U.S. Pat. Nos. 8,636,224; 8,638,806;
  • U.S. Pat. Nos. 8,640,958; 8,640,960;
  • U.S. Pat. Nos. 8,643,717; 8,646,692;
  • U.S. Pat. Nos. 8,646,694; 8,657,200;
  • U.S. Pat. Nos. 8,659,397; 8,668,149;
  • U.S. Pat. Nos. 8,678,285; 8,678,286;
  • U.S. Pat. Nos. 8,682,077; 8,687,282;
  • U.S. Pat. Nos. 8,692,927; 8,695,880;
  • U.S. Pat. Nos. 8,698,949; 8,717,494;
  • U.S. Pat. Nos. 8,717,494; 8,720,783;
  • U.S. Pat. Nos. 8,723,804; 8,723,904;
  • U.S. Pat. Nos. 8,727,223; 702,237;
  • U.S. Pat. Nos. 8,740,082; 8,740,085;
  • U.S. Pat. Nos. 8,746,563; 8,750,445;
  • U.S. Pat. Nos. 8,752,766; 8,756,059;
  • U.S. Pat. Nos. 8,757,495; 8,760,563;
  • U.S. Pat. Nos. 8,763,909; 8,777,108;
  • U.S. Pat. Nos. 8,777,109; 8,779,898;
  • U.S. Pat. Nos. 8,781,520; 8,783,573;
  • U.S. Pat. Nos. 8,789,757; 8,789,758;
  • U.S. Pat. Nos. 8,789,759; 8,794,520;
  • U.S. Pat. Nos. 8,794,522; 8,794,525;
  • U.S. Pat. Nos. 8,794,526; 8,798,367;
  • U.S. Pat. Nos. 8,807,431; 8,807,432;
  • U.S. Pat. Nos. 8,820,630; 8,822,848;
  • U.S. Pat. Nos. 8,824,692; 8,824,696;
  • U.S. Pat. Nos. 8,842,849; 8,844,822;
  • U.S. Pat. Nos. 8,844,823; 8,849,019;
  • U.S. Pat. Nos. 8,851,383; 8,854,633;
  • U.S. Pat. Nos. 8,866,963; 8,868,421;
  • U.S. Pat. Nos. 8,868,519; 8,868,802;
  • U.S. Pat. Nos. 8,868,803; 8,870,074;
  • U.S. Pat. Nos. 8,879,639; 8,880,426;
  • U.S. Pat. Nos. 8,881,983; 8,881,987;
  • U.S. Pat. Nos. 8,903,172; 8,908,995;
  • U.S. Pat. Nos. 8,910,870; 8,910,875;
  • U.S. Pat. Nos. 8,914,290; 8,914,788;
  • U.S. Pat. Nos. 8,915,439; 8,915,444;
  • U.S. Pat. Nos. 8,916,789; 8,918,250;
  • U.S. Pat. Nos. 8,918,564; 8,925,818;
  • U.S. Pat. Nos. 8,939,374; 8,942,480;
  • U.S. Pat. Nos. 8,944,313; 8,944,327;
  • U.S. Pat. Nos. 8,944,332; 8,950,678;
  • U.S. Pat. Nos. 8,967,468; 8,971,346;
  • U.S. Pat. Nos. 8,976,030; 8,976,368;
  • U.S. Pat. Nos. 8,978,981; 8,978,983;
  • U.S. Pat. Nos. 8,978,984; 8,985,456;
  • U.S. Pat. Nos. 8,985,457; 8,985,459;
  • U.S. Pat. Nos. 8,985,461; 8,988,578;
  • U.S. Pat. Nos. 8,988,590; 8,991,704;
  • U.S. Pat. Nos. 8,996,194; 8,996,384;
  • U.S. Pat. Nos. 9,002,641; 9,007,368;
  • U.S. Pat. Nos. 9,010,641; 9,015,513;
  • U.S. Pat. Nos. 9,016,576; 9,022,288;
  • U.S. Pat. Nos. 9,030,964; 9,033,240;
  • U.S. Pat. Nos. 9,033,242; 9,036,054;
  • U.S. Pat. Nos. 9,037,344; 9,038,911;
  • U.S. Pat. Nos. 9,038,915; 9,047,098;
  • U.S. Pat. Nos. 9,047,359; 9,047,420;
  • U.S. Pat. Nos. 9,047,525; 9,047,531;
  • U.S. Pat. Nos. 9,053,055; 9,053,378;
  • U.S. Pat. Nos. 9,053,380; 9,058,526;
  • U.S. Pat. Nos. 9,064,165; 9,064,167;
  • U.S. Pat. Nos. 9,064,168; 9,064,254;
  • U.S. Pat. Nos. 9,066,032; 9,070,032;
  • U.S. Design Patent No. D716,285;
  • U.S. Design Patent No. D723,560;
  • U.S. Design Patent No. D730,357;
  • U.S. Design Patent No. D730,901;
  • U.S. Design Patent No. D730,902;
  • U.S. Design Patent No. D733,112;
  • U.S. Design Patent No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


In the specification and/or figures, exemplary embodiments of the invention have been disclosed. The present disclosure is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method comprising: generating a reference map comprising a reference image of a plurality of objects located in an indoor facility;processing the reference image to determine a first set of objects of the plurality of objects having a line-of-sight visibility from a location and a second set of objects of the plurality of objects having a lack of line-of-sight visibility from the location for providing information about one or more of a location, a dimension, and an orientation of one or more of the plurality of objects located in the indoor facility;capturing, by an imaging device, a first image, the first image comprising a first object among the plurality of objects located in the indoor facility; andcomparing the first image to the reference image comprising the first object, wherein the first object is an object of the first set of objects to determine one or more of a location, a dimension, and an orientation of the first object to determine a first location of the imaging device, when the imaging device is used to capture the first image in the indoor facility.
  • 2. The method of claim 1, wherein the reference map is stored in at least one of the imaging device or a computer that is communicatively coupled to the imaging device, and wherein processing the first image is carried out in at least one of the imaging device or the computer.
  • 3. The method of claim 1, further comprising: based on determining the first location of the imaging device, providing an instruction to at least one of a user or the machine to travel from the first location to a second location corresponding to a second object, wherein the second object is an object from the second set of objects.
  • 4. The method of claim 1, wherein generating the reference map comprises using the imaging device to capture a set of reference images of the plurality of objects located in the indoor facility.
  • 5. The method of claim 1, wherein processing the first image comprises obtaining from a database, one or more of a location information, a dimension information, and an orientation information of the first object.
  • 6. The method of claim 1, wherein the reference map further comprises information about one or more of a predefined shape, a predefined size, a predefined orientation, and a predefined location of a label located on the first object, and wherein processing the first image to determine the first location of the one of the imaging device used to capture the first image in the indoor facility comprises: locating the label in the first image captured by the imaging device;identifying one or more characteristics of the label in the first image; anddetermining, based at least in part on the one or more characteristics of the label in the first image, a location of the imaging device in the indoor facility when used to capture the first image.
  • 7. The method of claim 6, wherein the predefined shape of the label is one of a square shape or a rectangle shape; wherein identifying the one or more characteristics of the label in the first image comprises detecting a distortion of the one of the square shape or the rectangle shape; and wherein determining the location of the imaging device in the indoor facility when used to capture the first image is based at least in part on examining a nature and an extent of the distortion of the one of the square shape or the rectangle shape.
  • 8. The method of claim 6, wherein the label comprises a barcode, and wherein processing the first image to determine the location of the imaging device in the indoor facility when used to capture the first image in the indoor facility further comprises: using information embedded in the barcode for determining in real-time, the location of the imaging device in the indoor facility when used to capture the first image.
  • 9. A method comprising: using an imaging device to capture an image of at least a first object among a plurality of objects located in an indoor facility;processing the image in cooperation with a reference map to determine a location, a dimension, and an orientation of the at least first object, the reference map comprising a reference images of the plurality of objects located in the indoor facility, wherein the reference image provides information about a first set of objects of the plurality of objects having a line-of-sight visibility from a location and a second set of objects of the plurality of objects having a lack of line-of-sight visibility from the location for providing information about one or more of a location, a dimension, and an orientation of each of the plurality of objects located in the indoor facility; anddetermining a location of the imaging device, based on the processing.
  • 10. The method of claim 9, wherein the reference map comprises one or more reference images of the plurality of objects located in the indoor facility, and wherein the reference map is stored in at least one of the imaging device or a computer that is communicatively coupled to the imaging device.
  • 11. The method of claim 9, wherein the reference map contains information about one or more of a predefined shape, a predefined size, a predefined orientation, and a predefined location of a label located on the first object, and wherein processing the image in cooperation with the reference map comprises: locating the label in the image captured by the imaging device;identifying one or more characteristics of the label in the image; anddetermining, based at least in part on the one or more characteristics of the label in the image, a location of the imaging device in the indoor facility.
  • 12. The method of claim 11, wherein the predefined shape of the label is one of a square shape or a rectangle shape; wherein identifying the one or more characteristics of the label in the image comprises detecting a distortion of the one of the square shape or the rectangle shape; and wherein determining the location of the imaging device in the indoor facility is based at least in part on examining a nature and an extent of the distortion of the one of the square shape or the rectangle shape.
  • 13. The method of claim 11, wherein the label comprises a barcode, and wherein processing the image to determine the location of the imaging device further comprises: using information embedded in the barcode for determining in real-time, the location of the imaging device in the indoor facility when capturing the image.
  • 14. The method of claim 9, wherein the first object is one of a fixed object or a relocatable object located in the indoor facility.
  • 15. A system comprising: a processor configured to: receive, from an imaging device, an image of at least a first object among a plurality of objects located in an indoor facility;process the image in cooperation with a reference map to determine a location, a dimension, and an orientation of the at least first object, the reference map comprising a reference images of the plurality of objects located in the indoor facility, wherein the reference image provides information about a first set of objects of the plurality of objects having a line-of-sight visibility from a location and a second set of objects of the plurality of objects having a lack of line-of-sight visibility from the location for providing information about one or more of a location, a dimension, and an orientation of each of the plurality of objects located in the indoor facility; and determine a location of the imaging device, based on the processing.
  • 16. The system of claim 15 wherein the reference map is stored in at least one of the imaging device or a computer that is communicatively coupled to the imaging device.
  • 17. The system of claim 15, wherein the reference map contains information about one or more of a predefined shape, a predefined size, a predefined orientation, and a predefined location of a label located on the first object, and wherein processing the image in cooperation with the reference map comprises: locating the label in the image captured by the imaging device;identifying one or more characteristics of the label in the image; anddetermining, based at least in part on the one or more characteristics of the label in the image, a location of the imaging device in the indoor facility.
  • 18. The system of claim 17, wherein the processor is further configured to: detect a distortion of the predefined shape of the label; anddetermine the location of the imaging device in the indoor facility based at least in part on examining a nature and an extent of the distortion.
  • 19. The system of claim 17, wherein the label comprises a barcode, and wherein the processor is further configured to: use information embedded in the barcode for determining in real-time, the location of the imaging device in the indoor facility when capturing the image.
US Referenced Citations (457)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9443123 Hejl Jan 2016 B2
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
20050092833 Glynn May 2005 A1
20070063048 Havens et al. Mar 2007 A1
20090134221 Zhu et al. May 2009 A1
20090262974 Lithopoulos Oct 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100277504 Song Nov 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267703 Taylor et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140347492 Fales Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chang et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150312774 Lau Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160071294 Park Mar 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160223340 Shin Aug 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20180061126 Huang Mar 2018 A1
20180157946 Landry Jun 2018 A1
Foreign Referenced Citations (5)
Number Date Country
2014006179 Jan 2014 JP
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (26)
Entry
Paul Trujillo, “Barcode Scanners: How Do They Work?”, 2014, Retrieved from the Internet: <URL: http://www.waspbarcode.com/buzz/how-barcode-scanners-work/>.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015. (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
Related Publications (1)
Number Date Country
20180350093 A1 Dec 2018 US