System and method for display of information using a vehicle-mount computer

Information

  • Patent Grant
  • 9616749
  • Patent Number
    9,616,749
  • Date Filed
    Friday, May 8, 2015
    10 years ago
  • Date Issued
    Tuesday, April 11, 2017
    8 years ago
Abstract
A system and method display information using a vehicle-mount computer. The system includes: (i) a computer touch screen for inputting and displaying information; (ii) a motion detector for detecting vehicle motion; and (iii) a vehicle-mount computer in communication with the computer touch screen and the motion detector. The vehicle-mount computer includes a central processing unit and memory. The vehicle-mount computer's central processing unit is configured to store information associated with user-selected information from the computer touch screen. Further, the vehicle-mount computer's central processing unit is configured to receive vehicle-motion information from the motion detector. Moreover, the vehicle-mount computer's central processing unit is configured to control the display of a zoomed view of the user-selected information on the computer touch screen in response to the motion detector's detection of motion.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. patent application Ser. No. 13/902,144 for a System and Method for Display of Information Using a Vehicle-Mount Computer filed May 24, 2013 (and published Nov. 27, 2014 as U.S. Patent Application Publication No. 2014/0350783), now U.S. Pat. No. 9,037,344. Each of the foregoing patent application, patent publication, and patent is hereby incorporated by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates generally to computer systems, and, more specifically, to systems and methods for displaying information using a vehicle-mount computer during vehicle motion.


BACKGROUND

Businesses have achieved greater productivity in recent years by deploying mobile computing devices into the field to assist workers. For example, workers use vehicle-mount computers to display important information in the field. Vehicle-mount computers are computing devices that are specially designed to mount to a vehicle and be used by the vehicle operator. For instance, several types of vehicle-mount computers are available for installation and use in commercial vehicles including forklifts, warehouse vehicles, and delivery trucks and vans.


Vehicle-mount computer systems typically utilize a screen for displaying information to a vehicle operator or other occupant. The computer system may incorporate a touch screen, or other input device, so that the user can select desired information for display. A forklift operator may, for example, view inventory information, delivery location information, and delivery navigation information directly from the cabin of the forklift using a vehicle-mount computer. The various types of user-selected information may be displayed either individually on the full area of a vehicle-mount computer screen or simultaneously on the computer screen using split-screen or otherwise partitioned views.


Although businesses have effectively employed vehicle-mount computers to increase worker productivity and improve the inbound, internal, and outbound flow of resources, challenges exist relating to the display of information on vehicle-mount computer screens when a vehicle is in motion. When a vehicle is in motion, the information displayed must be easily read by the vehicle operator at a glance and, therefore, difficulties can result when multiple types of information are displayed simultaneously. Moreover, when in motion the operator may prefer that only certain information be displayed on the vehicle-mount computer such as, for example, navigation information or other information associated with a delivery or pick-up. Although a vehicle operator could potentially manually configure a vehicle-mount computer to display the desired information prior to placing the vehicle in motion, this is inefficient, particularly for vehicles that are continually picking up and delivering multiple items.


Therefore, a need exists for improved systems and methods for displaying information using a vehicle-mount computer so that information of interest selected by a driver will be automatically displayed during vehicle motion. More particularly, there exists a need for a system to obtain and store user-selected information on a vehicle-mount computer screen and automatically provide an easily-readable, zoomed view of the selected information when the vehicle is in motion.


SUMMARY

Accordingly, in one aspect, the present invention embraces a system for displaying information using a vehicle-mount computer, including a computer touch screen for inputting and displaying information, a motion detector for detecting vehicle motion, and a vehicle-mount computer in communication with the computer touch screen and the motion detector, the vehicle-mount computer including a central processing unit and memory. The vehicle-mount computer's central processing unit is configured to store information associated with user selected information from the computer touch screen, receive vehicle-motion information from the motion detector, and control the display of user-selected information on the computer touch screen in response to the motion detector's detection of motion.


In an exemplary embodiment, the user-selected information is within a user-selected area on the computer touch screen.


In another exemplary embodiment, the user-selected area is a rectangle.


In yet another exemplary embodiment, in response to the motion detector's detection of vehicle motion the vehicle-mount computer's central processing unit controls the display such that the aspect ratio of the user-selected area on the computer touch screen corresponds to the aspect ratio of the computer touch screen.


In yet another exemplary embodiment, the user-selected area is selected by a user on the computer touch screen.


In yet another exemplary embodiment, the configuration of the user-selected area is predetermined by the vehicle-mount computer.


In yet another exemplary embodiment, the vehicle-mount computer includes a network interface.


In yet another exemplary embodiment, the motion detector includes an accelerometer, a gps locator, a gyroscope, and/or a compass.


In yet another exemplary embodiment, the central processing unit is configured to execute application programs.


In yet another exemplary embodiment, the user-selected information displayed on the computer touch screen covers at least about 50% of the display area of the computer touch screen.


In yet another exemplary embodiment, the user-selected information includes navigation information.


In yet another exemplary embodiment, the user-selected information includes delivery information.


In yet another exemplary embodiment, the vehicle-mount computer's central processing unit is configured to continually determine whether the vehicle is in motion.


In yet another exemplary embodiment, the vehicle-mount computer's central processing unit displays the full amount of information from the computer touch screen if the motion detector indicates that the vehicle is not in motion.


In another aspect, the present invention embraces a method for displaying information using a vehicle-mount computer including selecting information on a computer touch screen of the vehicle-mount computer, monitoring vehicle motion with a motion detector, and after detecting vehicle motion, displaying a zoomed view of the user-selected information on the computer touch screen display.


In an exemplary embodiment, the user-selected information is within a user-selected area on the computer touch screen.


In another exemplary embodiment, the aspect ratio of the zoomed view of the user-selected area on the computer touch screen corresponds to the aspect ratio of the computer touch screen.


In yet another exemplary embodiment, the motion detector includes an accelerometer, a gps locator, a gyroscope, and/or compass.


In yet another exemplary embodiment, the configuration of the user-selected area is predetermined by the vehicle-mount computer.


In yet another aspect, the present invention embraces a method for displaying data using a vehicle-mount computer including selecting an area on a computer touch screen of the vehicle-mount computer, storing the selected area, monitoring vehicle motion with a motion detector, and after detecting vehicle motion, displaying information associated with the selected area on the computer touch screen when the vehicle is moving.


The foregoing, as well as other objectives and advantages of the invention, and the manner in which the same are accomplished, are further specified within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic block diagram illustrating components for an exemplary system for displaying information using a vehicle-mount computer according to the present invention.



FIG. 2 illustrates an exemplary user-selected area and user-selected information from a computer touch screen according to the present invention.



FIG. 3 illustrates the communication of certain elements of the exemplary embodiment for using a vehicle-mount computer according to the present invention.



FIG. 4 depicts an exemplary user-selected area according to the present invention being displayed in a zoomed view on the vehicle-mount computer during vehicle motion or potential vehicle motion.





DETAILED DESCRIPTION

The present invention embraces systems and methods for displaying information. In particular, the present invention embraces systems and methods for displaying information using a vehicle-mount computer during vehicle motion.


In an exemplary embodiment, the system for displaying information using a vehicle-mount computer according to the present invention may include a vehicle-mount computer having a central processing unit, a system bus, a main memory, a mass storage device, an operating system stored on the mass storage device and executed by the central processing unit, and a computer touch screen for receiving input from a user and displaying information. The components of the vehicle-mount computer may be connected and in communication with each other by way of the system bus. The exemplary system also includes a motion detector connected to and in communication with the vehicle-mount computer.


In another exemplary embodiment, the method for displaying information using a vehicle-mount computer according to the present invention includes providing a vehicle-mount computer having a central processing unit, a system bus, a main memory, a mass storage device, an operating system stored on the mass storage device and executed by the central processing unit, and a computer touch screen for receiving input from a user and displaying information. The exemplary method according to the present invention further includes the steps of connecting a motion detector to the vehicle-mount computer and, upon the motion detector's detection of vehicle motion; the vehicle-mount computer's central processing unit displays a zoomed view of user-selected information on the computer touch screen.


Non-limiting examples of typical vehicles that may employ the system and method for displaying information using a vehicle-mount computer according to the present invention include forklifts, cranes, delivery truck, and similar industrial vehicles (e.g., vehicles used in industrial operations, factory or warehouse settings, and the like). References in the disclosure to particular types of vehicles are not intended to limit the disclosure to particular vehicles.


Referring now to the drawings, FIG. 1 is a schematic block diagram illustrating components of an exemplary system 10 for displaying information using a vehicle-mount computer. Vehicle-mount computer 20 includes a mass storage device 40 for storing an operating system 45 and various application programs 50. The mass storage device 40 may store other types of information as well.


As illustrated in FIG. 1, operating system 45 of the exemplary embodiment consists of software that controls the overall operation of the vehicle-mount computer 20, including process scheduling and management, process protection, and memory management. Examples of suitable operating systems include, but are not limited to, WINDOWS® 7 and WINDOWS® EMBEDDED COMPACT (i.e., WINDOWS® CE) from MICROSOFT® CORPORATION of Redmond, Wash., and the LINUX® open source operating system. Typically, operating system 45 is loaded by booting the vehicle-mount computer 20 and is executed directly by the central processing unit 25.


Application programs 50 (FIG. 1) include any number of executable software programs designed to assist the vehicle operator in the performance of specific tasks. Application programs 50 may load automatically upon execution of operating system 45 or in response to an input from the vehicle operator.


Main memory 30 (FIG. 1) provides for storage of instructions and information directly accessible by central processing unit 25. Main memory 30 may be configured to include random-access memory 32 (RAM) and read-only memory 34 (ROM). The ROM 34 may permanently store firmware or a basic input/output system (BIOS), which provides first instructions to vehicle-mount computer 20 when it is booted. RAM 32 may serve as temporary and immediately accessible storage for operating system 45 and application programs 50.


Mass storage device 40 (FIG. 1) may be any of the various types of computer components capable of storing large amounts of data in a persisting (i.e., non-volatile) and machine-readable manner. Typically, mass storage device 40 may be a hard disk drive. Alternatively, mass storage device 40 may be a solid state drive, optical drive, removable flash drive or any other component with similar storage capabilities.


As illustrated in FIG. 1, computer touch screen 70 may be provided for inputting and displaying information using vehicle-mount computer 20. Computer touch screen 70 is operably connected to, and in communication with, vehicle-mount computer 20. Touch screen 70 may display information to users in the form of text or graphical output generated by vehicle-mount computer 20. Persons having skill in the art will appreciate that computer touch screen 70 may incorporate any appropriate touch screen technology having the ability to sense touch (e.g., resistive, capacitive, etc.) and that is conducive to the operating environment of the vehicle. Although touch screen 70 is illustrated in FIG. 1, other input devices (e.g., keyboard or mouse) or display devices may be utilized in connection with vehicle-mount computer 20.


As depicted in FIG. 1, an exemplary embodiment of the vehicle-mount computer 20 of the system 10 for displaying information using a vehicle-mount computer may also include network interface 60. Network interface 60 is operably connected to communications network 85, enabling vehicle-mount computer 20 to communicate with communications network 85. Communications network 65 may include any collection of computers or communication devices interconnected by communication channels. The communication channels may be wired or wireless. Examples of such communication networks include, without limitation, local area networks, the Internet, and cellular networks. The connection to the communications network 85 allows vehicle-mount computer 20 to communicate with other network nodes. For example, a central dispatcher could send instructions (e.g., a delivery schedule) from a scheduling server to the vehicle operator via the communications network 85.


Motion detector 90 (FIG. 1) may include any number of sensors or other appropriate devices that detect vehicle movement or potential vehicle movement. Motion detector 90 is operably connected to and in communication with vehicle-mount computer 20. Those having skill in the art will appreciate that any of a number of sensors may be utilized to detect vehicle movement including, but not limited to, an accelerometer, a gps locator, gyroscope, compass, or some appropriate combination of a number of sensors or devices. Sensors could also monitor potential vehicle movement, such as when the vehicle is placed into gear or otherwise made ready for movement by the user. As the term is used herein, vehicle motion specifically embraces the concept of actual motion as well as potential motion.


Central processing unit 25 is configured to store user-selected information from the computer touch screen 70. As illustrated in the exemplary embodiment (FIG. 1), the central processing unit 25 may execute application programs 50 to at least temporarily store information relating to a user-selected area, such as a rectangle, input from computer touch screen 70.


As illustrated in FIG. 2, the vehicle-mount computer's 20 central processing unit 25, along with other components of system 10 such as application programs 50, may be configured to store user-selected information 96 associated with a user-selected area 95 from computer touch screen 70. Although the user-selected area 95 as depicted in FIG. 2 consists of a rectangle, the user-selected area 95 could consist of any configuration.



FIG. 3 illustrates how certain of the above-described elements of the exemplary embodiment for displaying information using a vehicle-mount computer may communicate in order to display user-selected information. As set forth above, the vehicle-mount computer's 20 central processing unit 25 is configured to receive vehicle-motion information from motion detector 90. The vehicle-mount computer's 20 central processing unit 25 is configured to control the storage and display of a zoomed view 100 (FIG. 4) of the user-selected information 96 on the computer touch screen 70 in response to the motion detector's 70 detection of vehicle motion.


As illustrated in FIG. 4, the user-selected information 96 may include information on the computer touch screen 70 that the vehicle operator prefers to be displayed in a zoomed view 100 on the vehicle-mount computer 20 during vehicle motion. Such information may include, but is not limited to, navigation information, pick-up information related to an item, or other information related to a scheduled delivery. As set forth in FIG. 4 and described above, the user-selected information 96 within user-selected area 95 will be automatically displayed on the computer touch screen 70 during vehicle motion. More particularly, the user-selected information 96 will be automatically displayed in a zoomed view 100 (i.e., a magnified view) on the vehicle-mount computer 20 such that the user-selected information 96 is provided in a more easily-readable format when the vehicle is in motion.


In a non-limiting embodiment, the rectangle or other outline utilized for the user-selected area 95 may be configured such that the user-selected area 95 on the computer touch screen 70 corresponds to the aspect ratio of the computer touch screen 90 in order to facilitate display of the zoomed view 100.


* * *

To supplement the present disclosure, this application incorporates entirely by reference the following patents, patent application publications, and patent applications: U.S. Pat. Nos. 6,832,725; 7,159,783; 7,413,127; 8,390,909; 8,294,969; 8,408,469; 8,408,468; 8,381,979; 8,408,464; 8,317,105; 8,366,005; 8,424,768; 8,322,622; 8,371,507; 8,376,233; 8,457,013; 8,448,863; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0193407; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2012/0318869; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0068840; U.S. Patent Application Publication No. 2013/0070322; U.S. Patent Application Publication No. 2013/0075168; U.S. Patent Application Publication No. 2013/0056285; U.S. Patent Application Publication No. 2013/0075464; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2010/0225757; U.S. patent application Ser. No. 13/347,219 for an OMNIDIRECTIONAL LASER SCANNING BAR CODE SYMBOL READER GENERATING A LASER SCANNING PATTERN WITH A HIGHLY NON-UNIFORM SCAN DENSITY WITH RESPECT TO LINE ORIENTATION, filed Jan. 10, 2012 (Good); U.S. patent application Ser. No. 13/347,193 for a HYBRID-TYPE BIOPTICAL LASER SCANNING AND DIGITAL IMAGING SYSTEM EMPLOYING DIGITAL IMAGER WITH FIELD OF VIEW OVERLAPPING FIELD OF FIELD OF LASER SCANNING SUBSYSTEM, filed Jan. 10, 2012 (Kearney et al.); U.S. patent application Ser. No. 13/367,047 for LASER SCANNING MODULES EMBODYING SILICONE SCAN ELEMENT WITH TORSIONAL HINGES, filed Feb. 6, 2012 (Feng et al.); U.S. patent application Ser. No. 13/400,748 for a LASER SCANNING BAR CODE SYMBOL READING SYSTEM HAVING INTELLIGENT SCAN SWEEP ANGLE ADJUSTMENT CAPABILITIES OVER THE WORKING RANGE OF THE SYSTEM FOR OPTIMIZED BAR CODE SYMBOL READING PERFORMANCE, filed Feb. 21, 2012 (Wilz); U.S. patent application Ser. No. 13/432,197 for a LASER SCANNING SYSTEM USING LASER BEAM SOURCES FOR PRODUCING LONG AND SHORT WAVELENGTHS IN COMBINATION WITH BEAM-WAIST EXTENDING OPTICS TO EXTEND THE DEPTH OF FIELD THEREOF WHILE RESOLVING HIGH RESOLUTION BAR CODE SYMBOLS HAVING MINIMUM CODE ELEMENT WIDTHS, filed Mar. 28, 2012 (Havens et al.); U.S. patent application Ser. No. 13/492,883 for a LASER SCANNING MODULE WITH ROTATABLY ADJUSTABLE LASER SCANNING ASSEMBLY, filed Jun. 10, 2012 (Hennick et al.); U.S. patent application Ser. No. 13/367,978 for a LASER SCANNING MODULE EMPLOYING AN ELASTOMERIC U-HINGE BASED LASER SCANNING ASSEMBLY, filed Feb. 7, 2012 (Feng et al.); U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.); U.S. patent application Ser. No. 13/780,356 for a Mobile Device Having Object-Identification Interface, filed Feb. 28, 2013 (Samek et al.); U.S. patent application Ser. No. 13/780,158 for a Distraction Avoidance System, filed Feb. 28, 2013 (Sauerwein); U.S. patent application Ser. No. 13/784,933 for an Integrated Dimensioning and Weighing System, filed Mar. 5, 2013 (McCloskey et al.); U.S. patent application Ser. No. 13/785,177 for a Dimensioning System, filed Mar. 5, 2013 (McCloskey et al.); U.S. patent application Ser. No. 13/780,196 for Android Bound Service Camera Initialization, filed Feb. 28, 2013 (Todeschini et al.); U.S. patent application Ser. No. 13/792,322 for a Replaceable Connector, filed Mar. 11, 2013 (Skvoretz); U.S. patent application Ser. No. 13/780,271 for a Vehicle Computer System with Transparent Display, filed Feb. 28, 2013 (Fitch et al.); U.S. patent application Ser. No. 13/736,139 for an Electronic Device Enclosure, filed Jan. 8, 2013 (Chaney); U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson); U.S. patent application Ser. No. 13/750,304 for Measuring Object Dimensions Using Mobile Computer, filed Jan. 25, 2013; U.S. patent application Ser. No. 13/471,973 for Terminals and Methods for Dimensioning Objects, filed May 15, 2012; U.S. patent application Ser. No. 13/895,846 for a Method of Programming a Symbol Reading System, filed Apr. 10, 2013 (Corcoran); U.S. patent application Ser. No. 13/867,386 for a Point of Sale (POS) Based Checkout System Supporting a Customer-Transparent Two-Factor Authentication Process During Product Checkout Operations, filed Apr. 22, 2013 (Cunningham et al.); U.S. patent application Ser. No. 13/888,884 for an Indicia Reading System Employing Digital Gain Control, filed May 7, 2013 (Xian et al.); U.S. patent application Ser. No. 13/895,616 for a Laser Scanning Code Symbol Reading System Employing Multi-Channel Scan Data Signal Processing with Synchronized Digital Gain Control (SDGC) for Full Range Scanning, filed May 16, 2013 (Xian et al.); U.S. patent application Ser. No. 13/897,512 for a Laser Scanning Code Symbol Reading System Providing Improved Control over the Length and Intensity Characteristics of a Laser Scan Line Projected Therefrom Using Laser Source Blanking Control, filed May 20, 2013 (Brady et al.); and U.S. patent application Ser. No. 13/897,634 for a Laser Scanning Code Symbol Reading System Employing Programmable Decode Time-Window Filtering, filed May 20, 2013 (Wilz, Sr. et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method comprising: selecting an area of a screen of a computer that is positioned in a vehicle;after selecting the area on the screen, storing the selected area;monitoring the vehicle's motion with a motion detector; andafter detecting vehicle motion, displaying information in the selected area on the screen when the vehicle is moving.
  • 2. The method of claim 1, wherein the displayed information is navigation information.
  • 3. The method of claim 1, wherein the displayed information is delivery information.
  • 4. A system, comprising: a screen for displaying information;an input device for interacting with a computer;a motion detector for detecting vehicle motion; anda computer having a central processing unit and memory in communication with the screen, the input device, and the motion detector;wherein the central processing unit is configured to store navigation information;wherein the central processing unit is configured to receive vehicle-motion information from the motion detector;wherein the central processing unit is configured to control the display of the navigation information on the screen in response to the motion detector's detection of motion;wherein the navigation information is within a user-selected area on the screen display; andwherein the configuration of the user-selected area is selected by a user on the screen display using the input device.
  • 5. The system of claim 4, wherein the computer is a mobile computing device and the input device comprises a touch screen.
  • 6. The system of claim 5, wherein the user-selected area comprises a rectangle.
  • 7. The system of claim 5, wherein in response to the motion detector's detection of vehicle motion the central processing unit is configured to control the screen display such that the aspect ratio of the user-selected area on the computer touch screen corresponds to the aspect ratio of the touch screen.
  • 8. The system of claim 5, wherein the computer is configured for mounting in a vehicle.
  • 9. The system of claim 5, wherein the configuration of the user-selected area is predetermined by the computer.
  • 10. The system of claim 4, wherein the computer comprises a network interface.
  • 11. The system of claim 4, wherein the motion detector comprises an accelerometer, a gps locator, a gyroscope, and/or a compass.
  • 12. The system of claim 4, wherein the central processing unit is configured to execute application programs.
  • 13. The system of claim 4, wherein the user-selected information displayed on the screen covers at least about 50% of the display area of the screen.
  • 14. The system of claim 4, wherein: the central processing unit is configured to store delivery information; andthe central processing unit is configured to control the display of the delivery information on the screen in response to the motion detector's detection of motion.
  • 15. The system of claim 4, wherein the computer's central processing unit is configured to continually determine whether the vehicle is in motion.
  • 16. The system of claim 4, wherein the central processing unit displays the full amount of information from the screen if the motion detector indicates that the vehicle is not in motion.
  • 17. A method, comprising: selecting information contained within a user-selected area on a computer screen that is positioned in a vehicle;monitoring vehicle motion with a motion detector; andafter detecting vehicle motion, displaying a zoomed view of the selected information contained within the user-selected area on the computer screen display;wherein the configuration of the user-selected area is selected by a user.
  • 18. The method of claim 17, wherein the computer screen comprises a computer touch screen and the selected information comprises delivery information.
  • 19. The method of claim 18, wherein the aspect ratio of the zoomed view of the user-selected area on the computer touch screen corresponds to the aspect ratio of the computer touch screen.
  • 20. The method of claim 17, wherein the motion detector comprises an accelerometer, a gps locator, a gyroscope, and/or compass.
US Referenced Citations (427)
Number Name Date Kind
4821029 Logan et al. Apr 1989 A
5359515 Weller et al. Oct 1994 A
5689682 Peasley et al. Nov 1997 A
5850209 Lemke et al. Dec 1998 A
5949345 Beckert et al. Sep 1999 A
6094609 Arjomand Jul 2000 A
6226570 Hahn May 2001 B1
6574531 Tan et al. Jun 2003 B2
6690940 Brown et al. Feb 2004 B1
6832725 Gardiner et al. Dec 2004 B2
7082365 Sheha Jul 2006 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7489303 Pryor Feb 2009 B1
7567861 Inagaki Jul 2009 B2
7640101 Pair et al. Dec 2009 B2
7726575 Wang et al. Jun 2010 B2
7983840 Pair et al. Jul 2011 B2
8077143 Panabaker et al. Dec 2011 B2
8078359 Small et al. Dec 2011 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8346426 Szybalski Jan 2013 B1
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8514172 Panabaker et al. Aug 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
20020085043 Ribak Jul 2002 A1
20030125873 Yamaguchi et al. Jul 2003 A1
20040243307 Geelen Dec 2004 A1
20070063048 Havens et al. Mar 2007 A1
20080211779 Pryor Sep 2008 A1
20090085863 Panabaker et al. Apr 2009 A1
20090134221 Zhu et al. May 2009 A1
20100090816 Hirsch et al. Apr 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100191457 Harada Jul 2010 A1
20110001614 Ghneim Jan 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120062455 Panabaker et al. Mar 2012 A1
20120109516 Miyazaki et al. May 2012 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120242687 Choi Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Liu et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
Foreign Referenced Citations (15)
Number Date Country
0913802 May 1999 EP
2805845 Nov 2014 EP
2398050 Aug 2004 GB
2490059 Oct 2012 GB
2517824 Mar 2015 GB
H0950235 Feb 1997 JP
2006017478 Jan 2006 JP
9843192 Oct 1998 WO
03057522 Jul 2003 WO
2011140591 Nov 2011 WO
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014058087 Apr 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (92)
Entry
Great Britain Search and Exam Report in Application GB1408963.1; Dated Dec. 3, 2014; 8 pages.
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages.
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages.
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages.
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages.
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages.
U.S. Appl. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.); 8 pages.
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages.
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages.
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages.
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages.
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages.
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages.
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages.
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages.
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages.
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages.
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages.
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 44 pages.
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using Wildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages.
U.S. Appl. No. 14/687,289 for System for Communication Via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages.
U.S. Appl. No. 29/513,410 for Electronic Device filed Dec. 30, 2014 (Nguyen et al.); 10 pages.
U.S. Appl. No. 29/513,411 for Electronic Device filed Dec. 30, 2014 (Nguyen et al.); 9 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Great Britain Second Exam Report in Application GB1408963.1; Dated Nov. 20, 2015; 3 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed on Aug. 19, 2014 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/724,134 for Electronic Device With Wireless Path Selection Capability filed May 28, 2015 (Wang et al.); 42 pages.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages.
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages.
U.S. Appl. No. 14/724,849 for Method of Programming the Default Cable Interface Software in an Indicia Reading Device filed May 29, 2015 (Barten); 29 pages.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages.
U.S. Appl. No. 14/722,608 for Interactive User Interface for Capturing a Document in an Image Signal filed May 27, 2015 (Showering et al.); 59 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages.
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages.
U.S. Appl. No. 14/724,908 for Imaging Apparatus Having Imaging Assembly filed May 29, 2015 (Barber et al.); 39 pages.
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages.
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages.
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages.
U.S. Appl. No. 14/679,275 for Dimensioning System Calibration Systems and Methods filed Apr. 6, 2015 (Laffargue et al.); 47 pages.
U.S. Appl. No. 14/744,633 for Imaging Apparatus Comprising Image Sensor Array Having Shared Global Shutter Circuitry filed Jun. 19, 2015 (Wang); 65 pages.
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages.
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages.
U.S. Appl. No. 14/744,836 for Cloud-Based System for Reading of Decodable Indicia filed Jun. 19, 2015 (Todeschini et al.); 26 pages.
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Bian et al.); 22 pages.
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et al.); 23 pages.
U.S. Appl. No. 14/745,006 for Selective Output of Decoded Message Data filed Jun. 19, 2015 (Todeschini et al.); 36 pages.
U.S. Appl. No. 14/568,305 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages.
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages.
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages.
U.S. Appl. No. 14/748,446 for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages.
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages.
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages.
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages.
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages.
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages.
AVTech, Website at http://www.avtech.com.hk/eng/AVI321.htm; downloaded on Jun. 3, 2014; 1 page.
AVTech, “AVI311 PTZ Network Camera”, Jan. 2, 2013, 2 pages.
European Exam Report in related EP Application No. 14160386.0, Dated Jul. 29, 2016, 5 pages.
Search Report in counterpart European Application No. 14160386.0 dated Jul. 31, 2015, pp. 1-3.
Great Britain Exam Report in Application GB1408963.1; Dated Jan. 16, 2017; 4 pages. [GB 2490059 and US 2011/0001614 previously cited].
Related Publications (1)
Number Date Country
20150239348 A1 Aug 2015 US
Continuations (1)
Number Date Country
Parent 13902144 May 2013 US
Child 14707037 US