System and method for display of information using a vehicle-mount computer

Information

  • Patent Grant
  • 10272784
  • Patent Number
    10,272,784
  • Date Filed
    Thursday, June 15, 2017
    7 years ago
  • Date Issued
    Tuesday, April 30, 2019
    6 years ago
Abstract
A system and method displays information using a vehicle-mount computer. The system includes (i) an input device and a display device for inputting and displaying information; (ii) a motion detector for detecting vehicle motion; (iii) a proximity sensor for detecting proximity to an item; and (vi) a vehicle-mount computer in communication with the input device, the display device, the motion detector, and the proximity sensor, the vehicle-mount computer including a central processing unit and memory. The vehicle-mount computer's central processing unit is configured to store information associated with user-selected information from the input device and to display a zoomed view of the user-selected information on the display device. Further, the vehicle-mount computer's central processing unit is configured to override screen-blanking when user-selected information is displayed.
Description
FIELD OF THE INVENTION

The present invention relates generally to computer systems, and, more specifically, to systems and methods for displaying information using a vehicle-mount computer during vehicle motion.


BACKGROUND

Businesses have achieved greater productivity in recent years by deploying mobile computing devices into the field to assist workers. For example, workers use vehicle-mount computers to display important information in the field. Vehicle-mount computers are computing devices that are specially designed to mount to a vehicle and be used by the vehicle operator. For instance, several types of vehicle-mount computers are available for installation and use in commercial vehicles including forklifts, warehouse vehicles, cranes, and delivery trucks and vans.


Vehicle-mount computer systems typically utilize a screen for displaying information to a vehicle operator or other occupant. The computer system may incorporate a touch screen, or other input device, so that the user can select desired information for display. A forklift operator may, for example, view inventory information regarding shipped or inventoried products, location information regarding the next item to be loaded for shipping, and navigation information relating to the item to be loaded directly from the cabin of the forklift using a vehicle-mount computer. The various types of user-selected information may be displayed either individually on the full area of a vehicle-mount computer screen or simultaneously on the computer screen using split-screen or otherwise partitioned views.


For safety reasons, vehicle-mount computer systems may incorporate a screen blanking or lock-out feature to prevent a driver of a vehicle from viewing or otherwise interacting with the vehicle-mount computer system while the vehicle is in motion. To prevent distractions that may cause accidents, the blanking or lock-out feature may disable all aspects of the computer system preventing all interaction by the driver during vehicle motion, or otherwise during potential vehicle motion such as when the vehicle is put into gear. Disabling of the computer system or blanking the computer screen is undesirable, however, due to the fact that the driver is not able to access relevant information during vehicle motion, such as delivery information relating to an item.


Although businesses have effectively employed vehicle-mount computers to increase worker productivity and improve the inbound, internal, and outbound flow of resources, challenges exist relating to the safe and effective display of information on vehicle-mount computer screens when a vehicle, such as a forklift, is in motion or is potentially in motion. When a vehicle is in motion, the information displayed must be easily read by the vehicle operator at a glance so that the screen blanking safety feature becomes unnecessary. Moreover, when in motion the driver may prefer that only certain information be displayed on the vehicle-mount computer such as, for example, information relating to the next item that will be picked up.


Although a vehicle operator could potentially configure a vehicle-mount computer to display certain desired information prior to placing the vehicle in motion, the driver would then have to remember the relevant information during vehicle motion after the screen blanking feature or lock-out feature is engaged. This is particularly inefficient for industrial vehicles that are continually picking up and delivering items.


Therefore, a need exists for improved systems and methods for displaying information using a vehicle-mount computer so that the computer screen blanking feature is overridden and information of interest that has been selected by a driver will be automatically displayed during vehicle motion. More particularly, there exists a need for a system to (1) obtain and store user-selected information on a vehicle-mount computer screen; (2) bypass computer screen blanking; and (3) provide an easily-readable, centered and zoomed view of the selected information when the vehicle is in motion.


Further, there exists a need for improved systems and methods for displaying information using a vehicle-mount computer that are adaptive as to the information of interest selected by a driver. More particularly, there exists a need for systems and methods that can automatically display additional information associated with user-selected information that might be more relevant when the vehicle is in motion or otherwise at various locations in relation to the item of interest.


SUMMARY

Accordingly, in one aspect, the present invention embraces a system for displaying information using a vehicle-mount computer, including a computer touch screen for inputting and displaying information, a motion detector for detecting vehicle motion, and a vehicle-mount computer in communication with the computer touch screen and the motion detector, the vehicle-mount computer including a central processing unit and memory. The vehicle-mount computer's central processing unit is configured to store information associated with user-selected information from the computer touch screen, receive vehicle-motion information from the motion detector, control the display of user-selected information on the computer touch screen, and the vehicle-mount computer's central processing unit includes a blanking feature that blanks the computer touch screen in response to the motion detector's detection of motion unless the computer touch screen is displaying user-selected information.


In an exemplary embodiment, the vehicle-mount computer's central processing unit is configured to control the display of a zoomed view of the user selected information on the computer touch screen.


In another exemplary embodiment, the zoomed view covers at least about 50% of the display area of the computer touch screen.


In yet another exemplary embodiment, the vehicle-mount computer's central processing unit is configured to control the display of a zoomed view of the user selected information and additional information associated with the user-selected information on the computer touch screen in response to the motion detector's detection of motion.


In yet another exemplary embodiment, the user selected information displayed includes additional information associated with the user-selected information.


In yet another exemplary embodiment, the additional information associated with the user-selected information includes delivery navigation information.


In yet another exemplary embodiment, the additional information associated with the user-selected information includes information regarding an item for pickup.


In yet another exemplary embodiment, the item for pickup is a pallet of goods.


In yet another exemplary embodiment, the system includes a proximity sensor for detecting vehicle location, and the vehicle-mount computer's central processing unit is configured to receive vehicle-location information from the proximity sensor and in response to vehicle-location information, the central processing unit is configured to display on the computer touch screen additional information associated with the user selected information.


In yet another exemplary embodiment, the vehicle-mount computer includes a network interface.


In yet another exemplary embodiment, the vehicle-mount computer receives the additional information associated with the user selected information through the network interface.


In yet another exemplary embodiment, the user-selected information is within a user-selected area on the computer touch screen.


In yet another exemplary embodiment, the user-selected area includes a circle.


In yet another exemplary embodiment, the configuration of the user-selected area is predetermined by the vehicle-mount computer.


In yet another exemplary embodiment, the vehicle-mount computer includes a network interface.


In yet another exemplary embodiment, the motion detector includes an accelerometer, a gps locator, a gyroscope, and/or a compass.


In another aspect, the invention embraces a method for displaying data using a vehicle-mount computer, including monitoring vehicle motion with a motion detector, unless the computer touch screen is displaying user selected information, blanking the computer touch screen in response to the detection of vehicle motion, selecting user selected information on a computer touch screen of the vehicle-mount computer, and after the step of selecting user selected information, storing in the vehicle-mount computer the user selected information.


In an exemplary embodiment, the method includes displaying additional information associated with the user selected information on the computer touch screen display.


In another exemplary embodiment, the vehicle-mount computer includes a network interface.


In yet another aspect, the invention embraces a method for displaying data using a vehicle-mount computer including monitoring vehicle motion with a motion detector, unless the computer touch screen is displaying user selected information, blanking the computer touch screen in response to the detection of vehicle motion, selecting user selected information on a computer touch screen of the vehicle-mount computer, after the step of selecting user selected information, storing in the vehicle-mount computer the user selected information, monitoring vehicle proximity to a location with a proximity sensor, and after detecting vehicle proximity to a location, displaying additional information associated with the user-selected information on the computer touch screen in response to certain detected locations.


The foregoing, as well as other objectives and advantages of the invention, and the manner in which the same are accomplished, are further specified within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic block diagram illustrating components for an exemplary system for displaying information using a vehicle-mount computer according to the present invention.



FIG. 2 illustrates exemplary user-selected information within a user-selected area from a computer touch screen according to the present invention.



FIG. 3 illustrates the communication of certain elements of the exemplary embodiment for using a vehicle-mount computer according to the present invention.



FIG. 4 depicts exemplary user-selected information according to the present invention being displayed in a zoomed view on the vehicle-mount computer during vehicle motion.



FIG. 5 illustrates the communication of certain elements of the exemplary embodiment for using a vehicle-mount computer according to the present invention.



FIG. 6 depicts additional information related to exemplary user-selected information according to the present invention being displayed on the vehicle-mount computer during vehicle motion or potential vehicle motion.





DETAILED DESCRIPTION

The present invention embraces systems and methods for displaying information. In particular, the present invention embraces systems and methods for displaying information using a vehicle-mount computer during vehicle motion.


In an exemplary embodiment, the system for displaying information using a vehicle-mount computer according to the present invention may include a vehicle-mount computer having a central processing unit, a system bus, a main memory, a mass storage device, an operating system stored on the mass storage device and executed by the central processing unit, and a computer touch screen for receiving input from a user and displaying information. The components of the vehicle-mount computer may be connected and in communication with each other by way of the system bus. The exemplary system may also include a motion detector connected to and in communication with the vehicle-mount computer. Moreover, the exemplary system could include a proximity sensor connected to and in communication with the vehicle-mount computer.


In another exemplary embodiment, the method for displaying information using a vehicle-mount computer according to the present invention includes providing a vehicle-mount computer having a central processing unit, a system bus, a main memory, a mass storage device, an operating system stored on the mass storage device and executed by the central processing unit, and a computer touch screen for receiving input from a user and displaying information. The exemplary method according to the present invention may further include the steps of connecting a motion detector to the vehicle-mount computer for detecting vehicle motion. The exemplary method according to the present invention may further include the steps of displaying a zoomed view of user-selected information on the computer touch screen. Further, the exemplary method according to the present invention may include the step of overriding lock-out or screen blanking during vehicle motion.


In yet another exemplary embodiment, the method for displaying information using a vehicle-mount computer according to the present invention may include providing a vehicle-mount computer having a central processing unit, a system bus, a main memory, a mass storage device, an operating system stored on the mass storage device and executed by the central processing unit, and a computer touch screen for receiving input from a user and displaying information. The exemplary method according to the present invention may further include the steps of connecting a motion detector to the vehicle-mount computer for detecting vehicle motion and a proximity sensor for detecting vehicle location in relation to items of interest. The exemplary method may further include the steps of displaying a zoomed view of user-selected information on the computer touch screen and overriding screen-blanking upon the detection of vehicle motion. The exemplary method may also include the steps of, upon the motion detector's detection of vehicle motion or the proximity sensor's detection of certain specified locations, displaying additional information relating to the user-selected items of interest.


Non-limiting examples of typical vehicles that may employ the system and method for displaying information using a vehicle-mount computer according to the present invention include forklifts, cranes, delivery trucks and similar industrial vehicles (e.g., vehicles used in industrial operations, factory or warehouse settings, and the like). References in the disclosure to particular types of vehicles are not intended to limit the disclosure to particular vehicles.


Referring now to the drawings, FIG. 1 is a schematic block diagram illustrating components of an exemplary system 10 for displaying information using a vehicle-mount computer. Vehicle-mount computer 20 includes a mass storage device 40 for storing an operating system 45 and various application programs 50. The mass storage device 40 may store other types of information as well.


As illustrated in FIG. 1, operating system 45 of the exemplary embodiment consists of software that controls the overall operation of the vehicle-mount computer 20, including process scheduling and management, process protection, and memory management. Examples of suitable operating systems include, but are not limited to, WINDOWS® 7 and WINDOWS® EMBEDDED COMPACT (i.e., WINDOWS® CE) from MICROSOFT® CORPORATION of Redmond, Wash., and the LINUX® open source operating system. Typically, operating system 45 is loaded by booting the vehicle-mount computer 20 and is executed directly by the central processing unit 25.


Application programs 50 (FIG. 1) include any number of executable software programs designed to assist the vehicle operator in the performance of specific tasks. Application programs 50 may load automatically upon execution of operating system 45 or in response to an input from the vehicle operator.


Main memory 30 (FIG. 1) provides for storage of instructions and information directly accessible by central processing unit 25. Main memory 30 may be configured to include random-access memory 32 (RAM) and read-only memory 34 (ROM). The ROM 34 may permanently store firmware or a basic input/output system (BIOS), which provides first instructions to vehicle-mount computer 20 when it is booted. RAM 32 may serve as temporary and immediately accessible storage for operating system 45 and application programs 50.


Mass storage device 40 (FIG. 1) may be any of the various types of computer components capable of storing large amounts of data in a persisting (i.e., non-volatile) and machine-readable manner. Typically, mass storage device 40 may be a hard disk drive. Alternatively, mass storage device 40 may be a solid state drive, optical drive, removable flash drive or any other component with similar storage capabilities.


As illustrated in FIG. 1, computer touch screen 70 may be provided for inputting and displaying information using vehicle-mount computer 20. Computer touch screen 70 is operably connected to, and in communication with, vehicle-mount computer 20. Touch screen 70 may display information to users in the form of text or graphical output generated by vehicle-mount computer 20. Persons having skill in the art will appreciate that computer touch screen 70 may incorporate any appropriate touch screen technology having the ability to sense touch (e.g., resistive, capacitive, etc.) and that is conducive to the operating environment of the vehicle. Although touch screen 70 is illustrated in FIG. 1, other input devices (e.g., keyboard or mouse) or display devices may be utilized in connection with vehicle-mount computer 20.


As depicted in FIG. 1, an exemplary embodiment of the vehicle-mount computer 20 of the system 10 for displaying information using a vehicle-mount computer may also include network interface 60. Network interface 60 is operably connected to communications network 85, enabling vehicle-mount computer 20 to communicate with communications network 85. Communications network 85 may include any collection of computers or communication devices interconnected by communication channels. The communication channels may be wired or wireless. Examples of such communication networks include, without limitation, local area networks, the Internet, and cellular networks. The connection to communications network 85 allows vehicle-mount computer 20 to communicate with other network nodes. For example, a central dispatcher could send instructions (e.g., a delivery schedule for pickup and drop off) from a scheduling server to the vehicle operator via communications network 85.


Motion detector 90 (FIG. 1) may include any number of sensors or other appropriate devices that detect vehicle movement or potential vehicle movement. Motion detector 90 is operably connected to and in communication with vehicle-mount computer 20. Those having skill in the art will appreciate that any of a number of sensors may be utilized to detect vehicle movement including, but not limited to, an accelerometer, gps locator, gyroscope, compass, or some appropriate combination of a number of sensors or devices. Sensors could also monitor potential vehicle movement, such as when the vehicle is placed into gear or otherwise made ready for movement by the user. As the term is used herein, vehicle motion specifically embraces the concept of actual motion as well as potential motion.


As depicted in FIG. 1, an exemplary embodiment of the vehicle-mount computer 20 of the system 10 for displaying information using a vehicle-mount computer may also include proximity sensor 80. Proximity sensor 80 may be operably connected to and in communication with vehicle-mount computer 20. Those having skill in the art will appreciate that any of a number of sensors may be utilized to detect vehicle proximity to a selected item or location including, but not limited to, a gps locator and/or some appropriate combination of other sensors or devices. Thus, in some embodiments, the motion detector 90 could operate as a proximity sensor 80, or the elements may be combined as to their use of certain sensors to perform their functions.


Central processing unit 25 is configured to store information associated with user-selected information within a user-selected area from the computer touch screen 70. As illustrated in the exemplary embodiment (FIG. 1), the central processing unit 25 may execute application programs 50 to at least temporarily store information relating to user-selected information within a user-selected area, such as a circle, input from computer touch screen 70.


As illustrated in FIG. 2, the vehicle-mount computer's 20 central processing unit 25, along with other components of system 10, such as application programs 50, may be configured to store user-selected information 96 associated with a user-selected area 95 from computer touch screen 70. Although the user-selected area 95, as depicted in FIG. 2, consists of a circle, the user-selected area 95 could consist of any configuration.



FIG. 3 illustrates how certain of the above described elements of the exemplary system 10 for displaying information using a vehicle-mount computer may communicate in order to display user-selected information. As set forth above, the vehicle-mount computer's 20 central processing unit 25 is configured to receive vehicle-motion information from motion detector 90. The vehicle-mount computer's 20 central processing unit 25 is configured to disable or override computer lock-out or screen-blanking when user-selected information 96 has been selected and stored. Computer lock-out, or screen blanking, is a security feature that may otherwise be triggered by vehicle motion or potential vehicle motion. The vehicle-mount computer's 20 central processing unit 25 is further configured to control the storage and display of a zoomed view 100 (FIG. 4) of the user-selected information 96 on the computer touch screen 70 in response to the motion detector's 70 detection of vehicle motion.


As illustrated in FIG. 4, the user-selected information 96 within user-selected area 95 may include information on the computer touch screen 70 that the vehicle operator prefers be displayed in a zoomed view 100 on the vehicle-mount computer 20 during vehicle motion. Such information may include, but is not limited to, a particular pallet or other item that is scheduled for pickup or delivery. As set forth in FIG. 4 and described above, computer screen blanking will be bypassed and the user-selected information 96 within user-selected area 95 will be displayed on the computer touch screen 70 during vehicle motion or potential vehicle motion. More particularly, the user-selected information 96 will be displayed in a zoomed view 100 (i.e., a magnified view) on the vehicle-mount computer 20 such that the user-selected area 95 is provided in a more easily-readable format when the vehicle is in motion.



FIG. 5 illustrates how certain of the above described elements of another exemplary system 10 for displaying information using a vehicle-mount computer may communicate in order to display user-selected information. The vehicle-mount computer's 20 central processing unit 25 is configured to control the storage and display of a zoomed view 100 of user-selected information 96 within user-selected area 95 on the computer touch screen 70. The vehicle-mount computer's 20 central processing unit 25 is configured to receive vehicle-motion information from motion detector 90. The vehicle-mount computer's 20 central processing unit 25 is also configured to disable or override computer lock-out or screen-blanking when user-selected information 96 has been selected, stored, and displayed. Additionally, as depicted in FIG. 6, vehicle-mount computer's 20 central processing unit 25 may be configured when the vehicle is moving to display additional associated information 105 that is relevant to, or more relevant than, the user-selected information 96 when the vehicle is in motion.


As a non-limiting example, and as illustrated in FIG. 6, the user-selected information 96 that is displayed in a zoomed view 100 may be information relating to a specific pallet or item that is to be picked up by a vehicle driver. When the vehicle moves, the information on the computer screen 70 could be changed or supplemented to display more relevant additional information 105, including but not limited to information such as the location of the pallet to be picked up. Through communication of the sensors in communication with the vehicle-mount computer 20 (i.e., the sensors of proximity sensor 80 and/or motion detector 90), when the vehicle arrives within a certain distance from the item, the computer screen 70 could display other relevant information, such as a waypoint along the travel path. The computer screen 70 could also, for example, switch back to the originally zoomed view 100 when the vehicle operator arrived within a certain distance from the item.


In a non-limiting embodiment, the communications network 85, in communication with the network interface 60, central processing unit 25, and/or other appropriate elements of the system 10, may facilitate the transmission of instructions such as a pick up or delivery schedule as well as relevant additional information 105 relating to the item listed on the schedule and that may be displayed on the computer touch screen 70.


To supplement the present disclosure, this application incorporates entirely by reference the following patents, patent application publications, and patent applications: U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127; U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,294,969; U.S. Pat. No. 8,408,469; U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,381,979; U.S. Pat. No. 8,408,464; U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,322,622; U.S. Pat. No. 8,371,507; U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,448,863; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0193407; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2012/0318869; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0068840; U.S. Patent Application Publication No. 2013/0070322; U.S. Patent Application Publication No. 2013/0075168; U.S. Patent Application Publication No. 2013/0056285; U.S. Patent Application Publication No. 2013/0075464; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2010/0225757; U.S. patent application Ser. No. 13/347,219 for an OMNIDIRECTIONAL LASER SCANNING BAR CODE SYMBOL READER GENERATING A LASER SCANNING PATTERN WITH A HIGHLY NON-UNIFORM SCAN DENSITY WITH RESPECT TO LINE ORIENTATION, filed Jan. 10, 2012 (Good); U.S. patent application Ser. No. 13/347,193 for a HYBRID-TYPE BIOPTICAL LASER SCANNING AND DIGITAL IMAGING SYSTEM EMPLOYING DIGITAL IMAGER WITH FIELD OF VIEW OVERLAPPING FIELD OF FIELD OF LASER SCANNING SUBSYSTEM, filed Jan. 10, 2012 (Kearney et al.); U.S. patent application Ser. No. 13/367,047 for LASER SCANNING MODULES EMBODYING SILICONE SCAN ELEMENT WITH TORSIONAL HINGES, filed Feb. 6, 2012 (Feng et al.); U.S. patent application Ser. No. 13/400,748 for a LASER SCANNING BAR CODE SYMBOL READING SYSTEM HAVING INTELLIGENT SCAN SWEEP ANGLE ADJUSTMENT CAPABILITIES OVER THE WORKING RANGE OF THE SYSTEM FOR OPTIMIZED BAR CODE SYMBOL READING PERFORMANCE, filed Feb. 21, 2012 (Wilz); U.S. patent application Ser. No. 13/432,197 for a LASER SCANNING SYSTEM USING LASER BEAM SOURCES FOR PRODUCING LONG AND SHORT WAVELENGTHS IN COMBINATION WITH BEAM-WAIST EXTENDING OPTICS TO EXTEND THE DEPTH OF FIELD THEREOF WHILE RESOLVING HIGH RESOLUTION BAR CODE SYMBOLS HAVING MINIMUM CODE ELEMENT WIDTHS, filed Mar. 28, 2012 (Havens et al.); U.S. patent application Ser. No. 13/492,883 for a LASER SCANNING MODULE WITH ROTATABLY ADJUSTABLE LASER SCANNING ASSEMBLY, filed Jun. 10, 2012 (Hennick et al.); U.S. patent application Ser. No. 13/367,978 for a LASER SCANNING MODULE EMPLOYING AN ELASTOMERIC U-HINGE BASED LASER SCANNING ASSEMBLY, filed Feb. 7, 2012 (Feng et al.); U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.); U.S. patent application Ser. No. 13/780,356 for a Mobile Device Having Object-Identification Interface, filed Feb. 28, 2013 (Samek et al.); U.S. patent application Ser. No. 13/780,158 for a Distraction Avoidance System, filed Feb. 28, 2013 (Sauerwein); U.S. patent application Ser. No. 13/784,933 for an Integrated Dimensioning and Weighing System, filed Mar. 5, 2013 (McCloskey et al.); U.S. patent application Ser. No. 13/785,177 for a Dimensioning System, filed Mar. 5, 2013 (McCloskey et al.); U.S. patent application Ser. No. 13/780,196 for Android Bound Service Camera Initialization, filed Feb. 28, 2013 (Todeschini et al.); U.S. patent application Ser. No. 13/792,322 for a Replaceable Connector, filed Mar. 11, 2013 (Skvoretz); U.S. patent application Ser. No. 13/780,271 for a Vehicle Computer System with Transparent Display, filed Feb. 28, 2013 (Fitch et al.); U.S. patent application Ser. No. 13/736,139 for an Electronic Device Enclosure, filed Jan. 8, 2013 (Chaney); U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson); U.S. patent application Ser. No. 13/750,304 for Measuring Object Dimensions Using Mobile Computer, filed Jan. 25, 2013; U.S. patent application Ser. No. 13/471,973 for Terminals and Methods for Dimensioning Objects, filed May 15, 2012; U.S. patent application Ser. No. 13/895,846 for a Method of Programming a Symbol Reading System, filed Apr. 10, 2013 (Corcoran); U.S. patent application Ser. No. 13/867,386 for a Point of Sale (POS) Based Checkout System Supporting a Customer-Transparent Two-Factor Authentication Process During Product Checkout Operations, filed Apr. 22, 2013 (Cunningham et al.); U.S. patent application Ser. No. 13/888,884 for an Indicia Reading System Employing Digital Gain Control, filed May 7, 2013 (Xian et al.); U.S. patent application Ser. No. 13/895,616 for a Laser Scanning Code Symbol Reading System Employing Multi-Channel Scan Data Signal Processing with Synchronized Digital Gain Control (SDGC) for Full Range Scanning, filed May 16, 2013 (Xian et al.); U.S. patent application Ser. No. 13/897,512 for a Laser Scanning Code Symbol Reading System Providing Improved Control over the Length and Intensity Characteristics of a Laser Scan Line Projected Therefrom Using Laser Source Blanking Control, filed May 20, 2013 (Brady et al.); and U.S. patent application Ser. No. 13/897,634 for a Laser Scanning Code Symbol Reading System Employing Programmable Decode Time-Window Filtering, filed May 20, 2013 (Wilz, Sr. et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A system comprising: a memory;an input unit;a display unit;a motion detector;a processing unit coupled to the memory, the input unit, the display unit, and the motion detector, wherein the processing unit is configured to: store user-selected information corresponding to information selected by a user via the input unit from amongst items displayed on the display unit and wherein the user-selected information is indicative of the information to be displayed on the display unit when the motion detector detects a motion;access motion information from the motion detector;identify, upon accessing motion information, no user-selected information being stored in the memory and based on the identification control the display unit to blank all of the items to be displayed on the display unit; anddetermine, upon accessing motion information, the user-selected information being stored in the memory and based on the determination control the display unit to override blanking of the all the items on the display unit and display user-selected information on the display unit.
  • 2. The system of claim 1, wherein the processing unit is configured to control the display unit for displaying a zoomed view of the user-selected information.
  • 3. The system of claim 2, wherein the zoomed view covers at least about 50% of display area of the display unit.
  • 4. The system of claim 2, wherein the processing unit is configured to control the display unit for displaying a zoomed view of the user-selected information and additional information associated with the user-selected information in response to detection of motion by the motion detector.
  • 5. The system of claim 1, wherein the user-selected information is one of: a navigational aid, an assistive information corresponding to picking of an item, and a zoomed view of textual information to be displayed on the display unit when the motion detector detects the motion.
  • 6. The system of claim 1, comprising a proximity sensor; wherein the processing unit is configured to receive location information from the proximity sensor; andwherein, in response to location information, the processing unit is configured to display on the display device additional information associated with the user-selected information.
  • 7. The system of claim 1, wherein the user-selected information is within a user-selected area on the display unit and wherein the user-selected area comprises any one of a circle and a rectangle encompassing information to be displayed on the display unit.
  • 8. The system of claim 1, wherein configuration of the display unit for the user-selected area is predetermined by the processing unit.
  • 9. The system of claim 1, wherein the motion detector comprises an accelerometer, a gps locator, a gyroscope, and/or a compass.
  • 10. A method comprising: monitoring motion associated with one of a computing system and a vehicle with a motion detector;determining an availability of user-selected information corresponds to information selected by a user, the information from amongst items to be displayed on a display unit and wherein the user-selected information is indicative of the information to be displayed on the display unit when the motion detector detects a motion;blanking all the items to be displayed on the display unit in response to the detection of motion if the user-selected information is determined to be not available;overriding blanking all of the items to be displayed on the display unit and displaying the user-selected information in response to the detection of motion if the user-selected information is determined to be available.
  • 11. The method of claim 10, comprising displaying one of: a navigational aid, an assistive information corresponding to picking of an item, and a zoomed view of textual information to be displayed on the display unit when the motion detector detects the motion.
  • 12. The method of claim 10, comprising supplementing information displayed on the display unit with additional information, based on the detection of the motion by the motion detector.
  • 13. The method of claim 10, comprising selecting an area on the display unit for providing the user-selected information and wherein the provided user-selected information is to be displayed in a zoomed view on the display unit upon detection of the motion by the motion detector.
  • 14. The method of claim 10, comprising: receiving, from a proximity sensor, location information corresponding to an object; anddisplaying, in response to location information, on the display device additional information associated with the user-selected information.
  • 15. A non-transient computer readable medium comprising instructions executable by a processing unit to perform the steps of: monitoring motion associated with a computing system by a motion detector;determining an availability of user-selected information corresponding to information selected by a user, the information from amongst items to be displayed on a display unit and wherein the user-selected information is indicative of the information to be displayed on the display unit when the motion detector detects a motion;blanking all the items to be displayed on the display unit in response to the detection of motion if the user-selected information is determined to be not available;overriding blanking of all the items to be displayed on the display unit and displaying the user-selected information in response to the detection of motion if the user-selected information is determined to be available.
  • 16. The non-transient computer readable medium of claim 15, comprising instructions executable by the processing unit for displaying one of: a navigational aid, an assistive information corresponding to picking of an item, and a zoomed view of information to be displayed on the display unit when the motion detector detects the motion.
  • 17. The non-transient computer readable medium of claim 15, comprising instructions executable by the processing unit for supplementing information displayed on the display unit with additional information based on the detection of the motion by the motion detector.
  • 18. The non-transient computer readable medium of claim 15, comprising instructions executable by the processing unit for selecting an area on the display unit for providing the user-selected information and wherein the provided user-selected information is to be displayed in a zoomed view on the display unit upon detection of the motion by the motion detector.
  • 19. The non-transient computer readable medium of claim 15, comprising instructions for: receiving from a proximity sensor, location information corresponding to an object; anddisplaying, in response to location information, on the display device additional information associated with the user-selected information.
  • 20. The non-transient computer readable medium of claim 15, comprising instructions for displaying a zoomed view of the user-selected information on the display unit in response to the detection of the motion if the user-selected information is determined to be available.
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. patent application Ser. No. 14/571,358 for a System and Method for Display of Information Using a Vehicle-Mount Computer filed Dec. 16, 2014 (and published Apr. 9, 2015 as U.S. Patent Publication No. 2015/0100196), now U.S. Pat. No. 9,682,625, which claims the benefit of U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer filed May 24, 2013 (and published Nov. 27, 2014 as U.S. Patent Application Publication No. 2014/0350782), now U.S. Pat. No. 8,918,250. Each of the foregoing patent applications, patent publications, and patents is hereby incorporated by reference in its entirety.

US Referenced Citations (642)
Number Name Date Kind
4821029 Logan et al. Apr 1989 A
5359515 Weller et al. Oct 1994 A
5689682 Peasley et al. Nov 1997 A
5850209 Lemke et al. Dec 1998 A
5949345 Beckert et al. Sep 1999 A
6094609 Arjomand Jul 2000 A
6226570 Hahn May 2001 B1
6574531 Tan et al. Jun 2003 B2
6690940 Brown et al. Feb 2004 B1
6832725 Gardiner et al. Dec 2004 B2
7050907 Janky et al. May 2006 B1
7082365 Sheha et al. Jul 2006 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7489303 Pryor Feb 2009 B1
7492255 Morris Feb 2009 B1
7567861 Inagaki Jul 2009 B2
7640101 Pair et al. Dec 2009 B2
7726575 Wang et al. Jun 2010 B2
7983840 Pair et al. Jul 2011 B2
8077143 Panabaker et al. Dec 2011 B2
8078359 Small et al. Dec 2011 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8346426 Szybalski Jan 2013 B1
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8514172 Panabaker et al. Aug 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Wang Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9443123 Hejl Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey May 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9360304 Chang et al. Jul 2016 B2
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein Aug 2016 B2
9412242 Van Horn et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van Volkinburg et al. Aug 2016 B2
9423318 Lui et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9682625 Hollifield Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
20020085043 Ribak Jul 2002 A1
20030125873 Yamaguchi et al. Jul 2003 A1
20070063048 Havens et al. Mar 2007 A1
20080211779 Pryor Sep 2008 A1
20090085863 Panabaker et al. Apr 2009 A1
20090134221 Zhu et al. May 2009 A1
20100090816 Hirsch et al. Apr 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100191457 Harada Jul 2010 A1
20110001614 Ghneim Jan 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120062455 Panabaker et al. Mar 2012 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120242687 Choi Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150239348 Chamberlin Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150310243 Ackley Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr et al. Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Linwood Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 DiPiazza et al. Jun 2016 A1
20160192051 DiPiazza et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggert et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160316190 McCloskey et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Geramine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van Horn et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschinie et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 d'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Jonas et al. Jul 2017 A1
20170193727 Van Horn et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
Foreign Referenced Citations (11)
Number Date Country
0913802 May 1999 EP
2805845 Nov 2014 EP
2398050 Aug 2004 GB
2490059 Oct 2012 GB
2517824 Mar 2015 GB
H0950235 Feb 1997 JP
9843192 Oct 1998 WO
03057522 Jul 2003 WO
2011140591 Nov 2011 WO
2013163789 Nov 2013 WO
2014058087 Apr 2014 WO
Non-Patent Literature Citations (6)
Entry
AVTech, Website at http://www.avtech.com.hk/eng/AVI321.htm; downloaded on Jun. 3, 2014; 1 page. Previously provided in parent application.
AVTech, “AVI311 PTZ Network Camera”,Jan. 2, 2013, 2 pages. Previously provided in parent application.
Search Report in counterpart European Application No. 14160386.0 dated Jul. 31, 2015, pp. 1-3 Previously provided in parent application.
European Exam Report in related EP Application No. 14160386.0, dated Jul. 29, 2016, 5 pages Previously provided in parent application.
Great Britain Search and Exam Report in Application GB1408963.1; dated Dec. 3, 2014; 8 pages. Previously provided in Parent Application.
Great Britain Second Exam Report in Application GB1408963.1; dated Nov. 20, 2015; 3 pages. Previously provided in Parent Application.
Related Publications (1)
Number Date Country
20180001769 A1 Jan 2018 US
Continuations (2)
Number Date Country
Parent 14571358 Dec 2014 US
Child 15624084 US
Parent 13902110 May 2013 US
Child 14571358 US