Indicia-reading systems having an interface with a user's nervous system

Information

  • Patent Grant
  • 10303258
  • Patent Number
    10,303,258
  • Date Filed
    Monday, November 21, 2016
    7 years ago
  • Date Issued
    Tuesday, May 28, 2019
    5 years ago
Abstract
Indicia-reading systems that interface with a user's nervous system include a device with electrodes capable of detecting electromagnetic signals produced in the brain or skeletal muscles of a user. The systems also include a computer with a processor and memory. The computer is configured to monitor the electromagnetic signals that are detected by the electrodes. The computer is also configured to perform operations in response to certain monitored electromagnetic signals. The computer may be an indicia-reading device configured to acquire indicia information in response to certain detected electromagnetic signals. The computer may also be a vehicle-mounted computer configured to sound an alarm in response to certain detected electromagnetic signals.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. patent application Ser. No. 14/735,717 for Indicia-Reading Systems Having an Interface with a User's Nervous System filed Jun. 10, 2015, now U.S. Pat. No. 9,507,974. Each of the foregoing patent application and patent is hereby incorporated by reference in its entirety.


FIELD OF THE INVENTION

The present invention relates to the field of indicia-reading systems and, more specifically, to indicia-reading systems that interface with a user's nervous system.


BACKGROUND

Indicia readers, such as barcode scanners, are typically configured to acquire information from indicia and then decode that information for use in data systems. Traditional indicia-reading systems embrace various kinds of devices used to read indicia, including handheld barcode scanners.


Handheld indicia-reading devices, such as handheld barcode scanners and mobile computers, are currently used in numerous environments for various applications (e.g., warehouses, delivery vehicles, hospitals, etc.). In this regard, a large percentage of retailers, notably grocery stores and general consumer merchandisers, currently rely on barcode technology to improve the efficiency and reliability of the checkout process. Traditionally, a user interacts with a handheld indicia-reading device via a trigger or a touchscreen display.


More recently, wearable computing devices (e.g., GOOGLE GLASS™ from Google, Inc.) have been developed. Wearable computing devices may be used in indicia-reading systems. As these types of devices become more common, the options through which users can interface with these devices and systems will change and expand as the demand for hands-free interface grows stronger.


Current hands-free interface options for computing systems include gesture optical recognition (i.e., mathematical interpretation of human motion by a computing device). Gesture recognition can originate from any bodily motion or state, but commonly originates from the hands. Gesture interface provides a useful building block for a hands-free interface, but does not offer a completely hands-free experience as it is actually more of a touch-free interface that still requires free hands for gesturing.


A technology that does offer the possibility of a completely hands-free and touch-free interface is the brain-computer interface. For example, electroencephalography (EEG) can be used to detect electrical activity in the brain. Traditional EEG testing in a medical or laboratory environment involves flat metal discs (electrodes) attached directly to the scalp to measure the electrical activity of the brain (i.e., to measure brain waves). Traditional EEG testing equipment is inadequate for more mainstream applications, however, because it involves equipment that requires shaving the head, affixing gelled electrodes to the scalp, etc.


Recent advances in EEG, however, open the ability to read electrical signals produced by the brain to more mainstream applications. For instance, companies such as Emotiv Systems, an Australian electronics company, have brought EEG devices to market that do not require shaving a user's head or gels of any kind to measure the electrical activity of the brain. One such device is the EMOTIVE INSIGHT™ from Emotiv Systems.


Another technology that opens the possibility to facilitate a hands-free or touch-free interface without the requirement of optical recognition of gestures is electromyography (EMG). EMG is a technique for evaluating and recording the electrical activity produced by skeletal muscles. EMG is performed using an instrument called an electromyograph to produce a record of activity called an electromyogram.


Recent advances in EMG have opened the ability to read electrical signals produced by skeletal muscles to more mainstream applications. Companies such as Thalmic Labs, Inc. of Ontario Canada have brought commercial EMG devices to market that are unobtrusive for a user to wear. These devices can connect wirelessly (via, for example, BLUETOOTH® protocols) to most modern day devices.


While traditional methods of user interaction with indicia-reading devices (such as via a trigger or touchscreen interface) are generally effective, the effectiveness of such traditional methods is not completely hands-free or touch-free.


Therefore, a need exists for more efficient and effective user interfaces for indicia-reading systems, including but not limited to indicia-reading systems that interface with a user's nervous system.


SUMMARY

Accordingly, in one aspect, the present invention embraces an indicia-reading system having an interface with a user's nervous system. The system may include a headset with electrodes capable of detecting electromagnetic signals produced in the brain of a user. The system may also include a indicia reader in communication with the headset, including a central processing unit and memory, an indicia capturing subsystem for acquiring information about indicia within the indicia-capturing subsystem's field of view, and an indicia-decoding subsystem configured for decoding indicia information acquired by the indicia-capturing subsystem. The indicia reader may be configured to monitor the electromagnetic signals detected by the headset.


In an exemplary embodiment, the indicia-reading system may include an indicia reader configured to perform an operation in response to electromagnetic signals detected by the headset.


In another exemplary embodiment, the indicia reader operation that may be performed in response to electromagnetic signals detected by the headset is acquiring information about indicia within the indicia-capturing subsystem's field of view.


In yet another exemplary embodiment, the indicia reader operation that may be performed in response to electromagnetic signals detected by the headset is placing the indicia reader into a different mode.


In yet another exemplary embodiment, the detected signals produced in the brain of the user may correspond to a facial expression.


In yet another exemplary embodiment, the detected signals produced in the brain of the user may correspond to a wink.


In yet another exemplary embodiment, the detected signals produced in the brain of the user may correspond to mental commands.


In yet another exemplary embodiment, the communication between the headset and the indicia reader may be wireless communication.


In yet another exemplary embodiment, the indicia reader may be a wearable computer.


In another aspect, the present invention may include an indicia-reading system having an interface with a user's nervous system including a band comprising electrodes capable of detecting electromagnetic signals produced in the skeletal muscles of a user. The system may also include a indicia reader in communication with the band, comprising a central processing unit and memory, an indicia capturing subsystem for acquiring information about indicia within the indicia-capturing subsystem's field of view, and an indicia-decoding subsystem configured for decoding indicia information acquired by the indicia-capturing subsystem. The indicia reader may be configured to monitor the electromagnetic signals detected by the band.


In an exemplary embodiment, the indicia reader may be configured to perform an operation in response to electromagnetic signals detected by the band.


In another exemplary embodiment, the indicia reader operation in response to electromagnetic signals detected by the band may be acquiring information about indicia within the indicia-capturing subsystem's field of view.


In yet another exemplary embodiment, the indicia reader operation in response to electromagnetic signals detected by the band may be placing the indicia reader into a different scanning mode.


In yet another exemplary embodiment, the detected signals produced in the skeletal muscles of a user may correspond to arm or hand gestures.


In yet another exemplary embodiment, the detected gesture may be a snap of two fingers.


In yet another exemplary embodiment, the detected gesture may be a clenched fist.


In yet another exemplary embodiment, the detected gesture may be a combination of hand, finger, or arm movements.


In yet another exemplary embodiment, the band may be an arm band configured to be worn on the user's forearm.


In another aspect, the present invention may include a vehicle safety system having an interface with a user's nervous system including a headset with electrodes capable of detecting electromagnetic signals produced in the brain of a user. The system may also include a vehicle computer including a central processing unit and memory in communication with the headset. The vehicle computer may be configured to monitor the electromagnetic signals detected by the headset.


In an exemplary embodiment, the vehicle computer may be configured to perform an operation in response to electromagnetic signals detected by the headset.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 graphically depicts a user wearing certain components of an exemplary indicia-reading system according to the present invention.



FIG. 2 graphically depicts a user wearing certain components of another exemplary indicia-reading system according to the present invention.



FIG. 3 is a block diagram illustrating certain components of an exemplary indicia-reading system according to the present invention.



FIG. 4 is a block diagram illustrating certain components of an exemplary system that interfaces with a user's nervous system according to the present invention.





DETAILED DESCRIPTION

The present invention embraces systems that interface with a user's nervous system. In particular, the present invention embraces hands-free indicia-reading systems that interface with a user's nervous system. Although indicia-reading systems are typically referred to herein, a person having skill in the art will recognize that the systems that interact with a user's nervous system may be utilized in other environments as set forth herein (e.g., for use with vehicle safety systems).


The term indicia as used herein is intended to refer broadly to various types of machine-readable indicia, including barcodes, QR codes, matrix codes, 1D codes, 2D codes, RFID tags, characters, etc. The indicia are typically graphical representations of information (e.g., data) such as product numbers, package tracking numbers, or personnel identification numbers. The use of indicia readers to input data into a system, rather than manual data entry, results in generally faster and more reliable data entry.


An exemplary indicia-reading system according to the present invention may include an electroencephalogram in the form of a headset that a user will wear and an indicia-reading device in electronic communication with the headset. When certain brainwave activity is detected by the headset, the system is configured to trigger operations of the indicia reader.


In another exemplary embodiment, an indicia-reading system according to the present invention may include an electromyograph band that a user may wear on their arm and an indicia reading device in electronic communication with the band. When certain skeletal muscle activity is detected by the band, the system is configured trigger operations of the indicia reader.


Non-limiting examples of typical indicia-reading devices may include handheld computers, handheld scanners, wearable computers, and similar products. Preferably, a wearable computer may be used in the exemplary embodiments disclosed herein for ease of user interface. References in the disclosure to particular types of devices are not intended to limit the disclosure to particular devices.


Referring now to the drawings, FIG. 1 depicts a user 101 wearing an exemplary indicia-reading system 100 having an interface with a user's nervous system, specifically an indicia-reading system having an interface with a user's brain.


The exemplary indicia-reading system 100 includes an indicia-reading wearable computer 102 (e.g., GOOGLE GLASS™ from Google, Inc.). Although a certain type of wearable computer 102 is depicted, various types of wearables or other kinds of devices that read indicia may alternatively be used (e.g., hand-held indicia readers such as trigger-type readers and mobile computing devices like smartphones).


The wearable computer 102 of the exemplary indicia-reading system 100 may include an indicia-capturing subsystem 103 (FIG. 1 and FIG. 3). In some instances, indicia-capturing subsystem 103 may include laser scanning subsystems that sweep light beams (e.g., a laser beam) across a scan path (i.e., a field of view), and then receive the optical signals that reflect or scatter off the indicium. Typically, in this type of embodiment the optical signal is received using a photoreceptor (e.g., photodiode) and is converted into an electrical signal. The electrical signal is an electronic representation of the indicia information (e.g., the data represented by the indicia). When in the form of an electrical signal, this information can be processed (e.g., decoded) by an indicia-decoding subsystem 104.


In other instances, the indicia-capturing subsystem 103 (FIG. 1 and FIG. 3) may include an imaging subsystem (e.g., the built-in camera of a smartphone, tablet, or wearable computer such as GOOGLE GLASS™) or some combination of an imaging subsystem and a laser scanning subsystem. The imaging subsystem captures digital images of objects within the subsystem's field of view 124 (FIG. 3) (e.g., 1D, 2D, and Postal barcodes).


When the indicia information takes the form of a digital image, the indicia information is typically processed by an indicia-decoding subsystem 104 (FIG. 3) through the use of image-processing software (e.g., optical character recognition (OCR) technology), which can both identify the presence of indicia in the digital image and decode the indicia. The components of indicia-decoding subsystem 104 are known in the art and may include a storage memory 118 for transmitting the signal to a central processing unit (CPU) 117 for processing digital signals. The exemplary wearable computer 102 may also include random access memory (RAM) 119, a read only memory (ROM) 120, and a mass storage device 121 (e.g., flash memory, a hard drive, etc.) with an operating system 122 and applications programs 123 (FIG. 3).


The exemplary indicia-reading system 100 also includes a EEG headset 105 (e.g., the EMOTIVE INSIGHT™ from Emotiv Systems, NEUROSKY® EEG biosensor from NeuroSky of San Jose, California, or similar devices) with a number of electrodes 106 capable of detecting signals produced in the brain of a user 101. For instance, EEG headset 105 is capable of producing a graph measurement of a user's 101 brain waves. The electrodes 106 are, for example, disks that conduct electrical activity, capture it from the brain, and convey it out through an amplifier.


As EEG technology has progressed, researchers (e.g., researchers at Emotiv Systems) have applied the technology to create high-fidelity brain computer interface systems that can read and interpret conscious and non-conscious thoughts as well as emotions. In this regard, the electrodes 106 of the exemplary indicia-reading system 100 can be used to record the resulting brain waves during a user's 101 concentration. Thereafter, the electrical activity of the user's 101 brain waves can be correlated based upon the recorded pattern to, for example, the user's 101 state of mind or, for example, to when the user 101 performs a facial expression (e.g., a wink, a smile, a frown, etc.).


A communication module pair 107A, 107B may be included respectively in the wearable computer 102 and the headset 105 of the exemplary indicia-reading system 100 for data communication. The wireless communication may include, but is not limited to, ZIGBEE® and BLUETOOTH® protocols. Although wireless communication is preferred (e.g., to provide the user with a greater range of motion), a wired connection may also be used.


Through the interface between the headset 105 and the wearable computer 102, EEG brainwave activity can be communicated in near real-time. For example, a software application program 123 running on the wearable computer 102 can monitor the user's 101 brainwave activity. The wearable computer 102 can be configured to trigger a scan event to the indicia-capturing subsystem 103 when the triggering event is detected using a software application program (such as, for example, SWIFTDECODER MOBILE™ barcode decoding software from Honeywell International, Inc.). By way of example, the relevant events to trigger a scan event to the indicia-capturing subsystem 103 may include a facial gesture such as a strong blink by the user 101, or a mental command such as when the user 101 focuses intensely on a particular location or imagines pushing a barcode away.


In addition to a trigger for a scan event to the indicia-capturing subsystem 103, mental commands or gesture commands can also be used to trigger any other operation feature in the indicia-reader 102 such as putting it into a different mode (e.g., presentation scanning), turning on and off the indicia-reader's illumination feature, or any other feature that the indicia-reader supports.


The software programs 123 can also, for example, be configured to recognize the direction a user 101 is looking in order to determine which indicia to return to the indicia-decoding subsystem 104 when multiple indicia are present in the field of view 124 (FIG. 3). The trigger command may be used to initially begin the scan event, and all indicia in the field of view 124 could then be captured by the indicia-capturing subsystem 103 and decoded by the indicia-decoding subsystem 104. If multiple indicia were present in the field of view, the indicia closest to the direction in which the user 101 was viewing can be captured by the indicia-capturing subsystem 103 and returned to the indicia-decoding subsystem 104.



FIG. 2 depicts certain components of another exemplary indicia-reading system 200 according to the present invention. The components of the exemplary system 200 depicted at FIG. 2 also provide an interface with a user's nervous system, but rather than interfacing with a user's brain as with system 100, the indicia-reading system 200 interfaces with a user's skeletal muscle activity.


A person having skill in the art will recognize that the relevant discussion with regard to the interface with a user's brain described above and depicted at FIGS. 1 and 3 is applicable to the indicia-reading system 200 that interacts with a user's skeletal muscle activity. Referring to the indicia reading system 200, the system may have elements 202, 203, 204, 205, 206, 207A, and 207B, which operate in a similar manner as corresponding elements 102, 103, 104, 105, 106, 107A, and 107B of indicia-reading system 100.


The exemplary indicia-reading system 200 includes an electromyography (EMG) band 205 that a user 201 may wear on their forearm (e.g., the MYO™ EEG arm-band developed by Thalmic Labs, Inc. of Ontario Canada or related devices). Similar to the discussion regarding EEG technology above, researchers (e.g., researchers at Thalmic Labs) have applied EMG technology to read the electrical activity of a user's muscles to allow for control of a device. In this regard, the band 205 contains a number of electrodes 206 that can read the electrical activity of a user's muscles.


An indicia-reading device, for example wearable computer 202, is in electronic communication to the band 205. The communication channels may be wired or wireless, but preferably includes wireless communication using a wireless communication module 207A, 207B.


When certain skeletal muscle activity is detected by the band 205, the system 200 is configured trigger operations of an indicia reader 202 using hardware and software programs of the type described above with reference to FIGS. 1 and 3. The trigger event may be based upon the detected electrical activity of the user's muscles such as when the user 202 performs an arm or hand gesture. The trigger may include, for example, the snap of two fingers, the rotation of the arm, clench of the fist, touching of two fingers, or various other combinations of hand, finger, or arm movements.


In addition to a trigger for a scan event, muscular activity commands can also be used to trigger other operations in the indicia-reader (e.g., wearable computer 202) such as putting it into a different mode (e.g., presentation scanning), tuning and off the indicia-reader's illumination feature, or any other feature that the indicia-reader supports. The EMG band 205 could also be used to holster an arm mounted device/computer in addition to providing a gesture recognition system.


In another exemplary embodiment, systems that interface with a user's nervous system may be utilized to control or monitor vehicles such as forklifts, cranes, delivery trucks and similar industrial vehicles (e.g., vehicles used in industrial operations, factory or warehouse settings, and the like). References in the disclosure to particular types of vehicles are not intended to limit the disclosure to particular vehicles.



FIG. 4 is a block diagram illustrating certain components of an exemplary system 300 that interfaces with a user's nervous system and that may be utilized to control or monitor vehicles, such as forklifts, according to the present invention. The exemplary system 300 is related to detecting or preventing a safety related incident quickly before or after it has occurred.


Some vehicle safety systems may use inertial sensors, cameras, or other sensors to detect safety-related events. The exemplary system 300 utilizes a user's brain response to an incident to trigger a notification/alarm or responsive action by the vehicle. Such events may include, but are not limited to, the imminent collision of a forklift and a person, an operator that is losing focus on a particular task at hand, or a driver falling asleep at the wheel. The system 300 is related to detecting these events and their warning signs. Further, system 300 may be utilized to prevent the occurrence of safety incidents.


The exemplary system 300 includes an EEG headset 305 which may be of the type described above with regard to indicia-reading system 100. The headset 305 includes electrodes 306 that conduct electrical activity, capture it from the brain of a user 301, and convey it out through an amplifier. A communication module 307 may be included for data communication.


The system 300 may also include a vehicle computer 320 which may be mounted within the applicable vehicle. Rather than a vehicle-mounted computer, other computing devices may alternatively be used (e.g., wearable or handheld computing devices). Exemplary vehicle computer 320 includes a mass storage device 340 (e.g., a solid state drive, optical drive, removable flash drive or any other component with similar storage capabilities) for storing an operating system 345 (e.g., WINDOWS® 7 and WINDOWS® EMBEDDED COMPACT (i.e., WINDOWS® CE) from MICROSOFT® CORPORATION of Redmond, Wash., and the LINUX® open source operating system) and various application programs 350. The mass storage device 340 may store other types of information as well.


Main memory 330 provides for storage of instructions and information directly accessible by central processing unit (CPU) 325. Main memory 330 may be configured to include random-access memory 332 (RAM) and read-only memory 334 (ROM). The ROM 334 may permanently store firmware or a basic input/output system (BIOS), which provide first instructions to vehicle-mount computer 320 when it is booted. RAM 332 may serve as temporary and immediately accessible storage for operating system 345 and application programs 350.


As illustrated in FIG. 4, computer touch screen 370 may be provided for inputting and displaying information using vehicle-mount computer 320. Computer touch screen 370 is operably connected to, and in communication with, vehicle-mount computer 320. Although touch screen 370 is illustrated in FIG. 4, other input devices (e.g., keyboard or mouse) or display devices may be utilized in connection with vehicle mount computer 320. The vehicle computer 320 may also include speaker 380 or other types of internal or external sound output devices.


As depicted in FIG. 4, the vehicle-mount computer 320 of the exemplary system 300 may also include network interface 365. Network interface 365 is operably connected to communications network 385, enabling vehicle-mount computer 320 to communicate with communications network 385. Communications network 385 may include any collection of computers or communication devices interconnected by communication channels. The communication channels may be wired or wireless (e.g., using BLUETOOTH® protocols). Examples of such communication networks include, without limitation, local area networks, the Internet, and cellular networks.


The connection to the communications network 385 allows vehicle computer 320 to communicate with the headset 305. The vehicle computer 320 may also be in communication with vehicle systems 381 such as a controlled braking system (e.g., wired or wireless communication). As described above with regard to system 100, the EEG headset 305 can monitor the user's 301 EEG activity in near real time and transmit the activity to the vehicle computer 320. The EEG headset 305 allows for the monitoring of attention, focus, engagement, interest, excitement, affinity, relaxation and stress, all of which can be used to make inferences into the activity being performed by the user 301.


In one embodiment, a sudden detection of high excitement from the user 301 could be used to trigger vehicle systems 381 such as a controlled breaking system on a forklift, given that a state of high excitement could be due to someone stepping in front of the vehicle. The predictive breaking system could go into effect before the forklift operator had time to consciously process what has happened and engage in an appropriate response.


In another embodiment, a user 301 might be operating a piece of heavy machinery and start to lose focus on the task at hand. The vehicle system 381 would then either perform a controlled slow down or completely stop the machinery until the operator 301 has given the task of operation their full attention.


In another embodiment, a motor vehicle user 301 could be monitored using the headset 305 for signs of drowsiness, which would sound an alarm through speakers 380, or a vehicle system 381 (e.g., an ignition lock-out system) could prevent the user 301 from operating the vehicle until the state of alertness was improved. A person having skill in the art will recognize that system 300 could be configured for use for multiple different safety/vehicle situations, and system 301 is not limited to the exemplary configurations referenced above.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266;
  • U.S. Pat. Nos. 7,159,783; 7,413,127;
  • U.S. Pat. Nos. 7,726,575; 8,294,969;
  • U.S. Pat. Nos. 8,317,105; 8,322,622;
  • U.S. Pat. Nos. 8,366,005; 8,371,507;
  • U.S. Pat. Nos. 8,376,233; 8,381,979;
  • U.S. Pat. Nos. 8,390,909; 8,408,464;
  • U.S. Pat. Nos. 8,408,468; 8,408,469;
  • U.S. Pat. Nos. 8,424,768; 8,448,863;
  • U.S. Pat. Nos. 8,457,013; 8,459,557;
  • U.S. Pat. Nos. 8,469,272; 8,474,712;
  • U.S. Pat. Nos. 8,479,992; 8,490,877;
  • U.S. Pat. Nos. 8,517,271; 8,523,076;
  • U.S. Pat. Nos. 8,528,818; 8,544,737;
  • U.S. Pat. Nos. 8,548,242; 8,548,420;
  • U.S. Pat. Nos. 8,550,335; 8,550,354;
  • U.S. Pat. Nos. 8,550,357; 8,556,174;
  • U.S. Pat. Nos. 8,556,176; 8,556,177;
  • U.S. Pat. Nos. 8,559,767; 8,599,957;
  • U.S. Pat. Nos. 8,561,895; 8,561,903;
  • U.S. Pat. Nos. 8,561,905; 8,565,107;
  • U.S. Pat. Nos. 8,571,307; 8,579,200;
  • U.S. Pat. Nos. 8,583,924; 8,584,945;
  • U.S. Pat. Nos. 8,587,595; 8,587,697;
  • U.S. Pat. Nos. 8,588,869; 8,590,789;
  • U.S. Pat. Nos. 8,596,539; 8,596,542;
  • U.S. Pat. Nos. 8,596,543; 8,599,271;
  • U.S. Pat. Nos. 8,599,957; 8,600,158;
  • U.S. Pat. Nos. 8,600,167; 8,602,309;
  • U.S. Pat. Nos. 8,608,053; 8,608,071;
  • U.S. Pat. Nos. 8,611,309; 8,615,487;
  • U.S. Pat. Nos. 8,616,454; 8,621,123;
  • U.S. Pat. Nos. 8,622,303; 8,628,013;
  • U.S. Pat. Nos. 8,628,015; 8,628,016;
  • U.S. Pat. Nos. 8,629,926; 8,630,491;
  • U.S. Pat. Nos. 8,635,309; 8,636,200;
  • U.S. Pat. Nos. 8,636,212; 8,636,215;
  • U.S. Pat. Nos. 8,636,224; 8,638,806;
  • U.S. Pat. Nos. 8,640,958; 8,640,960;
  • U.S. Pat. Nos. 8,643,717; 8,646,692;
  • U.S. Pat. Nos. 8,646,694; 8,657,200;
  • U.S. Pat. Nos. 8,659,397; 8,668,149;
  • U.S. Pat. Nos. 8,678,285; 8,678,286;
  • U.S. Pat. Nos. 8,682,077; 8,687,282;
  • U.S. Pat. Nos. 8,692,927; 8,695,880;
  • U.S. Pat. Nos. 8,698,949; 8,717,494;
  • U.S. Pat. Nos. 8,717,494; 8,720,783;
  • U.S. Pat. Nos. 8,723,804; 8,723,904;
  • U.S. Pat. Nos. 8,727,223; D702,237;
  • U.S. Pat. Nos. 8,740,082; 8,740,085;
  • U.S. Pat. Nos. 8,746,563; 8,750,445;
  • U.S. Pat. Nos. 8,752,766; 8,756,059;
  • U.S. Pat. Nos. 8,757,495; 8,760,563;
  • U.S. Pat. Nos. 8,763,909; 8,777,108;
  • U.S. Pat. Nos. 8,777,109; 8,779,898;
  • U.S. Pat. Nos. 8,781,520; 8,783,573;
  • U.S. Pat. Nos. 8,789,757; 8,789,758;
  • U.S. Pat. Nos. 8,789,759; 8,794,520;
  • U.S. Pat. Nos. 8,794,522; 8,794,526;
  • U.S. Pat. Nos. 8,798,367; 8,807,431;
  • U.S. Pat. Nos. 8,807,432; 8,820,630;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0138685;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0056285;
  • U.S. Patent Application Publication No. 2013/0070322;
  • U.S. Patent Application Publication No. 2013/0075168;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0200158;
  • U.S. Patent Application Publication No. 2013/0256418;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0278425;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306730;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0341399;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0002828;
  • U.S. Patent Application Publication No. 2014/0008430;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0027518;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061305;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0075846;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078342;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0084068;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100774;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0108682;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0160329;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166757;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0166760;
  • U.S. Patent Application Publication No. 2014/0166761;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175169;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0175174;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0203087;
  • U.S. Patent Application Publication No. 2014/0204268;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/436,337 for an Electronic Device, filed Nov. 5, 2012 (Fitch et al.);
  • U.S. patent application Ser. No. 13/771,508 for an Optical Redirection Adapter, filed Feb. 20, 2013 (Anderson);
  • U.S. patent application Ser. No. 13/852,097 for a System and Method for Capturing and Preserving Vehicle Event Data, filed Mar. 28, 2013 (Barker et al.);
  • U.S. patent application Ser. No. 13/902,110 for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Hollifield);
  • U.S. patent application Ser. No. 13/902,144, for a System and Method for Display of Information Using a Vehicle-Mount Computer, filed May 24, 2013 (Chamberlin);
  • U.S. patent application Ser. No. 13/902,242 for a System For Providing A Continuous Communication Link With A Symbol Reading Device, filed May 24, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 13/912,262 for a Method of Error Correction for 3D Imaging Device, filed Jun. 7, 2013 (Jovanovski et al.);
  • U.S. patent application Ser. No. 13/912,702 for a System and Method for Reading Code Symbols at Long Range Using Source Power Control, filed Jun. 7, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 13/922,339 for a System and Method for Reading Code Symbols Using a Variable Field of View, filed Jun. 20, 2013 (Xian et al.);
  • U.S. patent application Ser. No. 13/927,398 for a Code Symbol Reading System Having Adaptive Autofocus, filed Jun. 26, 2013 (Todeschini);
  • U.S. patent application Ser. No. 13/930,913 for a Mobile Device Having an Improved User Interface for Reading Code Symbols, filed Jun. 28, 2013 (Gelay et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/459,681 for an Electronic Device Enclosure, filed Jul. 2, 2013 (Chaney et al.);
  • U.S. patent application Ser. No. 13/933,415 for an Electronic Device Case, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/459,785 for a Scanner and Charging Base, filed Jul. 3, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,823 for a Scanner, filed Jul. 3, 2013 (Zhou et al.);
  • U.S. patent application Ser. No. 13/947,296 for a System and Method for Selectively Reading Code Symbols, filed Jul. 22, 2013 (Rueblinger et al.);
  • U.S. patent application Ser. No. 13/950,544 for a Code Symbol Reading System Having Adjustable Object Detection, filed Jul. 25, 2013 (Jiang);
  • U.S. patent application Ser. No. 13/961,408 for a Method for Manufacturing Laser Scanners, filed Aug. 7, 2013 (Saber et al.);
  • U.S. patent application Ser. No. 14/018,729 for a Method for Operating a Laser Scanner, filed Sep. 5, 2013 (Feng et al.);
  • U.S. patent application Ser. No. 14/019,616 for a Device Having Light Source to Reduce Surface Pathogens, filed Sep. 6, 2013 (Todeschini);
  • U.S. patent application Ser. No. 14/023,762 for a Handheld Indicia Reader Having Locking Endcap, filed Sep. 11, 2013 (Gannon);
  • U.S. patent application Ser. No. 14/035,474 for Augmented-Reality Signature Capture, filed Sep. 24, 2013 (Todeschini);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/055,234 for Dimensioning System, filed Oct. 16, 2013 (Fletcher);
  • U.S. patent application Ser. No. 14/053,314 for Indicia Reader, filed Oct. 14, 2013 (Huck);
  • U.S. patent application Ser. No. 14/065,768 for Hybrid System and Method for Reading Indicia, filed Oct. 29, 2013 (Meier et al.);
  • U.S. patent application Ser. No. 14/074,746 for Self-Checkout Shopping System, filed Nov. 8, 2013 (Hejl et al.);
  • U.S. patent application Ser. No. 14/074,787 for Method and System for Configuring Mobile Devices via NFC Technology, filed Nov. 8, 2013 (Smith et al.);
  • U.S. patent application Ser. No. 14/087,190 for Optimal Range Indicators for Bar Code Validation, filed Nov. 22, 2013 (Hejl);
  • U.S. patent application Ser. No. 14/094,087 for Method and System for Communicating Information in an Digital Signal, filed Dec. 2, 2013 (Peake et al.);
  • U.S. patent application Ser. No. 14/101,965 for High Dynamic-Range Indicia Reading System, filed Dec. 10, 2013 (Xian);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/154,207 for Laser Barcode Scanner, filed Jan. 14, 2014 (Hou et al.);
  • U.S. patent application Ser. No. 14/165,980 for System and Method for Measuring Irregular Objects with a Single Camera filed Jan. 28, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/166,103 for Indicia Reading Terminal Including Optical Filter filed Jan. 28, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/250,923 for Reading Apparatus Having Partial Frame Operating Mode filed Apr. 11, 2014, (Deng et al.);
  • U.S. patent application Ser. No. 14/257,174 for Imaging Terminal Having Data Compression filed Apr. 21, 2014, (Barber et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/274,858 for Mobile Printer with Optional Battery Accessory filed May 12, 2014 (Marty et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/300,276 for METHOD AND SYSTEM FOR CONSIDERING INFORMATION ABOUT AN EXPECTED RESPONSE WHEN PERFORMING SPEECH RECOGNITION, filed Jun. 10, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/305,153 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 16, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/310,226 for AUTOFOCUSING OPTICAL IMAGING DEVICE filed Jun. 20, 2014 (Koziol et al.);
  • U.S. patent application Ser. No. 14/327,722 for CUSTOMER FACING IMAGING SYSTEMS AND METHODS FOR OBTAINING IMAGES filed Jul. 10, 2014 (Oberpriller et al,);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/329,303 for CELL PHONE READING MODE USING IMAGE TIMER filed Jul. 11, 2014 (Coyle);
  • U.S. patent application Ser. No. 14/333,588 for SYMBOL READING SYSTEM WITH INTEGRATED SCALE BASE filed Jul. 17, 2014 (Barten);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/336,188 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, Filed Jul. 21, 2014 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/340,716 for an OPTICAL IMAGER AND METHOD FOR CORRELATING A MEDICATION PACKAGE WITH A PATIENT, filed Jul. 25, 2014 (Ellis);
  • U.S. patent application Ser. No. 14/342,544 for Imaging Based Barcode Scanner Engine with Multiple Elements Supported on a Common Printed Circuit Board filed Mar. 4, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/345,735 for Optical Indicia Reading Terminal with Combined Illumination filed Mar. 19, 2014 (Ouyang);
  • U.S. patent application Ser. No. 14/336,188 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, Filed Jul. 21, 2014 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/355,613 for Optical Indicia Reading Terminal with Color Image Sensor filed May 1, 2014 (Lu et al.);
  • U.S. patent application Ser. No. 14/370,237 for WEB-BASED SCAN-TASK ENABLED SYSTEM AND METHOD OF AND APPARATUS FOR DEVELOPING AND DEPLOYING THE SAME ON A CLIENT-SERVER NETWORK filed Jul. 2, 2014 (Chen et al.);
  • U.S. patent application Ser. No. 14/370,267 for INDUSTRIAL DESIGN FOR CONSUMER DEVICE BASED SCANNING AND MOBILITY, filed Jul. 2, 2014 (Ma et al.);
  • U.S. patent application Ser. No. 14/376,472, for an ENCODED INFORMATION READING TERMINAL INCLUDING HTTP SERVER, filed Aug. 4, 2014 (Lu);
  • U.S. patent application Ser. No. 14/379,057 for METHOD OF USING CAMERA SENSOR INTERFACE TO TRANSFER MULTIPLE CHANNELS OF SCAN DATA USING AN IMAGE FORMAT filed Aug. 15, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/460,387 for APPARATUS FOR DISPLAYING BAR CODES FROM LIGHT EMITTING DISPLAY SURFACES filed Aug. 15, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/460,829 for ENCODED INFORMATION READING TERMINAL WITH WIRELESS PATH SELECTON CAPABILITY, filed Aug. 15, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/446,387 for INDICIA READING TERMINAL PROCESSING PLURALITY OF FRAMES OF IMAGE DATA RESPONSIVELY TO TRIGGER SIGNAL ACTIVATION filed Jul. 30, 2014 (Wang et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 29/492,903 for an INDICIA SCANNER, filed Jun. 4, 2014 (Zhou et al.); and
  • U.S. patent application Ser. No. 29/494,725 for an IN-COUNTER BARCODE SCANNER, filed Jun. 24, 2014 (Oberpriller et al.).


In the specification and/or figures, typical embodiments and environments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. An indicia-reading system having an interface with a user's nervous system, comprising: a band comprising electrodes capable of detecting electrical signals produced in the skeletal muscles of a user when the user makes one or more facial gestures; anda hands-free indicia reader in communication with the band, comprising (i) a central processing unit and memory (ii) an indicia capturing subsystem for acquiring information about indicia within the indicia-capturing subsystem's field of view, and (iii) an indicia-decoding subsystem configured for decoding indicia information acquired by the indicia-capturing subsystem;wherein the hands-free indicia reader is configured to monitor the electrical signals detected by the band and to perform a scanning operation in response to an electrical signal detected by the band.
  • 2. The system according to claim 1, wherein the scanning operation comprises acquiring information about indicia within the indicia-capturing subsystem's field of view.
  • 3. The system according to claim 1, wherein the scanning operation comprises placing the indicia reader into a different scanning mode, the different scanning mode comprising activating or deactivating a feature supported by the indicia reader.
  • 4. The system according to claim 1, wherein the facial gesture comprises a blink or a wink by the user.
  • 5. The system according to claim 1, wherein the facial gesture comprises the user focusing intensely on a particular location.
  • 6. The system according to claim 5, wherein the particular location comprises a barcode.
  • 7. The system according to claim 1, wherein the facial gesture comprises looking in a particular direction.
  • 8. The system according to claim 1, wherein the system is configured to recognize, based at least in part on the electrical signals, a direction in which the user is looking.
  • 9. The system according to claim 8, wherein the system is configured to identify an indicia from among a plurality of indicia present in the field of view based at least in part on the direction in which the user is looking, and to cause the indicia reader to perform a scanning operation in respect of the identified indicia.
  • 10. An indicia-reading system having an interface with a user's nervous system, comprising: a band comprising electrodes capable of detecting electrical signals produced in the skeletal muscles of a user when the user makes one or more facial gestures; anda hands-free indicia reader in communication with the band, comprising (i) a central processing unit and memory (ii) an imaging module for capturing images of indicia within the imaging module's field of view, and (iii) an indicia-decoding subsystem configured for decoding indicia within images captured by the imaging module;wherein the system is configured to monitor the electrical signals detected by the band and to detect a triggering event; andin response to detecting the triggering event, to cause the imaging module to capture an image and cause the indicia-decoding subsystem to attempt to locate and decode an indicia in the captured image.
  • 11. The system according to claim 10, wherein the indicia reader is configured to perform an operation in response to electrical signals detected by the band.
  • 12. The system according to claim 11, wherein the indicia reader operation is capturing images of indicia within the imaging module's field of view.
  • 13. The system according to claim 11, wherein the indicia reader operation is placing the indicia reader into a different scanning mode, the different scanning mode comprising activating or deactivating a feature supported by the indicia reader.
  • 14. The system according to claim 10, wherein the facial gesture comprises a blink or a wink by the user.
  • 15. The system according to claim 10, wherein the facial gesture comprises looking in a particular direction.
  • 16. The system according to claim 10, wherein the system is configured to recognize, based at least in part on the electrical signals, a direction in which the user is looking.
  • 17. The system according to claim 10, wherein the facial gesture comprises the user focusing intensely on a particular location.
  • 18. The system according to claim 17, wherein the particular location comprises a barcode.
  • 19. The system according to claim 18, wherein the system is configured to identify an indicia from among a plurality of indicia present in the field of view based at least in part on the direction in which the user is looking, and to cause the indicia reader to perform a scanning operation in respect of the identified indicia.
US Referenced Citations (513)
Number Name Date Kind
4766299 Tierney Aug 1988 A
5191197 Metlitsky Mar 1993 A
5250790 Melitsky Oct 1993 A
5484992 Wilz Jan 1996 A
6234393 Paratore May 2001 B1
6806863 Howard Oct 2004 B1
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu et al. Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371505 Zolotov Feb 2013 B1
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Keamey et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9229526 Neglur Jan 2016 B1
9230140 Ackley Jan 2016 B1
9443123 Hejl Jan 2016 B2
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9507974 Todeschini Nov 2016 B1
20040245341 Shimoda Dec 2004 A1
20040249510 Hanson Dec 2004 A1
20050228515 Musallam Oct 2005 A1
20060231628 Wei Oct 2006 A1
20060258408 Tuomela et al. Nov 2006 A1
20070010756 Viertio-Oja Jan 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070124027 Betziza et al. May 2007 A1
20080212849 Gao Sep 2008 A1
20080228365 White et al. Sep 2008 A1
20080312551 Fadem Dec 2008 A1
20090040054 Wang et al. Feb 2009 A1
20090134221 Zhu et al. May 2009 A1
20090227965 Wijesiriwardana Sep 2009 A1
20090327171 Tan Dec 2009 A1
20100094502 Ito Apr 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100258618 Philbrick Oct 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110187640 Jacobsen et al. Aug 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110213511 Visconti et al. Sep 2011 A1
20120046531 Hua Feb 2012 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120224040 Wang Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130130799 Van Hulle et al. May 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130194200 Zanone Aug 2013 A1
20130204153 Buzhardt Aug 2013 A1
20130226408 Fung et al. Aug 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130296731 Kidmose et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002806 Buchel Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140050354 Heim Feb 2014 A1
20140059066 Koloskov Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Keamey May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140206323 Scorcioni Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140216174 Aberg Aug 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267142 MacDougall Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140285404 Takano Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140334083 Bailey Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150038231 Mahlmeister Feb 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150102562 Cooley Apr 2015 A1
20150109577 Haddadi Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150123890 Kapur May 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150141529 Hargrove May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150145805 Liu May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chang et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150212585 Latta Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150257673 Lawrence et al. Sep 2015 A1
20150272465 Ishii Oct 2015 A1
20150282760 Badower et al. Oct 2015 A1
20150286285 Pantelopoulos Oct 2015 A1
20150313496 Connor Nov 2015 A1
20150313497 Chang et al. Nov 2015 A1
20150313539 Connor Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150363082 Zhao Dec 2015 A1
20150374255 Vasapollo Dec 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160070439 Bostick Mar 2016 A1
20160103487 Crawford Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160132707 Lindbo et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171772 Ryznar Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160217621 Raghoebardajal Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160275483 Zhou Sep 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160321742 Phillips Nov 2016 A1
20170139484 Todeschini May 2017 A1
Foreign Referenced Citations (4)
Number Date Country
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (27)
Entry
Extended European Search Report in related European Application No. 17187839.0 dated Nov. 23, 2017, pp. 1-5.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Lie et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Bamdringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Wikipedia, “Evoked potential” downloaded from: https://en.wikipedia.org/wiki/Evoked_potential, Sep. 17, 2015, pp. 1-9.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
Related Publications (1)
Number Date Country
20170139484 A1 May 2017 US
Continuations (1)
Number Date Country
Parent 14735717 Jun 2015 US
Child 15357044 US