The present invention generally relates to human machine interfaces and more particularly relates to hands-free interactive devices for use in a vehicle.
Generally speaking, modern day society in the developed world has been significantly influenced by the technological advances in cellular phones, smartphones, and other types of mobile devices. When mobile devices are used in a vehicle, however, a driver can easily become distracted when his or her attention is diverted away from the primary responsibility of safely operating the vehicle. Every year, hundreds of thousands of accidents and even thousands of fatalities are attributed to drivers being distracted while driving on U.S. roads and highways. A large percentage of these distractions are caused by the use of cell phones and smartphones.
Many states have instituted laws prohibiting certain uses of mobile devices while driving, such as texting while driving. Despite their good intentions, however, many drivers continue to be distracted by their mobile devices while operating a vehicle.
Numerous devices have been developed over the years to attempt to reduce the level of driver distraction by simplifying certain actions, such as answering a cell phone call, placing a cell phone call, talking on the cell phone, etc. Many of these driver-assisted devices are configured to be mounted on the dashboard or windshield of the vehicle and can therefore impede the driver's view, creating another unsafe driving condition.
Therefore, a need exists for improved human machine interfaces for use in a vehicle. Particularly, human machine interfaces can be developed, as described in the present disclosure, to reduce the level of driver distraction and enable hands-free usage, yet would not interfere with the driver's view of his or her surroundings. Such devices can be used with any type of vehicle, such as an automobile, truck, delivery van, tractor trailer, etc.
Accordingly, in one aspect, the present invention embraces human machine interfaces for use in a vehicle. The human machine interfaces can be hands-free devices allowing the driver to maintain his or her hands on the vehicle's steering wheel at all times.
In an exemplary embodiment according to the teachings of the present disclosure, a human machine interface (HMI) comprises a housing configured to be mounted on a vehicle. A sensor, disposed in the housing, is configured to sense image input received from a driver of the vehicle. A microphone, also disposed in the housing, is configured to receive speech input received from the driver. A speaker, which is also disposed in the housing, is configured to provide audio output to the driver. The human machine interface further comprises a processing device disposed within the housing and coupled with the sensor, microphone, and speaker. The processing device is configured to process the image input and speech input.
In another exemplary embodiment according to the teachings of the present disclosure, another human machine interface (HMI) is described. The HMI comprises a housing configured to be mounted inside the cabin of a vehicle. The HMI further includes a gesture sensing device disposed in the housing. The gesture sensing device is configured to sense gestures of a driver of the vehicle. A microphone, disposed in the housing, is configured to receive speech input from the driver. A speaker, also disposed in the housing, is configured to provide audio output to the driver. The HMI further includes a heads-up display (HUD) projector disposed in the housing. The HUD projector is configured to project an image onto a windshield of the vehicle. Also, the HMI comprises a processing device disposed within the housing and electrically coupled with the gesture sensing device, microphone, speaker, and HUD projector. The processing device is configured to process gesture and speech input received from the driver. Also, the processing device is configured to generate audio output to be provided to the driver via the speaker and visual output to be provided to the driver via the HUD projector.
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
The present invention embraces hands-free human machine interfaces that can be used in a vehicle. The examples of the human machine interface (HMI) devices disclosed herein may be responsive to a user (preferably a driver of the vehicle seated in the driver's seat). In some embodiments, the HMI devices may also be responsive to other passengers within the cabin of the vehicle. The HMI devices of the present invention are able to receive speech input from a user (e.g., the driver of the vehicle) and can receive gesture input from the user.
The HMI devices disclosed herein enable the driver to remain focused on the primary responsibility of operating the vehicle, while allowing the driver to interact with other devices, such as mobile devices and global positioning system (GPS) devices. By establishing a short range communication channel (e.g., Bluetooth™) between the HMI device and the driver's mobile device (e.g., cell phone, smartphone, GPS device, etc.), the HMI device enables the driver to receive calls, place calls, request directions, etc. without physically touching the HMI device or mobile device. Instead, the driver can control the mobile device via the HMI device using voice commands and gestures.
Gesture input may be sensed by monitoring the positioning of the driver's head, eyes, mouth, shoulders, arms, hands, and/or fingers. Gesture input may also be sensed by monitoring the movement of the driver, including various dynamic actions such as tilting the head, blinking the eyes, moving the mouth, waving the hand, lifting one or more fingers, etc.
The HMI devices described in the present disclosure are intended to be easy to use without excessively distracting the driver while operating a vehicle. The driver can interact with the HMI devices hands-free. Also, the driver can receive information without looking away from the road.
Also, the HMI devices disclosed herein can be incorporated into a rear view mirror assembly, which can replace an existing rear view mirror of a vehicle. In other implementations, the HMI devices can be incorporated in a housing that is configured to be attached to an existing rear view mirror of the vehicle. By combining the HMI device with a rear view mirror assembly, according to the teachings of the present disclosure, the driver's ability to see through the windshield is not obstructed by supplemental devices.
In an exemplary embodiment, as shown in
As shown in the embodiment of
Attached to a bottom side of the housing 12 is an interface unit 24. The front side of the interface unit 24 (facing toward the rear of the vehicle) may include, for example, a microphone 26, a speaker 28, a rear facing camera 30, a first gesture sensor 32, and a second gesture sensor 34. In some embodiments, the interface unit 24 may include a single gesture sensing element or any number of gesture sensing elements instead of the two gesture sensors 32, 34 as shown. The interface unit 24 may also be configured to include multiple speakers in place of the single speaker 28.
The gesture sensors 32, 34, rear facing camera 30, and/or front facing camera 40 may comprise any suitable type of image sensing technology. For example, the sensors and cameras may use any combination of optical lenses, apertures, light sensors, image sensors, infrared sensors, or other suitable image sensing and processing devices.
In operation, the HMI 10 is configured to receive input from the user (e.g., the driver). Input may be in the form of voice commands from the driver received through the microphone 26. Other types of input may include gestures that are sensed by the first and second gesture sensors 32, 34. In addition to user-initiated input, the HMI 10 also detects other types of input, such as images sensed by the rear facing camera 30 and front facing camera 40.
The HMI 10 processes these inputs to perform various tasks. For example, user inputs may be processed to perform cellular phone actions, such as answering a cell phone call, placing a cell phone call to a particular person, determining a cell phone number of a particular person from a contact list stored in the cell phone, ending a cell phone call, adjusting volume levels, etc. In addition, user inputs may be processed by the HMI 10 to perform GPS actions, such as requesting information regarding the location of the vehicle, entering a destination to request navigational instructions to the destination, etc.
The HMI 10 provides output for the user in the form of audio signals via the speaker 28 and visual signals via the HUD projector 42. For cellular services, the speaker 28 may be configured to provide audio signals received from the caller on the other end of the “line.” The speaker 28 may also be configured to play various phone-type tones to indicate an incoming call, a disconnected call, a busy signal, etc. Regarding GPS services, the speaker 28 may be configured to provide voice instructions regarding the location of the vehicle, turn-by-turn instructions to reach a particular destination, distance to the particular destination, speed of the vehicle, etc.
Furthermore, according to some embodiments, the HMI 10 may include other output in addition to the audio output radiating from the speaker 28. Particularly, the HUD projector 42 may be configured to project images to assist the driver. The HUD projector 42 is preferably configured to project images onto the windshield of the vehicle, but may also be configured in some embodiments to project images on the mirror 22.
Regarding cellular services, the HUD projector 42 may be configured to provide information associated with the other caller. For example, information such as the caller's name, phone number, picture, etc. may be projected by the HUD projector 42, preferably in an area on the windshield or mirror 22 that does not obstruct the driver's view.
Regarding GPS services, the HUD projector 42 may be configured to provide information relating to the location and/or coordinates of the vehicle, directions to a particular destination, etc. The information regarding the location of the vehicle may include a name of a state, city, or county in which the vehicle is located, a name of a street, road, or highway on which the vehicle is currently operating, etc. Information regarding directions may include the name or names of one or more upcoming streets, roads, or highways on which the vehicle is to be turned, arrows showing upcoming turning directions, lines and road names showing intersecting roads, etc.
Not only is the HMI 10 capable of assisting with cellular and GPS services, but also it may be configured to provide other information to the driver that is not in response to user-initiated commands. For example, the rear facing camera 30 and front facing camera 40 are capable of capturing images that can be processed to determine the status of the vehicle. The HMI 10 may be configured to process the images from the cameras 30, 40 to determine whether an accident is imminent, whether objects in the general path of the vehicle may be obstacles, if the vehicle appears to be operated unsafely, if traffic laws are not being followed or may potentially be broken, etc.
When certain situations are recognized, the HMI 10 may be configured to provide an alert, warning, or other type of signal to the driver. The type of signal and characteristics of the signal provided to the user may be dependent on the imminence of an accident, severity of an unsafe condition, etc. The alerts, warnings, and other signals may be provided as audio signals (e.g., beeps, tones, verbal instructions, etc.) via the speaker 28 and/or as visual signals (e.g., flashing lights, warning terminology, symbols, etc.) via the HUD projector 42.
It should be noted that the HMI 10 may be mounted inside the cabin of the vehicle in a conventional manner near the top center of the windshield. In other embodiments, the HMI 10 may be mounted at the location of one or both of the side mirrors on the vehicle, particularly if the vehicle is a tractor trailer and the trailer would obstruct the view through a conventional rear view mirror. The HMI 10 may also be mounted on any rear view mirror, side mirror, or other mirror located inside or outside the cabin of any type of vehicle. Furthermore, the HMI 10 may be mounted on the windshield, side mirrors, gas tank, or frame of a motorcycle.
The internal circuitry 46 of the HMI 10 includes a processing device 50 configured to perform numerous operations. The processing device 50 may be a general-purpose or specific-purpose processor or microcontroller for controlling the operations and functions of the HMI 10. In some implementations, the processing device 50 may include a plurality of processors for performing different functions within the HMI 10.
The processing device 50 may include analog-to-digital converters for converting analog signals to digital signals. For example, analog audio signals may be received from the microphone 26 and/or analog video images may be received from the gesture sensors 32, 34 and/or cameras 30, 40. The processing device 50 may also include digital-to-analog converters for converting digital signals to analog signal for output to the speaker 28 and/or HUD projector 42.
The circuitry 46 further comprises memory 52, which may include volatile and non-volatile memory. In some embodiments, the memory 52 may store software programs allowing the processing device 50 to execute various functions as described herein. The memory 52 may include one or more internally fixed storage units, removable storage units, and/or remotely accessible storage units, each including a tangible storage medium. The various storage units may include any combination of volatile memory and non-transitory, non-volatile memory. For example, volatile memory may comprise random access memory (RAM), dynamic RAM (DRAM), etc. Non-volatile memory may comprise read only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory, etc. The storage units may be configured to store any combination of information, data, instructions, software code, etc.
A power source 54 disposed in the housing 12 may be used to provide electrical power to the circuitry 46. For example, the power source 54 may include one or more batteries or may include an electrical adapter connected to the vehicle's battery. When the power source 54 is embodied as an electrical adapter, the adapter is capable of converting the vehicle battery's voltage (e.g., about 12 volts) to an appropriate voltage level (e.g., about 2.5 volts) for powering the circuitry 46.
The circuitry 46 further includes user input devices, such as the microphone 26, rear facing camera 30, and forward facing camera 40. A gesture sensing unit 56 is another user input device that operates in coordination with the first and second gesture sensors 32, 34 to obtain gesture input. The user input received from the user input devices is provided to the processing device 50, which analyzes the input according to various duties. The processing device 50 then provides output to the user through output devices, such as the speaker 28 and HUD projector 42.
In some embodiments, the input devices and output devices may include fewer devices or more devices than what is illustrated in
The circuitry 46 shown in
The short range communication unit 58 may utilize Bluetooth™, Bluetooth low energy, or other short range wireless radio frequency technology to create a piconet with one or more mobile devices (e.g., cellular devices, smartphones, personal digital assistants (PDAs), etc.). In this manner, the user can control the one or more mobile devices by way of the HMI 10 using spoken commands and/or gestures.
Also, the short range communication unit 58 may be configured with near field communication (NFC) capabilities. When a mobile device is brought within a short range (e.g., within about 10 cm) of the HMI 10 or is tapped against the HMI 10, a link is established between the mobile device and the HMI 10, allowing further communication between the two devices. In some embodiments, an additional NFC unit and antenna may be incorporated in the housing 12 to enable NFC operations. According to other implementations, the short range communication unit 58 may be configured with radio frequency identification (RFID) technology to establish a link between the mobile device and HMI 10 when the two devices are brought into proximity with one another.
The GPS communication unit 62 may communicate with GPS satellites to determine the position and travel direction of the vehicle with respect to Earth coordinates. The GPS communication unit 62 may be configured to store and/or download road maps and determine turn-by-turn directions to a specific destination.
The task execution component 70, according to some embodiments, may comprise an ordered listing of executable instructions for implementing logical functions. The instructions can be embodied in any non-transitory, computer-readable medium for use by an instruction execution system or device, such as a computer-based system, processor-controlled system, etc.
As shown in the embodiment of
The gesture recognition module 72 is configured to receive images from the first and second gesture sensors 32, 34 and determine when the driver makes certain predefined gestures for communicating various commands. The gesture recognition module 72 may be configured to recognize the positioning of the driver's hands and fingers, movement of the driver's eyes, head, hands, etc. The detected gestures and/or interpretations of the gestures can be communicated to the processing device 50 as specific commands for further processing.
The speech recognition module 74 is configured to receive speech signals from the microphone 26 and determine specific vocal commands. The received speech is analyzed with respect to predefined audible commands to determine the user's commands. The commands are communicated to the processing device 50 for further processing.
The accident analysis module 76 is configured to receive visual input from the front facing camera 40. Images are analyzed in real time to determine if an accident is imminent. The obstacle warning module 78 may be configured to receive visual input from the front facing camera 40 and/or the rear facing camera 30. Images are analyzed by the obstacle warning module 78 to determine if one or more objects pose a threat as potentially being an obstacle in an estimated path of the vehicle based on the current status of the vehicle.
The navigation module 80 is configured to operate in conjunction with the GPS communication unit 62 to receive input regarding the earth location of the vehicle, speed of the vehicle, etc. The navigation module 80 is also configured to receive audible commands from the user via the microphone 26 and speech recognition module 74. The navigation module 80 processes the user commands relevant to navigation or other GPS related information. For example, the user may speak a command to start a navigational mode and speak a destination or address. The navigation module 80 may determine directions or other information based on the current status of the vehicle, commands, pre-stored road maps, etc.
The HUD video generation module 82 and HUD combiner module 84 may operate together to provide images that are projected onto the windshield of the vehicle. Based on the services being provided, the HUD video generation module 82 generates different types of video images for assisting the driver in various ways. If the accident analysis module 76 or obstacle warning module 78 detects a potential issue, the HUD video generation module 82 may create video images to accentuate or highlight the potential issue.
The HUD combiner module 84 is configured to obtain live images, such as those received from the front facing camera 40. The live images are combined with the images produced by the HUD video generation module 82. The HUD combiner module 84 can match up or align the generated images with the live images such that the driver can essentially see an overlay of supplemental information (e.g., highlighted potential obstacles, speed of vehicle, directional instructions, cellular caller information, etc.) on the windshield. The HUD combiner module 84 can be adjusted for the specific shape or design of the windshield and angle at which the HUD projector 42 is directed to the windshield.
It should be noted that the task execution component 70 may include fewer or more modules than those shown in
Additional modules may be incorporated into the HMI 10. For example, driver fatigue sensors may be used in cooperation with the gesture sensors 32, 34 to determine if the driver's eyes are drooping or if the driver's head is nodding. Also, music selection systems may be incorporated into the HMI 10 to enable the user to select a radio station or select a song stored on a compact disc, flash memory device, MP3 player, or other medium or device in communication with the HMI 10, allowing the driver to select music hands-free.
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
Number | Name | Date | Kind |
---|---|---|---|
6832725 | Gardiner et al. | Dec 2004 | B2 |
7128266 | Marlton et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Suzhou et al. | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Van Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Horn et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van Horn et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8736909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van Horn et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | Akel et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
20040246607 | Watson et al. | Dec 2004 | A1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedraro | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140034734 | Sauerwein | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140100813 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140121438 | Kearney | May 2014 | A1 |
20140121445 | Ding et al. | May 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197238 | Lui et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van Horn et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150071818 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150085127 | Kramer | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150169925 | Chang et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
Number | Date | Country |
---|---|---|
2013163789 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
Entry |
---|
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed on Aug. 19, 2014 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages. |
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages. |
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages. |
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages. |
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages. |
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages. |
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages. |
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages. |
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages. |
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages. |
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages. |
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages. |
U.S. Appl. No. 14/748,446 for Cordless Indicia Reader With a Multifunction Coil for Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages. |
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages. |
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages. |
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages. |
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Sian et al.); 22 pages. |
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et. al.); 23 pages. |
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages. |
U.S. Appl. No. 14/568,305 2014 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages. |
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages. |
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages. |
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages. |
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages. |
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages. |
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages. |
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages. |
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages. |
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages. |
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages. |
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages. |
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages. |
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages. |
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger er al.); 41 pages. |
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages. |
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages. |
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Heji); 25 pages. |
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Heji); 38 pages. |
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages. |
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages. |
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages. |
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages. |
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages. |
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages. |
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages. |
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et. al.); 60 pages. |
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages. |
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages. |
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages. |
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 44 pages. |
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages. |
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User'S Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages. |
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using Wildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages. |
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages. |
U.S. Appl. No. 14/687,289 for System for Communication Via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages. |
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages. |
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages. |
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages. |
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages. |
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages. |
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages. |
Number | Date | Country | |
---|---|---|---|
20160325677 A1 | Nov 2016 | US |