The present invention relates to improvements in speech recognition.
In environments where mobile devices are performing voice recognition, many factors in the environment can negatively impact speech recognition performance. For example, when such systems are utilized in an environment wherein the ambient noise level changes from location to location (e.g., on a factory or warehouse floor) the ability of the mobile device to perform accurate speech recognition can vary depending upon the user's proximity to noise sources such as production machinery.
Therefore, a need exists for a mechanism to cope with variable sources of audible noise that interferes with accurate speech recognition.
Accordingly, in one aspect, the present invention embraces a mechanism for changing gain and other audio system characteristics based upon location of the portable device.
In an example embodiment, a device has a network interface that receives a set of instructions from a server, the instructions comprising a sequence of at least one location and audio properties associated with the at least one location. An audio circuit receives audio signals picked up by a microphone and processes the audio signals in a manner defined by the audio properties associated with the at least one location. A speech recognition module receives processed signals from the audio circuit and carries out a speech recognition process thereupon.
In accord with certain example embodiments, audio signals picked up by the microphone are stored and conveyed to a server. In accord with certain example embodiments, the speech recognition module utilizes a user template that characterizes speech of a particular user to enhance recognition accuracy. In accord with certain example embodiments, the audio circuit comprises an amplifier and where the gain of the amplifier is set by the audio properties for the at least one location. In accord with certain example embodiments, the audio circuit comprises a noise comparison circuit that compares the audio with a noise model defined by the audio properties, and where the audio from the microphone is discarded if the audio matches the noise model. In accord with certain example embodiments, the audio properties for the at least one location are loaded after receiving a confirmation that the terminal has arrived at the at least one location. In accord with certain example embodiments, the confirmation comprises an audio signal picked up by the microphone. In accord with certain example embodiments, a speech synthesizer synthesizes speech instruction from the set of instructions.
In another example embodiment, a portable terminal has a wireless network interface that receives a set of instructions from a server, the instructions comprising a sequence of at least one location and audio properties associated with the at least one location. An audio circuit receives audio signals picked up by a microphone and processes the audio signals in a manner defined by the audio properties associated with the at least one location. The audio circuit has an amplifier and the gain of the amplifier is set by the audio properties for the at least one location. The audio circuit may include a noise comparison circuit that compares the audio with a noise model defined by the audio properties, and where the audio is discarded if the audio matches the noise model. A speech recognition module receives processed signals from the audio circuit and carries out a speech recognition process thereupon. A speech synthesizer synthesizes speech instruction from the set of instructions.
In accord with certain example embodiments, audio signals picked up by the microphone are stored and conveyed to a server. In accord with certain example embodiments, the speech recognition module utilizes a user template that characterizes speech of a particular user to enhance recognition accuracy. In accord with certain example embodiments, the audio properties for the at least one location are loaded after receiving a confirmation that the terminal has arrived at the at least one location. In accord with certain example embodiments, the confirmation comprises an audio signal picked up by the microphone.
In another example embodiment, a method of processing speech signals at a portable terminal involves: receiving a set of instructions from a server; the set of instructions include at least one location, a set of actions to be carried out at the at least one location, and a set of audio processing parameters associated with the at least one location; synthesizing a speech command to proceed to the at least one location; receiving a speech signal from a microphone confirming arrival at the at least one location; loading the audio processing parameters associated with the at least one location; and processing speech signals received from the microphone using the audio processing parameters associated with the at least one location.
In certain example methods, audio signals picked up by the microphone are stored. In accord with certain example embodiments, the speech recognition module utilizes a user template that characterizes speech of a particular user to enhance speech recognition accuracy. In accord with certain example embodiments, the audio processing parameters include an amplifier gain, and where the amplifier gain establishes the gain of an amplifier that amplifies signals from the microphone. In accord with certain example embodiments, the method further involves comparing audio signals received at the microphone with a noise model defined by the audio processing parameters, and where the audio from the microphone is discarded if the audio matches the noise model. In accord with certain example embodiments, the audio processing parameters include at least one of a compression value, and a frequency response parameter that processes signals from the microphone. In accord with certain example embodiments, the audio properties for the at least one location are loaded after receiving a confirmation that the terminal has arrived at the at least one location.
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
The present invention embraces a mechanism for utilizing workflow progression information to control various audio characteristics for processing a received speech signal from a user for further processing by speech recognition components.
In an exemplary embodiment, a system such as the Vocollect™ system produced by Honeywell International, Inc. (e.g., including a portable device) is utilized to conduct various transactions. In one example, as depicted by system 10 in
In one example, the system 10 may be utilized in a warehouse or production floor to provide working instructions for user 14. For example, user 14 may be responsible for picking items from shelves in a warehouse to help fulfill a customer's order. In another example, user 14 may be responsible for picking production parts and delivering them to other workers on a factory floor. In either case, the user works from “pick instructions” conveyed by the terminal 18 to the user 14.
While in the usage area (e.g., a warehouse), the wireless portable terminal 18 communicates with server 26 to transfer many types of information. If the terminal 18 knows its location or workflow progression at all times, it can relay information about its location along with the aforementioned data. One example of this data tagged with this location information could be the local noise levels or an actual audio sampling of the noise.
As user 14 operates and moves about the area, the system can ‘learn’ where the noise levels are highest and lowest, as well as what the characteristics of that noise are, and adapt itself accordingly to improve recognition. For instance, knowing that a certain area is particularly noisy, the system can automatically adjust the input gain applied to signals from microphone 34 and/or adjust a noise model to better cope with the ambient noise levels in the environment.
In one example embodiment that is common for use of this type of system, the user 14 starts off receiving a set of pick instructions from the server which might include, for example, the task of picking three items as follows:
Pick quantity 2 from aisle 7, bin 4.
Pick quantity 1 from aisle 8, bin 43.
Pick quantity 3 from aisle 12, bin 77.
It is noted that the present teachings are not limited to “pick instructions” per se, but rather are applicable to any set of instructions that are used to direct a user 14 about to accomplish a task or to any situation where a user is moving around and utilizing speech recognition. Returning to this example, this pick information is conveyed to the wireless portable terminal 18 which aids the user 14 in completing the pick by first telling the user (by speech synthesis and/or display) to first proceed to aisle 7. The message might be “go to aisle 7 and then say ‘ready’”. The user 14 can then proceed to aisle 7 and acknowledge arrival at that location (isle 7) by saying “ready” into microphone 34. At this point, the wireless portable terminal 18 will know that the user is at aisle 7. In one embodiment, at this point the terminal 18 can monitor the ambient sounds for use in characterizing the environment of aisle 7. Data representing the ambient noise at aisle 7 can be stored for later transmission back to the server 26 or may be immediately sent back to server 26 if the network connection is solid, or queued to be sent later.
Also, once the user has confirmed the location (isle 7), if the location is known to have a high background noise level, the terminal 18 can reduce the audio gain, adjust frequency response, adjust compression or utilize one or more a noise models to improve the quality of the speech recognition. This knowledge, in certain examples, comes as a part of the pick instructions directing the user 14 to each particular location.
Once the user 14 has confirmed his or her location at aisle 7, terminal 18 can provide instructions to go to bin 4 with an instruction such as “go to bin 4 and then say ‘ready’”. The user acknowledges arrival by saying “ready”. When the user's location is refined (i.e., the user is now at bin 4), the audio characteristics may be further adjusted if desired (or, a single set of characteristics can be used for the entire aisle). For example, one end of aisle 7 may be close to a noisy machine while the other end of the aisle may be considerably quiet. In other examples, an average can be used for each general location or the noise level can be characterized with any degree of precision desired.
Once the user has acknowledged arrival by saying “ready”, the terminal 18 provides the instruction “pick quantity 2 and then say ‘ready’”. When the user says “ready”, the terminal 18 proceeds to the second location (isle 8, bin 43) and after that provides a similar set of instructions for the third location (isle 12, bin 77). After the pick is completed, the user may receive a final instruction to deliver all items to a particular location. Once that is accomplished, the user may again say “ready” and a new pick instruction set will be downloaded to the terminal 18 from the server 26.
When one or more noise models are used in the above example, the noise model(s) may define the characteristics of noise present at a particular location in such a manner that when the terminal 18 receives an audio signal the signal is first checked against the noise model. If there is a match to the noise model, the audio is presumed invalid and is marked as noise and not interpreted as spoken words.
While ideally, every possible location might be characterized, in practice it may be that significant benefit is only obtained by characterizing a few of the noisier locations in a particular environment.
With reference to
It is noted that the changes that can be made to adapt to various locations is not limited to gain settings or a single noise model. Gain, frequency response, compression settings, and noise models are among the options of characteristics of audio processing that can be manipulated in accord with the present teachings. Moreover, even within a noise model, variables might be adjusted, as opposed to replacing the complete model (though that could be considered equivalent). Even in the search algorithms of the recognizer the weighting of noise models relative to the rest of the words that are being matched can be adjusted to make it more likely that a noise model will be accepted as a “match” to audio overall, even if one did not change anything about the noise model.
The word “terminal” as used herein can be interpreted as a wireless headset that is connected to a processor that is not portable. In such example, the microphone is moving around with the user, but not the processer (recognizer or dialog engine). In other embodiments, a similar system could be implemented without WiFi, in which case the user plugs a headset in at the start, get the information, do your route, then plug in afterwards to upload the results of the route. Many variations will occur to those skilled in the art without departing from the present teachings.
Speech recognition can also be made more reliable by using a limited vocabulary—for example, “start”, “ready”, “complete”, “back” and numerical digits 0-9 and perhaps a few additional commands.
Since the reliability of the speech recognition is desirably very high, and further desirably carried out at the portable terminal 18, limited vocabulary and individualized training are desirable, but should not be considered limiting on the present teachings.
If the system is individualized to each user, the user's particular speech recognition template that characterizes the user's speech can be downloaded at 110 from the server to the portable device. After this, a set of pick instructions (or other instructions containing locations and audio settings relating to each or certain of the locations is downloaded at 114. The pick operation (for example) can then begin starting with the first destination at 118. At 122, the portable terminal generates a speech (and/or displayed) instructions for the user that instructs the user to go to this destination. The user can then proceed to go to the designation location and confirms arrival at that location when he or she arrives at 126.
Now that the terminal 18 knows that it has arrived at the designated location, terminal 18 can load and apply gain and noise model information from the pick instructions at 130 for use until the user moves to a new location. The portable terminal and the user can now proceed with a dialog in which the terminal 18 conveys instructions to the user 14 at 134 in the form of speech and/or displayed text telling the user 14 what operation(s) to carry out at the location and in which the user provides confirmations that are used to confirm completion of actions at 138. The sequence of 134 and 138 may repeated as the user proceeds through the dialog.
Once the user has completed the pick operation and has acknowledged such (e.g., by saying “ready”), the portable terminal determines if the destination is the last in the pick instruction at 142. If not, the process proceeds to the next destination at 146 and control passes back to 122 to advance to the next destination. But, if the last destination has been processed at 142, the terminal generates instructions to return to a designated area (e.g., a shipping department) with the picked items at 150. Further instructions may be generated as required for a particular setting. At this point, the user may also receive a new set of pick instructions at 154 and the process begins again for the new pick instructions starting at 114. Many variations will occur to those skilled in the art upon consideration of the present teachings.
Turning now to
In the present examples, gain and noise model processing has been described, but other audio processing could also be implemented and adjusted by location (e.g., equalization, amplitude compression, filtering, etc.) without limitation.
Referring to
In carrying out this process, the currently received audio can be used locally at the portable terminal 18 in the processing by calculating a gain and noise model, for example, based on the currently received audio. This calculated gain and noise model can be used if significantly different than that stored for the particular location in certain embodiments. In other embodiments, the actual gain may be the average of that calculated and that received with the pick instruction. The noise model used may similarly be a combination of the noise model saved with the pick instructions and the currently calculated noise model. Many variations will occur to those skilled in the art upon consideration of the present teachings.
The functions discussed above are carried out by processor 308 utilizing programming stored in the memory 312 and 316. In this example, particular functional modules are depicted in RAM 316 that represent various functions discussed. Operating system 350 carries out the functions normally associated with an operating system (e.g., Linux or Android). The speech recognition module 354 carries out speech processing to convert speech received via the microphone 34 to a message understood by the terminal 18. The speech synthesis module 358 generates synthesized speech that is conveyed to the user via headset 30. The user template 362 provides information that is used by the speech recognition module 354 to improve the accuracy of recognition of speech by a particular user. Pick instructions are stored as data at 366 for use as described by parsing the instructions to generate speech and to load various audio processing parameters used by audio processing module 370 in conjunction with other audio circuits such as 374 to affect gain, noise model, etc. The speech recognition module may be implemented as a hardware module or as a processor utilizing speech recognition processes defined by 354. Many variations are possible without departing from the present teachings.
In the present embodiments, the location information is first provided by the server in the pick instructions and confirmed by the user upon arrival at the location. In certain embodiments, the location information can also be provided by or supplemented by GPS data using a GPS receiver forming a part of the terminal 18 (not shown) or other position determination mechanisms without limitation. The GPS information can be used to enhance the accuracy of the user's location or can be used independently without limitation.
In accord with certain embodiments, each location may not have to be characterized for audio parameters. The audio parameters may be represented as deviations from a normal setting (e.g., instructions on how much gain to add or subtract from normal), and the normal setting (e.g., gain) may be suitable for a wide variety of inputs.
In certain example embodiments, the usual workflow is that the terminal prompts “go to aisle X and then say ready” and then “go to bin Y and then say ready”. The audio characteristics are applied according to one example at the location after the user confirms that location. However, variations can be implemented without departing from the present teachings.
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
While the present discussion uses example embodiments shown as flow charts, equivalent hardware equivalents are also possible. Also, the order of certain operations of the flow charts may be modified without departing from the present teachings.
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
Number | Name | Date | Kind |
---|---|---|---|
6088669 | Maes | Jul 2000 | A |
6832725 | Gardiner et al. | Dec 2004 | B2 |
7128266 | Zhu et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
7864969 | Ma et al. | Jan 2011 | B1 |
8005680 | Kommer | Aug 2011 | B2 |
8285344 | Kahn et al. | Oct 2012 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Liu | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Van Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Horn et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van Horn et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8763909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van Horn et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue | Oct 2014 | B2 |
8862146 | Shatsky et al. | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | Akel et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber et al. | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
9224022 | Ackley et al. | Dec 2015 | B2 |
9224027 | Van Horn et al. | Dec 2015 | B2 |
D747321 | London et al. | Jan 2016 | S |
9230140 | Ackley | Jan 2016 | B1 |
9250712 | Todeschini | Feb 2016 | B1 |
9258033 | Showering | Feb 2016 | B2 |
9262633 | Todeschini et al. | Feb 2016 | B1 |
9310609 | Rueblinger et al. | Apr 2016 | B2 |
D757009 | Oberpriller et al. | May 2016 | S |
9342724 | McCloskey | May 2016 | B2 |
9375945 | Bowles | Jun 2016 | B1 |
D760719 | Zhou et al. | Jul 2016 | S |
9390596 | Todeschini | Jul 2016 | B1 |
D762604 | Fitch et al. | Aug 2016 | S |
D762647 | Fitch et al. | Aug 2016 | S |
9412242 | Van Horn et al. | Aug 2016 | B2 |
D766244 | Zhou et al. | Sep 2016 | S |
9443123 | Hejl | Sep 2016 | B2 |
9443222 | Singel et al. | Sep 2016 | B2 |
9478113 | Xie et al. | Oct 2016 | B2 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20100306711 | Kahn et al. | Dec 2010 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20110257974 | Kristjansson | Oct 2011 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedrao | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140100813 | Showering | Jan 2014 | A1 |
20140034734 | Sauerwein | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131438 | Kearney | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131445 | Ding et al. | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140133525 | Desclos et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197238 | Lui et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140278392 | Ramabadran et al. | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140303970 | Bell et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van Horn et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150071819 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150169925 | Chen et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150220753 | Zhu et al. | Aug 2015 | A1 |
20150254485 | Feng et al. | Sep 2015 | A1 |
20150317998 | Lee | Nov 2015 | A1 |
20150327012 | Bian et al. | Nov 2015 | A1 |
20160014251 | Hejl | Jan 2016 | A1 |
20160040982 | Li et al. | Feb 2016 | A1 |
20160042241 | Todeschini | Feb 2016 | A1 |
20160057230 | Todeschini et al. | Feb 2016 | A1 |
20160109219 | Ackley et al. | Apr 2016 | A1 |
20160109220 | Laffargue | Apr 2016 | A1 |
20160109224 | Thuries et al. | Apr 2016 | A1 |
20160112631 | Ackley et al. | Apr 2016 | A1 |
20160112643 | Laffargue et al. | Apr 2016 | A1 |
20160124516 | Schoon et al. | May 2016 | A1 |
20160125217 | Todeschini | May 2016 | A1 |
20160125342 | Miller et al. | May 2016 | A1 |
20160125876 | Schroeter et al. | May 2016 | A1 |
20160133253 | Braho et al. | May 2016 | A1 |
20160171720 | Todeschini | Jun 2016 | A1 |
20160178479 | Goldsmith | Jun 2016 | A1 |
20160180678 | Ackley et al. | Jun 2016 | A1 |
20160189087 | Morton et al. | Jun 2016 | A1 |
20160125873 | Braho et al. | Jul 2016 | A1 |
20160227912 | Oberpriller et al. | Aug 2016 | A1 |
20160232891 | Pecorari | Aug 2016 | A1 |
20160292477 | Bidwell | Oct 2016 | A1 |
20160294779 | Yeakley et al. | Oct 2016 | A1 |
20160306769 | Kohtz et al. | Oct 2016 | A1 |
20160314276 | Sewell et al. | Oct 2016 | A1 |
20160314294 | Kubler et al. | Oct 2016 | A1 |
20160316293 | Klimanis | Oct 2016 | A1 |
Number | Date | Country |
---|---|---|
104599677 | May 2015 | CN |
2013163789 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
2014133525 | Sep 2014 | WO |
2014143491 | Sep 2014 | WO |
Entry |
---|
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages. |
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages. |
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned. |
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages. |
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages. |
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages. |
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages. |
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages. |
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages. |
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages. |
U.S. Appl. No. 14/740,320 for Tactile Switch For a Mobile Electronic Device filed Jun. 16, 2015 (Bamdringa); 38 pages. |
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned. |
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages. |
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages. |
Extended Search Report in related European Application No. 17191991.3 dated Mar. 16, 2018, pp. 1-8 [U.S. Patent Publication No. 2011/0257974 previously cited.]. |
Number | Date | Country | |
---|---|---|---|
20180090134 A1 | Mar 2018 | US |