Systems and methods for managing dated products

Information

  • Patent Grant
  • 10896403
  • Patent Number
    10,896,403
  • Date Filed
    Monday, July 18, 2016
    7 years ago
  • Date Issued
    Tuesday, January 19, 2021
    3 years ago
Abstract
A mobile device can receive information from a computer; at least partially in response to the receiving of the information from the computer, the mobile device can communicate with a worker regarding a product type; the mobile device can receive information from the worker via voice, wherein the information received from the worker can identify a date or other product information associated with a product of the product type; and the mobile device can provide the product information to the computer. The mobile device can provide a second voice prompt to the worker, wherein the second voice prompt can request a quantity of products of the product type that are marked with the product information; then the mobile device can receive a quantity from the worker via voice; and the mobile device can report to the computer.
Description
FIELD OF THE INVENTION

The present invention generally relates to managing products by associated product information and, more particularly, to automated systems and methods for use in managing dated products.


BACKGROUND

It is typically desirable in retail and wholesale settings to properly manage dated products (e.g., products with freshness, expiration or sale-by dates, or the like) in order to avoid financial losses associated with unsold products being out-of-date. For example, it is typically desirable to promote the sale of products that are approaching their sale-by dates by making such products more visible or offering discounts on then. As another example, it is typically undesirable for out-of-date products to be unintentionally sold to customers. However, properly managing dated products in retail or other settings can be very time consuming.


Therefore, there is a desire for improved systems and methods for managing dated products.


SUMMARY

In one aspect, the present invention embraces a method of managing products by associated dates, the method comprising: providing information from a computer to a headset; providing, by the headset and at least partially in response to the providing of the information from the computer, a first voice prompt to a worker requesting a date associated with at least one product of a product type; receiving, by the headset, a date from the worker via voice; providing, by the headset and in response to the receiving of the date, a second voice prompt to the worker requesting a quantity of products of the product type that are marked with the date; receiving, by the headset, a quantity from the worker via voice; and providing, by the headset, information indicative of the date and the quantity to the computer.


In one embodiment, the method comprises determining, by the computer, whether the date associated with the at least one product is valid.


In one embodiment, the date is a first date, and the method comprises: providing, by the headset, a third voice prompt to the worker requesting a second date associated with at least one product of the product type; then receiving, by the headset, a second date from the worker via voice; providing, by the headset in response to the headset receiving the second date, a fourth voice prompt to the worker requesting a quantity of products of the product type that are marked with the second date; and receiving, by the headset, a quantity from the worker via voice.


In one embodiment, the product type is a first product type, and the method comprises: receiving, by the headset, an indication that there are no more products of the first product type from the worker via voice; providing, by the headset, a fifth voice prompt to the worker requesting a date associated with at least one product of a second product type; then receiving, by the headset, a third date from the worker via voice; providing, by the headset in response to the headset receiving the third date, a sixth voice prompt to the worker requesting a quantity of products of the second product type that are marked with the third date; and then receiving, by the headset, a quantity from the worker via voice.


In one aspect, the present invention embraces a method of managing products by associated dates, the method comprising: identifying, by a mobile device, a product type to a worker; receiving, by the mobile device, a date associated with at least one product of the product type from the worker via voice; and providing, by the mobile device, information indicative of the date associated with the at least one product to a computer.


In an embodiment, the mobile device comprises a headset.


In an embodiment, the method comprises identifying, by the mobile device, the product type to the worker via voice.


In an embodiment, the method comprises determining, by the computer, whether the date associated with the at least one product is valid.


In an embodiment, the method comprises requesting, by the mobile device, a quantity of products of the product type that are marked with the date associated with the at least one product.


In an embodiment, the product type is a first product type, and the method comprises: identifying, by the mobile device, the first product type to the worker via a first voice prompt; providing, by the mobile device and in response to the receiving of the date, a second voice prompt to the worker requesting a quantity of products of the first product type that are marked with the date; receiving, by the mobile device, a quantity from the worker via voice; identifying, by the mobile device, a second product type to the worker via a third voice prompt; receiving, by the mobile device, a date associated with at least one product of the second product type from the worker via voice; and requesting, by the mobile device, a quantity of products of the second product type that are marked with the date associated with the at least one product of the second product type via a third voice prompt.


In an embodiment, the date is a first date, and the method comprises: providing, by the mobile device and in response to the receiving of the date, a second voice prompt to the worker requesting a quantity of products of the first product type that are marked with the date; providing, by the mobile device, a third voice prompt to the worker requesting a second date associated with at least one product of the product type; then receiving, by the mobile device, a second date from the worker via voice; providing, by the mobile device in response to the mobile device receiving the second date, a fourth voice prompt to the worker requesting a quantity of products of the product type that are marked with the second date; and receiving, by the mobile device, a quantity from the worker via voice.


In an embodiment, the product type is a first product type, and the method comprises: the mobile device receiving from the worker via voice an indication that there are no more products of the first product type; the mobile device providing a fifth voice prompt to the worker, the fifth voice prompt requesting a date associated with at least one product of a second product type; then the mobile device receiving a third date from the worker via voice; in response to the mobile device receiving the third date, the mobile device providing a sixth voice prompt to the worker, the sixth voice prompt requesting a quantity of products of the second product type that are marked with the third date; and then the mobile device receiving a quantity from the worker via voice.


In one aspect, the present invention embraces a method of managing products by associated product information, the method comprising: identifying, by a mobile device, a product type to a worker; receiving, by the mobile device, product information associated with at least one product of the product type from the worker via voice; and providing, by the mobile device, information indicative of the product information associated with the at least one product to a computer.


In an embodiment, the mobile device comprises a headset.


In an embodiment, the method comprises identifying, by the mobile device, the product type to the worker via voice.


In an embodiment, the method comprises determining, by the computer, whether the product information is valid.


In an embodiment, the product information is a date associated with the at least one product.


In an embodiment, the date is a first date, and the method comprises: providing, by the mobile device and in response to the receiving of the date, a second voice prompt to the worker requesting a quantity of products of the first product type that are marked with the date; providing, by the mobile device, a third voice prompt to the worker requesting a second date associated with at least one product of the product type; then receiving, by the mobile device, a second date from the worker via voice; providing, by the mobile device in response to the mobile device receiving the second date, a fourth voice prompt to the worker requesting a quantity of products of the product type that are marked with the second date; and receiving, by the mobile device, a quantity from the worker via voice.


In an embodiment, the method comprises requesting, by the mobile device, a quantity of products of the product type that are marked with the product information.


In an embodiment, the product type is a first product type, and the method comprises: identifying, by the mobile device, a second product type to the worker; receiving, by the mobile device, product information associated with at least one product of the second product type from the worker via voice; and requesting, by the mobile device, a quantity of products of the second product type that are marked with the product information associated with the at least one product of the second product type.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a representative headset assembly, in accordance with an embodiment of this disclosure.



FIG. 2 is a diagram of a system that includes the headset assembly of FIG. 1, in accordance with an embodiment.



FIGS. 3A and 3B illustrate a flow diagram of a method performed by the system of FIG. 2 for managing products by associated product information (e.g., managing dates associated with products), in accordance with an embodiment.



FIG. 4 is a diagram of an electronics module of the headset assembly of FIG. 1, in accordance with an embodiment.



FIG. 5 is a diagram of a computer of the system of FIG. 2, in accordance with an embodiment.





DETAILED DESCRIPTION

The present disclosure is generally directed to an automated product management system, wherein in one embodiment the system is configured to assist a worker in handling dated products by engaging in a verbal dialog with the worker. In one example, the worker can engage products with both of his or her hands while simultaneously fully participating in the verbal dialog with the product management system, which can enhance the productivity of the worker. More generally and in one example, the present disclosure embraces a method of managing products by associated product information. As one example, the product information can be dates respectively associated with the products, as discussed in greater detail below.


In an embodiment of this disclosure, the product management system can include one or more mobile devices that can be in the form of headset assemblies. Each of the mobile devices, or headset assemblies, can comprise a wireless-enabled voice recognition device that is configured to be used in a hands-free manner. Alternatively, the mobile devices can be manually carried or mounted to a movable piece of equipment, such as a cart being used by a worker.


In FIG. 1, an example of a mobile device in the form of a headset assembly 100 is shown as including an electronics module 110 and a headset 115. Whereas the mobile device described in this detailed description is frequently referred to as the headset assembly 100, a variety of different types of suitable mobile devices are within the scope of this disclosure, such as smartphones, smartwatches or other suitable devices.


In the embodiment shown in FIG. 1, the headset 115 includes a headband 117 for securing the headset to the worker's head. Alternatively, the headband 117 or other suitable fastening or mounting features can be configured to fit in an ear, over an ear, or otherwise be designed to support the headset 115. The headset 115 can further include at least one speaker 120, and one or more microphones 125, 126. For example, the main microphone 125 can be configured for being proximate to the worker's mouth, for converting voice sounds from the worker into an electrical signal. In contrast, the optional secondary microphone 126 can be configured for being distant from the worker's mouth, for use and receiving and cancelling out environmental sounds to enhance voice recognition associated with the main microphone 125.


The electronics module 110 can contain or otherwise carry several components of the headset assembly 100 to reduce the weight and/or size of the headset 115. In some embodiments, the electronics module 110 can include one or more of a rechargeable or long life battery, keypad, Bluetooth® antenna, printed circuit board assembly (PCBA), and any other suitable electronics, or the like. The electronics module 110 can be mounted to a worker's torso (e.g., via a lapel clip and/or lanyard) or in any other suitable location. The headset 115 can be connected to the electronics module 110 via a communication link, such as a small audio cable 130 or a wireless link. The headset 100 can be used to support multiple workflows in multiple markets, including grocery retail, direct store delivery, wholesale, etc. In some embodiments, the headset 100 has a low profile that seeks not to be intimidating to a customer in a retail setting. That is, the headset 115 can be relatively minimalistic in appearance in some embodiments, or alternatively the headset 115 can have a larger profile in other embodiments. The electronics module 110 can be used with a wide variety of differently configured headsets, such as Vocollect™ headsets.


The electronics module 110 can be configured to read a unique identifier (I.D.) of the headset 115. The headset I.D. can be stored in an electronic circuitry package that is part of the headset 115, and the headset electronic circuitry package can be configured to at least partially provide the connection (e.g., communication path(s)) between the electronics module 110 and headset features (e.g., the one or more speakers 120 and microphones 125, 126). In one embodiment, the audio cable 130 includes multiple conductors or communication lines for signals which can include a speaker +, speaker −, main microphone, secondary microphone, and grounds. The electronics module 110 can utilize a user-configurable attachment feature 140, such as a plastic loop and/or other suitable features, for at least partially facilitating attachment of the electronics module to the worker. When a wireless link between the headset 115 and electronics module 110 is used, such as a Bluetooth type of communication link, the headset 115 can include a small lightweight battery. The wireless communication link can provide wireless signals suitable for exchanging voice communications. In an embodiment (not shown), the electronics module 110 can be integrated into the headset 115 rather than being remote from, and connected to, the headset 115. Accordingly, the mobile device, which may more specifically be in the form of the headset assembly 100, or the like, may include multiple pieces with separate housings or can be substantially contained in, or otherwise be associated with, a single housing.


In the embodiment shown in FIG. 2, the headset assembly 100 is part of a distributed product management system 200, or the like, configured for providing communications with a worker. The worker can be wearing the headset 115 on her or his head so that the speaker 120 is proximate the worker's ear, and the microphone 125 is proximate to the worker's mouth. As shown in FIG. 2, the system 200 further includes a terminal, server computer 500, or the like, connected to the electronics module 110 via one or more communication paths 215 that can comprise a wireless line 215, such as a Bluetooth® connection. The computer 500 can be one or more computers, such as a series of computers connected to one another in wired and/or wireless manner over a network, such as WLAN, to form a distributed computer system. The computer 500 can comprise a retail store computer having applications and data for managing operations of the retail store (e.g., an enterprise system, such as a retail management system, inventory management system or the like), including inventory control and other functions, such as point of sale functions.


In an embodiment, the computer 500 is configured to simultaneously interface with multiple of the headset assemblies 100, and thereby the workers respectively associated with the headset assemblies, to simultaneously provide one or more work tasks or workflows that can be related to the products or other items handled by the workers in a workplace (e.g., a retail store). The computer 500 can be located at one facility (e.g., the retail store) or be distributed at geographically distinct facilities. Furthermore, the computer 500 may include a proxy server. Therefore, the computer 500 is not limited in scope to a specific configuration. For example, and alternatively, each of the headset assemblies 100 can substantially be a stand-alone device, such that the computer 500 or suitable features thereof are part of the headset assembly. Usually, however, to have sufficient database capability to simultaneously handle large amounts of information that can be associated with multiple headset assemblies 100 being operated simultaneously, the computer 500 typically comprises a server computer configured to simultaneously interface with multiple of the headset assemblies.


In an embodiment, voice templates can be stored in the computer 500 to recognize worker voice interactions and convert the interaction into text-based data and commands for interaction with at least one software application or module being executed on at least one processor or processing unit of the computer 500. The functions ascribed to individual elements of the system 200 can be performed in one or more other locations in further embodiments. For example, computer 500 can perform voice recognition in one embodiment, or the electronics module 110 can perform voice recognition utilizing the voice templates. In one embodiment, the first stages of voice recognition can be performed on the electronics module 110, with further stages performed on the computer 500. In further embodiments, raw audio can be transmitted from the electronics module 110 to the computer 500 where the final stages of voice recognition are completed.



FIGS. 3A and 3B illustrate a flow or block diagram of a method 300 of operations performed by the system 200, such as in response to middleware being executed on the computer 500, in accordance with an embodiment. The method 300 is illustrative of how the system 200 can interact with a representative worker to facilitate management of products such as, but not limited to, dated products. Each block of the method 300 may be schematically indicative of at least one step and/or associated element(s) of the system 200. In the following, as an example and not for the purpose of limiting the scope of this disclosure, the method 300 is at times described in the context of a retail establishment containing different types of products (“product types”) (e.g., bread, dairy products, vegetables, beverages, etc.), wherein the products can be in the form of packages that are each individually marked with product information that may comprise dates (e.g., products with freshness, expiration or sale-by dates, or the like). As another example, the dated products can be supported on retail store shelves or endcaps, contained in bins, or be in any other suitable conventional configuration, such as by being in warehouse or other suitable settings. For example and not for the purpose of limiting the scope of this disclosure, one of the product types can be verbally identified as “gallon cartons of a predetermined brand of whole milk,” and there can be numerous products or cartons of that product type, with each of the cartons being marked with a sale-by date, and some of the cartons having different sale-by dates as compared to one another. Accordingly and as another example, a product type can be verbally identified by its brand name, size and sometimes also by one or more of a variety, type or style of the product (e.g., Green Giant® Asiago Parmesan Risotto Vegetables, 11 oz.) and/or by any other suitable, unique product identifier such as a Universal Product Code (UPC), a portion thereof, or the like.


In one embodiment, the method 300 can be performed (e.g. looped through) in serial fashion for each product type of a plurality of product types, wherein the products of the different product types can be in the form of packages that are each individually marked with, or otherwise associated with, dates, such as discussed above. As a precursor to, or an early part of, the method 300, the system 200 may serially identify product types to be subjected to the method 300. The system 200 may identify such a product type solely for the purposes of fulfilling the workflow of method 300 for the product type, or the product type can be identified for additional workflow purposes. For example, the method 300 can be performed substantially simultaneously with one or more other workflows for the subject product type, such as restocking, facing/blocking, price relabeling and/or any other suitable workflows, or the like.


In the method 300, the provision of each of the numerous verbal prompts (e.g., at blocks 302, 306, 310, 312, 320 and/or 322) can comprise the speaker 120 converting or transforming an audio signal, which is provided by respective features of the system 200, to a voice-sound for being received by the worker; and the receipt of each of the verbal responses (e.g., as a precursor to blocks 304, 308, 314 and/or 324) can comprise the microphone 125 converting or transforming a voice sound, which is provided by the worker, to an electrical signal that is provided to respective features of the system 200. The mobile device or headset assembly 100 can provide the verbal prompts substantially in response to, or at least partially in response to, receiving information from the computer 500. For example, the headset assembly 100 can provide the verbal prompts in real time/immediately in response to receiving information from the computer 500 and/or there may be a brief or any suitable time lag, or queuing, associated with the headset assembly 100 providing the verbal prompts. Alternatively, one or more of the verbal prompts and/or verbal responses can be replaced by or substituted with other responses, such as nonverbal (e.g., visual) prompts and/or nonverbal (e.g., typed and/or scanned) responses, or the like.


In one embodiment, the method 300 can optionally begin at block 302. At block 302, at least one verbal location prompt can be provided by the system 200 to the worker, and this verbal prompt can comprise information about a location at which a first product type of a plurality of product types is located. As an example, the verbal prompt of block 302 can include information about an aisle, shelf and/or any other suitable location at which the first product type is located. After the provision of the verbal prompt at block 302, processing control can be transferred to block 304. Associated with or as a precursor to block 304, the system 200 may receive a verbal response from the worker, and the verbal response can be a verbal location verification, or the like, comprising one or more of the words “ready,” “okay,” and/or any other suitable verbal response for indicating that the worker is proximate the location indicated at block 302 for the first product type. Block 304 can be configured to be operative so that, in response to the system 200 not receiving an appropriate verbal response to the verbal prompt of block 302, such as within a predetermined timeframe, processing control is returned to block 302. In response to the system 200 receiving an appropriate verbal response to the verbal prompt of block 302, such as within a predetermined timeframe, processing control can be transferred from block 304 to block 306.


Generally described, the system 200 can be configured so that for blocks 306 and/or 308, or the like, the system 200 is configured to participate in a verbal dialog with the worker so that, generally described, both the system 200 and the worker are directing their attention to the same product type. For example, in an alternative embodiment, the worker may originally identify the first product type and verbally or otherwise provide information for the first product type to the system 200, and in response the system 200 may provide some sort of confirmation regarding the first product type, such as by providing a verbal verification comprising identifying information for the first product type and/or one or more of the words “ready,” “okay,” and/or any other suitable response for indicating that both the system 200 and the worker are directing their attention to the same product type.


More specifically and in accordance with the embodiment shown in FIGS. 3A and 3B, at block 306 at least one verbal product prompt can be provided by the system 200 to the worker, and this verbal prompt can comprise a verbal identification of the first product type. Examples of suitable verbal identifications of product types are discussed above. After the provision of the verbal prompt at block 306, processing control can be transferred to block 308. Associated with or as a precursor to block 308, the system 200 may receive a verbal response from the worker. In response to the system 200 receiving an appropriate verbal response to the verbal prompt of block 306, such as within a predetermined timeframe, processing control can be transferred from block 308 to a respective one of blocks 310 and 312.


At least partially reiterating from above, an appropriate verbal response associated with block 308 can be a verbal response from the worker indicating that the worker has at least identified the location where the first product type is supposed to be located. For example, an appropriate verbal response from the worker for block 308 can be that there are no products of the first product type present in the location identified at block 302. In that case, processing control can be transferred from block 308 to block 310. At block 310, an enterprise or other suitable system associated with the computer 500 can be notified about the lack of products of the first product type, the system 200 can verbally prompt the worker to restock the first product type or take other corrective action, and/or the method 300 may end for the first product type.


As another example and at least partially reiterating from above, an appropriate verbal response associated with block 308 can be a verbal response from the worker comprising one or more of the words “ready,” “okay,” and/or any other suitable verbal verification for indicating that the worker has identified one or more products of the first product type at the location identified at block 302. For example, associated with or as a precursor to block 308, the system 200 may receive a verbal response from the worker, and the verbal response can be in the form of a confirmation comprising verbal identification of the first product type. As discussed above, examples of suitable verbal identifications of product types can include brand name, size, variety and/or other suitable product identifiers such as a Universal Product Code (UPC), a portion thereof, or the like. Block 308 can be configured to be operative so that, in response to the system 200 not receiving an appropriate verbal response to the verbal prompt of block 308, such as within a predetermined timeframe, processing control is transferred back to block 304. In response to the system 200 receiving an appropriate verbal response to the verbal prompt of block 308, such as within a predetermined timeframe, processing control can be transferred from block 308 to block 312.


At block 312, at least one verbal date prompt can be provided by the system 200 to the worker, and this verbal prompt can comprise a request for a date (e.g., freshness, expiration or sale-by date, or the like) associated with (e.g., printed, stamped or attached to the packaging of) at least one product of the first product type. After the provision of the verbal prompt at block 312, processing control can be transferred to block 314. Associated with or as a precursor to block 314, the system 200 may receive a verbal response from the worker. Block 314 can be configured to be operative so that, in response to the system 200 not receiving an appropriate verbal response to the verbal prompt of block 312, such as within a predetermined timeframe, processing control is transferred back to block 312. In response to the system 200 receiving an appropriate verbal response to the verbal prompt of block 312, such as within a predetermined timeframe, processing control can be transferred from block 314 to a respective one of blocks 316 and 318.


An appropriate verbal response associated with block 314 can be a verbal response from the worker that is indicative of a date associated with a first product of the first product type. In response to such a response, processing control can be transferred from block 314 to block 316. At block 316, the system 200 may determine whether the date received in association with block 314 is valid. As an example, a date verbally received from a worker in association with block 316 can be valid if that date is not later than the present day (e.g., a verbally received date can be valid if it is the present calendar day or a future calendar day).


In response to a negative determination being made at block 316, processing control can be transferred to block 320. At block 320, an enterprise system, or the like, associated with the computer 500 can be notified that there are outdated products of the first product type and/or the system 200 can verbally prompt the worker to remove the outdated products of the first product type from their location so that they are no longer available for sale, or the like. Processing control can be transferred from block 320 back to block 312. Alternatively, the step or process of block 320 can follow the step or process of block 322, or other suitable provisions can be made, so that the enterprise system, or the like, associated with the computer 500 can be notified of the quantity of outdated products of the first product type.


In response to a positive determination being made at block 316, processing control can be transferred to block 322. At block 322, at least one verbal quantity prompt can be provided by the system 200 to the worker, and this verbal prompt can comprise a request for a quantity of the products of the first type that are marked with the date most recently received by the system 200 in association with block 314. Processing control can be transferred from block 322 to block 324. Associated with or as a precursor to block 324, the system 200 may receive a verbal response from the worker. Block 324 can be configured to be operative so that, in response to the system 200 not receiving an appropriate verbal response to the verbal prompt of block 322, such as within a predetermined timeframe, processing control is returned to box 322.


An appropriate verbal response associated with block 324 can be a verbal response from the worker that is indicative of product information. At block 324, the verbal response from the worker, which is indicative of product information, can be a quantity (e.g., “one,” “two,” “three” or another suitable whole number) of the first type of products that are marked with the date most recently received by the system 200 in association with block 314. In response to the system 200 receiving such a verbal response, processing control can be transferred to block 326. At block 326, an enterprise system, or the like, associated with the computer 500 can be notified of the quantity of the first type of products that are marked with the date most recently received by the system 200 in association with block 314.


Processing control can be transferred from block 326 to block 312. At the second occurrence of block 312, at least one verbal date prompt can be provided by the system 200 to the worker, and this verbal prompt can comprise a request for another date associated with at least one product of the first product type. The system 200 can be configured and/or workers may be trained so that at the second and subsequent occurrences of block 312 for a product type, dates already processed by the system 200 for the product type are not repeated.


The loop comprising blocks 312, 314, 316, 322, 324 and 326 can be repeated for each differently dated group of products of the first product type until processing control is transferred to block 318. For example, in response to the system 200 receiving a verbal response that is from the worker and is associated with block 314, and the verbal response being indicative of there being no more products of the first type for which dates have not been provided in association with block 314, processing control is transferred to block 318. At block 318, an enterprise system, or the like, associated with the computer 500 can be notified, for example, that the method 300 has been completed for the first type of products, and the method 300 may end for the first product type.


The system 200 can be configured so that after the method 300 has been completed for the first type of products, the method is automatically completed in series for other products of the plurality of dated products, such as a second type of products, and then a third type of products, and so on. The method 300 can be described as being schematically illustrative of one or more software modules that may be executed on the computer 500 and/or headset assembly 100, and such one or more modules may be referred to as a date capture module, or the like. As another example, block 316 can be described as being schematically illustrative of one or more software modules that may be executed on the computer 500 and/or headset assembly 100, and such one or more modules may be referred to as a validity module, or the like. Similarly, one or more other blocks of the method 300 may be characterized as being schematically illustrative of one or more other software modules for being executed on the computer 500 and/or headset assembly 100, or the like. One or more steps or blocks of the method 300 can be omitted or rearranged in a suitable manner, and suitable additional blocks or steps may be incorporated into the method 300.


In accordance with one aspect of this disclosure, the method 300 can schematically represent middleware that accepts a list of products that need to have their expiration date, or some other suitable date, verified etc. The list can be a product master list, a separate list, or the like. In one embodiment, the method 300 can schematically represent a date capture workflow that can be interleaved into (e.g., performed substantially simultaneously with) various other workflows, including stocking, so that a worker may not travel through the retail store for the sole purpose of verifying or otherwise managing the product dates. The workflow associated with the method 300 can prompt (e.g., verbally) the user with a location to go to and a product to verify, and can wait for the worker's confirmation (e.g., verbal confirmation) that they are handling the correct product, and upon such product verification the worker can be prompted (verbally) for an associated date. Upon valid (e.g., verbal) date entry the worker can be prompted (e.g., verbally) for the quantity of products matching the entered date. A valid date can be any date that is the present day's date or any future date. Upon quantity entry, the workflow returns to prompt (e.g., verbally) again for a date until the worker says “no more,” or the like, and the workflow moves to the next item or product requiring date capture, or the like. All information captured can be sent to and stored in the middleware for processing, such as for integration with point of sale data, alert triggering and/or other suitable actions.


In an embodiment shown in FIG. 4, the electronics module 110 includes an enclosure, such as plastic case, with a connector 410 that mates with a complimentary mating connector (not shown) on audio cable 130. An internal path 415 or buss can be used to communicate between multiple components within the electronics module 110. In one embodiment, an input speech pre-processor (ISPP) 420 converts input speech into pre-processed speech feature data. An input speech encoder (ISENC) 425 encodes input speech for transmission to a remote terminal for reconstruction and playback and/or recording. A raw input audio sample packet formatter 430 transmits the raw input audio to a remote terminal, such as computer system 500, using an application-layer protocol to facilitate communications between the computer system and headset 115 as the transport mechanism. For the purposes of the transport mechanism, the formatter 430 can be abstracted to a codec type referred to as Input Audio Sample Data (IASD). An output audio decoder (OADEC) 435 decodes encoded output speech and audio for playback in the headset 115. A raw output audio sample packet reader 440 operates to receive raw audio packets from the remote terminal using the transport mechanism. For the purposes of the transport mechanism, this formatter 430 can be abstracted to a codec type referred to as Output Audio Sample Data (OASD). A command processor 445 adjusts the headset 115 hardware (e.g., input hardware gain level) under control of a remote computer or terminal 500. A query processor 450 allows the computer 500 to retrieve information regarding headset operational status and configuration. Path 415 is also coupled to network circuitry 455 to communicate via wired or wireless protocol with the computer or terminal 500. The ISPP 420, ISENC 425, and raw input audio formatter 430 are sources of communication packets used in the transport mechanism; the OADEC 435 and raw output audio reader 440 are packet sinks. The command and query processors 445,450 are both packet sinks as well as sources (in general they generate acknowledgement or response packets).


In an embodiment shown in FIG. 5, the computer system 500 implements components and methods of the distributed headset 100. Each of the following components may be used in various combinations, in various embodiments. For example, the computer system 500, can include one or more of a processor processing unit 502, memory 503, removable storage 510, and non-removable storage 512. Although the example computing device is illustrated and described as computer system 500, the computing device can be in different forms in different embodiments. For example, the computing device can also be a laptop, desktop, server, smartphone, a tablet, headset, smartwatch, or other computing device including the same or similar elements as illustrated and described with regard to FIG. 5. Devices such as smartphones, tablets, headsets, and smartwatches are generally collectively referred to as mobile devices. Further, although the various data storage elements are illustrated as part of the computer 500, the storage can also or alternatively include cloud-based storage accessible via a network, such as the Internet.


Memory 503 can include volatile memory 514 and non-volatile memory 508. Computer 500 can include, or have access to a computing environment that includes, a variety of computer-readable media, such as volatile memory 514 and non-volatile memory 508, removable storage 510 and non-removable storage 512. Computer storage includes random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (EPROM) & electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, compact disc read-only memory (CD ROM), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of storing computer-readable instructions.


The computer 500 can include or have access to a computing environment that includes input 506, output 504, and at least one communication device or connection 516 (e.g., a transceiver, or the like, for providing a communication connection (e.g., at least partially providing the wireless line 215 (FIG. 2))). Output 504 can include a display device, such as a touchscreen, that also can serve as an input device. The input 506 can include one or more of a touchscreen, touchpad, mouse, keyboard, camera, one or more device-specific buttons, one or more sensors integrated within or coupled via wired or wireless data connections to the computer 500, and other input devices. The computer 500 can operate in a networked environment using a communication connection to connect to one or more remote computers, such as database servers. The remote computer 500 can include a personal computer (PC), server, router, network PC, a peer device or other common network node, or the like. The communication connection can include a Local Area Network (LAN), a Wide Area Network (WAN), cellular, WiFi, Bluetooth, or other networks.


Computer-readable instructions stored on a computer-readable medium are executable by the processing unit 502 of the computer 500. A hard drive, CD-ROM, and RAM are some examples of articles including a non-transitory computer-readable medium such as a storage device. The terms computer-readable medium and storage device do not include carrier waves. For example, a computer program 518 capable of providing a generic technique to perform access control check for data access and/or for doing an operation on one of the servers in a component object model (COM) based system can be included on a CD-ROM and loaded from the CD-ROM to a hard drive. The computer-readable instructions allow computer 500 to provide generic access controls in a COM based computer network system having multiple users and servers.


An aspect of this disclosure is the provision of numerous examples that can be configured in a variety of combinations and subcombinations. In a first example, a method of managing products, such as by associated dates, comprises a mobile device identifying a product type to a worker; the mobile device receiving from the worker, via voice, a date associated with at least one product of the product type; and the mobile device providing to the computer information indicative of the date associated with the at least one product.


A second example comprises the first example, wherein the mobile device comprises a headset.


A third example comprises the first example, wherein the mobile device identifies the product type to the worker via voice.


A fourth example comprises the first example and the computer determining whether the date associated with the at least one product is valid.


A fifth example comprises the first example and the mobile device requesting, via voice, a quantity of products of the product type that are marked with the date associated with the at least one product.


A sixth example comprises a system for managing products by associated dates, the system comprising: a mobile device configured to receive information from a worker via voice, and provide information to the worker via voice; and a computer configured to communicate at least indirectly with the mobile device, the computer including a date capture module configured to facilitate collection of dates associated with product types, wherein for each product type the date capture module is configured to: cause the mobile device to verbally communicate with the worker regarding the product type, and receive, from the mobile device, information indicative of a date associated with at least one product of the product type.


A seventh example comprises the sixth example, wherein the mobile device comprises a headset.


An eighth example comprises the sixth example, wherein the computer further comprises a validity module that is configured to determine whether the date associated with the at least one product is valid.


A ninth example comprises the sixth example, wherein: the date capture module is configured to cause the mobile device to provide at least one voice prompt to the worker; and the at least one voice prompt comprises information regarding the product type.


A tenth example comprises the ninth example, wherein the at least one voice prompt comprises a request for the date associated with the at least one product.


An eleventh example comprises the ninth example, wherein: the at least one voice prompt comprises a first voice prompt; and the date capture module is configured to: cause the mobile device to provide a second voice prompt requesting a quantity of products of the product type that are marked with the date associated with the at least one product, and receive, from the mobile device, information indicative of the quantity of products of the product type that are marked with the date associated with the at least one product.


A twelfth example comprises a computer for managing products by associated dates, the computer comprising: a communication device configured to communicate at least indirectly with a mobile device; a processor; and a date capture module configured to the executed by the processor to facilitate collection of dates associated with product types, wherein for each product type the date capture module is configured to: cause the mobile device to verbally communicate with a worker regarding the product type, and receive, from the mobile device, information indicative of a date associated with at least one product of the product type.


A thirteenth example comprises the twelfth example, wherein the date capture module is configured to cause the mobile device to provide information regarding the product type to the worker via voice.


A fourteenth example comprises the twelfth example, wherein the computer further comprises a validity module configured to be executed by the processor to determine whether the date associated with the at least one product is valid.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266; 7,159,783; 7,413,127; 7,726,575; 8,294,969; 8,317,105; 8,322,622; 8,366,005; 8,371,507; 8,376,233; 8,381,979; 8,390,909; 8,408,464; 8,408,468; 8,408,469; 8,424,768; 8,448,863; 8,457,013; 8,459,557; 8,469,272; 8,474,712; 8,479,992; 8,490,877; 8,517,271; 8,523,076; 8,528,818; 8,544,737; 8,548,242; 8,548,420; 8,550,335; 8,550,354; 8,550,357; 8,556,174; 8,556,176; 8,556,177; 8,559,767; 8,599,957; 8,561,895; 8,561,903; 8,561,905; 8,565,107; 8,571,307; 8,579,200; 8,583,924; 8,584,945; 8,587,595; 8,587,697; 8,588,869; 8,590,789; 8,596,539; 8,596,542; 8,596,543; 8,599,271; 8,599,957; 8,600,158; 8,600,167; 8,602,309; 8,608,053; 8,608,071; 8,611,309; 8,615,487; 8,616,454; 8,621,123; 8,622,303; 8,628,013; 8,628,015; 8,628,016; 8,629,926; 8,630,491; 8,635,309; 8,636,200; 8,636,212; 8,636,215; 8,636,224; 8,638,806; 8,640,958; 8,640,960; 8,643,717; 8,646,692; 8,646,694; 8,657,200; 8,659,397; 8,668,149; 8,678,285; 8,678,286; 8,682,077; 8,687,282; 8,692,927; 8,695,880; 8,698,949; 8,717,494; 8,717,494; 8,720,783; 8,723,804; 8,723,904; 8,727,223; D702,237; 8,740,082; 8,740,085; 8,746,563; 8,750,445; 8,752,766; 8,756,059; 8,757,495; 8,760,563; 8,763,909; 8,777,108; 8,777,109; 8,779,898; 8,781,520; 8,783,573; 8,789,757; 8,789,758; 8,789,759; 8,794,520; 8,794,522; 8,794,525; 8,794,526; 8,798,367; 8,807,431; 8,807,432; 8,820,630; 8,822,848; 8,824,692; 8,824,696; 8,842,849; 8,844,822; 8,844,823; 8,849,019; 8,851,383; 8,854,633; 8,866,963; 8,868,421; 8,868,519; 8,868,802; 8,868,803; 8,870,074; 8,879,639; 8,880,426; 8,881,983; 8,881,987; 8,903,172; 8,908,995; 8,910,870; 8,910,875; 8,914,290; 8,914,788; 8,915,439; 8,915,444; 8,916,789; 8,918,250; 8,918,564; 8,925,818; 8,939,374; 8,942,480; 8,944,313; 8,944,327; 8,944,332; 8,950,678; 8,967,468; 8,971,346; 8,976,030; 8,976,368; 8,978,981; 8,978,983; 8,978,984; 8,985,456; 8,985,457; 8,985,459; 8,985,461; 8,988,578; 8,988,590; 8,991,704; 8,996,194; 8,996,384; 9,002,641; 9,007,368; 9,010,641; 9,015,513; 9,016,576; 9,022,288; 9,030,964; 9,033,240; 9,033,242; 9,036,054; 9,037,344; 9,038,911; 9,038,915; 9,047,098; 9,047,359; 9,047,420; 9,047,525; 9,047,531; 9,053,055; 9,053,378; 9,053,380; 9,058,526; 9,064,165; 9,064,167; 9,064,168; 9,064,254; 9,066,032; 9,070,032;
  • U.S. Design Pat. No. D716,285;
  • U.S. Design Pat. No. D723,560;
  • U.S. Design Pat. No. D730,357;
  • U.S. Design Pat. No. D730,901;
  • U.S. Design Pat. No. D730,902;
  • U.S. Design Pat. No. D733,112;
  • U.S. Design Pat. No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method of managing products by associated product information, the method comprising: transforming, by a mobile device, a first electrical signal corresponding to a first voice prompt to a worker into a first sound signal to identify a product type to the worker;transforming, by the mobile device, a second sound signal, received from the worker in response to the first sound signal, into a second electrical signal indicative of a confirmation of the product type;transforming, by the mobile device, a third sound signal into a third electrical signal, wherein the third sound signal, received from the worker in response to the confirmation of the product type, is indicative of product information associated with at least one product of the confirmed product type;providing, by the mobile device, information indicative of a date associated with the at least one product, to a computer;determining, by the computer, the date associated with the at least one product of the confirmed product type is valid based on determining that the date is equal to or later than a current date;in response to determining that the date associated with the at least one product of the confirmed product type is valid, transforming, by the mobile device, a fourth electrical signal corresponding to a second voice prompt to the worker into a fourth sound signal to request a quantity of products of the product type that are marked with the validated date; andin response to determining that the date associated with the at least one product of the confirmed product type is invalid based on determining that the date is earlier than the current date, generating a notification indicating that the at least one product of the confirmed product type is expired.
  • 2. The method of claim 1, wherein the mobile device comprises a headset.
  • 3. The method of claim 1, wherein the product information is the date associated with the at least one product.
  • 4. The method of claim 3, wherein the date is a first date, and the method further comprises: transforming, by the mobile device, a fifth electrical signal corresponding to a third voice prompt to the worker into a fifth sound signal to request a second date associated with at least one product of the product type;transforming, by the mobile device, a sixth sound signal indicative of the second date into a sixth electrical signal;transforming, by the mobile device in response to the mobile device receiving the second date, a seventh electrical signal corresponding to a fourth voice prompt to the worker into a seventh sound signal to request a quantity of products of the product type that are marked with the second date; andtransforming, by the mobile device, an eighth sound signal indicative of the quantity into an eighth electrical signal.
  • 5. The method of claim 1, further comprising transforming, by the mobile device, the fourth electrical signal corresponding to the second voice prompt to the worker into the fourth sound signal to request the quantity of products of the product type that are marked with the product information.
  • 6. The method of claim 5, wherein the product type is a first product type, and the method further comprises: transforming, by the mobile device, a fifth electrical signal corresponding to a third voice prompt to the worker into a fifth sound signal to identify a second product type to the worker;transforming, by the mobile device, a sixth sound signal into a sixth electrical signal indicative of product information associated with at least one product of the second product type; andtransforming, by the mobile device, a seventh electrical signal corresponding to a fourth voice prompt to the worker into a seventh sound signal to request a quantity of products of the second product type that are marked with the product information associated with the at least one product of the second product type.
  • 7. A system for managing products by associated product information, the system comprising: a mobile device comprising a first processor configured to: transform a first electrical signal corresponding to a first voice prompt to a worker into a first sound signal to identify a product type to the worker;transform a second sound signal, received from the worker in response to the first sound signal, into a second electrical signal indicative of a confirmation of the product type;transform a third sound signal into a third electrical signal, wherein the third sound signal, received from the worker in response to the confirmation of the product type, is indicative of product information associated with at least one product of the confirmed product type;provide information indicative of a date associated with the at least one product of the confirmed product type, to a computer;the computer comprising a second processor is configured to: determine that the date associated with the at least one product of the confirmed product type is valid based on determining that the date is equal to or later than a current date; andwherein the first processor is further configured to: in response to the determination that the date associated with the at least one product of the confirmed product type is valid, transform, a fourth electrical signal corresponding to a second voice prompt to the worker into a fourth sound signal to request a quantity of products of the product type that are marked with the validated date; andwherein the second processor is further configured to: in response to the determination that the date associated with the at least one product of the confirmed product type is invalid based on determining that the date is earlier than the current date, generate a notification indicating that the at least one product of the confirmed product type is expired.
  • 8. The system of claim 7, wherein the mobile device further comprises a headset.
  • 9. The system of claim 7, wherein the product information is the date associated with the at least one product.
  • 10. The system of claim 9, wherein the date is a first date, and the first processor is further configured to: transform a fifth electrical signal corresponding to a third voice prompt to the worker into a fifth sound signal to request a second date associated with at least one product of the product type;transform a sixth sound signal indicative of the second date into a sixth electrical signal;transform, in response to the reception of the second date, a seventh electrical signal corresponding to a fourth voice prompt to the worker into a seventh sound signal to request a quantity of products of the product type that are marked with the second date; andtransform an eighth sound signal indicative of the quantity into an eighth electrical signal.
  • 11. The system of claim 7, wherein the first processor is further configured to transform the fourth electrical signal corresponding to the second voice prompt to the worker into the fourth sound signal to request the quantity of products of the product type that are marked with the product information.
  • 12. The system of claim 11, wherein the product type is a first product type, and wherein the first processor is further configured to: transform a fifth electrical signal corresponding to a third voice prompt to the worker into a fifth sound signal to identify a second product type to the worker;transform a sixth sound signal into a sixth electrical signal indicative of product information associated with at least one product of the second product type; andtransform a seventh electrical signal corresponding to a fourth voice prompt to the worker into a seventh sound signal to request a quantity of products of the second product type that are marked with the product information associated with the at least one product of the second product type.
US Referenced Citations (475)
Number Name Date Kind
6021392 Lester Feb 2000 A
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7464873 Spencer et al. Dec 2008 B2
7516120 Ghazaleh Apr 2009 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8933791 Vargo Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9443123 Hejl Jan 2016 B2
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9449205 Vargo Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9679321 Pitzel Jun 2017 B1
10108824 Vargo Oct 2018 B2
20040117243 Chepil Jun 2004 A1
20070063048 Havens et al. Mar 2007 A1
20080046114 White Feb 2008 A1
20080097876 White Apr 2008 A1
20090134221 Zhu et al. May 2009 A1
20090303001 Brumer Dec 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100265880 Rautiola et al. Oct 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120150699 Trapp Jun 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20130037613 Soldate Feb 2013 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130080289 Roy et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270341 Janneh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140148947 Levesque May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140165614 Manning Jun 2014 A1
20140166754 Vargo Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140278391 Braho et al. Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150102913 Vargo Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chang et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189270 Mellott et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160260148 High Sep 2016 A1
20160292477 Bidwell Oct 2016 A1
20160292633 Griffin Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160350709 Taylor Dec 2016 A1
20170004334 Vargo Jan 2017 A1
Foreign Referenced Citations (6)
Number Date Country
103745342 Apr 2014 CN
3043300 Jul 2016 EP
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (32)
Entry
Office Action in related European Application No. 17181028.6 dated Sep. 18, 2018, pp. 1-7 [All references previously cited.].
Extended Search Report in related European Application No. 17181028.6 dated Sep. 11, 2017, pp. 1-8.
Ackerman, “Voice Recognition Systems Technology at Work in Today's Warehouse/Distribution Facilities”, Prologis Supply Chain Review, Mar. 1, 2006, Denver, CO, Retrieved from the Internet: URL:http://www.prologis.com/docs/research/supply chain/Voice_Recognition_Systems_-_March 2006.pdf, 12 pages [Cited in EP Search Report].
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User'S Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
Decision to Refuse for European Application No. 17181028.6, dated Feb. 5, 2020, 13 pages.
Summons to Attend Oral Proceedings for European Application No. 17181028.6, dated Sep. 5, 2019, 11 pages.
U.S. Patent Application for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.), U.S. Appl. No. 13/367,978.
U.S. Patent Application for Indicia Reader filed Apr. 1, 2015 (Huck), U.S. Appl. No. 14/676,109.
Related Publications (1)
Number Date Country
20180018623 A1 Jan 2018 US