Systems and methods for worker resource management

Information

  • Patent Grant
  • 10909490
  • Patent Number
    10,909,490
  • Date Filed
    Monday, October 12, 2015
    8 years ago
  • Date Issued
    Tuesday, February 2, 2021
    3 years ago
Abstract
A worker resource management system may include a voice-directed mobile terminal that enables a dialog between a user and the voice-directed mobile terminal. At least one computer may be in communication with the mobile terminal. The computer can include a worker resource management module that receives and records user activity from the voice-directed mobile terminal. The worker resource management module can identify user productivity patterns and provide work assessment predictions based at least in part upon the user activity that is received and recorded. Management can make worker resource decisions in response to the user productivity patterns identified or the work assessment predictions provided by the worker resource management module.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of Indian Patent Application No. 2944/DEL/2014 for SYSTEMS AND METHODS FOR WORKER RESOURCE MANAGEMENT filed Oct. 15, 2014. The foregoing patent application is hereby incorporated by reference in its entirety.


FIELD

Embodiments of the present invention relate to the field of worker resource management and, more specifically, to worker resource management in a warehouse environment.


BACKGROUND

Wearable, mobile, and/or portable computer terminals are used for a wide variety of tasks. Such terminals allow workers to maintain mobility, while providing the user with desirable computing, data gathering, and data-processing functions. Furthermore, such terminals often provide a communication link to a larger, more centralized computer system.


One example of a particular use environment for a wearable terminal is in connection with a warehouse management system (WMS). A WMS generally involves product distribution and inventory management. One example of a commercial management system is VOCOLLECT VOICE SOLUTIONS™ from Honeywell International, Inc.


An overall integrated management system may utilize a central computer system that runs a program for product tracking/management and for order-filling via shipping. A plurality of mobile terminals may be employed within the system so that workers may communicate with the central system in relation to product handling and other related tasks.


One particularly efficient system is a voice-directed system that utilizes a voice-directed workflow. More specifically, to provide an interface between the central computer system and the workers or other users, such wearable terminals and the central systems to which they are connected are often voice-driven or speech-driven (e.g., operated or controlled at least in part using human speech). A bi-directional communication stream of information (i.e., a dialog) may be exchanged, typically over a wireless network, between the wireless wearable terminals and the central computer system. Information received by each wireless wearable terminal from the central system may be translated from text into voice instructions or commands for the corresponding worker. The mobile terminals and voice-directed work provide a significant efficiency in the performance of the workers' tasks. Specifically, using such terminals, the data-processing work is done virtually hands-free without cumbersome equipment to juggle or paperwork to carry around.


Typically, in order to communicate in a voice-driven system, the worker wears a headset which is communicatively coupled to a wearable or portable terminal. The headset has a microphone for voice data entry and an ear speaker for playing instructions (e.g., voice instructions). Through the headset, the workers are able to receive voice instructions regarding assigned tasks, ask questions, report the progress of tasks, and report working conditions such as inventory shortages.


Therefore, an overall integrated management system generally involves a combination of a central computer system for tracking and management, mobile devices (e.g., wearable terminals), and the users who use and interface with the computer system. Such users may be in the form of workers/operators such as order fillers and pickers (e.g., selection operators picking and placing items), or supervisors that access and monitor the system information. The workers handle the manual aspects of the integrated management system under the command and control of information transmitted from the central computer system to the wireless wearable terminal.


An illustrative example of a set of tasks suitable for a wireless wearable terminal with voice capabilities may involve initially welcoming the worker to the computerized inventory management system and defining a particular task or order, for example, filling a load for a particular delivery vehicle scheduled to depart from a warehouse at a certain specified time. The worker may then answer with a particular area (e.g., “working in freezer area”) that he will be working in order to fill that given order. The worker may then be directed to pick items to fill a pallet or bin used for the order.


The system may vocally direct the worker to a particular aisle and bin to pick a particular quantity of an item. The worker may vocally confirm the locations visited, the number of picked items, and/or various other information relating to worker activities. Once the bin or pallet is filled, the system may then direct the worker to a loading dock or bay for a particular truck or other delivery vehicle that will receive that order. As will be appreciated, the specific communications between the wireless wearable terminal and the central computer system for such voice-directed work can be task-specific and highly variable.


In addition to responding to inquiries or confirming the completion of certain tasks, the terminals may also allow the workers to interface with the computer system for other activities such as when they are starting/ending a shift (i.e., logging in or out of the system), and when starting/ending a break activity. For example, in order to indicate the beginning of a break activity the worker may report to the computer system through the headset using standard break vocabulary such as “take a break” followed by the type of break the worker wishes to take (e.g., lunch break, coffee break, etc.).


In existing management systems, workers may be checked and monitored by management based upon their performance with regard to multiple parameters. These parameters can include, but are not limited to, the workers' work rate (i.e., the pace at which the worker is performing their assigned tasks) and the workers' break durations. It is difficult in existing systems, however, to ascertain the workers' idle duration around reported break activities (i.e., before and after breaks). Determining worker idle duration around reported break activities is useful information for a supervisor because workers may generally cease or slow work activity before reporting break activities and/or after reporting returning from break activities. Although such worker idleness around break activities may affect the workers' overall work rate, in some cases workers may be able to achieve an acceptable work rate without having attracted attention from the supervisor.


In the described situation where the reported work rate remains at acceptable levels notwithstanding worker idle duration around reported break activities, the management system would not be providing potentially useful information to the supervisor regarding whether the work assigned to the worker is less than his or her capability, or if the workload in general can be increased for all the workers. This adversely affects the warehouse performance in terms of completing the work in a stipulated time period, and can result in problems for warehouse supervisors such as delayed assignments, required overtime, and related expenses causing cost overruns.


In other related situations, a worker may be working in an assigned team of workers for a given task relating to, for example, preparing an order for delivery to a customer. One worker in the group may be trying to slow the pace of the teams' work, and it is difficult to identify or flag this worker in real time. Identification may be possible based upon a periodic work rate report, but by the time the report issues harm has already been done to the warehouse operations (e.g., to the delivery schedules). Existing management systems and methods do not provide an effective way to determine, in real-time, which worker is causing delay.


Notwithstanding the benefits that a warehouse management system can provide, at times there are delays in completing work assignments. Problems in completing a given assignment can have ripple effects to a full shift, and even a whole day's work schedule (or beyond). This affects planning in terms of the resources needed, such as the workers that would be needed to complete the work in the remaining period of time. Also, with delays there may be a requirement for communicating a new estimated time of arrival (ETA) to a customer. Even if a calculation can be done to assess the delay, it becomes critical to monitor the situation going forward.


In some situations, delays may only become visible only towards a shifts' end, or at a periodic situational evaluation by supervisor. It might be too late at this point to take any remedial measures based upon the delay, and even if measures are taken there may be cost overruns in worker overtime or customer dissatisfaction (or both).


There could be other times when the opposite situation occurs in a warehouse environment; i.e., when given work assignments are completed too soon leaving some workers idle for a period of time (e.g., for a two hour period). This may be referred to as a problem of plenty. In this situation, there are resource management problems relating to work assessment, work allocation, work production, etc. In existing systems, however, these situations would not be identified until after the problem has occurred resulting in a loss of man hours which otherwise could have been put to better use.


Another situation which can arise is a combination of the two problems previously identified. A warehouse could be divided among teams working in different regions or there could be an allocation of groups of workers per truck route. While one team may be struggling to finish assigned work towards the end of shift, the other team may have become idle an hour before. In such cases, there is a need for a system which can forewarn the onset of a problematic situation relating to resource management.


As set forth above, while the utilization of voice-directed mobile terminals and management systems tends to improve worker efficiency, existing weaknesses in current systems remain in achieving maximized worker resource allocation. Accordingly, a need exists for management systems and methods that analyze worker productivity based at least in part on worker activity data retrieved from a voice-directed mobile terminal. A need also exists for systems and methods for work assessment predictions based at least in part on worker activity data retrieved from a voice-directed mobile terminal.


SUMMARY

Accordingly, in one aspect, the present invention embraces a worker resource management system including a voice-directed mobile terminal for facilitating a dialog between a user and the voice-directed mobile terminal. The system may include a computer in communication with the voice-directed mobile terminal, the computer including a worker resource analysis module. The worker resource analysis module may be configured to receive user activity information from the voice-directed mobile terminal, and identify user productivity patterns based at least in part upon the user activity information.


In one exemplary embodiment, the system includes a visual display in communication with the computer.


In another exemplary embodiment, the visual display provides reports corresponding to user productivity patterns.


In yet another exemplary embodiment, the visual display provides alerts corresponding to user productivity patterns.


In yet another exemplary embodiment, the worker resource analysis module is configured to classify user activity information into groups including user workflow tasks, user sign-in activity, user sign-out activity, user break activity, and/or user region changes.


In yet another exemplary embodiment, the user productivity patterns include user break duration, user idle time after sign-in, user idle time before sign-off, user idle time before beginning break activity, and/or user idle time after returning from break activity.


In yet another exemplary embodiment, the user productivity patterns are identified at fixed interval time periods immediately preceding the current identification time.


In yet another exemplary embodiment, the user productivity patterns identified are flagged based upon the most recent interval period immediately preceding a current identification time.


In another aspect, the present invention embraces a worker resource management system including a voice-directed mobile terminal for facilitating a dialog between a user and the voice-directed mobile terminal. The system may also include a computer in communication with the voice-directed mobile terminal, the computer including a worker resource analysis module. The worker resource analysis module may be configured to receive user activity information from the voice-directed mobile terminal, and provide work assessment predictions based at least in part upon user activity information received.


In one exemplary embodiment, the system includes a visual display in communication with the computer.


In another exemplary embodiment, the visual display provides reports or alerts corresponding to the work assessment predictions.


In yet another exemplary embodiment, the work assessment predictions include information that more workers are needed in a region.


In yet another exemplary embodiment, the work assessment predictions include information that a delivery vehicle will be delayed beyond scheduled departure time.


In yet another exemplary embodiment, the work assessment predictions are based upon the number of work units remaining in a warehouse region, the number of workers present in a warehouse region's workforce, and/or the rate at which work is being completed in a warehouse region.


In yet another aspect, the present invention embraces a method for managing worker resources including transmitting task data from a server computer to a voice-directed mobile terminal in communication with the server. Speech-based instructions associated with the task data may be provided to a user using the voice-directed mobile terminal. User activity information may be received from the voice-directed mobile terminal. The user activity information may be analyzed to identify user productivity patterns or provide work assessment predictions. Worker resource management decisions may be implemented by management in response to the user activity information analysis.


In one exemplary embodiment, the user productivity patterns include user break duration, user idle time after sign-in, user idle time after sign-off, user idle time before beginning break activity, and/or user idle time after returning from break activity.


In another exemplary embodiment, the worker resource decisions include providing a productivity compliance alert to a worker based upon break duration compliance.


In yet another exemplary embodiment, the worker resource decisions include transferring a worker to a second work region from a first work region.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a worker using an exemplary voice-directed mobile terminal in accordance with one embodiment of a worker resource management system of the present disclosure.



FIG. 2 is a block diagram illustrating certain components of an exemplary worker resource management system according to the present disclosure.



FIG. 3 is a graphical illustration depicting an exemplary warehouse region divided into time interval segments.



FIG. 4 is a graphical illustration depicting time interval segments for an exemplary warehouse region in line format.





DETAILED DESCRIPTION

Embodiments of the present invention embrace systems and methods for worker resource management. The exemplary worker resource systems track and provide supervisors or other management with timely updates, analysis, and predictions relating to workforce management so that problems can be identified and addressed in real-time. Typically, at least a portion of the analyzed data is generated by, or used in connection with, a voice-directed mobile terminal.



FIG. 1 depicts an exemplary voice-directed mobile terminal 10 that may be used with embodiments of the worker resource management system according to the present invention. The voice-directed mobile terminal 10 may be a wearable device, which may be worn by a worker 11 (e.g., on a belt 14 as shown), or by some other user or operator. This allows for hands-free operation. The voice-directed mobile terminal 10 might also be manually carried or otherwise mounted on a piece of equipment such as an industrial vehicle (e.g., a forklift). The worker 11 is shown in FIG. 1 operating a pallet jack 13, which is a piece of transportation equipment that may be utilized by a worker in a warehouse environment.


The use of the descriptive term “terminal” is not limiting and may include any similar computer, device, machine, smartphone, smartwatch, indicia reader, combination, or system. Furthermore, the voice-directed mobile terminal may include multiple pieces with separate housings or may be contained in a single housing similar to the embodiment shown in FIG. 1. Therefore, the terminal may also include multiple wearable pieces. Alternatively, some or all of the terminal functionality may be incorporated into the headset, which may include all the features required to communicate with a server or external computer. Therefore, the exact form of the voice-directed mobile terminal utilized to practice the present systems and methods is not limited to only the embodiments shown in the drawings.


Although the present application may generally reference “users” that interface with the exemplary systems of the present disclosure, the descriptive term “worker” or “operator” as set forth herein may be more specifically used in reference to workers/operators that perform work on the floor in a manufacturing environment or work on the floor of a warehouse (e.g., fillers, pickers, etc.). Such workers/operators would typically be the users of mobile terminal 10 in connection with the exemplary system. Similarly, other “users” that interface with the exemplary systems may be described using the descriptive term “supervisor.” As set forth herein, “supervisor” is generally in reference to a supervisor of workers/operators. The supervisors would generally have access to the graphical interface or display of the exemplary system as described below. The use of the descriptive terms “worker/operator” and “supervisor” in relation to users of the exemplary systems are not limiting and may include any similar member of an organization (staff member, manager, etc.).


The voice-directed mobile terminal 10 is typically a voice-driven device in that it includes speech interfaces to permit a worker 11 to communicate, using speech or voice, with an external computer such as server computer 20 as illustrated in FIG. 2. Typically, the voice-generated mobile terminal's speech interfaces are configured to be capable of permitting multiple different workers to communicate with the server computer as illustrated at FIG. 2 (e.g., using speech-recognition technology that recognizes different English dialects, different languages, etc.).


It will be appreciated by a person of ordinary skill in the art that the server computer 20 may be one or, more typically, a plurality of computers having software stored thereon. The server computer 20 may run one or more system software packages for handling/executing a particular task or set of tasks, such as inventory and warehouse management systems (which are available in various commercial forms), or any other systems where multiple tasks are handled by multiple workers. The server computer 20 may be any of a variety of different computers, including both client and server computers working together, and/or databases, and/or systems necessary to interface with multiple voice-directed mobile terminals 10 and associated with multiple different workers, to provide the work tasks that may be related to the products or other items handled in the voice-directed work environment.


The server computer 20 may include a Warehouse Management System (WMS), a database, and a Web application (not explicitly shown). The server computer 20 might also include a computer for programming and managing the individual voice-directed mobile terminals 10. The server computer 20 may be located at one facility or be distributed at geographically distinct facilities. Furthermore, the server computer 20 may include a proxy server. Therefore, the server computer 20 is not limited in scope to a specific configuration.


Alternatively, the voice-directed mobile terminals 10 may be stand-alone devices which interface directly with a worker 11 without a server computer. Therefore, various aspects of the present disclosure might be handled with voice-directed mobile terminals only. Usually, however, to have sufficient database capability to handle large amounts of information, a server computer is desirable.


In an exemplary embodiment, the voice-directed mobile terminal 10 communicates with the server computer 20 using a wireless communication link 22 (FIG. 2). The wireless link may be established through an appropriate wireless communication format (e.g., 802.11b/g/n) and may use one or more wireless access points that are coupled to the server computer 20 and accessed by the voice-directed mobile terminal 10. To allow the workers 11 to communicate with the system, one or more peripheral devices, including a headset 16 (e.g., earpiece, earbuds, etc.), are coupled to the voice-directed mobile terminal 10.


The headset 16 may be coupled to the voice-directed mobile terminal 10 through a wired connection such as cord 18 or by a wireless headset connection illustrated in FIG. 1 as reference numeral 19 (e.g., using the BLUETOOTH wireless protocol). The headset 16 may be worn on the head of the user/worker 11 and may use a microphone 21 for directing voice responses and activity reports to the voice-directed mobile terminal 10. A headset speaker 17 provides (e.g., plays) voice commands to the worker 11. The voice-directed mobile terminal 10 thus carries on a speech dialog with a worker 11 and provides hands-free operation and voice-directed movement throughout a warehouse or other facility.


It will be appreciated by a person of ordinary skill in the art that, although exemplary embodiments presented herein incorporate voice-direction techniques, the present disclosure is not limited to speech-directed terminals. The present disclosure embraces any terminal that carries on a dialog via speech, text (e.g., through a keyboard), gestures, or other communicative activity, with a worker/operator (or other user).


The server computer 20 includes a tasking module 25 for transmitting specific task data (e.g., picking instructions, training information, scheduling information, or other information associated with a request for a worker to perform some task or provide some information) to the voice-directed mobile terminal 10. Typically, the tasking module 25 is a software module stored on the server computer 20. Alternatively, the tasking module 25 may be a hardware module, or a combination of hardware and software.


The voice-directed mobile terminal 10 may use the task data received from the tasking module 25 to generate audio outputs at the headsets and speakers. For example, text data may be converted using a text-to-speech (TTS) interface to provide voice direction to a worker. Speech input or feedback from a worker is generated at the headset microphone 21 and transmitted to the voice-directed mobile terminal 10 where it is processed by speech recognition circuitry or other speech processing circuitry (e.g., speech recognition software). Any data that is obtained from the voice dialog (e.g., worker speech data) may then be relayed to the server computer 20.


For example, in a worker resource management system 100, the voice-directed mobile terminal 10 receives instructions (e.g., task data) from the tasking module 25 and converts those instructions into an audio transmission (e.g., audio file) to be heard by a worker/operator 11 via a speaker 17. The worker executes the audio instructions and, for example, goes to a designated location and picks a designated product or performs some other task communicated by the audio instructions. The worker 11 then replies into the microphone 21, in a spoken language, such as with a verification of a location and/or a product, and the audio reply is converted to a useable data format (e.g., worker speech data) to be sent back and processed by the server computer 20. That is, in the voice-directed or speech directed work context, the worker 11 maintains a speech dialog (e.g., workflow dialog) with the voice-directed mobile terminal 10 and/or server computer 20 to execute and complete a variety of tasks.


In order to identify worker productivity patterns, all worker 11 dialog interactions through the system 100 may be recorded with a timestamp and maintained by the server computer 10 by a worker resource analysis module 30. Accordingly, the server computer 20 maintains information or data relating the user/worker 11 activity or inactivity. For example, when the operator 11 begins his or her shift, starting from that point until the operator 11 logs out of the system 100 at end of the shift the activities of the worker 11 are recorded.


Activities of the worker 11 that are recorded may be classified by worker resource analysis module 30. Classification types can include, for example, the particular workflow on which worker was working (e.g., selection, replenishment, etc.), sign in, sign out, break, region changes, etc.


Typically, the worker resource analysis module 30 is a software module stored on the server computer 20. Alternatively, the worker resource analysis module 30 may be a hardware module, or a combination of hardware and software.


The worker resource analysis module 30 generates, based at least in part upon an analysis of the worker activity dialog between the voice-directed mobile terminal 10 and the worker 11, productivity data. The productivity data relates to the analysis of information relating to user/worker 11 activity or inactivity.


The productivity information provided by the worker resource analysis module 30 includes determinations relating to idle time around worker sign on/off and worker breaks. In this regard, the worker resource analysis module 30 may calculate the time that elapses between the tasking module 25 assigning work and the information relating to user 11 activities such as sign on, break, and sign off events. All such calculations, except for sign on data, may be based on the 24-hour period preceding the current time that an analysis occurs.


The following information, which could aid the supervisor to take action regarding resource issues, is calculated by the worker resource analysis module 30 from various activities of the worker 11:

    • Break duration: The time (e.g., in minutes) between an operator issuing the “take a break” command and returning from break.
    • Idle time after signing in: The time (e.g., in minutes) from the operator signing in with his or her password and the first operator activity on an assignment.
    • Idle time before signing off: The time (e.g., in minutes) from the last pick (if the assignment is still in progress) or other operator activity on an assignment and the operator issuing the “sign off” command.
    • Idle time before break: The time (e.g., in minutes) from the last pick (if the assignment is still in progress) or other operator activity on an assignment and the operator issuing the “take a break” command.
    • Idle time after taking break: The time (e.g., in minutes) from the operator issuing the “take a break” command and the first operator activity on an assignment or a pick if the assignment is in progress.


The above noted reports or alerts can be evaluated by the module 30 at fixed time intervals; i.e., running every “X” minutes. Each run could consider the system 100 activity based upon the current time period minus “X” number of minutes. This would ensure that no stale data or past activity creeps into the current productivity report or alert that is provided to the supervisor for addressing a problem.


By way of example, a worker 11 may enter a “take a break” activity at 11:00 am. The worker's break activity may extend beyond ten minutes, while the acceptable break duration for evaluation purposes may only be five minutes. System evaluation by the worker resource module 30 may be scheduled to occur at five minute intervals. Thus, when the subsequent evaluation occurs at 11:06 am, the worker 11 would be flagged by the system 100 (e.g., the supervisor would be alerted). At the next evaluation of 11:11 am, if the worker had not returned from break the worker would remain flagged by the system. Alternatively, if the worker 11 has returned at this time period, the system 100 would assume normalcy and no further action would be assumed. This is possible because only a delta of system activity between 11:06 am-11:11 am was considered by the system (i.e., the current time minus “X” number of minutes where X is equal to five minutes in this example).


Exemplary implementation scenarios where workers productivity/idleness patterns can be monitored by the worker resource analysis module 30 are:

    • Workers idle time before sign off: Condition-A worker's last activity is greater than “X” minutes before his or her sign off time.
    • Workers taking longer breaks: Condition-Workers are taking breaks longer than “X” minutes.
    • Workers idle time before break: Condition-A worker's last activity is more than “X” minutes before the start of his or her break time.
    • Workers idle time after break: Condition-A worker's next activity is more than “X” minutes after the end of his or her break time.
    • Workers idle time after break: Condition-An operator's next activity is more than “X” minutes after the end of his or her break time.


With repeated evaluations, the most recent deflection can be flagged by the system. The frequency of evaluation period could be set proportional to how critical the monitored situation activity is.


The productivity data generated by the worker resource analysis module 30 may be viewed by a workforce supervisor overseeing, for example, the performance of picking operators on a warehouse floor, on a display device 40 (e.g., LCD monitor) that is in communication with the server computer 20. The communication will typically be wireless communication using a wireless method of communication method (e.g., SMS or text messaging, electronic mail, etc.). The workflow management system 100 may display the productivity data in raw form or in a compiled form (e.g., a summary report). In this regard, a supervisor may be provided with information regarding the productivity of the workforce (e.g., selection operators working a warehouse floor), a selected subgroup of the workforce, or an individual member of the workforce. In this way, the exemplary worker resource management system 100 according to the present disclosure can provide timely information relating to worker productivity.


Typically, the worker resource management system 100 is configured to receive and display at least a portion of the productivity data in real time, thereby allowing the workforce manager to take immediate corrective action to remedy the reported problem.


A supervisor may utilize the relevant productivity information to, for example, manage operator downtime and break compliance with productivity alerts; use these alerts and various charted data to determine the cause of missed goal rates or work schedules; and/or to make informed management decisions and personnel actions at the right time instead of waiting for shift end reports to identify anomalies in work patterns.


In another exemplary embodiment, worker resource management system 100 can track and compare worker task and activity progress across multiple warehouse regions at given points or intervals of time. For example, worker resource analysis module 30 of system 100 can provide information relating to whether an assigned group/team of workers 11 (e.g., Team 1) assigned work in one warehouse region (e.g., Region 1) is performing at a faster rate than a team of workers 11 (e.g., Team 2) in another region (e.g., Region 2) such that the workers 11 of Team 1 will be completing their selected tasks a certain time period (e.g., “X” number of minutes) before the workers 11 of Team 2 will complete their assigned tasks. In this regard, the worker resource analysis module 30 of system 100 can forewarn supervisors regarding the onset of problematic situations with worker resource management reporting and allow the supervisor to take appropriate corrective action. For example, the system 100 can report (e.g., via display device 40) if the team of workers 11 working in an exemplary Region 1 is performing better/faster than the workers 11 in Region 2 such that Region 1 work will be completed at a certain time (i.e., “X” minutes) before Region 2 work, which would therefore yield an excess of workers 11 in Region 1.


Some features of the exemplary system 100 include the capability to predict workforce shortages as well as surplus, real-time alerts/reports in response to changes, the ability to integrate with third-party applications, and the ability to monitor/manage the whole of warehouse operations.


The worker resource analysis module 30 of the exemplary system 100 may generally use the following information in generating reports/alerts: the work units remaining in a given warehouse region, the current number of workers present in a region's workforce, the current time for assigned work completion in a region, and/or the rate at which work is being completed in a given region (e.g., in units/hour). The exemplary system 100 can observe the described metrics in real-time and provide timely updates to supervisors relating to workload.


In one example for a specific use environment, the system 100 can analyze worker 11 activities/tasks in order to predict when workers 11 will finish picking tasks in a warehouse for items relating to a given delivery route such that the route then can depart for delivery. The worker's loading activities and time may be taken into account for more accurate predictions relating to delivery vehicle loading completion and subsequent delivery vehicle departure.


In order for the worker resource analysis module 30 of the system 100 to compare worker 11 task progress across multiple warehouse regions or across multiple vehicle loading projects at certain given points of time, warehouse operational hours can be divided into finite intervals or periods of time (i.e., chunks of time). For example, FIG. 3 illustrates an exemplary warehouse region as a 24-hour time period 50, and then further divides the region into segments of equal, two-hour time intervals 55.



FIG. 4 illustrates the time intervals 60 in line format stretching into the future from present time, which is designated as “Now.” For example, if the current time (i.e., Now) is 1:00 pm, the next two-hour interval 62 runs to 3:00 pm; the next four-hour interval 64 runs to 5:00 pm; the next six-hour interval 66 runs to 7:00 pm; and the next eight-hour interval 68 runs to 9:00 pm. With intervals 60 established, worker resource analysis requiring a time window may then be considered.


In the item/product loading context for a delivery vehicle, workload for a given workforce region/route can be identified based upon the tasking module 25 assignment of the quantity of items to be picked by a worker 11 for the assignments that have a route delivery/departure falling within the given time intervals 60. The workload calculation for a subsequent interval (e.g., subsequent interval 64) will be inclusive of the workload of the previous interval (e.g., interval 62). For example, if interval 62 has workload of ten work units, interval 64 will have a workload of ten units plus the additional workload units that are included beyond the duration of interval 62.


The operators/workers 11 that are needed for a given region/route can be determined from the worker 11 activity recorded by the server computer 20 based upon factors including the quantity of items to be picked for the respective interval period, the actual rate at which items are being picked by workers 11, and the number of currently active/signed in operators 11.


Table 1 (below) is an exemplary operator requirement table noting the operators required for a given workload per exemplary regions (1-3) based upon departure intervals falling within the given intervals. For example, in the “Next” columns for Region 1 worker excess or shortage is listed respectively as −8, −8, 2, 11, and 24. This reveals an excess of eight operators until the “Next 06-hrs” interval when, due to the number of vehicles scheduled for departure, the work to be completed rises such that in addition to the eight operators, Region 1 now requires two additional workers to fulfill the Region's workload requirements.















TABLE 1





Region
Operators
Next
Next
Next
Next
Next


Number
Working
02-hrs
04-hrs
06-hrs
08-hrs
10-hrs







1
8
−8
−8
2
11
24


2
8
−8
−8
6
18
34


3
9
−9
−9
3
13
27









In order for the worker resource management system 100 to issue advance reports or alerts regarding the onset of potentially problematic situations with worker resource management, a number of factors are taken into account by the worker productivity analysis module 30. This includes the amount of work remaining (e.g., reported in workload units), the current number of workers 11 in the workforce (e.g., the number of users currently signed-in), the current projected time of work completion, and the current rate of work being accomplished (e.g., calculated in work-units/hour). These factors provide a basis for calculating the current demand for resources (e.g., time, workers, etc.). Certain exemplary factors/calculations that may be determined/reported through the worker resource analysis module 30 are set forth as follows:

    • Projected departure date:

      Calculation=(Work Remaining)÷(Work Rate×Operators Required)+Current Time.
    • Projected departure delay: The time difference (e.g., in minutes) from expected departure date/time and the projected departure date/time of the delivery vehicle carrying the assignment or route items.

      Calculation=Expected Departure−Projected Departure.
    • Operators required (REQ): The number of operators that are needed to work on a route or region to complete the route by the expected departure date and time.

      Calculation: REQ=Workload÷(Work Rate).
    • Workload: The total number of items remaining to process during the chosen interval (e.g., expressed in items-per-minute for the items remaining for all assignments in the route).
    • Work Rate: number of items processed per minute (the average processing rate of active operators, or the region goal rate if there are no active operators).


Based on the given factors/calculations, the worker resource analysis module 30 can provide useful reports or alerts to supervisors. For example, reports/alerts may be provided to a supervisor indicating that “X” number of additional workers are needed in a region, that “X” number of workers are surplus in a region, that delivery departure in a region would be delayed by “X” minutes/hours beyond current scheduled departure time, that delivery departure time for a region would be ready “X” minutes/hours before the current scheduled departure time, and/or that the work assigned in a given region would be complete after “X” hours.


Providing the noted reports or alerts to supervisors well enough in advance would give supervisors time to make appropriate worker resource adjustments and avoid worker resource management problems. To counter adverse situations, steps could be taken including transferring “X” number of workers to a region where workers are currently needed from a region where there is a surplus, transferring “X” number of workers from a region where there is a surplus to region where workers are needed, transferring “X” number of workers to the trucks/routes which are most recently scheduled for departures followed by a transfer to the trucks/routes scheduled to depart next, etc.


The worker resource analysis module 30 of the exemplary system 100 therefore provides reports/alerts that improve warehouse management as respective regions/routes can be managed for completion at almost the same time. The exemplary system 100 also provides for less disparity in work completion percentage, allows delivery vehicles to depart at a known/scheduled time, allows workforce and other resources to be more effectively utilized, and allows for greater work progress given that worker re-allocation can be monitored in real time.


Although exemplary embodiments of the present disclosure relate to a warehouse setting, it will be appreciated by a person of ordinary skill in the art that the present disclosure embraces systems and methods that may be used in connection with other environments. For example, and without intending to limit the present disclosure, the systems and methods according to the present disclosure may be used in a retail store setting, a pharmacy setting, or a transport vehicle. The term warehouse, therefore, is used in its broadest sense and is not intended to limit the application of the disclosure to a particular physical environment.


It will be appreciated that the present disclosure additionally embraces methods associated with the embodiments of the systems according to the present disclosure.


To supplement the present disclosure, this application incorporates entirely by reference the following patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266;
  • U.S. Pat. Nos. 7,159,783; 7,413,127;
  • U.S. Pat. Nos. 7,726,575; 8,294,969;
  • U.S. Pat. Nos. 8,317,105; 8,322,622;
  • U.S. Pat. Nos. 8,366,005; 8,371,507;
  • U.S. Pat. Nos. 8,376,233; 8,381,979;
  • U.S. Pat. Nos. 8,390,909; 8,408,464;
  • U.S. Pat. Nos. 8,408,468; 8,408,469;
  • U.S. Pat. Nos. 8,457,013; 8,459,557;
  • U.S. Pat. Nos. 8,469,272; 8,474,712;
  • U.S. Pat. Nos. 8,479,992; 8,490,877;
  • U.S. Pat. Nos. 8,517,271; 8,523,076;
  • U.S. Pat. Nos. 8,528,818; 8,544,737;
  • U.S. Pat. Nos. 8,548,242; 8,548,420;
  • U.S. Pat. Nos. 8,550,335; 8,550,354;
  • U.S. Pat. Nos. 8,550,357; 8,556,174;
  • U.S. Pat. Nos. 8,556,176; 8,556,177;
  • U.S. Pat. Nos. 8,559,767; 8,599,957;
  • U.S. Pat. Nos. 8,561,895; 8,561,903;
  • U.S. Pat. Nos. 8,561,905; 8,565,107;
  • U.S. Pat. Nos. 8,571,307; 8,579,200;
  • U.S. Pat. Nos. 8,583,924; 8,584,945;
  • U.S. Pat. Nos. 8,587,595; 8,587,697;
  • U.S. Pat. Nos. 8,588,869; 8,590,789;
  • U.S. Pat. Nos. 8,596,539; 8,596,542;
  • U.S. Pat. Nos. 8,596,543; 8,599,271;
  • U.S. Pat. Nos. 8,599,957; 8,600,158;
  • U.S. Pat. Nos. 8,600,167; 8,602,309;
  • U.S. Pat. Nos. 8,608,053; 8,608,071;
  • U.S. Pat. Nos. 8,611,309; 8,615,487;
  • U.S. Pat. Nos. 8,616,454; 8,621,123;
  • U.S. Pat. Nos. 8,622,303; 8,628,013;
  • U.S. Pat. Nos. 8,628,015; 8,628,016;
  • U.S. Pat. Nos. 8,629,926; 8,630,491;
  • U.S. Pat. Nos. 8,635,309; 8,636,200;
  • U.S. Pat. Nos. 8,636,212; 8,636,215;
  • U.S. Pat. Nos. 8,636,224; 8,638,806;
  • U.S. Pat. Nos. 8,640,958; 8,640,960;
  • U.S. Pat. Nos. 8,643,717; 8,646,692;
  • U.S. Pat. Nos. 8,646,694; 8,657,200;
  • U.S. Pat. Nos. 8,659,397; 8,668,149;
  • U.S. Pat. Nos. 8,678,285; 8,678,286;
  • U.S. Pat. Nos. 8,682,077; 8,687,282;
  • U.S. Pat. Nos. 8,692,927; 8,695,880;
  • U.S. Pat. Nos. 8,698,949; 8,717,494;
  • U.S. Pat. Nos. 8,717,494; 8,720,783;
  • U.S. Pat. Nos. 8,723,804; 8,723,904;
  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;
  • U.S. Pat. Nos. 8,740,082; 8,740,085;
  • U.S. Pat. Nos. 8,746,563; 8,750,445;
  • U.S. Pat. Nos. 8,757,495; 8,760,563;
  • U.S. Pat. Nos. 8,763,909; 8,777,108;
  • U.S. Pat. Nos. 8,777,109; 8,779,898;
  • U.S. Pat. Nos. 8,781,520; 8,783,573;
  • U.S. Pat. Nos. 8,789,757; 8,789,758;
  • U.S. Pat. Nos. 8,789,759; 8,794,520;
  • U.S. Pat. Nos. 8,794,522; 8,794,525;
  • U.S. Pat. Nos. 8,794,526; 8,798,367;
  • U.S. Pat. Nos. 8,807,431; 8,807,432;
  • U.S. Pat. Nos. 8,820,630; 8,822,848;
  • U.S. Pat. Nos. 8,824,692; 8,824,696;
  • U.S. Pat. Nos. 8,842,849; 8,844,822;
  • U.S. Pat. Nos. 8,844,823; 8,849,019;
  • U.S. Pat. Nos. 8,851,383; 8,854,633;
  • U.S. Pat. Nos. 8,866,963; 8,868,421;
  • U.S. Pat. Nos. 8,868,519; 8,868,802;
  • U.S. Pat. Nos. 8,868,803; 8,870,074;
  • U.S. Pat. Nos. 8,879,639; 8,880,426;
  • U.S. Pat. Nos. 8,881,983; 8,881,987;
  • U.S. Pat. Nos. 8,903,172; 8,908,995;
  • U.S. Pat. Nos. 8,910,870; 8,910,875;
  • U.S. Pat. Nos. 8,914,290; 8,914,788;
  • U.S. Pat. Nos. 8,915,439; 8,915,444;
  • U.S. Pat. Nos. 8,916,789; 8,918,250;
  • U.S. Pat. Nos. 8,918,564; 8,925,818;
  • U.S. Pat. Nos. 8,939,374; 8,942,480;
  • U.S. Pat. Nos. 8,944,313; 8,944,327;
  • U.S. Pat. Nos. 8,944,332; 8,950,678;
  • U.S. Pat. Nos. 8,967,468; 8,971,346;
  • U.S. Pat. Nos. 8,976,030; 8,976,368;
  • U.S. Pat. Nos. 8,978,981; 8,978,983;
  • U.S. Pat. Nos. 8,978,984; 8,985,456;
  • U.S. Pat. Nos. 8,985,457; 8,985,459;
  • U.S. Pat. Nos. 8,985,461; 8,988,578;
  • U.S. Pat. Nos. 8,988,590; 8,991,704;
  • U.S. Pat. Nos. 8,996,194; 8,996,384;
  • U.S. Pat. Nos. 9,002,641; 9,007,368;
  • U.S. Pat. Nos. 9,010,641; 9,015,513;
  • U.S. Pat. Nos. 9,016,576; 9,022,288;
  • U.S. Pat. Nos. 9,030,964; 9,033,240;
  • U.S. Pat. Nos. 9,033,242; 9,036,054;
  • U.S. Pat. Nos. 9,037,344; 9,038,911;
  • U.S. Pat. Nos. 9,038,915; 9,047,098;
  • U.S. Pat. Nos. 9,047,359; 9,047,420;
  • U.S. Pat. Nos. 9,047,525; 9,047,531;
  • U.S. Pat. Nos. 9,053,055; 9,053,378;
  • U.S. Pat. Nos. 9,053,380; 9,058,526;
  • U.S. Pat. Nos. 9,064,165; 9,064,167;
  • U.S. Pat. Nos. 9,064,168; 9,064,254;
  • U.S. Pat. Nos. 9,066,032; 9,070,032;
  • U.S. Design Pat. No. D716,285;
  • U.S. Design Pat. No. D723,560;
  • U.S. Design Pat. No. D730,357;
  • U.S. Design Pat. No. D730,901;
  • U.S. Design Pat. No. D730,902;
  • U.S. Design Pat. No. D733,112;
  • U.S. Design Pat. No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER′S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER′S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A worker resource management system, comprising: a voice-directed mobile terminal for facilitating a voice communication between a user and the voice-directed mobile terminal, wherein the user is assigned a picking task in one of multiple warehouse regions and each warehouse region has a delivery vehicle with a scheduled departure time; anda computer in communication with the voice-directed mobile terminal, the computer including a worker resource analysis module and a tasking module for transmitting a pick task instruction to the voice-directed mobile terminal, wherein the worker resource analysis module is configured to: (i) record the voice communication between the user and the voice-directed mobile terminal with a corresponding timestamp that indicates when the voice communication occurred, based at least in part upon the voice communication received from the voice-directed mobile terminal over a communication link,(ii) record user activity information from the voice-directed mobile terminal based on the recorded voice communication at the time the voice communication occurred, wherein the user activity information comprises at least one of a break duration, a user idle time after sign-in, a user idle time before sign-off, a user idle time before beginning break activity, and a user idle time after returning from break activity,(iii) identify user productivity patterns for the user based at least in part upon the user activity information,(iv) provide an alert corresponding to the user productivity patterns at predefined intervals based on the break duration taken by the user and a break period predefined by the worker resource analysis module, and(v) provide a report of a projected departure time of the delivery vehicle from each of the multiple warehouse regions relative to the scheduled departure time based on a function of an amount of work remaining in the warehouse region, number of users operating in the warehouse region, current rate of work being accomplished, and current projected time of work completion.
  • 2. The system of claim 1, comprising a visual display in communication with the computer.
  • 3. The system of claim 2, wherein the visual display provides reports corresponding to the user productivity patterns.
  • 4. The system of claim 2, wherein the visual display provides the alerts corresponding to the user productivity patterns.
  • 5. The system of claim 1, wherein the worker resource analysis module is configured to classify user activity information into groups comprising user workflow tasks, user sign-in activity, user sign-out activity, user break activity, and/or user region changes.
  • 6. The system of claim 1, wherein the user productivity patterns are identified at fixed interval time periods immediately preceding a current identification time.
  • 7. The system of claim 6, wherein the user productivity patterns identified are based on 24-hour time periods immediately preceding the current identification time.
  • 8. The system of claim 6, wherein the user productivity patterns identified are based on five minute time periods immediately preceding the current identification time.
  • 9. The system of claim 6, wherein the user productivity patterns identified are flagged based upon the most recent interval period immediately preceding the current identification time.
  • 10. A worker resource management system, comprising: a voice-directed mobile terminal for facilitating a voice communication between a user and the voice-directed mobile terminal, wherein the user is assigned a picking task in one of multiple warehouse regions and each warehouse region has a delivery vehicle with a scheduled departure time; anda computer in communication with the voice-directed mobile terminal, the computer including a worker resource analysis module and a tasking module for transmitting a pick instruction to the voice-directed mobile terminal, wherein the worker resource analysis module is configured to: (i) record the voice communication between the user and the voice-directed mobile terminal with a corresponding timestamp based at least in part upon the voice communication received from the voice-directed mobile terminal over a communication link,(ii) record user activity information from the voice-directed mobile terminal based on recorded voice communication at the time the voice communication occurred, wherein the user activity information comprises of at least one of a break duration, a user idle time after sign-in, a user idle time before sign-off, a user idle time before beginning break activity, and a user idle time after returning from break activity,(iii) provide work assessment predictions based at least in part upon user activity information received recorded,(iv) provide an alert corresponding to the work assessment predictions at predefined intervals based on the break duration taken by the user and a break period predefined by the worker resource analysis module, and(v) provide a report of a projected departure time of the delivery vehicle from each of the multiple warehouse regions relative to the scheduled departure time based on a function of an amount of work remaining in the warehouse region, number of users operating in the warehouse region, current rate of work being accomplished, and current projected time of work completion.
  • 11. The system of claim 10, comprising a visual display in communication with the computer.
  • 12. The system of claim 11, wherein the visual display provides reports ef alerts corresponding to the work assessment predictions.
  • 13. The system of claim 10, wherein the work assessment predictions comprise information that more workers are needed in the warehouse region.
  • 14. The system of claim 10, wherein the work assessment predictions comprise information that the delivery vehicle from the warehouse region will be delayed beyond scheduled departure time.
  • 15. The system of claim 10, wherein the work assessment predictions are based upon a number of work units remaining in the warehouse region, a number of users present in a warehouse region's workforce, and the rate at which work is being completed in the warehouse region.
  • 16. The system of claim 10, wherein the work assessment predictions comprise determining idle time duration around break duration.
  • 17. A method for managing worker resources, comprising: transmitting task data from a server computer to a voice-directed mobile terminal in communication with the server computer;providing speech-based instructions associated with task data to a user using the voice-directed mobile terminal, wherein the user is assigned a picking task in one of multiple warehouse regions and each warehouse region has a delivery vehicle with a scheduled departure time;recording a voice communication between the user and the voice-directed mobile terminal with a corresponding timestamp based at least in part upon the voice communication received from the voice-directed mobile terminal over a communication link;recording user activity information from the voice-directed mobile terminal based on recorded voice communication at the time the voice communication occurred, wherein the user activity information comprises of at least one of a break duration, a user idle time after sign-in, a user idle time before sign-off, a user idle time before beginning break activity, and a user idle time after returning from break activity;analyzing user activity information to (i) identify user productivity patterns,(ii) provide work assessment predictions;(iii) provide an alert corresponding to the work assessment predictions at predefined intervals based on the break duration taken by the user and a predefined break period;(iv) provide a report of a projected departure time of the delivery vehicle from each of the multiple warehouse regions relative to the scheduled departure time based on a function of an amount of work remaining in the warehouse region, number of users operating in the warehouse region, current rate of work being accomplished, and current projected time of work completion; andimplementing worker resource decisions in response to the analysis of the user activity information.
  • 18. The method of claim 17, wherein the worker resource decisions comprise providing a productivity compliance alert to a worker related to break duration compliance based on the corresponding timestamps.
  • 19. The method of claim 17, wherein the worker resource decisions comprise transferring a worker to a second work region from a first work region.
  • 20. The method of claim 17, wherein the worker resource decisions are based upon a number of work units remaining in the warehouse region, a number of users present in a warehouse region's workforce, and the rate at which work is being completed in the warehouse region.
Priority Claims (1)
Number Date Country Kind
2944/DEL/2014 Oct 2014 IN national
US Referenced Citations (499)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8532282 Bracey Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9443123 Hejl Jan 2016 B2
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9261398 Amundsen et al. Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262664 Soule, III et al. Feb 2016 B2
9274806 Barten Mar 2016 B2
9282501 Wang et al. Mar 2016 B2
9292969 Laffargue et al. Mar 2016 B2
9298667 Caballero Mar 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9319548 Showering et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9342827 Smith May 2016 B2
9355294 Smith et al. May 2016 B2
9367722 Xian et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
9396375 Qu et al. Jul 2016 B2
9398008 Todeschini et al. Jul 2016 B2
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9407840 Wang Aug 2016 B2
9412242 Van Horn et al. Aug 2016 B2
9418252 Nahill et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9448610 Davis et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9582696 Barber et al. Feb 2017 B2
9616749 Chamberlin Apr 2017 B2
9618993 Murawski et al. Apr 2017 B2
D785636 Oberpriller et al. May 2017 S
9715614 Todeschini et al. Jul 2017 B2
9728188 Rosen Aug 2017 B1
9734493 Gomez et al. Aug 2017 B2
20020129139 Ramesh Sep 2002 A1
20070063048 Havens et al. Mar 2007 A1
20070080930 Logan Apr 2007 A1
20090006164 Kaiser Jan 2009 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100265880 Rautiola et al. Oct 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110237287 Klein Sep 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120197678 Ristock Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120253548 Davidson Oct 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130325763 Cantor Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140058801 Deodhar Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140278391 Braho et al. Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chang et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150178523 Gelay et al. Jun 2015 A1
20150178534 Jovanovski et al. Jun 2015 A1
20150178535 Bremer et al. Jun 2015 A1
20150178536 Hennick et al. Jun 2015 A1
20150178537 El Akel et al. Jun 2015 A1
20150181093 Zhu et al. Jun 2015 A1
20150181109 Gillet et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150236984 Sevier Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150261643 Caballero et al. Sep 2015 A1
20150312780 Wang et al. Oct 2015 A1
20150324623 Powilleit Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160092805 Geisler Mar 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20170200108 Au Jul 2017 A1
20180091654 Miller Mar 2018 A1
20190114572 Gold Apr 2019 A1
20190124388 Schwartz Apr 2019 A1
Foreign Referenced Citations (5)
Number Date Country
3009968 Apr 2016 EP
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (34)
Entry
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User'S Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Bamdringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/676,109 for Indicia Reader;In re: Huck, filed Apr. 1, 2015.
Office Action for European Application No. 15 189 657.8 dated May 12, 2017, 6 pages.
Search Report and Written Opinion in counterpart European Application No. 15189657.8 dated Feb. 5, 2016, pp. 1-7.
Office Action in related European Application No. 15189657.8 dated May 12, 2017, pp. 1-6 [All references previously cited.].
Annex to the communication dated Jan. 3, 2019 for EP Application No. 15189657.9.
Annex to the communication dated Jul. 6, 2018 for EP Application No. 15189657.9.
Annex to the communication dated Nov. 19, 2018 for EP Application No. 15189657.9.
Decision to Refuse European Application No. 15189657.9, dated Jul. 6, 2018, 2 pages.
Summons to attend Oral Proceedings for European Application No. 15189657.9, dated Jan. 3, 2019, 2 pages.
Related Publications (1)
Number Date Country
20160117627 A1 Apr 2016 US