Systems and methods for scheduling call center agents using quality data and correlation-based discovery

Information

  • Patent Grant
  • 7864946
  • Patent Number
    7,864,946
  • Date Filed
    Wednesday, February 22, 2006
    18 years ago
  • Date Issued
    Tuesday, January 4, 2011
    13 years ago
Abstract
Systems and methods for scheduling call center agents are provided. An exemplary system for scheduling call center agents includes an agent computing device that is capable of obtaining quality scores of agents. The quality score is a measurement of quality that the agents provide to a call center. The Agent computing device is capable of transmitting the quality scores of agents over a network. The system further includes a manager computing device that is capable of: receiving the quality scores of agents over the network, receiving a scheduled interval, receiving a quality goal for the scheduled interval, the quality goal being a desired measurement of quality that the agents collectively provide to the call center, determining a quality goal for the scheduled interval based on the received quality scores of agents, and determining a schedule for the agents based on the quality goal, the quality goal score, and the scheduled interval.
Description
TECHNICAL FIELD

The present invention is generally related to scheduling employees of a company and, more particularly, is related to systems and methods for scheduling call center agents.


BACKGROUND

Existing forecasting and scheduling solutions allow call center managers to forecast workload and schedule the right number of skilled agents at the right times to meet their service goals. However, existing solutions do not factor quality of performance in determining a schedule. Therefore, call center supervisors have had to use inefficient manual processes, including manual scheduling of agents, to ensure that customers receive the desired service quality.


In addition, call centers have access to data about the quality of calls that are serviced. This data is collected based on a sophisticated set of voice recordings, forms filled out by supervisors, and other means for automatically and semi-automatically evaluating how well a call/customer was serviced. This data is kept as call details (e.g., time of day/date, type of call, customer information, etc.) and as agent details (e.g., name of agent, skills, experience, etc.). In addition, call centers can also have access to details regarding the actual schedule of agent operations (e.g., days, dates that agents worked, including their work shifts and break times, agent skills) and the actual record of call statistics overall (e.g., call service levels achieved, queue sizes, abandonment rates—all for various times of days, days of weeks, specific dates, etc.).


SUMMARY OF THE INVENTION

Systems and methods for scheduling call center agents are provided. Briefly described, one embodiment of such a system, among others, comprises an agent computing device that is capable of obtaining quality scores of agents and transmitting the quality scores of agents over a network. The quality score is a measurement of manager computing device that is capable of receiving the quality scores of agents over the network, receiving a scheduled interval, and receiving a quality goal for the scheduled interval. The quality goal is a desired measurement of quality that the agents collectively provide to the call center. The manager computing device is further capable of determining a quality goal score for the scheduled interval based on the received quality scores of agents, and determining a schedule for the agents based on the quality goal, the quality goal score, and the scheduled interval.


An embodiment of a method, among others, can be broadly summarized as comprising the steps of: obtaining quality scores of agents; defining a scheduled interval; defining a quality goal for the scheduled interval; determining a quality goal score for the scheduled interval based on the quality scores of the agents; and determining a schedule for the agents based on the quality goal, the quality goal score, and the scheduled interval.


Embodiments of the present invention can also be viewed as providing methods for optimizing call center performance. In this regard, one embodiment of such a method, among others, can be broadly summarized as comprising the steps of: obtaining quality performance of agents data that includes information on the quality of service and quality characteristics of the agent; obtaining call center operations data that includes information on statistics and details of a call center; correlating the quality performance of agents data and the call center operations data; identifying correlation-based discovery; and optimizing call center performance based on the correlation-based discovery.


Other systems, methods, features, and advantages of the present invention will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present invention, and be protected by the accompanying claims.





BRIEF DESCRIPTION OF THE DRAWINGS

Many aspects of the invention can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.



FIG. 1 is a schematic view of an embodiment of a system in which customers can communicate to call center agents of a company.



FIG. 2 is a block diagram of embodiments of agent and manager computing devices such as shown in FIG. 2.



FIG. 3 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates a graphical user interface for supervisors to calculate quality scores of agents.



FIG. 4 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates quality scoring of agents.



FIG. 5 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates a graphical user interface for configuring parameters used in calculating a schedule.



FIG. 6 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates a Gantt chart having a calculated schedule.



FIG. 7 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that identifies potential problems of a calculated schedule, such as shown in FIG. 6.



FIGS. 8A-B are flow diagrams that illustrate operation of an embodiment of an agent-manager system, such as shown in FIG. 1, in scheduling call center agents.



FIG. 9 is a block diagram that illustrates operation of an embodiment of a performance correlation application, such as shown in FIG. 2.



FIG. 10 is a block diagram that illustrates operation of an embodiment of a performance correlation application, such as shown in FIG. 9.



FIG. 11 is a flow diagram that illustrates operation of an embodiment of an agent scheduling application, such as shown in FIG. 2.



FIGS. 12A-B are flow diagrams that illustrate operation of an embodiment of a manager scheduling application, such as shown in FIG. 2.



FIG. 13 is a block diagram that illustrates operation of an embodiment of a manager scheduling application, such as shown in FIG. 2.





DETAILED DESCRIPTION

Disclosed herein are systems and methods that involve the scheduling of call center agents. In particular, the scheduling of agents can be achieved using a quality goal for a scheduled interval. In some embodiments, a schedule can be generated using statistical correlation of historical data associated with a call center and the agents. In this regard, correlation-based discovery (or patterns) can be analyzed to help supervisors improve the quality of service provided by the agents. For example, the supervisors may be interested in how quality of service (poor versus good quality of service) may be correlated with other contact center statistics. These statistics can include agent skill group, schedule information (time of day, date, etc.), and queue load information, for example.


Exemplary systems are first discussed with reference to the figures. Although these systems are described in detail, they are provided for purposes of illustration only and various modifications are feasible. After the exemplary systems have been described, examples of display diagrams and operations of the systems are provided to explain the manner in which the scheduling of the call center agents and the correlation of historical data can be achieved.


Referring now in more detail to the figures, FIG. 1 is a schematic view of an embodiment of a system in which customers can communicate to call center agents of a company. As indicated in this figure, the system 10 comprises one or more customer premises 13, 16, 19, a network 21 (e.g., PSTN and/or cellular), and a company premises 23, which includes a call center 26. Call center 26 includes an agent-manager system 27 having agent computing devices 29, 31 and 33 that communicate with a manager computing device 39 via a network 36 (e.g., local area network (LAN) or wide area network (WAN) or virtual private network (VPN)). The agent and manager computing devices 29, 31, 33, 39 can, for instance, comprise desktop computers (PCs) or Macintosh computers that can be connected to communication devices, such as headsets, microphones, and headphones, among others. The agent and manager computing devices are further described in relation to FIG. 2. The customer premises 13, 16, 19 include, for example, telephones, cellular phones, and any other communication devices that communicate to the call center 26 via the network 21.



FIG. 2 is a block diagram of embodiments of agent and manager computing devices such as shown in FIG. 1. As indicated in FIG. 2, each agent computing device 29, 31, 33 comprises a processing device 41, memory 43, one or more user interface devices 55, one or more input/output (I/O) devices 57, and one or more networking devices 59, each of which is connected to local interface 53. The processing device 41 can include any custom made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors associated with the agent computing device 29, 31, 33, a semiconductor base microprocessor (in the form of a microchip) or a macroprocessor. The memory 43 can include any one or a combination of volatile memory elements (e.g., random access memory RAM, such as DRAM, SRAM, etc.) and non-volatile memory elements (e.g., ROM, hard drive, tape, CD-ROM, etc.).


The one or more user interface devices 55 comprise elements with which the user (e.g, agent) can interact with the agent computing device 29, 31, 33. Where the agent computing device 29, 31, 33 comprises a personal computer (e.g., desktop or laptop computer) or similar device, these components can comprise those typically used in conjunction with a PC such as a keyboard and mouse.


The one or more I/O devices 57 comprise components used to facilitate connection of the agent computing device to other devices and therefore, for instance, comprise one or more serial, parallel, small computer system interface (SCSI), universal serial bus (USB), or IEEE 1394 (e.g., Firewire™) connection elements. The networking devices 59 comprise the various components used to transmit and/or receive data over the network 36, where provided. By way of example, the networking devices 59 include a device that can communicate both inputs and outputs, for instance, a modular/demodular (e.g., modem), a radio frequency (RF) or infrared (IR) transceiver, a telephonic interface, a bridge, and a router, as well as a network card, etc.


Memory 43 comprises various programs (in software and/or firmware), including an operating system (O/S) 46 and an agent scheduling application 49. The O/S 46 controls the execution of programs, including the agent scheduling application 49, and provides scheduling, input-output control, file and data management, memory management, communication control and related services. The agent scheduling application 49 facilitates the process for transmitting quality scores of agents. Typically, the process involves receiving evaluations corresponding to the agents and transmitting the evaluations via the network 36, and receiving the schedule of the agents based on configured quality goals and calculated correlation-based discovery. Operation of the agent scheduling application 49 is described in relation to FIGS. 8A-B and 11.


The architecture for the manager computing device 39 is similar to the architecture of the agent computing device 29, 31, 33 described above. That is, the manager computing device 39 includes a processing device 61, one or more user interface devices 75, one or more I/O devices 77, and one or more networking devices 79, each of which is connected to a local interface 73. The user interface device 75 is connected to a display device 76, which includes a display diagram 78. Examples of the display diagrams 78 are further described in FIGS. 3-7.


The memory 63 includes agent performance database 62, quality performance database 64, call center operations database 65, quality score database 67, manager scheduling application 69, and performance correlation application 70. Typically, the agent schedule database 62 includes, but is not limited to, information on shift details and shift-related information of the agents; the quality performance database 64 includes, but is not limited to, information on the quality of service and quality characteristics of the agent; the call center operation database 65 includes, but is not limited to, information on statistics and details of a call center; and the quality score database 67 includes, but is not limited to, the quality scores of the agents.


The performance correlation application 70 provides statistical correlation of historical data associated with a call center and the agents. Correlation-based discovery (or patterns) is then analyzed to help supervisors improve the quality of service provided by the agents. Operation of the performance correlation application 70 is described in relation to FIGS. 8A-B, 9, and 10. The manager scheduling application 69 provides a schedule for the call center agents based on configured quality goals, calculated quality scores of the agents, and calculated correlation-based discovery. Operation of the manager scheduling application 69 is described in relation to FIGS. 8A-B, 12A-B, and 13.



FIG. 3 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates a graphical user interface for supervisors to calculate quality scores of agents. By activating the manager scheduling application 69, a display diagram 78a entitled “Quality and Scheduling” is displayed on the display device 76. The diagram includes file menus 81 that include Agents 93, Schedule 105, and Scorecard 83. By selecting the file menu Scorecard 83, option “Quality Score Calculator” 86 appears.


By selecting option “Quality Score Calculator” 86, display diagram 78b entitled “Configure: Quality Score Calculator” is displayed. This diagram is used for configuring the calculation of quality scores of the agents. The Quality Score Calculator display diagram 78b includes a “Look Back Days” option 89 and “Quality Score Calculation Mode” option 91. The look back days option 89 enables the user to calculate quality scores of the agents in the past days, for example, for the past seven days. In other embodiments, the quality score can be calculated over past hours, weeks, a specific queue of the day, a specific day of the week, holidays, and events, for example.


In this example, each agent is evaluated on a daily basis based on evaluations and is given a quality score for each day. The quality score is stored in the quality score database 67 of the manager computing device 39. The agent could also be evaluated on different types of work or evaluated on one type of work performed in multiple intervals. Hence, an agent can receive a single quality score for one type of work or multiple quality scores for different types of work. For example, an agent might provide high quality when interacting during English language calls, but low quality when interacting during Spanish language calls.


The quality score calculation mode 91 enables the user to configure a mode of calculating the quality scores of the agents for the number of days entered in the “look back days” option 89. For example, the mode of calculation can be the average score of the agents during the previous 7-day interval. Other modes of calculation include, but are not limited to, medium score, lowest score, and highest score, for example.



FIG. 4 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates quality scoring of agents. By selecting agent score option 94 of the file menu Agents 93 on display diagram 78c, display diagram 78d appears. In particular, diagram 78d includes first name 96, middle initial 99, last name 101, and quality score 103.



FIG. 5 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates a graphical user interface for configuring parameters used in calculating a schedule. By selecting goals option 106 of the file menu schedule 105 on display diagram 78e, display diagram 78f appears on the display device 76. The diagram 78f includes schedule option 107 for scheduling agents on specified intervals. The intervals include a start date 108 and end date 110, such as Jul. 4, 2005 at 5:00 am and Jul. 11, 2005 at 5:00 pm, respectively. A supervisor can select a “Visualization of Contact Center Performance” option 112 for identifying patterns that potentially indicate why and when certain poor performance occurs repeatedly. The patterns can be correlated with exogenous events such as high average handling time (AHT) or long queue times. The operation of the visualization of contact center performance is further described in relation to FIGS. 8A-B, 9, and 10.


In service goal section 109, a “Make Goal Constant” option 113 enables the supervisor to set the same goal for the specified intervals being scheduled. A service level option 116 enables the supervisor to configure a level of service that the agents should be providing during the specified intervals. For example, the supervisor can specify that the agents should answer 80% (option 119) of the incoming calls within 30 seconds (option 121) or that each agent should answer a call on average (option 123) within, for example, two seconds of an incoming call (option 126).


A quality goal section 111 enables a user to schedule, for example, 2 agents (option 131) or a percentage of the agents (option 133) having at least a quality score of 80 (option 136) during the past working 7-day period. As mentioned above, the quality score is calculated based on evaluations of performance given to each agent. In this example, the quality score indicates the quality of service that the agents have demonstrated during the past 7-day period. The display diagram 78f further includes action features 139 such as “Ok”, “Cancel”, “Apply”, and “Help”.


After the supervisor completes configuring the parameters used in calculating a schedule, the user can select the “Ok” button of the action features 139 to start calculating the schedule for the agents. The operation of scheduling agents based on quality data and correlation-base discovery is further described in relation to FIGS. 8A-B, 12A-B, and 13.



FIG. 6 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that illustrates a Gantt chart having a calculated schedule. The display diagram 78h can appear by selecting the “Ok” button 139 in FIG. 5 or by selecting calendar option 114 of the file menu schedule 105 on display diagram 78g in FIG. 6, for example. The Gantt chart 78h includes staffing differentials 118, staffing with quality 120, service level 122, number of agents scheduled 124, date and time 125, and list of employees 127.


The Gantt chart 78h provides various activities for each employee, which can be color coded. For example, employee 10 should come into work at 5:30 am and begin answering calls (130), get a break (132) from 8:00-8:30 am, answer calls (130) from 8:30-10:30 am, and should attend a manager meeting (134) from 10:30-11:00 am. There is no intention to limit the invention to this particular type of activity. Other types of activities may include training and installation of an application, for example. The staffing with quality section 120 provides a total number of agents staffed for a 15-minute interval who are meeting their quality goal. Exclamation marks 135 next to the employees 10 and 7 indicate that they have not met their quality goal. The supervisor can click on an exclamation mark for more information.



FIG. 7 is an exemplary display diagram from the display device of a manager computing device, such as shown in FIG. 2, that identifies potential problems of the calculated schedule such as shown in FIG. 6. By selecting the “Ok” button 139 in FIG. 5 or selecting calendar option 114 of the file menu schedule 105 on display diagram 78g in FIG. 6, for example, the display diagram 78j within display diagram 78i shows a schedule having identified potential problems with the calculated schedule. For example, schedule issues folder 143 is expanded and includes general issues folder 144, assignment rule issues folder 156, queue issues folder 159, shift issues folder 161, event issues folder 163, and visualization issues folder 166.


The general issues folder 144 can include, for example, problem 146 indicating that only three agents are scheduled on Jul. 4, 2005 at 7:30 am, but that a corresponding minimum quality agents constraint requires 4 agents. Additionally, problem 149 indicates that only two agents are scheduled on Jul. 5, 2005 at 7:00 am, but the minimum quality agents constraint requires 4 agents; problem 151 indicates that only one agent is scheduled on Jul. 7, 2005 at 4:45 pm, but the minimum quality agents constraint requires 2 agents; and problem 153 indicates that no agents are scheduled to work between Jul. 10, 2005 at 3:00 am and Jul. 11, 2005 at 5:00 am.


The visualization issues folder 166 can include, for example: discovery (or pattern) 167 that indicates employees 9 and 10 repeatedly perform poorly half an hour before and after lunch; pattern 168 that indicates employee 1 repeatedly performs poorly from 5:00 am to 9:00 am on the 4th of July and performs very well afterwards; and pattern 169 that indicates employee 7 repeatedly performs poorly from 5:00 am to 7:30 am in the mornings. Patterns 167, 168, 169 can affect the calculated schedule. By providing such patterns, the supervisor can manually change the calculated schedule to improve the performance of the agents. Additionally or alternatively, a schedule can be automatically or semi-automatically calculated using such patterns.



FIGS. 8A-B are flow diagrams that illustrate operation of an embodiment of an agent-manager system, such as shown in FIG. 1, in scheduling call center agents. With the agent-manager system 27, a schedule can be automatically calculated based on the quality goals, quality scores of agents, and correlation-based discovery. Beginning with block 171, a schedule is executed at a call center and in block 173, call center operation data and agent schedule data associated with the schedule are identified. In block 175, quality performance of agents that worked during the scheduled interval is calculated. It should be noted that the identified call center operation data, agent schedule data, and calculated quality performance of agents data are stored in memory 63 and can be accumulated as more schedules are calculated and executed. In block 177, a performance correlation application 70 is executed to correlate at least two of the following stored data: quality performance of agents data, agent schedule data, and call center operation data. The operation of the performance correlation application 70 is further described in relation to FIGS. 9 and 10.


In block 179, the agent-manager system 27 determines quality scores of the agents based on the past working interval. In block 183, an interval is defined for another schedule, and in block 185, a quality goal can be defined for the call center performance. It should be noted that a quality goal can be defined for agents based on the correlation-based discovery calculated by the performance correlation application 70. For example, to assist a supervisor in defining a quality goal (and/or a service goal) of the call center during the Christmas season, the supervisor may access historical patterns on the performance of the call center during the last two Christmas seasons. The performance during the last two Christmas seasons may influence the supervisor as the levels of quality goal and service goal are configured.


In block 187, the quality goal score for each schedule of the agents is calculated based on the quality scores of the agents. In block 189, the schedule is calculated based on the defined quality goal, the calculated quality goal score, and the correlation-based discovery. As mentioned above, the supervisor can manually change the calculated schedule to improve the performance of the agents based on the correlation-base discovery. Additionally or alternatively, a schedule can be automatically or semi-automatically calculated using the correlation-base discovery that relates to the performance of the agents and the scheduled interval. In block 190, the flow diagram returns to block 171 where the schedule is executed.



FIG. 9 is a block diagram that illustrates operation of an embodiment of a performance correlation application, such as shown in FIG. 2. Agent schedule database 62 includes, but is not limited to, parameters such as skills, day, date, breaks, meetings, and training. Quality performance database 64 includes, but is not limited to, parameters such as call quality scores and agent name. Call center operation database 65 includes, but is not limited to, parameters such as call identification, call type, agent name, call stats (e.g., time/date percentage of calls answered, average handling times, call volumes, wait times, abandonment rates), and temporal details (e.g., events occurring within a certain time period of each other).


The parameters obtained from the databases 62, 64, 65 are inputted into the performance correlation application 70, which identifies patterns (or correlation-based discovery) that show why and/or when certain poor performance occurs repeatedly and correlated with exogenous events such as high AHT or long queue times. The identified patterns are communicated to the I/O interface, such as display device 76 shown in FIG. 2, or to a manager scheduling application 69 such as shown in FIG. 2. The operation of the performance correlation application 70 involves visualization of contact center performance by way of pre-computed correlations between contact center, agent schedule, and agent performance parameters. The performance correlation application can provide n-way correlations between poor/good quality of agent measurements and other call center details. Thus, the performance correlation application can provide a statistical examination of time-indexed data to find correlations between quality of agents measured historically and other call center details.



FIG. 10 is a block diagram that illustrates operation of an embodiment of a performance correlation application, such as shown in FIGS. 2 and 9. The performance correlation application 70 identifies statistically valid correlations between quality of agents and other call data. In block 190, parameters from the quality performance database, call contact operation database, and agent schedule database are inputted into the correlation performance application. In block 192, at least two parameters are inputted into the performance correlation application 70 such as a call quality of an agent and other parameters related to the call center and agents. In block 194, the inputted parameters can be pre-process filtered to remove outlier data (for example, removal of all data that is a greater than two standard deviation away from the mean/median). In an alternative embodiment, the pre-processing filtering can allow selective user-induced filtering on other parameters (for example, look only at data relating to a particular agent).


In block 195, the filtered data represents two axes of information, for example, call quality and some second axis (X). This two-axis data is inputted into a Pearson r linear correlation formula, which computes the value of r, known as the correlation coefficient. The correlation coefficient ranges from −1 (perfect negative correlation) to 0 (no correlation) to 1 (perfect positive correlation). In addition, statistical significance p (e.g., p<0.050) is computed based on total amount of data used to compute r and based on the probability of the value of r, which is based on random data. This significance value p, the total data set size N, and the correlation coefficient r constitute the outputs of the correlation engine, as shown in block 197. In block 198, a user can adjust the thresholds on desired levels of r magnitude, significance value p, data set size minima N, and other filters such as filtering to a single agent or a set of agents for data to be used in the correlation analyses. In addition, a user can also turn the pre-process filter on or off and input other combinations of paired values.


In some embodiments, the operation of the correlation performance application can be described as a method for taking an exemplary quality of service measurement of a specific call or interval, and mining the exemplar to indicate other call episodes that are statistically correlated in terms of quality of service and in terms of one or more of the set X details with the exemplar. For example, a manager discovers a poor call of Joe's on Wednesday afternoon and requests a correlation between Joe's poor calls across a range of days. Based on the results of the correlation, the manager finds a list of poor calls by Joe on Wednesdays. The list indicates to the manager that Joe does poorly on Wednesdays consistently.


Another correlation computational method is clustering, which takes all evaluated and unevaluated calls and clusters them, or partitions them into sets based on one or more parameters that are statistically correlated with quality of service. A manager can visualize each “cluster” for the user and preferably annotate which calls are evaluated and which are unevaluated. Yet another correlation computational method is statistical trend analysis, which looks for temporal trends in correlated sets of calls to show that quality of service is increasing or decreasing statistically significantly over time.



FIG. 11 is a flow diagram that illustrates operation of an embodiment of an agent scheduling application, such as shown in FIG. 2. The agent scheduling application 49 stores evaluations for agents, as indicated in block 191, and receives a request for information to transmit evaluations over a network, as indicated in block 193. Based on the requested information, the agent scheduling application 49 determines the type of evaluations to be transmitted over the network, as indicated in block 196. The agent scheduling application 49 transmits the determined evaluations over the network, as indicated in block 199.



FIGS. 12A-B are flow diagrams that illustrate operation of an embodiment of a manager scheduling application, such as shown in FIG. 2. Beginning with block 201, the manager scheduling application 69 transmits request information for evaluations of agents over a network, and receives evaluations of agents over the network based on the request information, as indicated in block 203. In block 205, quality scores are determined for the agents based on the received evaluations. In block 207, the manager scheduling 69 receives defined intervals for scheduling the agents, defined service goal and defined quality goal.


In block 209, the manager scheduling application 69 determines a quality goal score for the defined interval based on the determined quality scores of the agent. In block 210, correlation-based discovery from the performance correlation application can be displayed so that a user can view and manually change the schedule based on the correlation-based discovery. In an alternative embodiment, the correlation-based discovery is inputted into the manager scheduling application to assist in determining the schedule. One or more schedules are determined for scheduling the agents, as indicated in block 211, based on the quality goal score of the schedules, the received quality goal for the scheduled interval, and the correlation-based discovery. In block 213, the one or more schedules are displayed for the manager on a display device 76. In block 216, the one or more schedules are transmitted to the agent computing device over the network.



FIG. 13 is a block diagram that illustrates operation of an embodiment of a manager scheduling application, such as shown in FIG. 2. Scheduled interval 107, quality goal 111, and quality scores from database 67 are inputted into Required Agents Relaxation Algorithm 223, which outputs required quality staffing 226 for a scheduled interval. The Quality Goal Score Formula 229 and the Required Agents Relaxation Algorithm 223 process a quality goal score 231 for each schedule and send the quality goal score to the Search Engine 233. Such Engine 233 is able to generate numerous schedules 236, 239, evaluate each schedule, and then return the best schedule generated 241. In this case, one criterion of a schedule is meeting the quality goal. Each schedule that is generated and evaluated should have such a quality goal score; the schedule with the quality goal score being equal to or greater than the defined quality goal should be returned as the “optimal schedule” 241.


There are two types of quality goals that are handled differently by the manager scheduling application. The first type of quality goals can be referred to as minimum agents with quality. This type of goal specifies that at least X agents are present in every interval with a quality score of at least Y. The second type of quality goal can be referred to as percent agents with quality. This type of goal specified that at least X percent of the scheduled agents in every interval have a quality score of at least Y.


In the scheduled interval, staffing 236 can be computed for every potential schedule by summing the total number of agents skilled and scheduled to be working on a particular queue in that interval. If the agents are skilled to work on multiple queues in the scheduled interval, the agents should be counted as a full agent towards the staffing number on each queue. For example, five agents are skilled and scheduled to work on the customer service queue for the interval of 8:00 am-8:15 am. Seven agents are skilled and scheduled to work on the sales queue. Three agents are skilled and scheduled to work on both the queues. This should result in a total of eight agents and ten agents qualified to be working on the customer service queue and sales queue, respectively.


Next, quality staffing 226 can be computed for the scheduled interval via a similar method. The quality staffing 226 is the total number of agents skilled and scheduled to be working on a particular queue in that interval who have a quality score that is greater than the quality score of at least Y specified in the goal. If the agents are skilled to work on multiple queues in an interval, the agents should be counted as a full agent towards the staffing number on each queue where they meet the quality goal. For example, one of the five agents skilled to work the customer service queue has a quality score of at least 75. Two of the seven agents skilled to work on the sales queue have a quality score of at least 85. Of the three agents skilled to work both queues: one meets the quality score for customer service, and one meets the quality score from sales. This should result in a total of two agents and three agents qualified to be worked on the customer service queue and sales queue, respectively.


Alternatively, a manager may care about a portion of time that agents actually spend on the queue instead of just the number of agents skilled to work that queue. In that case, the staffing 236 and quality staffing 226 can be computed differently. The staffing 236 can be determined as the equivalent amount of dedicated agents needed to provide the same service level as the current collection of multi skilled agents on the queue. The quality staffing 226 can be determined as the sum of the contributions to the queue for each of the agents skilled to work that queue.


The Required Agents Relaxation Algorithm 223 computes the required agents 226 with quality for the scheduled interval. In the case of the minimum agents goal, this calculation simply returns the at least X agents specified in the quality goal. For example, the quality goal for the customer service queue specifies at least three agents with a quality score of 75. Therefore, the required agents should be three for the scheduled interval.


The quality goal Score Formula 229 uses the required quality staffing value 226 and the quality staffing value 239 for each interval to compute the quality goal score 231 that reflects how closely this potential schedule meets the quality goal. If the quality staffing 239 is greater than or equal to the required quality staffing 226, clearly the goal is met. If the quality staffing 239 is less than the required quality staffing 226, there is deviation from the goal.


In the case of the Percent Agents goal, the algorithm can be more complex, such as by using the concept of relaxation and multiple passes to avoid over-constraining the search. Relaxation is a heuristic search technique that employs solving a simplified version of a problem as an intermediate step to solving the complete problem. In this case, early in the search, the quality goal is relaxed so the Required Agents Relaxation Algorithm 223 returns 0. This allows the Search Engine 233 to explore schedules that staff the call center optimally and meet service goals without restricting the Search Engine 233 to evaluating only schedules that meet the quality goal.


Then, at the beginning of a later pass, the Search Engine 233 sets the Required Agents Relaxation Algorithm 223 to compute the required staffing 226. At this point, the algorithm computes the required quality staffing 226 to meet the quality goal and cache it. The Search Engine 233 then instructs the Algorithm 223 to disable relaxation so the cached value should be returned instead of 0. The Search Engine 233 should also set to re-compute the required staffing 226 before each subsequent pass. For example: if there are ten staffing 236 and three quality staffing 239 are scheduled to work on the sales queue for the interval of 8:00 AM-8:15 AM, then the Agents Relaxation Algorithm 223 calculates zero required quality staffing 226. If the quality goal is twenty percent (20%) of staffing, then the required quality staffing is two, which is 20% of ten staffing 236. Once the required agents 226 are computed for the percent agents goal, the values are used by the Quality Goal Score Formula 229 just as in the minimum agents goal method that is stated above.


Finally, the application displays status messages displaying when the goal is met and is not met and displays the staffing for each 15-minute interval that meets the quality goal. This allows the users to validate that their goals are being met by the created schedule.


It should be noted that in the foregoing description, any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.


It should be emphasized that the above-described embodiments of the present invention, merely possible examples of implementations, merely set forth for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiments of the invention without departing substantially from the spirit and principles of the invention. All such modifications and variations are intended to be included herein within the scope of this disclosure and the present invention and protected by the following claims.

Claims
  • 1. A method for scheduling call center agents, comprising the steps of: obtaining quality scores of agents, the quality scores being a measurement of quality that each agent provides to a call center;obtaining quality performance of agents data that includes information on a quality of service and quality characteristics of the agent;obtaining call center operations data that includes information on statistics and details of a call center;defining a scheduled interval;defining a quality goal for the scheduled interval, the quality goal being a desired measurement of quality that the agents collectively provide to the call center;determining a quality goal score for the scheduled interval based on the quality scores of the agents;discovering a correlation between the quality performance of agents data and the call center operations data, wherein the correlation is computed using one or a combination of the following methods: Pearson linear correlation, clustering computing method, and statistical trend analysis, wherein the computation using Pearson linear correlation outputs correlation coefficient r and statistical measurement p with data set size N; anddetermining a schedule for the agents based on the correlation, the quality goal, the quality goal score, and the scheduled interval.
  • 2. The method as defined in claim 1, further comprising receiving evaluations of the agents.
  • 3. The method as defined in claim 2, further comprising determining the quality scores of the agents based on the received evaluations of the agents.
  • 4. The method as defined in claim 1, wherein the step of defining the quality goal includes defining a minimum quality score, and defining either a number of minimum agents having the minimum quality score or a percentage of agents in the scheduled interval having the minimum quality score.
  • 5. The method as defined in claim 4, further comprising determining whether each of the agents is skilled to work on multiple queues in the scheduled interval.
  • 6. The method as defined in claim 5, further comprising determining the total number or the percentage of agents skilled and available to be working on a particular queue in the scheduling interval who have the minimum quality score.
  • 7. The method as defined in claim 6, wherein the step of determining the sum of contributions to the queue for each of the agents includes the portion of time that the agents spend on the queue.
  • 8. The method as defined in claim 6, wherein the step of determining the percentage of agents includes a relaxation and multiple passes technique.
  • 9. The method as defined in claim 5, further comprising determining the sum of contributions to the queue for each of the agents.
  • 10. A system for scheduling call center agents, comprising: an agent computing device that is capable of obtaining quality scores of agents, the quality scores being a measurement of quality that the agents provide to a call center, the agent computing device transmitting the quality scores of agents over a network; and
  • 11. The system as defined in claim 10, wherein the manager computing device receives evaluations of the agents over the network from the agent computing device.
  • 12. The system as defined in claim 11, wherein the manager computing device determines the quality scores of the agents based on the received evaluations of the agents.
  • 13. The system as defined in claim 10, wherein the quality goal includes a minimum quality score, and either a number of minimum agents having the minimum quality score or a percentage of agents in the scheduled interval having the minimum quality score.
  • 14. The system as defined in claim 13, wherein the manager computing device is further operative to determining whether each of the agents are skilled to work on multiple queues in the scheduled interval based on the received quality scores of the agents.
  • 15. The system as defined in claim 14, wherein the manager computing device that is further operative to determining the total number or the percentage of agents skilled and available to be working on a particular queue in the scheduled interval who have the minimum quality score.
  • 16. The system as defined in claim 15, wherein determining the sum of contributions to the queue for each of the agents includes the portion of time that the agents spend on the queue.
  • 17. The system as defined in claim 15, wherein determining the percentage of agents includes a relaxation and multiple passes technique.
  • 18. The system as defined in claim 14, wherein the manager computing device is further operative to determining the sum of contributions to the queue for each of the agents.
  • 19. A system for scheduling call center agents, comprising: means for obtaining quality scores of agents, the quality scores being a measurement of quality that the agents provide to a call center;means for obtaining quality performance of agents data that includes information on a quality of service and quality characteristics of the agent;means for obtaining call center operations data that includes information on statistics and details of a call center;means for defining a scheduled interval;means for defining a quality goal for the scheduled interval, the quality goal being a desired measurement of quality that the agents collectively provide to the call center;means for determining a quality goal score for the scheduled interval based on the quality scores of the agents;means for discovering a correlation between the quality performance of agents data and the call center operations data, wherein the correlation is computed using one or a combination of the following methods: Pearson linear correlation, clustering computing method, and statistical trend analysis, wherein the computation using Pearson linear correlation outputs correlation coefficient r and statistical measurement p with data set size N; andmeans for determining a schedule for the agents based on the correlation, the quality goal, the quality goal score, and the scheduled interval.
  • 20. The system as defined in claim 19, wherein the quality goal includes a minimum quality score, and either a number of minimum agents having the minimum quality score or a percentage of agents in the scheduled interval having the minimum quality score.
US Referenced Citations (162)
Number Name Date Kind
3594919 De Bell et al. Jul 1971 A
3705271 De Bell et al. Dec 1972 A
4510351 Costello et al. Apr 1985 A
4684349 Ferguson et al. Aug 1987 A
4694483 Cheung Sep 1987 A
4763353 Canale et al. Aug 1988 A
4815120 Kosich Mar 1989 A
4924488 Kosich May 1990 A
4953159 Hayden et al. Aug 1990 A
5016272 Stubbs et al. May 1991 A
5101402 Chiu et al. Mar 1992 A
5117225 Wang May 1992 A
5210789 Jeffus et al. May 1993 A
5239460 LaRoche Aug 1993 A
5241625 Epard et al. Aug 1993 A
5267865 Lee et al. Dec 1993 A
5299260 Shaio Mar 1994 A
5311422 Loftin et al. May 1994 A
5315711 Barone et al. May 1994 A
5317628 Misholi et al. May 1994 A
5347306 Nitta Sep 1994 A
5388252 Dreste et al. Feb 1995 A
5396371 Henits et al. Mar 1995 A
5432715 Shigematsu et al. Jul 1995 A
5465286 Clare et al. Nov 1995 A
5475625 Glaschick Dec 1995 A
5485569 Goldman et al. Jan 1996 A
5491780 Fyles et al. Feb 1996 A
5499291 Kepley Mar 1996 A
5535256 Maloney et al. Jul 1996 A
5572652 Robusto et al. Nov 1996 A
5577112 Cambray et al. Nov 1996 A
5590171 Howe et al. Dec 1996 A
5597312 Bloom et al. Jan 1997 A
5619183 Ziegra et al. Apr 1997 A
5696906 Peters et al. Dec 1997 A
5717879 Moran et al. Feb 1998 A
5721842 Beasley et al. Feb 1998 A
5742670 Bennett Apr 1998 A
5748499 Trueblood May 1998 A
5778182 Cathey et al. Jul 1998 A
5784452 Carney Jul 1998 A
5790798 Beckett, II et al. Aug 1998 A
5796952 Davis et al. Aug 1998 A
5809247 Richardson et al. Sep 1998 A
5809250 Kisor Sep 1998 A
5825869 Brooks et al. Oct 1998 A
5835572 Richardson, Jr. et al. Nov 1998 A
5862330 Anupam et al. Jan 1999 A
5864772 Alvarado et al. Jan 1999 A
5884032 Bateman et al. Mar 1999 A
5907680 Nielsen May 1999 A
5918214 Perkowski Jun 1999 A
5923746 Baker et al. Jul 1999 A
5933811 Angles et al. Aug 1999 A
5944791 Scherpbier Aug 1999 A
5948061 Merriman et al. Sep 1999 A
5958016 Chang et al. Sep 1999 A
5964836 Rowe et al. Oct 1999 A
5978648 George et al. Nov 1999 A
5982857 Brady Nov 1999 A
5987466 Greer et al. Nov 1999 A
5990852 Szamrej Nov 1999 A
5991373 Pattison et al. Nov 1999 A
5991796 Anupam et al. Nov 1999 A
6005932 Bloom Dec 1999 A
6009429 Greer et al. Dec 1999 A
6014134 Bell et al. Jan 2000 A
6014647 Nizzari et al. Jan 2000 A
6018619 Allard et al. Jan 2000 A
6035332 Ingrassia et al. Mar 2000 A
6038544 Machin et al. Mar 2000 A
6039575 L'Allier et al. Mar 2000 A
6057841 Thurlow et al. May 2000 A
6058163 Pattison et al. May 2000 A
6061798 Coley et al. May 2000 A
6072860 Kek et al. Jun 2000 A
6076099 Chen et al. Jun 2000 A
6078894 Clawson et al. Jun 2000 A
6091712 Pope et al. Jul 2000 A
6108711 Beck et al. Aug 2000 A
6122665 Bar et al. Sep 2000 A
6122668 Teng et al. Sep 2000 A
6130668 Stein Oct 2000 A
6138139 Beck et al. Oct 2000 A
6144991 England Nov 2000 A
6146148 Stuppy Nov 2000 A
6151622 Fraenkel et al. Nov 2000 A
6154771 Rangan et al. Nov 2000 A
6157808 Hollingsworth Dec 2000 A
6171109 Ohsuga Jan 2001 B1
6182094 Humpleman et al. Jan 2001 B1
6195679 Bauersfeld et al. Feb 2001 B1
6201948 Cook et al. Mar 2001 B1
6211451 Tohgi et al. Apr 2001 B1
6225993 Lindblad et al. May 2001 B1
6230197 Beck et al. May 2001 B1
6236977 Verba et al. May 2001 B1
6244758 Solymar et al. Jun 2001 B1
6282548 Burner et al. Aug 2001 B1
6286030 Wenig et al. Sep 2001 B1
6286046 Bryant Sep 2001 B1
6288753 DeNicola et al. Sep 2001 B1
6289340 Purnam et al. Sep 2001 B1
6301462 Freeman et al. Oct 2001 B1
6301573 McIlwaine et al. Oct 2001 B1
6324282 McIlwaine et al. Nov 2001 B1
6347374 Drake et al. Feb 2002 B1
6351467 Dillon Feb 2002 B1
6353851 Anupam et al. Mar 2002 B1
6360250 Anupam et al. Mar 2002 B1
6370574 House et al. Apr 2002 B1
6404857 Blair et al. Jun 2002 B1
6411989 Anupam et al. Jun 2002 B1
6418471 Shelton et al. Jul 2002 B1
6459787 McIlwaine et al. Oct 2002 B2
6487195 Choung et al. Nov 2002 B1
6493758 McLain Dec 2002 B1
6502131 Vaid et al. Dec 2002 B1
6510220 Beckett, II et al. Jan 2003 B1
6535909 Rust Mar 2003 B1
6542602 Elazar Apr 2003 B1
6546405 Gupta et al. Apr 2003 B2
6560328 Bondarenko et al. May 2003 B1
6583806 Ludwig et al. Jun 2003 B2
6606657 Zilberstein et al. Aug 2003 B1
6665644 Kanevsky et al. Dec 2003 B1
6674447 Chiang et al. Jan 2004 B1
6683633 Holtzblatt et al. Jan 2004 B2
6697858 Ezerzer et al. Feb 2004 B1
6724887 Eilbacher et al. Apr 2004 B1
6738456 Wrona et al. May 2004 B2
6757361 Blair et al. Jun 2004 B2
6772396 Cronin et al. Aug 2004 B1
6775377 McIlwaine et al. Aug 2004 B2
6792575 Samaniego et al. Sep 2004 B1
6810414 Brittain Oct 2004 B1
6820083 Nagy et al. Nov 2004 B1
6823384 Wilson et al. Nov 2004 B1
6870916 Henrikson et al. Mar 2005 B2
6901438 Davis et al. May 2005 B1
6959078 Eilbacher et al. Oct 2005 B1
6965886 Govrin et al. Nov 2005 B2
20010000962 Rajan May 2001 A1
20010032335 Jones Oct 2001 A1
20010043697 Cox et al. Nov 2001 A1
20020038363 MacLean Mar 2002 A1
20020052948 Baudu et al. May 2002 A1
20020065911 Von Klopp et al. May 2002 A1
20020065912 Catchpole et al. May 2002 A1
20020128925 Angeles Sep 2002 A1
20020143925 Pricer et al. Oct 2002 A1
20020165954 Eshghi et al. Nov 2002 A1
20030055883 Wiles et al. Mar 2003 A1
20030079020 Gourraud et al. Apr 2003 A1
20030144900 Whitmer Jul 2003 A1
20030154240 Nygren et al. Aug 2003 A1
20040100507 Hayner et al. May 2004 A1
20040165717 McIlwaine et al. Aug 2004 A1
20050138560 Lee et al. Jun 2005 A1
20060179064 Paz et al. Aug 2006 A1
20060233349 Cooper Oct 2006 A1
Foreign Referenced Citations (6)
Number Date Country
0453128 Oct 1991 EP
0773687 May 1997 EP
0989720 Mar 2000 EP
2369263 May 2002 GB
WO 9843380 Nov 1998 WO
WO 0016207 Mar 2000 WO