System and method for integrated workforce and analytics

Information

  • Patent Grant
  • 8396732
  • Patent Number
    8,396,732
  • Date Filed
    Wednesday, May 2, 2007
    17 years ago
  • Date Issued
    Tuesday, March 12, 2013
    11 years ago
Abstract
Systems and methods of integrating workforce management and contacts analysis are disclosed. An exemplary method comprises receiving content data derived from classification of a plurality of recorded agent contacts. The contact content data is correlated with past time period. The method also comprises identifying at least one pattern in the contact content data. The pattern is based on the contact classifications. The method also comprises receiving historical workload data from a contact router, and generating a workload forecast based on the historical workload data and the identified pattern.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to call centers.


BACKGROUND

The business of a contact center is to provide rapid and efficient interaction between agents and customers (or prospective customers). Existing solutions require the purchase of multiple hardware and software components, typically from different vendors, to achieve the business goals of the contact center. The use of separate systems of components leads to a variety of problems. For instance, each system typically has its own method of configuration and its own user interface. Thus, exchanging data between the systems requires additional work by someone at the contact center.


Furthermore, contact centers are continually tasked with striking a balance between service quality, efficiency, effectiveness, revenue generation, cost cutting, and profitability. As a result, today's contact center agents are charged with mastering multiple data sources and systems, delivering consistent service across customer touch points, up-selling, cross-selling, and saving at-risk customers, while winning new ones.


The systems and methods described herein provide integrated solutions for performing workforce management and analysis of recorded interactions. Combining these two functionalities as a unified integrated solution, delivered through a single platform, enables users to gain more insight and make smarter decisions faster about sales, service, and overall operations. This takes contact center tools beyond the traditional “suite” approach to a true single workforce optimization platform.


As can be seen, while each technology segment delivers value, integration is the key: together the segments deliver greater impact than the sum of their individual parts. Utilizing them separately limits the contact center's potential to become a strategic business asset.





BRIEF DESCRIPTION OF THE DRAWINGS

Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure.



FIG. 1 is a block diagram of a contact center environment.



FIG. 2 is a high-level view of components in one embodiment of an integrated contact center system.



FIG. 3 shows a process performed by the mining function of FIG. 2.



FIG. 4 shows a process performed by the pattern recognition function of FIG. 2.



FIG. 5 is a data flow diagram showing component interactions in one embodiment of a method or system of integrating the forecaster and contacts analysis components of FIG. 2.



FIG. 6 is a data flow diagram showing component interactions in yet another embodiment of a method of system of integrating the forecaster and contacts analysis components of FIG. 2.



FIG. 7 is an object diagram showing component interactions in one embodiment of a method or system of integrating the forecaster and contacts analysis components of FIG. 2.



FIG. 8 is a screen shot of an activity view displayed by one embodiment of the tracking component of FIG. 2.



FIG. 9 shows another integration point between the tracking component and contacts analysis components of FIG. 2.



FIG. 10 shows an integration point between the scheduler and contacts analysis components of FIG. 2.





SUMMARY OF THE INVENTION

Systems and methods of integrating workforce management and contacts analysis are disclosed. An exemplary method comprises receiving content data derived from classification of a plurality of recorded agent contacts. The contact content data is correlated with past time period. The method also comprises identifying at least one pattern in the contact content data. The pattern is based on the contact classifications. The method also comprises receiving historical workload data from a contact router, and generating a workload forecast based on the historical workload data and the identified pattern.


Another exemplary method comprises monitoring a stream of application events from an application monitor. Each event associated with agent workstation activity. The method also comprises determining a count of agent work units represented by the application event stream. The method also comprises receiving historical workload data from a contact router, and generating a workload forecast based on the historical workload data and the count of agent work units.


Another exemplary method comprises capturing a plurality of contacts made by an agent. The method also comprises analyzing at least a portion of the contacts to derive contact description information. The method also comprises displaying a timeline. The method also comprises displaying, in visual correlation with the timeline, a plurality of activities performed by the agent. At least a portion of the activities are associated with one of the captured contacts. The method also comprises displaying, for those activities associated with one of the captured contacts, the contact attribute. The contact attribute is displayed in visual proximity to the activity.


DETAILED DESCRIPTION


FIG. 1 is a block diagram of a contact center environment 100. The contact center 100 is staffed by agents who handle incoming and/or outgoing contacts. Although the traditional and most common form of contact is by phone, other types of contacts are becoming more common (e.g., text chat, web collaboration, email, and fax). An agent workspace includes an agent phone 110 and a workstation computer 120. A network 130 connects one or more of the workstations 120.


A contact router 140 distributes incoming contacts to available agents. When the contacts are made by traditional phone lines, the contact router 140 operates by connecting outside trunk lines 150 to agent trunk lines 160. In this environment, the contact router 140 may be implemented by an automatic call distributor (ACD), which queues calls until a suitable agent is available. Other types of contacts, such as Voice over Internet Protocol (VoIP) calls and computer-based contacts (e.g., chat, email) are routed over one or more data networks. These contacts are distributed over network 130 to one of the agent workstations 120.


During a customer contact, the agent interacts with one or more applications running on the workstation 120. Example workstation applications give the agent access to customer records, product information, ordering status, and transaction history, for example. The applications may access one or more business databases (not shown) via the network 130.


A contact recorder 170 provides the ability to capture or record contacts of many different types, including traditional and IP telephony environments, text chat, web collaboration, email, and fax. A recorded contact may consist of multiple streams of data. One stream may be considered a “content” stream: on a voice call, the content stream is a digitized voice stream; on a text chat contact, the content stream is text.


While on a call with a customer, the agent interacts with one or more applications 180 running on the workstation 120. Examples are applications that give the agent access to customer records, product information, ordering status, transaction history, etc. The applications may access one or more enterprise databases (not shown) via the network 130.


The contact center 100 also includes a computer-implemented integrated contact center system 200, described in further detail in connection with FIG. 2.



FIG. 2 is a high-level view of components in one embodiment of an integrated contact center system 200. The integrated system 200 includes at least a work force manager (WFM) component 210 and a contacts analysis component 220. An integrated contact center system such as system 200 allows contact center analysts to quickly access the right information. Such an integrated system allows valuable and previously undiscovered information to be uncovered. This new level of visibility into contact center operations should allow personnel make better decisions faster.


The WFM 210 performs many functions related to the agent workforce. The functionality of the entire WFM 210 is typically divided among several applications, executables, processes, or services. A forecast component (230) predicts future workload based on past workload, where workload includes contact volume and handle time for intervals throughout a historical period. A scheduler component (240) calculates staffing levels and agent schedules based on predicted workload. A tracking component (250) provides a contact center supervisor or manager with information about agent activities and agent-customer interactions, both historical and real-time. An adherence component (260) supplies the supervisor with information on how well each agent complies with contact center policies.


Contacts database 270 stores contacts recorded by contact recorder 170. The database may include descriptive information as well as recorded content. Contact center supervisors and quality analysts can tap into these recorded interactions to review, evaluate, and score agent performance.


Contacts analysis component 220 consists of a mining function 280 and a pattern recognition function 290. The mining function 280 mines the streams of unstructured data within a recorded interaction to produce structured data. Structured data includes any data format that is understood and easily accessible from an application, for example, a database schema, software data structures, etc. Pattern recognition function 290 correlates instances of structured data with other instances of structured data to extract additional meaning from the interactions. The structured data used by pattern recognition function 290 includes data produced by the mining function 280, as well as other forms of structured data provided as input to contacts analysis component 220.


Examples of analysis include categorizing calls based on content, analyzing a call against an expected call pattern and reporting exceptions to the pattern, and providing a visualization layer for recorded interactions that displays other data attributes such as agent activities coincident with call events.


In one embodiment, integrated system 200 also includes one or more of a performance manager, an evaluation manager, and a development manager. The evaluation manager allows various types of employee performance review processes to be managed (i.e., 360 degree reviews). The performance manager receives data from the evaluation manager and presents the performance data to the contact center manager through various scorecard view. The development manager tracks employee Learning/Development and detects a need for training.


It should be noted that embodiments of one or more of the systems described herein could be used to perform an aspect of speech analytics (i.e., the analysis of recorded speech or real-time speech), which can be used to perform a variety of functions, such as automated call evaluation, call scoring, quality monitoring, quality assessment and compliance/adherence. By way of example, speech analytics can be used to compare a recorded interaction to a script (e.g., a script that the agent was to use during the interaction). In other words, speech analytics can be used to measure how well agents adhere to scripts, identify which agents are “good” sales people and which ones need additional training. As such, speech analytics can be used to find agents who do not adhere to scripts. Yet in another example, speech analytics can measure script effectiveness, identify which scripts are effective and which are not, and find, for example, the section of a script that displeases or upsets customers (e.g., based on emotion detection). As another example, compliance with various policies can be determined. Such may be in the case of, for example, the collections industry where it is a highly regulated business and agents must abide by many rules. The speech analytics of the present disclosure may identify when agents are not adhering to their scripts and guidelines. This can potentially improve collection effectiveness and reduce corporate liability and risk.


In this regard, various types of recording components can be used to facilitate speech analytics. Specifically, such recording components can perform one or more of various functions such as receiving, capturing, intercepting, and tapping of data. This can involve the use of active and/or passive recording techniques, as well as the recording of voice and/or screen data.


It should be noted that speech analytics can be used in conjunction with such screen data (e.g., screen data captured from an agent's workstation/PC) for evaluation, scoring, analysis, adherence, and compliance purposes, for example. Such integrated functionality can improve the effectiveness and efficiency of, for example, quality assurance programs. For example, the integrated function can help companies to locate appropriate calls (and related screen interactions) for quality monitoring and evaluation. This type of “precision” monitoring improves the effectiveness and productivity of quality assurance programs.


Another aspect that can be accomplished involves fraud detection. In this regard, various manners can be used to determine the identity of a particular speaker. In some embodiments, speech analytics can be used independently and/or in combination with other techniques for performing fraud detection. Specifically, some embodiments can involve identification of a speaker (e.g., a customer) and correlating this identification with other information to determine whether a fraudulent claim for example is being made. If such potential fraud is identified, some embodiments can provide an alert. For example, the speech analytics of the present disclosure may identify the emotions of callers. The identified emotions can be used in conjunction with identifying specific concepts to help companies spot either agents or callers/customers who are involved in fraudulent activities.


Referring back to the collections example outlined above, by using emotion and concept detection, companies can identify which customers are attempting to mislead collectors into believing that they are going to pay. The earlier the company is aware of a problem account, the more recourse options they may have. Thus, the speech analytics of the present disclosure can function as an early warning system to reduce losses.


Also included in this disclosure are embodiments of integrated workforce optimization platforms, as discussed in U.S. patent application Ser. No. 11/359,356, filed on Feb. 22, 2006, entitled “Systems and Methods for Workforce Optimization,” and U.S. patent application Ser. No. 11/540,185, filed on Sep. 29, 2006, entitled “Systems and Methods for facilitating Contact Center Coaching,” both of which are hereby incorporated by reference in their entireties. At least one embodiment of an integrated workforce optimization platform integrates: (1) Quality Monitoring/Call Recording—voice of the customer; the complete customer experience across multimedia touch points; (2) Workforce Management—strategic forecasting and scheduling that drives efficiency and adherence, aids in planning, and helps facilitate optimum staffing and service levels; (3) Performance Management—key performance indicators (Kips) and scorecards that analyze and help identify synergies, opportunities and improvement areas; (4) e-Learning—training, new information and protocol disseminated to staff, leveraging best practice customer interactions and delivering learning to support development; (5) Analytics—deliver insights from customer interactions to drive business performance; and/or (6) Coaching—feedback to promote efficient performance. By way of example, the integrated workforce optimization process and system can include planning and establishing goals—from both an enterprise and center perspective—to ensure alignment and objectives that complement and support one another. Such planning may be complemented with forecasting and scheduling of the workforce to ensure optimum service levels. Recording and measuring performance may also be utilized, leveraging quality monitoring/call recording to assess service quality and the customer experience.


Contacts analysis component 220 will now be discussed in further detail in connection with FIGS. 3 and 4. FIG. 3 shows a process performed by mining function 280. In this example, a recorded interaction 300 includes three streams of unstructured data 310, each containing audio content from one speaker: Agent1 (320A1), Agent2 (320A2), and Customer (320C). Other embodiments can include different numbers of unstructured data streams. Mining function 280 identifies structured data contained within the unstructured audio stream 310. Two types of structured data that can be mined from an unstructured audio stream are speaker attributes and speech events.


Attributes are identified characteristics associated with a segment of the unstructured stream 310. Some attributes describe the contact, such as inbound or outbound. Other attributes relate to speech, for example, language, speaker gender, speaker identifier (e.g., by agent role/customer role, or voice recognition to identify particular agents). In the example interaction of FIG. 3, mining function 280 has identified certain segments of the audio stream 310 as being male and female.


In addition to identifying stream attributes, mining function 280 also identifies events within an unstructured stream. Events are associated with a start time and a duration (or a start time and a stop time). Some events are linguistic in nature, while others are not. Examples of linguistic speech events include words/phrases (e.g., “supervisor”, “Bank of America”) and proximate words/phrases (e.g., “transfer me” and “supervisor” found within 5 seconds of each other). Examples of non-linguistic speech events include emotion/stress, and silence.


In the example interaction of FIG. 3, mining function 280 has identified five different speech events: a word event “greeting” (330W1) at time t1; a stress event (330E1) at t2; a silence event (330S1) at t3; an emotional event “empathy” (330E2) at t4; and a word event “competitor” (330W2) at t5.


Although mining an unstructured interaction to produce structured events and/or attributes is useful, at this level these events and attributes do not represent information in the context in which it was actually utilized. Thus, this level does not yet capture information that has semantic meaning to the business, i.e., business intelligence. In FIG. 4 pattern recognition function 290 performs a process in which correlations between multiple events and/or attributes are identified. The identified correlations, or patterns, have a higher value of semantic meaning for the business enterprise as compared to individual events and attributes.



FIG. 4 shows patterns identified in a recorded contact 400. The unstructured audio stream has already been mined to identify two speakers, a customer and an agent. The recorded contact in FIG. 4 is therefore shown as containing two logical streams, one for the customer (410C) and the other for the agent (410A). The mining function 280 has also identified several speech events: phrase event “greeting” (420P1); word event “competitor” (420W1); emotion event “stress” (420E1); phrase event “supervisor” (420P2); and phrase event “transfer to supervisor” (420P3).


Two additional streams of structured data are also included in interaction 400. One structured data stream (430) includes events from contact router 140 (see FIG. 1), such as a CallTransfer 430T. The other structured data stream (440) includes events from agent workstations 120. In this example, a workstation event such as event 440A specifies a particular application running on the agent workstation, as well as a particular screen within the application. In other implementations, the events may also include application inputs (e.g., keystrokes, mouse clicks).


Pattern recognition function 290 analyzes the structured data within multiple captured streams to identify one or more patterns. In the example interaction of FIG. 4, pattern recognition function 290 has identified an Angry Customer pattern (450) from the sequence of structured data. The Angry Customer pattern starts with stress event 420E1, followed by the phrase event “transfer to supervisor” 420P2 as spoken by the agent, followed by a contact router Transfer event 430T. These patterns are defined by business rules which may be based on generic templates and then customized for the specific business enterprise.


Speech analytics is typically deployed as a mechanism to extract business intelligence out of the conversations captured in the contact center. For example most common applications include: contact classification (billing, technical support, etc.); contact disposition (resolution of issue); topic trending (mentions of competitor, product, etc.); business process improvement (identification of repetitive activities for automation, etc.); issue identification (to reduce churn); and automated contact scoring.



FIG. 5 is a data flow diagram showing component interactions in one embodiment of a method or system of integrating forecaster 230 and contacts analysis component 220. Forecaster 230 receives historical workload data (510) from contact router 140. Workload data 510 describes past contacts coming into, and going out of, the contact center, for example, contact volume, handle time, time to answer, etc. Workload data is typically tracked per queue, in correlation with time periods. This workload data 510 provided by contact router 140 is not based on the content of the contact.


In contrast, forecaster 230 also receives historical contact data (520), from contacts analysis component 220, which includes information derived from the content of the contact. In one embodiment, contacts analysis component 220 derives this information by performing various classifications, for example, classifying contacts according to topic, disposition (e.g, resolved, open, etc.), and emotional content (e.g., “stressed”, “angry”). In one embodiment, classification applies speech analysis techniques to voice calls to extract keywords which suggest call topic and/or disposition, to identify emotional content, etc. In another embodiment, classification applies text analysis to email, instant messages, text chat, etc. to extract keywords which suggest call topic, disposition, and/or emotional content.


Like workload data 510, contact content data 520 also includes correlation with time periods. In some embodiments, these time periods include intraday, daily, weekly, and monthly.


Forecaster 230 examines historical workload data 510 and contact content data 520 to identify regularly occurring patterns and trends in the contact center workload, and produces a prediction of future workload 530. Patterns describe repeating occurrences of similar content at related time periods. One example of a contact content pattern is a number of contacts that were classified into the same topic category (similar content), and that occur during the early evening shift (related time periods).


Using only historical workload data 510 from contact router 140, forecaster 230 is limited to identifying patterns such as “more call volume on Monday mornings,” “longer handle time after lunch,” etc. However, the reason why there are more calls on Monday, or handle times are longer after lunch, is not clear from this data, since it doesn't include any information about the content of the contact.


The addition of contact content data 520 allows forecaster 230 to identify additional and more detailed patterns such as “more billing contacts during last week of month” and “more disposition=Issue Resolved at start of shift.” Forecaster 230 can then take these patterns into account when producing the workload forecast 530, so that more agents are assigned to billing during the last week of the month. Thus, the workload forecast 530 produced by this integrated system of FIG. 5 is more accurate than a workload forecast produced by a conventional WFM system.



FIG. 6 is a data flow diagram showing component interactions in yet another embodiment of a method of system of integrating between forecaster 230 and contacts analysis component 220. As contacts are distributed to agents for handling, events 610 are collected from various sources, such as contact router 140 and an application monitor 620. For example, as the agent takes calls throughout a workday, the contact router 140 reports changes in the state of the agent's phone as ACD events, and as an agent interacts with various applications on his workstation 120, application monitor 620 tracks and reports application events. In one implementation, the granularity of application events is application-level, so that events describe when applications start and exit, and when a user switches from one application to another. In another implementation, the granularity of application events is screen-level, so that events describe a particular screen displayed within an application. In yet another implementation, application events are low-level, including input and/or output associated with each application (e.g., keystrokes, mouse clicks, and screen updates).


Contact recorder 170 records these event streams, as well as the contact content stream 630 (e.g., the voice stream for a voice call, or a data stream for computer contact such as email, chat, etc.). As described above, contacts analysis component 220 performs various analyses on the recorded streams to produce structured data and to identify patterns.


In the embodiment of FIG. 6, contacts analysis component 220 also analyzes recorded workstation events and determines how many agent work units these events represent. A work units is determined by mapping sequences of events to business transactions which are considered agent work units 640. For example, a particular sequence of screens within a customer relationship manager (CRM) application may indicate a “new customer record creation,” and another sequence might indicate a “new order creation,” where a “creating a new customer” and “creating a new order” are defined as agent work units. (More details of mapping events to business transactions can be found in U.S. patent application Ser. No. 11/359,319 “System and Method for Detecting and Displaying Business Transactions.”)


Determining agent work units 640 for non-phone contacts allows contacts analysis component 220 to produce a measure which is analogous to a handle time for a phone contact. This type of measure for non-phone activities is not readily available in conventional call center software.


Contacts analysis component 220 also produces, from agent workstation events, workstation productivity data 650 indicating the amount of time an agent spends in productive computer activities, in non-productive computer activities, or both. In one embodiment, productive/non-productive is determined at the level of a particular workstation applications (e.g., CRM and corporate email application are productive, while games and web browser are not). In another embodiment, productive/non-productive is determined at a finer level of granularity. For example, certain screens within an application may be considered productive while others are not, or certain web pages or Internet addresses may be considered productive while others are not.


These two measures (640, 650) are provided to forecaster 230, which uses this data as a basis for determining historical workload for non-phone activities. Forecaster 230 analyzes this historical workload for non-phone activities, as well as historical phone workload data 660 from contact router 140, and identifies workload patterns and trends. From this information, forecaster 230 produces a prediction of future workload 670, which is used by the scheduler 240 to produce a schedule.



FIG. 7 is an object diagram showing component interactions in one embodiment of a method or system of integrating work force management and contact analysis. A tracking component 250 of WFM 210 receives reports of agent activities (710) throughout the workday. The activity 710 information typically includes an agent identifier, an activity source, an activity code, a start time, and a duration. Typical activity codes are Activity_Avail, Activity_Talk, Activity_AfterCallWork, Activity_Break, and Activity_Email. Examples of activity reporting sources are contact router 140 and a workstation application activity monitor 720. In one embodiment, the reporting components report device-specific events, and WFM 210 maps the events to agent activities. In the simplified view of FIG. 7, the reporting components are shown as reporting agent activities rather than events.


Contact recorder 170 captures (730) a portion, or all of, agent-customer contacts for storage in contacts database 270. Contacts analysis component 220 analyzes the contacts to derive information describing the contact. Some embodiments analyze the contacts to determine contact attributes (740) such as call time and duration, a queue identifier, and call transfer events (e.g., call was initially answered by a customer service agent but was then transferred to a specialist agent). Some embodiments of contacts analysis component 220 further analyze the contact to extract content information (750) such as contact topic and contact disposition. Some embodiments of contacts analysis component 220 also analyze the contact to determine the occurrence of specific speech events 760 (e.g., keywords)


In the embodiment of FIG. 7, information provided by contacts analysis component 220 is used by a user interface of tracking component 250 to supplement activity information displayed to a contact center manager or supervisor. FIG. 8 is a screen shot of an activity view (810) displayed by one embodiment of tracking component 250. Tracking component 250 displays activities of one or more agents in a timeline view. Blocks (820) displayed in correlation with a timeline axis (830) represent activities, with blocks of different colors/shades/fills/etc. representing different types of activities. Activities for each agent are displayed on a different line.


When an agent activity is also associated with a recorded contact, tracking component 250 displays contact analysis information in conjunction with the activity. For example, contact attributes 740 can be displayed in visual proximity with the recorded interaction, as shown in FIG. 8. In one embodiment, contact attributes 740 are not displayed by default, but a user activates display of the attributes, for example, by clicking an icon or button. As another example, activity blocks of different colors can represent different contact topics. As yet another example, the occurrence of keywords within an activity can be represented by various small icons placed within the activity block.



FIG. 9 shows another integration point between tracking component 250 and contacts analysis component 220. Tracking component 250 provides a user a graphical view of actual workload as compared to forecast workload, on a per queue basis. Contact center supervisors closely monitor this view so they can respond to changes in call volume or handle time. When a change in contact volume or handle time is detected, a supervisor typically investigates the root cause. For example, increases in call volume may be related to weather, product defects, promotions, policy changes, competitors activities, etc. Increases in call handle time may have similar causes or may be caused by slow systems.


In integrated contact center system 200, a user can quickly “drill down” from a particular queue statistic to one or more recorded contact interactions associated with the queue, and to any analysis data produced by contacts analysis component 220 that is associated with those contacts. When a user requests more detailed information on a queue statistic, tracking component 250 retrieves (910) related analysis data, and displays this additional detail to the user. For example, a queue with an increased average handle time may be associated with interactions that include unusual keywords, and this information is then provided to the user who “drills down” from the average handle time. In this manner, the user better understands the root cause of a deviation between actual workload and forecast, and can therefore make better decisions on how to respond.



FIG. 10 shows yet another integration point between WFM 210 and contacts analysis component 220, this one involving scheduler 240. Scheduler 240 produces agent schedules using predicted workload 1010, service goals 1020, and agent information 1030. Agent information is typically provided from an agent database 1040, and includes information about skills, quality scores, and previously scheduled work activities. In integrated system 200, contacts analysis component 220 provides additional information used by scheduler 240, such as a correlation or relationship 1050 between agent interaction scores 1060 and agent shifts/work patterns 1070.


Typically, contact center personnel such as supervisors and quality analysts play back some of the interactions and review, evaluate, and score agent quality in various categories (product knowledge, selling, listening, etc.) However, only a relatively small portion of recorded interactions can be scored using this manual method, since it is time consuming. Thus, a percentage of calls are usually randomly selected for manual scoring.


In one embodiment, contacts analysis component 220 performs automatic call scoring on a larger portion, or even all, of the recorded interactions. For automatic scoring, interaction scores 1060 may be relative to each other rather than absolute as compared to a standard. Manual scoring may then be performed on a portion of the automatically scored interactions. For example, if contacts analysis component 220 classifies interactions into buckets having a 1 to 5 scale, then an “intelligent selection” feature may select X % of each bucket for manual scoring.


After scores 1060 are produced (manually, automatically, or in combination), a correlation engine within contacts analysis component 220 looks for correlations between scores 1060 and agent shifts and agent work patterns. Such correlations, when found, suggest that interaction quality increases or decreases with certain shift or work patterns. For example, the interaction quality of most agents may decline after handling particular contact types for more than a specific length of time, or that a certain mix of contact types leads to a decline in interaction quality (possibly because the context switch from one contact to the next is difficult for the agent).


The correlation data 1050 is provided to scheduler 240, which can use the data to generate schedules that lead to higher quality interactions. Identified correlations can be tracked within the database repository as dynamic data relationships. Since these are non-structured and very dynamic, this relationship may be represented via a dynamic association that could be programmatically configured rather than being represented as data structures per se.


One embodiment of the correlation engine will now be described in more detail. This embodiment utilizes an agent schedule database, a quality performance database, and a contact center operation database when determining correlations between scores 1060 and shifts and work patterns. The agent schedule database may include data such as skills, day, date, breaks, meetings, and training. The quality performance database may include data such as contact quality scores and agent names. The contact center operation database may include data such as contact identification, contact type, agent name, contact stats (e.g., time/date percentage of contacts answered, average handling times, contact volumes, wait times, abandonment rates), and timing details (e.g., events occurring within a certain time period of each other).


The correlation engine identifies patterns (or correlation-based discovery) that show why and/or when certain poor performance occurs repeatedly and correlated with exogenous events such as high AHT or long queue times. The correlation engine can provide n-way correlations between poor/good quality of agent measurements and other contact center details. Thus, the correlation engine can provide a statistical examination of time-indexed data to find correlations between quality of agents measured historically and other contact center details.


The correlation engine identifies statistically valid correlations between agent quality scores and other contact data. The input data can be pre-processed or filtered to remove outlier data (for example, removal of all data that is a greater than two standard deviation away from the mean/median). In an alternative embodiment, the pre-processing can allow selective user-induced filtering on other parameters (for example, look only at data relating to a particular agent).


The filtered data, representing two axes of information is provided as input to a Pearson r linear correlation formula, which computes the value of r, known as the correlation coefficient. The correlation coefficient ranges from −1 (perfect negative correlation) to 0 (no correlation) to 1 (perfect positive correlation). In addition, statistical significance p (e.g., p<0.050) is computed based on total amount of data used to compute r and based on the probability of the value of r, which is based on random data. This significance value p, the total data set size N, and the correlation coefficient r constitute the outputs of the correlation engine, as shown in block 197. In block 198, a user can adjust the thresholds on desired levels of r magnitude, significance value p, data set size minima N, and other filters such as filtering to a single agent or a set of agents for data to be used in the correlation analyses. In addition, a user can also turn the pre-process filter on or off and input other combinations of paired values.


Clustering is another computational method used to determine correlation. Clustering takes all evaluated and unevaluated contacts and partitions them into sets, based on one or more parameters that are statistically correlated with quality of service. A contact center manager can visualize each “cluster” for the user and preferably annotate which contacts are evaluated and which are unevaluated. Yet another correlation computational method is statistical trend analysis, which looks for temporal trends in correlated sets of contacts to show that quality of service is increasing or decreasing statistically significantly over time.

Claims
  • 1. A computer-implemented method of integrating workforce management and contacts analysis, comprising: in a computer-implemented contact analysis system: receiving contact content data for a plurality of recorded agent contacts, the contact content data correlated with past time periods;analyzing the contact content data to produce a classification for each of the plurality of recorded agent contacts based on emotional content of the plurality of recorded agent contacts;identifying at least one pattern in the classifications for each of the plurality of recorded agent contacts, the pattern determined from rules based on the contact classifications;in a computer-implemented workforce management system: receiving the at least one identified pattern from the contact analysis system;receiving historical workload data from a contact router; andgenerating a workload forecast based on the historical workload data and the at least one identified pattern.
  • 2. The method of claim 1, further comprising: in the workforce management system, generating a schedule for a plurality of agents based on the workload forecast.
  • 3. The method of claim 1, further comprising: in the contact analysis system, further classifying each of a plurality of recorded agent contacts into one of a plurality of categories according to at least one of call disposition or call topic.
  • 4. The method of claim 1, wherein the recorded contact is text-based, and further comprising: in the contact analysis system: performing text analysis on the contact to identify at least one keyword; andclassifying the contact into one of a plurality of categories based on the identified keyword.
  • 5. The method of claim 1, wherein the recorded contact is a voice call, and further comprising: in the contact analysis system: performing speech analysis on the contact to identify at least one keyword; andclassifying the contact into one of a plurality of categories based on the identified keyword.
  • 6. A computer-implemented method of integrating workforce management and contacts analysis, comprising the steps of: in a computer-implemented contact analysis system: receiving contact content data for a plurality of recorded agent contacts, the contact content data correlated with past time periods;analyzing the contact content data to produce a classification for each of the plurality of recorded agent contacts based on emotional content of the plurality of recorded agent contacts;identifying at least one pattern in the classifications for each of the plurality of recorded agent contacts, the pattern determined from rules based on the contact classifications;monitoring a stream of application events from an application monitor, each event associated with agent workstation activity;determining a count of agent work units represented by the application event stream;in a computer-implemented workforce management system: receiving the at least one identified pattern from the contact analysis system;receiving historical workload data from a contact router; andgenerating a workload forecast based on the historical workload data, the at least one identified pattern, and the count of agent work units.
  • 7. The method of claim 6, further comprising: in the contact analysis system, producing workstation productivity data from the application events.
  • 8. The method of claim 7, further comprising: in the workforce management system, generating the workload forecast based on the historical workload data, the count of agent work units, and the workstation productivity data.
  • 9. The method of claim 7, further comprising: in the workforce management system: determining historical workload for non-phone activities from the agent work units and the workstation productivity data; andidentifying workload patterns by analyzing the historical workload for non-phone activities and historical workload for phone activities.
  • 10. The method of claim 6, wherein the determining step further comprises: determining the count of agent work units represented by the application event stream by mapping a sequence of events within the stream to agent work units.
  • 11. The method of claim 6, wherein the application events comprise application-level events.
  • 12. The method of claim 6, wherein the application events comprise screen-level events.
  • 13. The method of claim 6, wherein the application events comprise application input/output events.
  • 14. A method of viewing data associated with recorded agent interactions from a window that displaying agent activity information, the method comprising the steps of: in a computer-implemented contact analysis system: capturing a plurality of contacts made by an agent;receiving contact content data for the plurality of captured agent contacts, the contact content data correlated with past time periods;analyzing the contact content data to produce a classification for each of the plurality of captured agent contacts based on emotional content of the plurality of captured agent contacts;identifying at least one pattern in the classifications for each of the plurality of recorded agent contacts, the pattern determined from rules based on the contact classifications;in a computer-implemented workforce management system: receiving the at least one identified pattern from the contact analysis system;receiving historical workload data from a contact router;generating a workload forecast based on the historical workload data and the at least one identified pattern;displaying a timeline;displaying, in visual correlation with the timeline, a plurality of activities performed by the agent, at least a portion of the activities associated with one of the captured contacts; anddisplaying, for those activities associated with one of the captured contacts, a contact attribute, in visual proximity to the activity.
  • 15. The method of claim 14, wherein the analyzing step further comprises: analyzing at least a portion of the contacts to identify a contact attribute, wherein the contact attribute comprises at least one of call time, call duration, and queue identifier.
  • 16. The method of claim 14, wherein the analyzing step further comprises: analyzing at least a portion of the contacts to extract content information, wherein the content information comprises at least one of topic and disposition.
  • 17. The method of claim 14, wherein the analyzing step further comprises: analyzing at least a portion of the contacts to determine an occurrence of a speech event, wherein the speech event comprises a predefined keyword.
  • 18. The method of claim 14, further comprising: in the workforce management system: displaying a plurality of queue statistics, each queue statistic associated with a queue in a plurality of queues; andreceiving a selection of one of the queue statistics; anddisplaying contact analysis data for a contact in a plurality of recorded contacts, the recorded contacts associated with the queue that is associated with the selected queue statistic, responsive to receiving the selection.
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 60/798,683, filed May 8, 2006 and hereby incorporated by reference herein.

US Referenced Citations (179)
Number Name Date Kind
3594919 De Bell et al. Jul 1971 A
3705271 De Bell et al. Dec 1972 A
4510351 Costello et al. Apr 1985 A
4684349 Ferguson et al. Aug 1987 A
4694483 Cheung Sep 1987 A
4763353 Canale et al. Aug 1988 A
4815120 Kosich Mar 1989 A
4924488 Kosich May 1990 A
4953159 Hayden et al. Aug 1990 A
5016272 Stubbs et al. May 1991 A
5101402 Chiu et al. Mar 1992 A
5117225 Wang May 1992 A
5210789 Jeffus et al. May 1993 A
5239460 LaRoche Aug 1993 A
5241625 Epard et al. Aug 1993 A
5267865 Lee et al. Dec 1993 A
5299260 Shaio Mar 1994 A
5311422 Loftin et al. May 1994 A
5315711 Barone et al. May 1994 A
5317628 Misholi et al. May 1994 A
5347306 Nitta Sep 1994 A
5388252 Dreste et al. Feb 1995 A
5396371 Henits et al. Mar 1995 A
5432715 Shigematsu et al. Jul 1995 A
5465286 Clare et al. Nov 1995 A
5475625 Glaschick Dec 1995 A
5485569 Goldman et al. Jan 1996 A
5491780 Fyles et al. Feb 1996 A
5499291 Kepley Mar 1996 A
5535256 Maloney et al. Jul 1996 A
5572652 Robusto et al. Nov 1996 A
5577112 Cambray et al. Nov 1996 A
5590171 Howe et al. Dec 1996 A
5597312 Bloom et al. Jan 1997 A
5619183 Ziegra et al. Apr 1997 A
5696906 Peters et al. Dec 1997 A
5717879 Moran et al. Feb 1998 A
5721842 Beasley et al. Feb 1998 A
5742670 Bennett Apr 1998 A
5748499 Trueblood May 1998 A
5778182 Cathey et al. Jul 1998 A
5784452 Carney Jul 1998 A
5790798 Beckett, II et al. Aug 1998 A
5796952 Davis et al. Aug 1998 A
5809247 Richardson et al. Sep 1998 A
5809250 Kisor Sep 1998 A
5825869 Brooks et al. Oct 1998 A
5835572 Richardson, Jr. et al. Nov 1998 A
5862330 Anupam et al. Jan 1999 A
5864772 Alvarado et al. Jan 1999 A
5884032 Bateman et al. Mar 1999 A
5907680 Nielsen May 1999 A
5918214 Perkowski Jun 1999 A
5923746 Baker et al. Jul 1999 A
5933811 Angles et al. Aug 1999 A
5944791 Scherpbier Aug 1999 A
5948061 Merriman et al. Sep 1999 A
5958016 Chang et al. Sep 1999 A
5964836 Rowe et al. Oct 1999 A
5978648 George et al. Nov 1999 A
5982857 Brady Nov 1999 A
5987466 Greer et al. Nov 1999 A
5990852 Szamrej Nov 1999 A
5991373 Pattison et al. Nov 1999 A
5991796 Anupam et al. Nov 1999 A
6005932 Bloom Dec 1999 A
6009429 Greer et al. Dec 1999 A
6014134 Bell et al. Jan 2000 A
6014647 Nizzari et al. Jan 2000 A
6018619 Allard et al. Jan 2000 A
6035332 Ingrassia et al. Mar 2000 A
6038544 Machin et al. Mar 2000 A
6039575 L'Allier et al. Mar 2000 A
6057841 Thurlow et al. May 2000 A
6058163 Pattison et al. May 2000 A
6061798 Coley et al. May 2000 A
6072860 Kek et al. Jun 2000 A
6076099 Chen et al. Jun 2000 A
6078894 Clawson et al. Jun 2000 A
6091712 Pope et al. Jul 2000 A
6108711 Beck et al. Aug 2000 A
6122665 Bar et al. Sep 2000 A
6122668 Teng et al. Sep 2000 A
6130668 Stein Oct 2000 A
6138139 Beck et al. Oct 2000 A
6144991 England Nov 2000 A
6146148 Stuppy Nov 2000 A
6151622 Fraenkel et al. Nov 2000 A
6154771 Rangan et al. Nov 2000 A
6157808 Hollingsworth Dec 2000 A
6171109 Ohsuga Jan 2001 B1
6182094 Humpleman et al. Jan 2001 B1
6195679 Bauersfeld et al. Feb 2001 B1
6201948 Cook et al. Mar 2001 B1
6211451 Tohgi et al. Apr 2001 B1
6225993 Lindblad et al. May 2001 B1
6230197 Beck et al. May 2001 B1
6236977 Verba et al. May 2001 B1
6244758 Solymar et al. Jun 2001 B1
6282548 Burner et al. Aug 2001 B1
6286030 Wenig et al. Sep 2001 B1
6286046 Bryant Sep 2001 B1
6288753 DeNicola et al. Sep 2001 B1
6289340 Purnam et al. Sep 2001 B1
6289382 Bowman-Amuah Sep 2001 B1
6301462 Freeman et al. Oct 2001 B1
6301573 McIlwaine et al. Oct 2001 B1
6324282 McIlwaine et al. Nov 2001 B1
6347374 Drake et al. Feb 2002 B1
6351467 Dillon Feb 2002 B1
6353851 Anupam et al. Mar 2002 B1
6360250 Anupam et al. Mar 2002 B1
6370574 House et al. Apr 2002 B1
6404857 Blair et al. Jun 2002 B1
6411989 Anupam et al. Jun 2002 B1
6418471 Shelton et al. Jul 2002 B1
6459787 McIlwaine et al. Oct 2002 B2
6487195 Choung et al. Nov 2002 B1
6493758 McLain Dec 2002 B1
6502131 Vaid et al. Dec 2002 B1
6510220 Beckett, II et al. Jan 2003 B1
6535909 Rust Mar 2003 B1
6542602 Elazar Apr 2003 B1
6546405 Gupta et al. Apr 2003 B2
6560328 Bondarenko et al. May 2003 B1
6583806 Ludwig et al. Jun 2003 B2
6606657 Zilberstein et al. Aug 2003 B1
6606744 Mikurak Aug 2003 B1
6665644 Kanevsky et al. Dec 2003 B1
6674447 Chiang et al. Jan 2004 B1
6683633 Holtzblatt et al. Jan 2004 B2
6697858 Ezerzer et al. Feb 2004 B1
6724887 Eilbacher et al. Apr 2004 B1
6728695 Pathria et al. Apr 2004 B1
6738456 Wrona et al. May 2004 B2
6757361 Blair et al. Jun 2004 B2
6772396 Cronin et al. Aug 2004 B1
6775377 McIlwaine et al. Aug 2004 B2
6792575 Samaniego et al. Sep 2004 B1
6810414 Brittain Oct 2004 B1
6820083 Nagy et al. Nov 2004 B1
6823384 Wilson et al. Nov 2004 B1
6870916 Henrikson et al. Mar 2005 B2
6901438 Davis et al. May 2005 B1
6904449 Quinones Jun 2005 B1
6959078 Eilbacher et al. Oct 2005 B1
6965886 Govrin et al. Nov 2005 B2
7023979 Wu et al. Apr 2006 B1
7043008 Dewan May 2006 B1
7085728 Sarlay et al. Aug 2006 B2
7203285 Blair Apr 2007 B2
7222075 Petrushin May 2007 B2
8078486 Mclean et al. Dec 2011 B1
20010000962 Rajan May 2001 A1
20010032335 Jones Oct 2001 A1
20010043697 Cox et al. Nov 2001 A1
20020038363 MacLean Mar 2002 A1
20020052948 Baudu et al. May 2002 A1
20020065911 von Klopp et al. May 2002 A1
20020065912 Catchpole et al. May 2002 A1
20020128925 Angeles Sep 2002 A1
20020143925 Pricer et al. Oct 2002 A1
20020152305 Jackson et al. Oct 2002 A1
20020165954 Eshghi et al. Nov 2002 A1
20020184069 Kosiba et al. Dec 2002 A1
20030014491 Horvitz et al. Jan 2003 A1
20030055883 Wiles, Jr. Mar 2003 A1
20030079020 Gourraud et al. Apr 2003 A1
20030144900 Whitmer Jul 2003 A1
20030154240 Nygren et al. Aug 2003 A1
20030163360 Galvin Aug 2003 A1
20040100507 Hayner et al. May 2004 A1
20040165717 McIlwaine et al. Aug 2004 A1
20050043986 McConnell et al. Feb 2005 A1
20050138560 Lee et al. Jun 2005 A1
20070195944 Korenblit et al. Aug 2007 A1
20070198323 Bourne et al. Aug 2007 A1
20080002823 Fama et al. Jan 2008 A1
20080004933 Gillespie Jan 2008 A1
Foreign Referenced Citations (6)
Number Date Country
0453128 Oct 1991 EP
0773687 May 1997 EP
0989720 Mar 2000 EP
2369263 May 2002 GB
WO 9843380 Nov 1998 WO
WO 0016207 Mar 2000 WO
Non-Patent Literature Citations (118)
Entry
“Customer Spotlight: Navistar International,” Web page, unverified print date of Apr. 1, 2002.
“DKSystems Integrates QM Perception with OnTrack for Training,” Web page, unverified print date of Apr. 1, 2002, unverified cover date of Jun. 15, 1999.
“OnTrack Online Delivers New Web Functionality,” Web page, unverified print date of Apr. 2, 2002, unverified cover date of Oct. 5, 1999.
“PriceWaterouseCoopers Case Study The Business Challenge,” Web page, unverified cover date of 2000.
Abstract, net.working: “An Online Webliography,” Technical Training pp. 4-5 (Nov.-Dec. 1998).
Adams et al., “Our Turn-of-the-Century Trend Watch” Technical Training pp. 46-47 (Nov./Dec. 1998).
Barron, “The Road to Performance: Three Vignettes,” Technical Skills and Training pp. 12-14 (Jan. 1997).
Bauer, “Technology Tools: Just-in-Time Desktop Training is Quick, Easy, and Affordable,” Technical Training pp. 8-11 (May/Jun. 1998).
Beck et al., “Applications of A1 in Education,” AMC Crossroads vol. 1: 1-13 (Fall 1996) Web page, unverified print date of Apr. 12, 2002.
Benson and Cheney, “Best Practices in Training Delivery,” Technical Training pp. 14-17 (Oct. 1996).
Bental and Cawsey, “Personalized and Adaptive Systems for Medical Consumer Applications,” Communications ACM 45(5): 62-63 (May 2002).
Benyon and Murray, “Adaptive Systems: from intelligent tutoring to autonomous agents,” pp. 152, Web page, unknown date.
Blumenthal et al., “Reducing Development Costs with Intelligent Tutoring System Shells,” pp. 1-5, Web page, unverified print date of Apr. 9, 2002, unverified cover date of Jun. 10, 1996.
Brusilosy et al., “Distributed intelligent tutoring on the Web,” Proceedings of the 8th World Conference of the AIED Society, Kobe, Japan, Aug. 18-22, pp. 1-9 Web page, unverified print date of Apr. 12, 2002, unverified cover date of Aug. 18-22, 1997.
Brusilovsky and Pesin, ISIS-Tutor: An Intelligent Learning Environment for CD/ISIS Users, pp. 1-15 Web page, unverified print date of May 2, 2002.
Brusilovsky, “Adaptive Educational Systems on the World-Wide-Web: A Review of Available Technologies,” pp. 1-10, Web Page, unverified print date of Apr. 12, 2002.
Byrnes et al., “The Development of a Multiple-Choice and True-False Testing Environment on the Web,” pp. 1-8, Web page, unverified print date of Apr. 12, 2002, unverified cover date of 1995.
Calvi and DeBra, “Improving the Usability of Hypertext Coursewae through Adaptive Linking,”ACM, unknown page numbers (1997).
Coffey, “Are Performance Objectives Really Necessary?” Technical Skills and Training pp. 25-27 (Oct. 1995).
Cohen, “Knowledge Management's Killer App,” pp. 1-11, Web page, unverified print date of Sep. 12, 2002, unverified cover date of 2001.
Cole-Gomolski, “New Ways to manage E-Classes,” Computerworld 32(48):4344 (Nov. 30, 1998).
Cross: “Sun Microsystems—the SunTAN Story,” Internet Time Group 8 (© 2001).
Cybulski and Linden, “Teaching Systems Analysis and Design Using Multimedia and Patterns,” unknown date, unknown source.
De Bra et al., “Adaptive Hypermedia: From Systems to Framework,”ACM (2000).
De Bra, “Adaptive Educational Hypermedia on the Web,” Communications ACM 45(5):60-61 (May 2002).
Dennis and Gruner, “Computer Managed Instruction at Arthur Andersen & Company: A Status Report,” Educational Technical pp. 7-16 (Mar. 1992).
Diessel et al., “Individualized Course Generation: A Marriage Between CAL and ICAL,” Computers Educational 22(1/2) 57-65 (1994).
Dyreson, “An Experiment in Class Management Using the World Wide Web,” pp. 1-12, Web page, unverified print date of Apr. 12, 2002.
E Learning Community, “Excellence in Practice Award: Electronic Learning Technologies,” Personal Learning Network pp. 1-11, Web page, unverified print date of Apr. 12, 2002.
Eklund and Brusilovsky, “The Value of Adaptivity in Hypermedia Learning Environments: A Short Review of Empirical Evidence,” pp. 1-8, Web page, unverified print date of May 2, 2002.
e-Learning the future of learning, THINQ Limited, London, Version 1.0 (2000).
Eline, “A Trainer's Guide to Skill Building,” Technical Training pp. 34-41 (Sep./Oct. 1998).
Eline, “Case Study: Briding the Gap in Canada's IT Skills,” Technical Skills and Training pp. 23-25 (Jul. 1997).
Eline “Case Study: IBT's Place in the Sun,” Technical Training pp. 12-17 (Aug./Sep. 1997).
Fritz, “CB templates for productivity: Authoring system templates for trainers,” Emedia Professional 10(8):6678 (Aug. 1997).
Fritz, “ToolBook II: Asymetrix's updated authoring software tackles the Web,”Emedia Professional 10(20): 102106 (Feb. 1997).
Gibson et al., “A Comparative Analysis of Web-Based Testing and Evaluation Systems,” pp. 1-8, Web page, unverified print date of Apr. 11, 2002.
Halberg and DeFiore, “Curving Toward Performance: Following a Hierarchy of Steps Toward a Performance Orientation,” Technical Skills and Training pp. 9-11 (Jan. 1997).
Harsha, “Online Training ‘Sprints’ Ahead,” Technical Training pp. 27-29 (Jan./Feb. 1999).
Heideman, “Training Technicians for a High-Tech Future: These six steps can help develop technician training for high-tech work,” pp. 11-14 (Feb./Mar. 1995).
Heideman, “Writing Performance Objectives Simple as A-B-C (and D),” Technical Skills and Training pp. 5-7 (May/Jun. 1996).
Hollman, “Train Without Pain: The Benefits of Computer-Based Training Tools,” pp. 1-11, Web page, unverified print date of Mar. 20, 2002, unverified cover date of Jan. 1, 2000.
Klein, “Command Decision Training Support Technology,” Web page, unverified print date of Apr. 12, 2002.
Koonce, “Where Technology and Training Meet,” Technical Training pp. 10-15 (Nov./Dec. 1998).
Kursh, “Going the distance with Web-based training,” Training and Development 52(3): 5053 (Mar. 1998).
Larson, “Enhancing Performance Through Customized Online Learning Support,”Technical Skills and Training pp. 25-27 (May/Jun. 1997).
Linton, et al. “Owl: A Recommender System for Organization-Wide Learning,” Educational Technical Society 3(1): 62-76 (2000).
Lucadamo and Cheney, “Best Practices in Technical Training,” Technical Training pp. 21-26 (Oct. 1997).
McNamara, “Monitoring Solutions: Quality Must be Seen and Heard,” Inbound/Outbound pp. 66-67 (Dec. 1989).
Merrill, “The New Component Design Theory: Instruction design for courseware authoring,” Instructional Science 16:19-34 (1987).
Minton-Eversole, “IBT Training Truths Behind the Hype,” Technical Skills and Training pp. 15-19 (Jan. 1997).
Mizoguchi, “Intelligent Tutoring Systems: The Current State of the Art,” Trans. IEICE E73(3):297-307 (Mar. 1990).
Mostow and Aist, “The Sounds of Silence: Towards Automated Evaluation of Student Leaning a Reading Tutor that Listens” American Association for Artificial Intelligence, Web page, unknown date Aug. 1997.
Mullier et al., “A Web base Intelligent Tutoring System,” pp. 1-6, Web page, unverified print date of May 2, 2002.
Nash, Database Marketing, 1993, pp. 158-165, 172-185, McGraw Hill, Inc. USA.
Nelson et al. “The Assessment of End-User Training Needs,” Communications ACM 38(7):27-39 (Jul. 1995).
O'Herron, “CenterForce Technologies CenterForce Analyzer,” Web page unverified print dateof Mar. 2, 2002, unverified cover date of Jun. 1, 1999.
O'Roark, “Basic Skills Get a Boost,” Technical Training pp. 10-13 (Jul./Aug. 1998).
Pamphlet, On Evaluating Educational Innovations1, authored by Alan Lesgold, unverified cover date of Mar. 5, 1998.
Papa et al., “A Differential Diagnostic Skills Assessment and Tutorial Tool,” Computer Education 18(1-3):45-50 (1992).
PCT International Search Report, International Application No. PCT/US03/02541, mailed May 12, 2003.
Phaup, “New Software Puts Computerized Tests on the Internet: Presence Corporation announces breakthrough Question Mark™ Web Product,” Web page, unverified print date of Apr. 1, 2002.
Phaup, “QM Perception™ Links with Integrity Training's WBT Manager™ to Provide Enhanced Assessments of Web Based Courses,” Web page, unverified print date of Apr. 1, 2002, unverified cover date of Mar. 25, 1999.
Phaup, “Question Mark Introduces Access Export Software,” Web page, unverified print date of Apr. 2, 2002, unverified cover date of Mar. 1, 1997.
Phaup, “Question Mark Offers Instant Online Feedback for Web Quizzes and Questionnaires: Univerity of California assist with Beta Testing, Server scripts now available on high-volume users,” Web page, unverified print date of Apr. 1, 2002, unverified cover date of May 6, 1996.
Piskurich, Now-You-See-'Em, Now-You-Don't Learning Centers, Technical Training pp. 18-21 (Jan./Feb. 1999).
Read, “Sharpening Agents' Skills,” pp. 1-15, Web page, unverified print date of Mar. 20, 2002, unverified cover date of Oct. 1, 1999.
Reid, “On Target: Assessing Technical Skills,” Technical Skills and Training pp. 6-8 (May/Jun. 1995).
Stormes, “Case Study: Restructuring Technical Training Using ISD,” Technical Skills and Training pp. 23-26 (Feb./Mar. 1997).
Tennyson, “Artificial Intelligence Methods in Computer-Based Instructional Design,” Journal of Instructional Development 7(3): 17-22 (1984).
The Editors, Call Center, “The Most Innovative Call Center Products We Saw in 1999,” Web page, unverified print date of Mar. 20, 2002, unverified cover date of Feb. 1, 2000.
Tinoco et al., “Online Evaluation in WWW-based Courseware,” ACM pp. 194-198 (1997).
Uiterwijk et al., “The virtual classroom,”InfoWorld 20(47):6467 (Nov. 23, 1998).
Unknown Author, “Long-distance learning,” InfoWorld 20(36):7676 (1998).
Untitled, 10th Mediterranean Electrotechnical Conference vol. 1 pp. 124-126 (2000).
Watson and Belland, “Use of Learner Data in Selecting Instructional Content for Continuing Education,” Journal of Instructional Development 8(4):29-33 (1985).
Weinschenk, “Performance Specifications as Change Agents,”Technical Training pp. 12-15 (Oct. 1997).
Witness Systems promotional brochure for eQuality entitled “Bringing eQuality to Bisiness”.
Witness Systems promotional brochure for eQuality entitled “Building Customer Loyalty Through BusinessDriven Recording of Multimedia Interactions in your Contact Center,” (2000).
Aspect Call Center Product Specification, “Release 2.0”, Aspect Telecommuications Corporation, May 23, 1998 798.
Metheus X Window Record and Playback, XRP Features and Benefits, 2 pages Sep. 1994 LPRs.
“Keeping an Eye on You Agents,” Call Center Magazine, pp. 32-34, Feb. 1993 LPRs & 798.
Anderson: Interactive TVs New Approach, The Standard, Oct. 1, 1999.
Ante, Everything You Ever Wanted to Know About Cryptography Legislation. . . (But Were to Sensible to Ask), PC world Online, Dec. 14, 1999.
Berst, It's Baa-sack. How Interative TV is Sneaking Into Your Living Room, The AnchorDesk, May 10, 1999.
Berst, Why Interactive TV Won't Turn You on (Yet), The AnchorDesk, Jul. 13, 1999.
Borland and Davis, US West Plans Web Services on TV, CNETNews.com, Nov. 22, 1999.
Brown, Let PC Technology Be Your TV Guide, PC Magazine, Jun. 7, 1999.
Brown, Interactive TV: The Sequel, NewMedia, Feb. 10, 1998.
Cline, Déjàvu—Will Interactive TV Make It This Time Around?, DevHead, Jul. 9, 1999.
Crouch, TV Channels on the Web, PC World, Sep. 15, 1999.
D'Amico, Interactive TV Gets $99 set-top box, IDG.net, Oct. 6, 1999.
Davis, Satellite Systems Gear Up for Interactive TV Fight, CNETNews.com, Sep. 30, 1999.
Diederich, Web TV Data Gathering Raises Privacy Concerns, ComputerWorld, Oct. 13, 1998.
Digital Broadcasting, Interactive TV News.
EchoStar, MediaX Mix Interactive Multimedia With Interactive Television, PRNews Wire, Jan. 11, 1999.
Furger, The Internet Meets the Couch Potato, PCWorld, Oct. 1996.
Hong Kong Comes First with Interactive TV, SCI-TECH, Dec. 4, 1997.
Interactive TV Overview TimeLine, Interactive TV News.
Interactive TV Wars Heat Up, Industry Standard.
Needle, Will the Net Kill Network TV? PC World Online, Mar. 10, 1999.
Kane, AOL-Tivo: You've Got Interactive TV, ZDNN, Aug. 17, 1999.
Kay, E-Mail in Your Kitchen, PC World Online, 093/28/96.
Kenny, TV Meets Internet, PC World Online, Mar. 28, 1996.
Linderholm, Avatar Debuts Home Theater PC, PC World Online, Dec. 1, 1999.
Mendoza, Order Pizza WhileYou Watch, ABCNews.com.
Moody, WebTV: What the Big Deal?, ABCNews.com.
Murdorf, et al., Interactive Television—Is There Life After the Internet?, Interactive TV News.
Needle, PC, TV or Both?, PC World Online.
Interview with Steve Perlman, CEO of Web-TV Networks, PC World Online.
Press, Two Cultures, The Internet and Interactive TV, Universite de Montreal.
Reuters, Will TV Take Over Your PC?, PC World Online.
Rohde, Gates Touts Interactive TV, InfoWorld, Oct. 14, 1999.
Ross, Broadcasters Use TV Signals to Send Data, PC World Oct. 1996.
Schlisserman, Is Web TV a Lethal Weapon?, PC World Online.
Stewart, Interactive Television at Home: Television Meets the Internet, Aug. 1998.
Swedlow, Computer TV Shows: Ready for Prime Time?, PC World Online.
Wilson, U.S. West Revisits Interactive TV, Interactive Week, Nov. 28, 1999.
Provisional Applications (1)
Number Date Country
60798683 May 2006 US