Interface and method of designing an interface

Information

  • Patent Grant
  • 7139369
  • Patent Number
    7,139,369
  • Date Filed
    Thursday, August 29, 2002
    22 years ago
  • Date Issued
    Tuesday, November 21, 2006
    18 years ago
Abstract
A method of designing an interface system that allows users to map the representation of their task directly to the interface. There are three major phases to the Customer-Centric Approach to Interface Design (C-CAID). End-users' tasks are categorized to determine the frequency of reasons or tasks of why users interact with a particular system. These reasons and their relative frequencies are used to design interface options that emphasize the user's task categories. Finally, the customer-centric interface designs are evaluated and compared with existing system interfaces using usability tests with actual users performing the tasks. The results from usability tests are used to pinpoint the task-option combinations that do not work well and which should be revised. Benefits of this customer-centric design are improved systems performance and increased user satisfaction.
Description
TECHNICAL FIELD OF THE INVENTION

The present invention relates to a customer-centric approach to interface design (C-CAID) such as for Interactive Voice Response (IVR) systems. The customer-centric approach to IVR menu design produces menu options that closely match the various tasks that customers are trying to accomplish when they are accessing an IVR system. The menu options are grouped and ordered by the frequency of the occurrence of specific customer's tasks, and they are worded in the language used by the customer to facilitate customer understanding of the actual choice provided.


BACKGROUND OF THE INVENTION

Every year, millions of customers call various customer service centers looking for assistance with various tasks that they want to accomplish or looking for answers to various inquiries. The end goal for both the customer and the customer service center is to route the customer's call to an organizational representative who can best accomplish the customer's task, while minimizing the number of misdirected calls. Presently, most customer calls are answered by an IVR whose primary function is to direct the call to an appropriate service center. To get a specific call to the correct center, the customers have to map or correlate their reason for calling onto the applicable IVR menu choices.


A major shortcoming of many of the present prior art interface design methods is that these methods simply map customer service departments onto an organizational hierarchy and allocate tasks to these departments based upon this organizational structure. Interface design is often accomplished with little or no empirical data or user centered methodology to guide the design process. If there is a department that handles new accounts, “New Accounts” becomes a menu option and incoming calls are routed to that department. The remaining services provided by the various organizational service centers are allocated accordingly. The interface design is forcibly fit onto the existing organizational structure.


Another shortcoming of this approach is that many of the organizational structure names do not translate easily into language constructs that a user would readily understand. For example, a service organization entitled “High Speed Internet Access Modalities” when converted into a menu option would not convey the same information content to the average user as “Ordering a new ISDN Line”.


The underlying problem common to all of the prior art systems is that there is no user data or user information input into systems development in the early design stages. The resulting user interface merely mirrors an organization and there is no customization or optimization folded into the design of the interface. The resulting menu is simply imposed on the user with no consideration or forethought for what are the actual requirements of the user or how to best address the needs of the user.


The goal of an effective interface design methodology is to use as much user input as possible to ensure that users are presented with choices that they need and are able to understand, while simultaneously minimizing and potentially eliminating unproductive time spent looking at and having to choose between selections that do not address or solve the specific problems that customers want solved or addressed. The choice of style, type, structure and language used in the interface design all have a significant impact upon performance.


In contrast to traditional methods of interface design, the current methodology takes into consideration end-user task frequency and applies the language of the user in the design of the interface menu options. It is much easier for a user to understand and make an intelligent and informed selection of a requested service or task in their own words and in the language of more common usage, rather than listening to recitations of technical jargon and organization-specific descriptions. In a conventional system, a user may have to hazard a guess (with the resulting possibility of an incorrect guess) as to which category applies to their particular situation. Such guessing is minimized by the present interface design method.


Therefore, a method is needed to design an interface that maximizes the performance of the users during operation, reduces misrouted calls, and ensures that the users arrive at their desired destination after successfully (i.e., rapidly and directly) navigating through the menuing system. The present customer centric design methodology has been shown to improve system performance by mapping the user's task representations to the task options of the user interface. The C-CAID approach allows customers to make the correct menu option selection and reach the correct service department without requiring additional assistance or intervention from an operator or customer service representative.


Following the C-CAID approach results in a user interface that the customer can easily understand and successfully navigate. In other words, the interface according to the present invention is designed to be customer friendly. This increases system performance, reduces operational costs, improves customer satisfaction, and leads to general improvements in overall efficiency.


SUMMARY OF THE DESCRIPTION

Accordingly, the present invention is directed to a method for designing a user interface and taking into consideration the user's input in mapping the specific tasks to the interface.


It is an object of the present invention to provide a customer-centric method for designing a menu based option interface. Accordingly, the present invention provides for identifying, categorizing, producing, using, evaluating and comparing the system interface by employing usability tests with actual user task performance data to optimize the system user interface.


The present invention is directed to a method for designing an interface system comprising identifying reasons for a user to interact with the interface system, categorizing the reasons into task categories based at least upon commonality of subject matter, producing menu options based upon the task categories, and organizing the menu options based upon the frequency of occurrence of the task categories.


The method is further directed to evaluating the designed interface system utilizing usability tests to optimize the interface system. Further, the usability tests compare menu options of the designed interface system with an existing interface system. Yet further, evaluating comprises identifying ineffective task categories and menu option combinations.


According to further features of the invention, producing the menu options comprises utilizing customer-centric terminology to produce the menu options and the identified reasons are mapped to an associated menu option that addresses responses to queries represented by the reasons.


According to a feature of the present invention, the interface system comprises an interactive Voice Response (IVR) system.


Additionally, identifying reasons addresses the reasons why a user may access the interface system and defines a relationship between tasks related to the reasons and menu options.


According to a feature of the present invention, the organization of menu options locates the high frequency tasks earlier in the sequence of menu options.


Further, a prediction model based upon expected user task volume and task frequency is utilized to provide an indication of how an interface system will perform. In addition, the cumulative response time (CRT) and routing accuracy are used to evaluate performance of the interface system.


The present invention is directed to a method for designing an interface system, comprising utilizing reasons a user is interfacing with a system to define tasks, determining frequency of task occurrence, categorizing tasks in accordance with the frequency of task occurrence, and utilizing the categorized tasks to design an interface system options menu.


According to a further feature, the tasks are categorized based upon task frequency and call volume.


Additionally, user relevant terms are used in the menu options.


Further, evaluating the performance of the interface system can be by use of an analytic scoring model.


Additionally, utilizing of the categorized tasks can include grouping and ordering the menu options in accordance with a frequency of task occurrence.


Yet further, the menu options can be ordered so that higher frequency tasks are positioned higher in the sequence of options.


Additionally, a prediction model based upon expected user task volume and task frequency can be utilized to provide an indication of how the interface system will perform.


According to the present invention, cumulative response time (CRT) and routing accuracy can be used to evaluate performance of the interface system.


The present invention also is directed to an interface system for performing tasks related to a user's reasons for interacting with the interface system, the interface system including an ordered set of menu options, wherein the order of the menu options of the set of menu options is determined in accordance with task frequency and each menu option is defined in terms of customer relevant terminology.


Additionally, the menu options can represent tasks categorized according to task frequency and call volume.


Furthermore, the interface can comprise an interactive voice response system.


The foregoing objects are achieved by the present invention. Additional features and advantages of the present invention will be set forth in the description to follow, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the methods particularly pointed out in the written description and claims hereof, together with the appended drawings.


In the following description, the C-CAID of the present invention will be described as applied to a IVR for a telecommunication company small business call center. However, the C-CAID design approach of the present invention is not limited in application to small business call center IVR design. The present design approach may be used in any organization or business unit and is not limited to telephone or telecommunications companies. It can be used for any type of company or governmental unit having need for an interface such as an IVR with which others, such as members of the public (i.e., users) can readily and efficiently interact.


In addition to the IVR system described in this specification, the C-CAID approach may be used to design any man/machine interface, such as one between a computer and a computer operator. In particular this method is also directly applicable to the design of an internet web site map. The C-CAID model can be applied to a wide variety of interface types and modalities. C-CAID could be applied to the design of any human and machine interface such as merely as a non-limiting example, a computer menuing system, an airline reservation system, an on-line shopping or electronic commerce web site, etc.


The C-CAID methodology could also be used in optimizing the interface between two or more machines or devices. In particular this method is also directly applicable to the design of an internet web site map. C-CAID could be used to identify those machine tasks that occur at a greater relative frequency than others and the system could be designed to focus on these tasks and optimize their operations. For example, additional processing power and reallocation of greater memory resources to perform these higher frequency-of-occurrence tasks could be input into the system design.


The C-CAID methodology used in/by the present invention is organization and system independent. Further, it should be noted that the data assembled, processed and folded into an IVR design can be updated as often as the designer or organization wants. In other words, the IVR can be updated yearly, quarterly, weekly, etc. The system can also be adapted to acquire data and measurement parameters in real time and correspondingly adapt the interface dynamically in light of the acquired information.


It is to be understood that both the foregoing general description and the following detailed description are only exemplary and explanatory, rather than limiting, and are intended to provide further explanation of the invention as claimed.


The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrating one embodiment of the invention. The drawings, together with the description, serve to explain the principles of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

A more complete understanding of the present embodiments and advantages thereof may be acquired by referring to the following description taken in conjunction with the accompanying drawings, in which like reference numbers indicate like features, and wherein:


The present invention is illustrated by way of example, and not by way of limitation, by the figures of the accompanying drawings in which like reference numerals refer to similar elements, and in which:



FIG. 1 is a flowchart representing the C-CAID methodology;



FIG. 2 is a representation of a CRT/Routing Matrix;



FIG. 3 shows a sample calculation of CRT;



FIG. 4 shows the percentage of calls made to small business centers; and



FIG. 5 shows a Sample questionnaire.





DETAILED DESCRIPTION OF THE INVENTION

The C-CAID process provides a statistically optimized and validated design methodology based upon empirical data support to help a design team apply the needs and desires of the user to the design of interface options.


Studies have shown that customers are generally task oriented. As an illustrative example, a recent study showed that all the customers knew the reasons why they were a calling a specific center (they knew their task), but less than half knew for which department to ask. Also, about half the customers who did ask for a specific department guessed and these guesses were incorrect.


Consequently, customers who are able to map (i.e. correlate) their reason for calling (i.e. their task) directly to an IVR menu option get routed to the correct call center. Customers who cannot correctly map their task to the corresponding menu option might well make an incorrect selection and generate a misdirected call, which wastes resources for both the customer and the organization utilizing the IVR.


When applying the customer-centric approach to IVR design, it is very important to define IVR menu options using customer-relevant terms that are extracted or derived from real-world scenarios. This ensures that the users can understand the menu options and consequently make an intelligent and informed decision as to what they need and where best to satisfy that need in the overall system. To assure that technical terms will be understandable and salient to users, data must be collected from the various call/service centers to design the menu options using the customer's own wording.


Initially, when a first iteration of the customer-centric design is complete, one must evaluate its potential performance through an analytic scoring model. The customer-centric menu options tend to be combinations of the various task categories. The analytic scoring model uses several customer statements for each of the task categories used in the design process and determines the percentage of those statements that map to the corresponding option. The menu designer is trying to map a sample of the original statements of purpose onto the menu options to facilitate comprehension for a high percentage of customers. This allows the customer to readily ascertain the most pertinent menu option related to their specific needs.



FIG. 1 is a flow chart of the customer-centric methodology flow. Step S1 is the initial data collection step where any data, information or metrics pertinent to the interface design is collected.


Table 1 shows a breakdown of the collected sample by call center type, the location of the call center facilities, and the number of calls collected from each call center site.









TABLE 1







Number of calls to small business call centers











Call Center
Call Center
Number of



Type
Location
Calls















AA
TX1
561




AR1
222




AA Subtotal =
783



NN
MO1
175




TX2
31




NN Subtotal =
206



RR
TX3
195




TX4
151




OK1
77




MO2
174




RR Subtotal =
597



CC
MO2
495




CC Subtotal =
495



MM
OK2
310




MM Subtotal =
310



TOTAL

2391











FIG. 1, step S21, shows that as the service representatives at these centers receive calls, they log the customer's opening statement or reason for calling using an IVR Call Log Sheet that can be found in Table 2.


As is readily apparent, utilizing a Call Log Sheet such as the one shown in Table 2, a service representative can easily record the reasons for customers calls for a number of customers and the length of each call.









TABLE 2





Sample Call Log Sheet


















Small Business IVR Call Log Sheet
Date:









Service Center:            



CSR:                      















Call



Reason/Purpose of Customers
Length


Call
Call
(minutes)





Example:
“I'd like to find out the cost of
3.5



adding two more business phone



lines.”


1


2


3


4


5


6


7


8


9


10









The C-CAID is based upon on the user's representation of a task to be performed while interacting with a particular system interface. FIG. 1, step S2 indicates that user-centric statements should be applied to the design of the interface to assure that items are clearly mapped to the task options that the system can present to the user for selection. User-centric task statements are collected from a sample of users and categorized based on the tasks to be performed by the system.


Each task statement is classified or categorized into one of about seventy customer-centric task categories as shown in step S22 of FIG. 1. These customer-centric task categories were logically derived from a general consumer model. A general consumer model is a model based upon the potential tasks that a customer might require of an organization (e.g. telephone company), but does not have to be tailored to the unique organizational structure of a specific telephone company. Subcategories are generated to capture greater details of the lower level general task categories. The categories are then rank-ordered by the number of statements they represent. The IVR Call Log Sheets (Table 2) are electronically recorded and task categories are coded in spreadsheets for analysis. Examples of customer task statements are illustrated in Table 3. They show how many customer statements can be grouped into a higher level category that characterizes the same general information content.


It should be noted that the data collection method mentioned above can be performed either manually or automatically. Statistics can be accumulated via a data extraction or data collection algorithm designed to collect data without changing any of the intent, goals or effects of the present invention. This process automation can be applied to any of the steps found in the process flow diagram depicted in FIG. 1.









TABLE 3







Example customer statements to small business call centers.









Call
Reason/Purpose of Customers Call
Task Category





1
“I'd like to find out the cost of
Get



adding two more business phone
information



lines.”


2
“I want to order an ISDN line.”
Acquire




services


3
“I need some help on how to use call
Information



forwarding.”
request


4
“I need to get someone to look at my
Fix a service



broken phone.”
problem


5
“I called to check on my move
Relocate



orders.”
service









Before coding the customer task statements into task categories it is important to validate all of the task categories as shown in S23 of FIG. 1. Table 4 lists the different types of task categories than can be used in the design of the small business customer-centric IVR.









TABLE 4





Customer Statement Categorization list

















Acquire Service










A0
Unspecified Addition



A1
Request new phone service




(Open an Account)



A2
Add Optional Services



A3
Schedule Pending Acquisition



A4
Get Info about a Pending Acquisition



A5
Give Info for a Pending Acquisition



A6
Change a Pending Acquisition



A7
Cancel a Pending Acquisition



A8
Reconnect Service



A9
Acquire Service Temporarily









Discontinue Service










D0
Discontinue Unspecified Service



D1
Disconnect (Close Account)



D2
Discontinue Optional Service



D3
Schedule a Pending Deletion



D4
Get Info about a Pending Deletion



D5
Give Info for a Pending Deletion



D6
Change a Pending Deletion



D7
Cancel a Pending Deletion



D9
Discontinue Service Temporarily



D11
Return a Product









Fix a Service Problem










F0
Fix an unspecified problem



F1
Fix a Service



F2
Fix a Product



F3
Schedule a Pending Repair



F4
Get Info about a Pending Repair



F5
Give Info about a Pending Repair



F6
Change a Pending Repair



F7
Cancel a Pending Repair



F8
Report a Problem



F9
Fix a Problem









Relocate Service










M0
Move unspecified items (I'm moving.)



M1
Request a Move of Service



M3
Schedule a Pending Move



M4
Get Info about a Pending Move



M5
Give Info for the Pending Move



M6
Change a Pending Move



M7
Cancel a Pending Move



M9
Move Service Temporarily









Change Account Information, Services, or Service



Features










C0
Change Unspecified



C1
Change Account Data



C2
Change Optional Services



C8
Change a Feature of a Service



C9
Make a Temporary Change









Bill Issues










B0
Unspecified issue with the Bill



B4
Get Information about the Bill



B5
Give Information for the Bill



B6
Dispute the Bill









Pay Arrangement or Report a Payment










P0
Inquire about Payment Options



P1
Set Up Payment Arrangement



P2
Where to Make a Payment



P3
Schedule Payment



P4
Get Info about a Pending Payment



P5
Give Info for a Pending Payment



P6
Change a Pending Payment



P7
Cancel a Pending Payment









Information Requests










I0
Nature of inquiry unspecified



I1
On Services



I2
My Account



I3
Other Service Providers



I11
Other Company Offices



I12
Name/Address/Number









Long Distance/Other Service Provider










L1
Add a Carrier/Provider



L2
Restore a Carrier/Provider



L4
Get Info from the Carrier/Provider



L5
Give Info to the Carrier/Provider



L6
Change Carrier/Provider



L7
Cancel a Carrier/Provider



L8
Billing Issue for a Carrier/Provider



L9
Payment for a Carrier/Provider



L11
PIC Letter










Validation is the process whereby one tests all of the proposed call mapping categories to make sure that they adequately cover all of the reasons that a customer may call the organization for which the IVR is being developed. The interface designer wants to make sure that the IVR can handle any possible reason that a customer would be calling.


Different subsidiaries of a company may have different task categories than the ones used for the small business customer-centric IVR. The task categories listed in Table 4 were originally developed for the consumer customer-centric IVR. However, it may be necessary to test the proposed call mapping categories to make sure that they adequately cover all the reasons a customer calls the subsidiary for which the IVR is being developed. This process involves taking a sample of randomized, customer reasons for calling a center and mapping these reasons to the proposed categories. Going through this process allows the designer to amend the task categories to better fit the call center that will interface with the IVR. This approach allows adding or deleting various categories, reorganizing them entirely, or reorganizing any combination thereof.


The metrics tabulated in Table 5 through Table 9 illustrate the most frequent reasons that small business customers call the different call centers listed in Table









TABLE 5







Why customers call AA.











Category
Description
Category %







B4
Get Information about the
33.9% 




Bill



12
Get Information on my
17.5% 




Account



C1
Change Account Data
6.8%



D1
Disconnect (Close
5.4%




Account)



B6
Dispute the Bill
5.3%



I1
Get Information on
4.7%




Services



D2
Discontinue Optional
3.7%




Service



A2
Addition Optional
2.6%




Services



P4
Get Info about a Pending
2.6%




Payment



L6
Change Carrier/Provider
2.1%



I11
Get Information on other
1.8%




Company Offices



L2
Restore a
1.3%




Carrier/Provider



P1
Set Up Payment
1.3%




Arrangement



A8
Reconnect Service
1.2%



P5
Give Info for a Pending
1.2%




Payment



F1
Fix a Service
1.1%



P0
Inquire about Payment
0.9%




Options



D4
Get Info about a Pending
0.8%




Deletion



I0
Nature of inquiry
0.8%




unspecified



I12
Get Information on a
0.8%




Name/Address/Number



M1
Request a Move of
0.8%




Service



C2
Change Optional Services
0.7%



L7
Cancel a
0.7%




Carrier/Provider



A4
Get Info about a Pending
0.4%




Acquisition



A1
Request new phone
0.3%




service



L5
Give Info to the
0.3%




Carrier/Provider



P3
Schedule Payment
0.3%



A5
Give Info for a Pending
0.1%




Acquisition



C8
Change a Feature of a
0.1%




Service



D9
Discontinue Service
0.1%




Temporarily



F2
Fix a Product
0.1%



F8
Report a Problem
0.1%



L4
Get Info from the
0.1%




Carrier/Provider



M4
Get Info about a Pending
0.1%




Move



P2
Where to Make a Payment
0.1%




Total
100.0% 



A0
Unspecified Addition
0.0%



A3
Schedule Pending
0.0%




Acquisition



A6
Change a Pending
0.0%




Acquisition



A7
Cancel a Pending
0.0%




Acquisition



A9
Acquire Service
0.0%




Temporarily



B5
Give Information for the
0.0%




Bill



C0
Change Unspecified
0.0%



D0
Discontinue Unspecified
0.0%




Service



D11
Return a Product
0.0%



D5
Give Info for a Pending
0.0%




Deletion



F0
Fix an unspecified
0.0%




problem



F3
Schedule a Pending
0.0%




Repair



F9
Fix a Problem
0.0%



I3
Get Information on other
0.0%




Service Providers



L1
Add a Carrier/Provider
0.0%



L11
PIC Letter
0.0%



L8
Billing Issue for a
0.0%




Carrier/Provider



M5
Give Info for the
0.0%




Pending Move



M9
Move Service Temporarily
0.0%

















TABLE 6







Why customers call CC.













Category



Category
Description
%







L6
Change Carrier/Provider
29.5% 



I2
Get Information on my
20.5% 




Account



L4
Get Info from the
10.6% 




Carrier/Provider



B4
Get Information about the
6.9%




Bill



L2
Restore a Carrier/Provider
6.0%



I1
Get Information on Services
5.4%



L7
Cancel a Carrier/Provider
4.3%



L1
Add a Carrier/Provider
2.6%



B6
Dispute the Bill
2.2%



C1
Change Account Data
1.9%



A2
Add Optional Services
1.7%



C2
Change Optional Services
1.1%



I11
Get Information on other
1.1%




Company Offices



L5
Give Info to the
1.1%




Carrier/Provider



D2
Discontinue Optional
0.9%




Service



A4
Get Info about a Pending
0.6%




Acquisition



A1
Request new phone service
0.4%



F1
Fix a Service
0.4%



I12
Get Information on a
0.4%




Name/Address/Number



P4
Get Info about a Pending
0.4%




Payment



A7
Cancel a Pending
0.2%




Acquisition



D11
Return a Product
0.2%



F9
Fix a Problem
0.2%



I3
Get Information on other
0.2%




Service Providers



L8
Billing Issue for a
0.2%




Carrier/Provider



M1
Request a Move of Service
0.2%



P1
Set Up Payment Arrangement
0.2%



P5
Give Info for a Pending
0.2%




Payment





Total
100.0% 



A0
Unspecified Addition
0.0%



A3
Schedule Pending
0.0%




Acquisition



A5
Give Info for a Pending
0.0%




Acquisition



A6
Change a Pending
0.0%




Acquisition



A8
Reconnect Service
0.0%



A9
Acquire Service Temporarily
0.0%



B5
Give Information for the
0.0%




Bill



C0
Change Unspecified
0.0%



C8
Change a Feature of a
0.0%




Service



D0
Discontinue Unspecified
0.0%




Service



D1
Disconnect (Close Account)
0.0%



D4
Get Info about a Pending
0.0%




Deletion



D5
Give Info for a Pending
0.0%




Deletion



D9
Discontinue Service
0.0%




Temporarily



F0
Fix an unspecified problem
0.0%



F2
Fix a Product
0.0%



F0
Fix an unspecified problem
0.0%



F2
Fix a Product
0.0%



F3
Schedule a Pending Repair
0.0%



F2
Fix a Product
0.0%



F3
Schedule a Pending Repair
0.0%



F8
Report a Problem
0.0%



I0
Nature of inquiry
0.0%




unspecified



L11
PIC Letter
0.0%



M4
Get Info about a Pending
0.0%




Move



L8
Billing Issue for a
0.2%




Carrier/Provider



M1
Request a Move of Service
0.2%



P1
Set Up Payment Arrangement
0.2%



P5
Give Info for a Pending
0.2%




Payment





Total
100.0% 



A0
Unspecified Addition
0.0%



A3
Schedule Pending
0.0%




Acquisition



A5
Give Info for a Pending
0.0%




Acquisition



A6
Change a Pending
0.0%




Acquisition



A8
Reconnect Service
0.0%



A9
Acquire Service Temporarily
0.0%



B5
Give Information for the
0.0%




Bill



C0
Change Unspecified
0.0%



C8
Change a Feature of a
0.0%




Service



D0
Discontinue Unspecified
0.0%




Service



D1
Disconnect (Close Account)
0.0%



D4
Get Info about a Pending
0.0%




Deletion



D5
Give Info for a Pending
0.0%




Deletion



D9
Discontinue Service
0.0%




Temporarily



F0
Fix an unspecified problem
0.0%



F2
Fix a Product
0.0%



F3
Schedule a Pending Repair
0.0%



F8
Report a Problem
0.0%



I0
Nature of inquiry
0.0%




unspecified



L11
PIC Letter
0.0%



M4
Get Info about a Pending
0.0%




Move



M5
Give Info for the Pending
0.0%




Move



M9
Service Temporarily Move
0.0%



P0
Inquire about Payment
0.0%




Options



P2
Where to Make a Payment
0.0%



P3
Schedule Payment
0.0%

















TABLE 7







Why Customers call RR.













Frequency



Category
Description
%















I1
Get Information on
20.4%




Services



A2
Add Optional Services
16.9%



I2
Get Information on my
8.4%




Account



C1
Change Account Data
7.1%



A4
Get Info about a Pending
5.4%




Acquisition



D2
Discontinue Optional
4.7%




Service



D1
Disconnect (Close
3.8%




Account)



C2
Change Optional Services
3.3%



I11
Get Information on other
3.0%




Company Offices



M1
Request a Move of Service
3.0%



B4
Get Information about the
2.8%




Bill



L6
Change Carrier/Provider
2.6%



F1
Fix a Service
2.4%



A1
Request new phone service
1.9%



A5
Give Info for a Pending
1.9%




Acquisition



A8
Reconnect Service
1.6%



L5
Give Info to the
1.6%




Carrier/Provider



L1
Add a Carrier/Provider
1.2%



C8
Change a Feature of a
1.0%




Service



B6
Dispute the Bill
0.9%



A7
Cancel a Pending
0.7%




Acquisition



L4
Get Info from the
0.5%




Carrier/Provider



A3
Schedule Pending
0.3%




Acquisition



D4
Get Info about a Pending
0.3%




Deletion



D5
Give Info for a Pending
0.3%




Deletion



L2
Restore a
0.3%




Carrier/Provider



M4
Get Info about a Pending
0.3%




Move



P2
Where to Make a Payment
0.3%



P5
Give Info for a Pending
0.3%




Payment



A0
Unspecified Addition
0.2%



A6
Change a Pending
0.2%




Acquisition



A9
Acquire Service
0.2%




Temporarily



B5
Give Information for the
0.2%




Bill



C0
Change Unspecified
0.2%



D0
Discontinue Unspecified
0.2%




Service



D9
Discontinue Service
0.2%




Temporarily



F2
Fix a Product
0.2%



F3
Schedule a Pending Repair
0.2%



F8
Report a Problem
0.2%



L11
PIC Letter
0.2%



M5
Give Info for the Pending
0.2%




Move



M9
Service Temporarily Move
0.2%



P0
Inquire about Payment
0.2%




Options





Total
100.0%



D11
Return a Product
0.0



F0
Fix an unspecified
0.0




problem



F9
Fix a Problem
0.0



I0
Nature of inquiry
0.0




unspecified



I12
Get Information on a
0.0




Name/Address/Number



I3
Get Information on other
0.0




Service Providers



L7
Cancel a Carrier/Provider
0.0



L8
Billing Issue for a
0.0




Carrier/Provider



P1
Set Up Payment
0.0




Arrangement



P3
Schedule Payment
0.0



P4
Get Info about a Pending
0.0




Payment





Total
100.0

















TABLE 8







Why customers calls MM.











Category
Description
Frequency %















I1
Get Information on Services
36.1



I11
Get Information on other
14.0




Company Offices



A2
Add Optional Services
13.7



I2
Get Information on my
6.0




Account



A4
Get Info about a Pending
5.3




Acquisition



B4
Get Information about the
3.2




Bill



M1
Request a Move of Service
3.2



F1
Fix a Service
2.1



F2
Fix a Product
2.1



C1
Change Account Data
1.8



A1
Request new phone service
1.4



C2
Change Optional Services
1.4



D2
Discontinue Optional
1.4




Service



M4
Get Info about a Pending
1.4




Move



D11
Return a Product
1.1



L4
Get Info from the
1.1




Carrier/Provider



A3
Schedule Pending
0.7




Acquisition



A5
Give Info for a Pending
0.7




Acquisition



C8
Change a Feature of a
0.7




Service



F0
Fix an unspecified problem
0.7



A8
Reconnect Service
0.4



A9
Acquire Service Temporarily
0.4



D1
Disconnect (Close Account)
0.4



F8
Report a Problem
0.4



M5
Give Info for the Pending
0.4




Move



P5
Give Info for a Pending
0.4




Payment





Total
100.0



A0
Unspecified Addition
0.0



A6
Change a Pending
0.0




Acquisition



A7
Cancel a Pending
0.0




Acquisition



B5
Give Information for the
0.0




Bill



B6
Dispute the Bill
0.0



C0
Change Unspecified
0.0



D0
Discontinue Unspecified
0.0




Service



D4
Get Info about a Pending
0.0




Deletion



D5
Give Info for a Pending
0.0




Deletion



D9
Discontinue Service
0.0




Temporarily



F3
Schedule a Pending Repair
0.0



F9
Fix a Problem
0.0



I0
Nature of inquiry
0.0




unspecified



I12
Get Information on a
0.0




Name/Address/Number



I3
Get Information on other
0.0




Service Providers



L1
Add a Carrier/Provider
0.0



L11
PIC Letter
0.0



L2
Restore a Carrier/Provider
0.0



L5
Give Info to the
0.0




Carrier/Provider



L6
Change Carrier/Provider
0.0



L7
Cancel a Carrier/Provider
0.0



L8
Billing Issue for a
0.0




Carrier/Provider



M9
Service Temporarily Move
0.0



P0
Inquire about Payment
0.0




Options



P1
Set Up Payment Arrangement
0.0



P2
Where to Make a Payment
0.0



P3
Schedule Payment
0.0



P4
Get Info about a Pending
0.0




Payment

















TABLE 9







Why customers call NN.













Frequency



Category
Description
%















A1
Request a new phone service
20.1



I1
Get Information on Services
18.0



M1
Request a Move of Service
10.1



A2
Add Optional Services
7.4



B4
Get Information about the
7.4




Bill



I2
Get Information on my
6.3




Account



A4
Get Info about a Pending
4.8




Acquisition



C1
Change Account Data
4.8



D1
Disconnect (Close Account)
3.7



C2
Change Optional Services
2.1



A3
Schedule Pending
1.6




Acquisition



I11
Get Information on other
1.6




Company Offices



L4
Get Info from the
1.6




Carrier/Provider



M5
Give Info for the Pending
1.6




Move



A5
Give Info for a Pending
1.1




Acquisition



D4
Get Info about a Pending
1.1




Deletion



L6
Change Carrier/Provider
1.1



M4
Get Info about a Pending
1.1




Move



A8
Reconnect Service
0.5



A9
Acquire Service Temporarily
0.5



B6
Dispute the Bill
0.5



D2
Disconnect Optional Service
0.5



F2
Fix a Product
0.5



I0
Nature of inquiry
0.5




unspecified



P1
Set Up Payment Arrangement
0.5



P4
Get Info about a Pending
0.5




Payment



P5
Give Info for a Pending
0.5




Payment





Total
100.0



A0
Unspecified Addition
0.0



A6
Change a Pending
0.0




Acquisition



A7
Cancel a Pending
0.0




Acquisition



B5
Give Information for the
0.0




Bill



C0
Change Unspecified
0.0



C8
Change a Feature of a
0.0




Service



D0
Discontinue Unspecified
0.0




Service



D11
Return a Product
0.0



D5
Give Info for a Pending
0.0




Deletion



D9
Discontinue Service
0.0




Temporarily



F0
Fix an unspecified problem
0.0



F1
Fix a Service
0.0



F3
Schedule a Pending Repair
0.0



F8
Report a Problem
0.0



F9
Fix a Problem
0.0



I12
Get Information on a
0.0




Name/Address/Number



I3
Get Information on other
0.0




Service Providers



L1
Add a Carrier/Provider
0.0



L11
PIC Letter
0.0



L2
Restore a Carrier/Provider
0.0



L5
Give Info to the
0.0




Carrier/Provider



L7
Cancel a Carrier/Provider
0.0



L8
Billing Issue for a
0.0




Carrier/Provider



M9
Move Service Temporarily
0.0



P0
Inquire about Payment
0.0




Options



P2
Where to Make a Payment
0.0



P3
Schedule Payment
0.0










A customer-centric design approach takes the most frequent reasons and uses them as guidance to identify and define the arrangement of menu topics to reflect this frequency of occurrence.


Table 5 illustrates the task categories or reasons that customers call center AA. The most frequent category is to “Get Information About the Bill and accounts for 33.9% of the calls. The second most frequent task category occurs in only 17.5% of the calls (“Get Information On My Account”). For the task categories with a frequency above 1.0%, 6 of them are about “Information” and account for 61.7% of the customer calls. This suggests that “Information” should be the prominent topic in a customer-centric IVR menu for small business customer calls to call center AA.


Table 6 illustrates the task categories or reasons that customers call center CC. The top category is to “Change Carrier/Provider” and accounts for 29.5% of the customer calls. The second most frequent task category accounts for 20.5% of the customer calls (“Get Information On My Account”). For the task categories with a frequency above 1.0%, 5 are about “Information” and account for 44.5% of the customer calls, and 5 are about “Carriers” and account for 43.2% of the customer calls. This suggests that “Information” and topics relating to “Carriers” should be prominent topics in a customer-centric IVR menu for customer calls to the call center designated as CC.


Table 7 illustrates the task categories or reasons that customers call RR. The top category is to “Get Information on Services” and accounts for 20.4% of the customer calls. The second most frequent task category called is add optional services and accounts for 16.9% of the customer calls. For the task categories with a frequency above 1.0%, 5 are about “information” and account for 40.0% of the customer calls; 3 are about “optional services” and account for 24.9% of the customer calls. This suggests that “information” and topics relating to “optional services” should be prominent topics in a customer-centric IVR menu for customer calls to call center RR.


Table 8 illustrates the task categories or reasons that customers call MM. The top category is to “Get Information on Services” and accounts for 36.1% of the customer calls. The second most frequent task category is to get information about other company services and accounts for 14.0% of the customer calls. The third most frequent category is to add optional services and accounts for 13.7% of the customer calls. For the task categories greater than 1.0%, 7 are about “information” and account for 67.1% of the customer calls. This suggests that “information” should be the prominent topic in a customer-centric IVR menu for customer calls to center MM.


Table 9 illustrates the task categories or reasons that customers call NN. The top category is to “Request New Phone Service” and accounts for 20.1% of the customer calls. The second most frequent task category is to get information about services and accounts for 18.0% of the customer calls. The third most frequent category is to request a move of services and accounts for 10.1% of the customer calls. For these task categories, 7 are about “information” and account for 40.8% of the customer calls; 4 are about “services” and account for 39.7% of the customer calls. This suggests that “information” and topics relating to “services” should be prominent topics in a customer-centric IVR menu for customer calls NN.


An important part of the data collection process shown in FIG. 1 step S1 is to collect end-user volume data as shown in step S3 and to analyze the volume data as shown in step S4.


Often an IVR designer wants to provide customers with a single access telephone number that can be used for many or all of the centers. Such an arrangement reduces misdirected calls because the end user or customer will not have to guess which of the many company telephone numbers to call. So the question becomes, what if one combines the AA, NN, RR, CC and MM call centers by use of a single customer-access telephone number. In other words, what would the customer-centric call frequency show? To accurately depict a combined and adjusted task frequency as shown in FIG. 1 step S5 across the various call centers it is important to consider the call volume generated to each center. Call volume for each call center in the sample is used to weight the frequency of its different task categories. This is done because a call center that handles billing may get twice as much call volume per year compared with a call center that processes new orders. This volume weighting adjusts task category percentages to account for different call volumes and provides a more accurate analysis of task frequency.


The first step in adjusting task frequency for call volume is to consider total call volume for each small business center. First total call volume for each small business call center in a sample is obtained. Based upon data obtained from call volume during a predetermined time period, AA had approximately 610,059 calls, CC had approximately 51,374 calls, RR had approximately 573,871 calls, MM had approximately 40,260 calls, and NN had approximately 308,161 calls. This indicates a 39% weight for AA, a 19% weight for NN, a 36% weight for RR, a 3% weight for CC, and a 3% weight for MM. Table 10 illustrates the relative call volume of the different small business call centers.









TABLE 10







Relative call volume of different small business call centers.










Calls Received



Small Business
During Time
Relative Call


Call Center
Period
Volume





AA
610,059
39%


NN
308,161
19%


RR
573,871
36%


CC
 51,374
 3%


MM
 40,260
 3%


Total
1,583,725  
100% 









After call volume weights have been computed, task frequency for each call center type is adjusted by multiplying task frequency by the call volume weight. Table 11 illustrates the steps involved in computing adjusted task frequency based upon volume data, as shown in step S5 of FIG. 1.









TABLE 11







Example computation of adjusted task frequency based on relative call volume.















Task

AA Call
AA Adj
NN Adj
RR Adj
CC Adj
MM Adj
Overall Adj


Categories
AA %
Vol Wght
Freq
Freq
Freq
Freq
Freq
Task Freq





Get
.34
.39
13.2% 
1.4%
1.0%
0.2%
0.1%
15.9%


information


about the


bill


Get
.05
.39
1.8%
3.4%
7.3%
0.2%
1.1%
13.8%


information


on services


Get
.18
.39
6.8%
1.2%
3.0%
0.6%
0.2%
11.8%


information


on my


account









The first column shows the task categories to be adjusted by call volume. As an example we will consider the first row task category “Get Information About the Bill”. The second column in Table 11 (AA %) represents the percentage of calls made to AA for each task category. In this case, 33.9% of the calls to AA were to “Get Information About the Bill”(see Table 3). The third column in Table 11 (AA weight volume) represents the relative call volume weighting for AA. As shown in Table 10, the relative call weighting for AA was 39%. Column 4 shows the adjusted task frequency which was computed by multiplying the percentage of calls relating to “Get Information About the Bill” by the relative call volume weight. In this example, the adjusted frequency for calls to AA relating to “Get information about the bill” is 13.2% (i.e., 33.9%*39.0%=12.9%). These steps were also performed for the other small business call centers to obtain the adjusted task frequency values in columns 5–8. The adjusted task frequencies for each call center type are then summed to obtain the overall adjusted task category frequency which is represented in the last column (i.e., column 9) of Table 11. To complete the example, the overall adjusted task frequency for “Get Information About the Bill” is approximately 15.9% (i.e., 13.2%+1.4%+1.0%+0.2%+0.1%).


The results appear in Table 12 and show the adjusted task category frequencies or reasons customers called all Small Business centers after adjusting or compensating for differences in call volume.









TABLE 12







Why customers call AA, CC, RR, NN and MM.













Combined



Category
Description
%







B4
Get Information about the
15.92% 




Bill



I1
Get Information on Services
13.85% 



I2
Get Information on my
11.82% 




Account



A2
Add Optional Services
8.98%



C1
Change Account Data
6.25%



A1
Request new phone service
4.67%



D1
Disconnect (Close Account)
4.19%



M1
Request a Move of Service
3.38%



D2
Discontinue Optional
3.30%




Service



A4
Get Info about a Pending
3.18%




Acquisition



L6
Change Carrier/Provider
2.85%



I11
Get Information on other
2.54%




company Offices



B6
Dispute the Bill
2.53%



C2
Change Optional Services
1.92%



F1
Fix a Service
1.36%



P4
Get Info about a Pending
1.14%




Payment



A8
Reconnect Service
1.14%



A5
Give Info for a Pending
0.96%




Acquisition



L4
Get Info from the
0.89%




Carrier/Provider



L2
Restore a Carrier/Provider
0.82%



P5
Give Info for a Pending
0.70%




Payment



L5
Give Info to the
0.70%




Carrier/Provider



D4
Get Info about a Pending
0.63%




Deletion



P1
Set Up Payment Arrangement
0.62%



L1
Add a Carrier/Provider
0.52%



C8
Change a Feature of a
0.45%




Service



A3
Schedule Pending
0.45%




Acquisition



P0
Inquire about Payment
0.42%




Options



M4
Get Info about a Pending
0.42%




Move



I0
Nature of inquiry
0.41%




unspecified



L7
Cancel a Carrier/Provider
0.39%



M5
Give Info for the Pending
0.37%




Move



I12
Get Information on a
0.32%




Name/Address/Number



F2
Fix a Product
0.28%



A7
Cancel a Pending
0.26%




Acquisition



P2
Where to Make a Payment
0.18%



A9
Acquire Service Temporarily
0.17%



D5
Give Info for a Pending
0.13%




Deletion



F8
Report a Problem
0.12%



D9
Discontinue Service
0.11%




Temporarily



P3
Schedule Payment
0.10%



A0
Unspecified Addition
0.06%



A6
Change a Pending
0.06%




Acquisition



B5
Give Information for the
0.06%




Bill



C0
Change Unspecified
0.06%



D0
Discontinue Unspecified
0.06%




Service



F3
Schedule a Pending Repair
0.06%



L11
PIC Letter
0.06%



M9
Service Temporarily Move
0.06%



D11
Return a Product
0.04%



F0
Fix an unspecified problem
0.02%



F9
Fix a Problem
0.01%



I3
Get Information on other
0.01%




Service Providers



L8
Billing Issue for a
0.01%




Carrier/Provider





Total
100.0% 










As is evident, the top task category is “Get Information About the Bill”, and it accounts for 15.92% of all calls. The next most frequent task category is “Get Information on Services”, and it accounts for 13.85% of all calls. In fact, the top three reasons why customers call is to: 1) inquire about the monthly bill; 2) inquire about products, services and prices; or 3) inquire about their account. These three information inquiries are responsible for 41.59% of all calls. Further, among the task categories with a frequency greater than 1.0%, 6 are about “information” and account for 48.45% of all calls; and 8 task categories are about “services”(i.e., new, changing, moving or disconnecting service) and account for 35.54%; of the customer calls. This suggests that tasks relating to “information” and “services” should be prominent topics when designing a customer-centric IVR menu for Small Business Call Centers.


As seen in FIG. 4, the top ten tasks or reasons customers call small business centers account for 72.2% of the calls in the sample and the top 14 tasks account for 81.6% of the calls. That is, the overwhelming majority of calls to small business centers can be categorized by less than 15 tasks. This means that when designing the IVR menu options S6, one should focus on high frequency tasks because they are the most likely to be of interest to and requested by the small business customers. The most frequent tasks are located early in the IVR menu selections to quickly address common calls; however, less frequent categories may be grouped with more frequent categories for logical reasons. For example, in the customer-centric design approach to the small business IVR, we combine the frequent task categories into menu options that are ordered by the percentage of calls for which they account (see Table 12). As noted above, there are six categories involving “Get Information.” Three of them (rank orders 1, 2, and 3) concern either the customer's account, the bill or the products and services. These three account for 41.59% of the calls. Therefore, these tasks are used to word the Top-Level Menu “Get information about your account or our services”. All categories relating to the “Get Information” option appear in an associated Second-Level Menu as shown in Table 13.


According to the data collected organized and analyzed as discussed above and as shown in the steps S1–S5 of FIG. 1 menu options of the interface are defined in step S6. A sample of such an IVR are shown in Table 13 .









TABLE 13







Example Top-Level, Second-Level,


and Third-Level IVR Menu Options.









Top-Level Menu
Second-Level Menu
Third-Level Menu


Choices
Choices
Choices





To get information
For information about
For bill items other


about your account
an item on your bill,
than DSL or ISDN, press


or our services
press 1.
1.


and prices, press 1.

For DSL bill items,




press 2.




For ISDN bill items,




press 3.




For DSL or ISDN bill




items, press 4.




For all other billing




questions, press 5.



For information about
If you have an existing



our services, products,
account, press 1.



and prices, press 2.
If you are a new




account, press 2.









For the information we have about your account,



press 3.










For information about a
For repair, press 1.



repair or installation
For installation, press



order, press 4.
2.









For information about an order for additional



lines or services, press 5.










For information about
For DSL, press 1.



high-speed data
For ISDN, press 2.



services, press 6.
For all other questions




about high-speed data




services, press 3.









Within any business, there is often an organizational need to route similar requests to different agents for task handling and execution. For example, staffing levels or specialization of training at different call centers may dictate distinguishing among similar generic tasks on some basis that may be transparent, irrelevant or unknown to the customer. In Small Business, for example, billing questions involving high-speed data products are handled separately by agents who are specialized by technology and according to other areas of expertise. Thus, the Second-Level option “Information About an Item On Your Bill” leads to a Third-Level Menu that differentiates ISDN and DSL and “other.”


It should be noted that not all IVR menu categories are included in all IVR designs. For example, task categories such as “Get Information About a Pending Acquisition” and “Get Information About Other Company Offices” were not included as options in the current small business IVR although these items account for 5.2% of the call volume. These items were both included in the customer-centric IVR design.


As FIG. 1 step S7 shows, after the initial user-interface is designed, a prediction model based on expected user task volume and task frequency is used to provide an early indication of how the interface will perform. A predictive comparison of the newly designed system to the existing system is performed to estimate performance gains (see steps S8, S9 and S10 in FIG. 1). Prototypes of the system are then tested on a sample of users performing real-world scenarios (i.e., tasks) extracted from the initially collected customer-centric data.


In an effort to estimate the routing performance of the current IVR design as compared to the customer-centric design, a designer needs to directly compare predicted routing performance of both IVR designs. Estimates of correct call routing were predicted using 2 independent judges (i.e., testers). Each judge reviewed 24 task categories that each contained three real-world scenarios for a total of 72 scenarios. The scenarios were extracted from the initial 2391 customer-task statements that were collected from the customer call centers. The judges read each scenario and attempted to resolve it (i.e., correctly route it) using both the existing and newly designed customer-centric small business IVR. For example:

    • Task Category=Get information about the bill:
    • Scenario #1=I have a question about the charges on my bill.
    • Scenario #2=I need to know to who some of these numbers on my bill belong.
    • Scenario #3=I need to know the balance of my account.


Scores were converted to a “routing percentage score” by multiplying percent correct routing by the task frequency adjusted for call volume across call centers. For example, if the two judges examining scenarios correctly routed 5 of 6 scenarios within a task category, then the “routing percentage score” equaled 83.3%. Further, if the task category frequency adjusted for call volume across calling centers was 15.92% then the “Predicted Routing Percentage Score” would be 13.26% (i.e., 83.3%×15.92%=13.26%). The “Predicted Routing Percentage Scores” are multiplied and summed across the 24 task categories arrive at the overall predicted routing percentage for each IVR. These scores are used to estimate performance differences between the current small business IVR and the redesigned customer-centric small business IVR.


Table 14 describes the menu topics based on call frequency adjusted for call volume.









TABLE 14







Small business customer-centric menu topics based on frequency.














Specific

Rout-
Pre-


General
Specific
Task
Call
ing %
dicted


Topics
Topics
Code
Freq
Score
Score





Get
Get
B4
15.92% 
100%
15.92% 


Information
information



about the



bill



Get
I1
13.85% 
100%
13.85% 



information



on services



Get
I2
11.82% 
100%
11.82% 



information



on my



account



Get
A4
3.18%
100%
3.18%



information



about a



pending



acquisition



Get
I11
2.54%
33.3% 
0.84%



information



about other



company



offices



Get
P4
1.14%
100%
1.14%



information



about a



pending



payment



Get
L4
0.89%
33.3% 
0.28%



information



about a



carrier














Subtotal =
47.03% 












Change
Add
A2
8.98%
83.3% 
7.48%


your
optional


account or
services


change
Change
C1
6.25%
33.3% 
2.08%


your
account


services
information



Delete
D2
3.30%
100%
3.30%



optional



services



Request a
M1
3.38%
100%
3.38%



move of



service



Change a
L6
2.85%
100%
2.85%



carrier



Change
C2
1.92%
100%
1.92%



optional



services



Restore a
L2
0.83%
83.3% 
0.68%



carrier



Get
M5
0.37%
33.3% 
0.12%



information



about a



pending



move



Cancel or
L7
0.39%
100%
0.39%



delete a



carrier



Add a
L1
0.52%
100%
0.52%



carrier














Subtotal =
22.72% 












Open an
Discontinue
D1
4.19%
100%
4.19%


account or
service


disconnect
Open a new
A1
4.67%
100%
4.67%


services
account














Subtotal =
8.86%












Repair
Fix a
F1
1.36%
100%
1.36%



service



Fix a
F2
0.28%
100%
0.28%



product














Subtotal =
1.64%












Discuss
Dispute the
B6
2.53%
100%
2.53%


your bill
bill


or
Make
P1
0.62%
100%
0.62%


payments
payment



arrangement



Reconnect
A8
1.14%
66.6% 
0.76%



service














Subtotal =
3.91%










Grand Total =
84.16% 










The “general” and “specific” columns represent the most frequent reasons why customers called small business centers. The “task code” column shows the identifier used when the calls were classified in designing the 1VR. The “call frequency” column shows the percentage of calls that were for a particular task; this is the percentage of callers who called the center to accomplish this particular task. The “routing percent score” shows the percentage of tasks correctly routed by the two judges. The “predicted score” shows the theoretical prediction of routing performance based on the adjusted call frequency and the routing percentage score.


The results of the analysis provide support for the theoretical prediction that there will be an improvement in correct call routing performance by redesigning an existing small business IVR using a customer-centric approach. Specifically, the results of the analysis predict that the customer-centric design shown in Table 14 will correctly route approximately 84% of customers to the appropriate service center (see Grand Total of 84.16%). For comparison purposes, the predicted routing model was applied to the existing small business IVR. The analysis is shown in Table 15 and suggests that approximately 73% of the customers will be routed to the appropriate service center (see Grand Total of 73.00%) with the existing design. Therefore, a customer-centric approach has the potential to produce a significant increase (i.e., 84.16%−73.00%=11.16%) in the number of callers who are correctly routed. This comparison illustrates a primary recommended use of this approach—to compare related IVR designs as to their relative customer-centric operational performance.









TABLE 15







Existing Small Business IVR: Business-centric


prompts, with frequency data.














Specific

Rout-
Pre-


General
Specific
Task
Call
ing %
dicted


Topics
Topics
Code
Freq
Score
Score





Orders
Add or
A2
8.98%
100%
8.98%


and
make
C2
1.92%
100%
1.92%


product
changes
A8
1.14%
 50%
0.57%


information
to
M5
0.37%
66.6% 
0.25%



service



Establish
A1
4.67%
100%
4.67%



new
M1
3.38%
100%
3.38%



service



or move



existing



service



Telephone
11
3.38%
100%
13.85% 



systems



or



equipment














Subtotal =
3.62%












Billing
Change
C1
6.25%
16.7% 
1.04%


Inquiries
long
L6
2.85%
33.3% 
0.95%


or
distance
L2
0.82%
100%
0.82%


change
carrier
L1
0.52%
100%
0.52%


long

L7
0.39%
100%
0.39%


distance
Payment
P4
1.14%
100%
1.14%


carrier
arrangements
P1
0.62%
100%
0.62%



Completely
B4
15.92% 
100%
15.92% 



disconnect
I2
11.82% 
100%
11.82% 



service
D2
3.30%
33.3% 
1.10%



or all
B6
2.53%
100%
2.53%



other
L4
0.89%
100%
0.89%



billing



questions














Subtotal =
37.74% 












Repair
Fix a
F1
1.36%
100%
1.36%



product
F2
0.28   
100%
0.25%



or



service














Subtotal =
1.64%










Grand Total =
73.00% 










The C-CAID approach to interface design has many tangible and useful results including increased user satisfaction and optimized system performance. One of the biggest problems associated with menu option systems is the occurrence of misdirected calls. These are incoming calls that are “misdirected” or incorrectly routed to the wrong servicing organization. Misdirected calls are a significant problem in both business and consumer markets. An example of the impact that these misdirected calls may have on a business is discussed below.


It is estimated that approximately 31% of the total calls to a company's small business call centers (i.e. AA, CC, RR, MM and NN) are misdirects. In 1998, there were approximately 6.2 million calls to those Small Business call centers, resulting in approximately 1.9 million misdirected calls. A recent study indicates that each misdirected call costs approximately $0.91. Thus, the annual cost of misdirected calls is approximately $1.75 million (1.9 million misdirected calls times $0.91 each). The projected cost savings achieved by implementing the customer-centric small business IVR design is estimated at $400,000 dollars based on the 25% predicted reduction in misdirects for the customer-centric small business IVR.


After design of the C-CAID based interface system, the system was tested. Participants in the test were instructed that they were to make two phone calls. One call was to the existing small business IVR and the second call was to the customer-centric IVR. Participants were presented with real-world tasks that were generated using example scenarios obtained from gathered data. Example tasks might require participants to obtain information about a bill or to order an optional phone service. Participants were told to select category options from either the current or redesigned IVR to resolve the particular scenario with which they were presented. After completing both phone calls, participants were administered a short questionnaire to assess their impression of each system. An example questionnaire can be found in FIG. 5. Internal administrations of this survey found that customers greatly preferred an IVR designed using the C-CAID methodology as opposed to ones that did not.


Two objective measures are used to evaluate the performance of the customer-centric IVR. These measures are cumulative response time (CRT) and a routing accuracy score and are described in detail below. This evaluation step is shown in FIG. 1 step S11.


Two objective measures are collected during usability testing. Cumulative response time (CRT) and task accuracy are used to evaluate performance of the systems. CRT is a measure of the time per task that end-users spend interacting with each system. Task accuracy is used to measure how effectively the goals of the end-user were mapped to each system via the user interface. These two metrics (CRT and task accuracy) are then graphed into a CRT/accuracy matrix (S11 of FIG. 1) that depicts end-user performance for these systems in a convenient and easy to understand manner. This CRT/accuracy matrix is a very powerful tool for the designer to evaluate the strengths and weaknesses of a specific interface design. The designer can observe and evaluate those specific tasks that the end-users are performing poorly while using the interface. From this evaluation, one can redesign the interface design in order to improve performance. If a particular task does not effectively map to the associated interface options then one can expect that the resulting accuracy of the navigation will be poor or higher CRTs will result. The CRT/accuracy matrix is instrumental in identifying the deficiencies of an existing interface system and then serves as a guide and feedback monitor to indicate how introduced changes positively or negatively impact the system as a result of the redesign activity (S12 of FIG. 1). The goal of the retesting efforts is to determine whether performance on the redesigned interface options improved and if the redesign negatively impacted unchanged aspects of the interface design.


To calculate the cumulative response time, the time in seconds that subject spends interacting with the IVR is measured. Participants (i.e., subjects) are not credited or penalized for listening to the entire IVR menu before making a selection. For example, if the main menu of the IVR is 30 seconds in length, and the participant listens to the whole menu and then makes a selection, they receive a CRT score of 0. If the participant only listens to part of a menu, hears their choice, and barges in, choosing an option before the whole menu plays, then they would receive a negative CRT score. For instance, if they choose option 3, fifteen seconds into a 6 option, 30 second menu, they would receive a CRT score of −15. Conversely, if they were to repeat the menu after hearing it once, and then chose option three on the second playing of that menu, they would receive a CRT score of +15 for that menu.


Referring to the examples illustrated in FIG. 3, the first subject took two seconds after the end of each announcement to make the selection, resulting in a score of 4. The second subject repeated the announcement and made both selections after two seconds, resulting in a score of 12. The third subject made the selection before completion of each announcement and thus received a score of −4. Once the subject reaches their final destination in the IVR (regardless of accuracy), the CRT scores for each menu level are summed, and that becomes the total CRT score for that subject on that particular task. This metric accurately measures the total time a user is interfacing with the system, regardless of menu length.


Using this method, it was found that longer IVR menus don't necessarily mean that the participant spends more time in the system. In fact, the customer-centric consumer IVR has menus that measured approximately 90 seconds in length while the current consumer version's menus were about 30 seconds in length. However, CRT scores indicated that although the customer-centric design is almost 3 times longer than the current consumer IVR, the averaged CRT scores for both the customer-centric and current IVR's were not significantly different. The likely explanation for this could be that the subjects got better descriptions, with more detail, in the customer-centric design. Therefore, they did not need to repeat the menus as much.


Route scoring provider information as to whether or not the user is routed to the correct call center. A score of −1, 0, or 1 could be awarded for each task. A score of −1 indicates that a subject actively chose an IVR path that did not route them to the correct destination for the task that they were trying to complete. A score of 0 indicates that the subject got to a point in the IVR and made no selection either by hanging up or automatically timing out. Finally, a score of 1 indicates that the subject successfully navigated the IVR for a given task and was routed to the correct destination.


Both CRT and routing accuracy (i.e., route scoring) are used to create a CRT/Routing matrix that is used to estimate performance on specific IVR menu options. CRT is plotted on the Y axis and route scoring on the X axis. An example CRT/Routing matrix is depicted in FIG. 2. From this matrix one can determine which tasks are handled well by the IVR, and which tasks are not. Data points located in the upper left quadrant of the matrix are optimal performers because there was both high routing accuracy and a short CRT. In this case, the short CRT time shows that participants are confident that their selected task option matches their goal or reason for calling. Therefore, participants are “barging in” to the menuing system, spending less time with the IVR, and their call is correctly routed. In contrast, data points that fall in the lower right quadrant represent tasks that did not perform well because there was both poor routing accuracy and a long CRT. In this case, participants had to repeat the menus and still did not choose the correct menu options in the IVR.


As shown in FIG. 1, step S12, the CRT/routing matrix provides a powerful tool when it comes to evaluating the strengths and weaknesses of the IVR design. For example, a designer can look at the specific tasks that participants perform poorly, and redesign or reword aspects of these IVR menu choices in order to improve performance. That is, if a particular task does not effectively map to the associated menu option then one can expect poor routing accuracy or high CRT's. The CRT/routing matrix can be used to identify deficiencies and tweak the IVR design for optimal performance.


Pending completion of the redesigned IVR, the modified design is retested using a similar user sample as in the initial testing sample as shown in step S13 of FIG. 1. The purpose of the retest is to evaluate any changes made to the IVR based on results of the CRT/routing matrix. The goals of retesting efforts are to determine whether performance of the redesigned menu items improves and to determine whether changes made had negative impact on unchanged menus in the IVR. Evaluations of IVR performance using the CRT/routing matrix and retesting are iterative processes and should continue until the design maximizes performance or improvement goals are met.


After the initial user-interface is designed, a prediction model based on expected user task volume and task frequency is used to provide an early indication of the how the interface will perform. This establishes a baseline interface that provides an indication of how the new interface will perform.


The newly designed interface is then compared to the previous one and performance gain estimates can be determined with respect to the previous interface.


The resulting system prototypes are then tested on a sample of users. The users perform real-world scenarios extracted from the initially collected customer-centric data to test the system.


As also shown in FIG. 1, upon completion of the redesign test (S13) an updated interface can be designed using the same principles by looping back to step S6 or to another appropriate point. Thus, the interface can be periodically improved.


While the invention has been described with reference to a preferred embodiment, it is understood that the words which have been used herein are words of description and illustration, rather than words of limitations. Changes may be made, without departing from the scope and spirit of the invention in its aspects. Although the invention has been described to a particular method, the invention is not intended to be limited to the particulars disclosed herein; rather, the invention extends to all functionally equivalent structures, methods and uses.


For example, the interface can be an interface between a user and a computer application or between a user and an e-mail system. Also, the interface can also be between two communicating machines or devices. The communications layers can be optimized so that the machines are set up optimally to communicate more statistically relevant tasks than those communications that may not occur as frequently. In addition, the updating of the design interface may be done at various time increments. Additionally, the information gathered is not limited to customer service requests for a telephonic system. The information can be the data gathered from an online e-commerce system that looks at those items or preferences that occur or are accessed more frequently by the end user.


Accordingly, the invention is now defined with respect to the attached claims.

Claims
  • 1. A method for evaluating performance of a user interface design, comprising: developing a prediction model for the user interface design based upon expected user task volume and task frequency by determining an estimated percentage of correctly routed calls and multiplying the estimated percentage by a task frequency adjusted for call volume across call centers to obtain a predicted routing percentage score, the task frequency adjusted for call volume comprising a sum of a relative call volume for each call center multiplied by a task frequency at each call center; andevaluating potential performance of the user interface design by applying the prediction model.
  • 2. The method of claim 1, further comprising comparing a prediction model of a newly designed interface with a prediction model of a current interface to estimate performance gains resulting from the newly designed interface.
  • 3. The method of claim 1, in which the developing the prediction model further comprises summing predicted routing percentage scores across all task categories to obtain an overall predicted routing percentage for the interface design.
  • 4. The method of claim 1, in which the determining the estimated percentage further comprises attempting to route a call based upon historical data, the attempt including routing while using the user interface design.
  • 5. A computer readable medium storing a computer program for evaluating performance of a user interface design, comprising: a prediction model code segment that develops a prediction model for the user interface design based upon expected user task volume and task frequency by determining an estimated percentage of correctly routed calls and multiplying the estimated percentage by a task frequency adjusted for call volume across call centers to obtain a predicted routing percentage score, and summing predicted routing percentage scores across all task categories to obtain an overall predicted routing percentage for the user interface design, andan evaluation code segment that evaluates potential performance of the user interface design by applying the prediction model.
  • 6. The medium of claim 5, further comprising a performance analysis code segment that compares a prediction model of a newly designed interface with a prediction model of a current interface to estimate performance gains resulting from the newly designed interface.
  • 7. The medium of claim 5, in which the prediction model code segment estimates the percentage by attempting to route a call based upon historical data, the attempt including routing while using the user interface design.
  • 8. The medium of claim 5, in which the task frequency adjusted for call volume comprises a sum of a relative call volume for each call center multiplied by a task frequency at each call center.
  • 9. A method for modeling performance of a user interface design, comprising: developing a prediction model for the user interface design based upon expected user task volume and task frequency by determining an estimated percentage of correctly routed calls and multiplying the percentage by the task frequency adjusted for call volume across call centers to obtain a predicted routing percentage score, and summing predicted routing percentage scores across all task categories to obtain an overall predicted routing percentage for the interface design.
  • 10. The method of claim 9, further comprising comparing a prediction model of a newly designed interface with a prediction model of a current interface to estimate performance gains resulting from the newly designed interface.
  • 11. The method of claim 9, in which the determining the estimated percentage further comprises attempting to route a call based upon historical data, the attempt including routing while using the user interface design.
  • 12. The method of claim 9, in which the task frequency adjusted for call volume comprises a sum of a relative call volume for each call center multiplied by a task frequency at each call center.
  • 13. A computer readable medium storing a computer program for modeling performance of a user interface design, comprising: a prediction model code segment that develops a prediction model for the user interface design based upon expected user task volume and task frequency by determining an estimated percentage of correctly routed calls and multiplying the percentage by a task frequency adjusted for call volume across call centers to obtain a predicted routing percentage score,wherein the task frequency adjusted for call volume comprises a sum of a relative call volume for each call center multiplied by a task frequency at each call center.
  • 14. The medium of claim 13, further comprising a performance analysis code segment that compares a prediction model of a newly designed interface with a prediction model of a current interface to estimate performance gains resulting from the newly designed interface.
  • 15. The medium of claim 13, in which the prediction model code segment further comprises summing predicted routing percentage scores across all task categories to obtain an overall predicted routing percentage for the user interface design.
  • 16. The medium of claim 13, in which the prediction model code segment estimates the percentage by attempting to route a call based upon historical data, the attempt including routing while using the user interface design.
RELATED PATENT APPLICATION

This patent application is a continuation of U.S. patent application Ser. No. 09/532,038, entitled INTERFACE AND METHOD OF DESIGNING AN INTERFACE with inventors Robert R. Bushey et al. and filed Mar. 21, 2000, now U.S. Pat. No. 6,778,643.

US Referenced Citations (175)
Number Name Date Kind
4310727 Lawser Jan 1982 A
4694483 Cheung Sep 1987 A
4761542 Kubo et al. Aug 1988 A
4922519 Daudelin May 1990 A
4930077 Fan May 1990 A
4964077 Eisen et al. Oct 1990 A
5115501 Kerr May 1992 A
5181259 Rorvig Jan 1993 A
5204968 Parthasarathi Apr 1993 A
5206903 Kohler et al. Apr 1993 A
5263167 Conner, Jr. et al. Nov 1993 A
5299260 Shaio Mar 1994 A
5311422 Loftin et al. May 1994 A
5323452 Dickman et al. Jun 1994 A
5327529 Fults et al. Jul 1994 A
5335268 Kelly, Jr. et al. Aug 1994 A
5335269 Steinlicht Aug 1994 A
5371673 Fan Dec 1994 A
5388198 Layman et al. Feb 1995 A
5420975 Blades et al. May 1995 A
5479488 Lennig et al. Dec 1995 A
5495567 Iizawa et al. Feb 1996 A
5500795 Powers et al. Mar 1996 A
5519772 Akman et al. May 1996 A
5530744 Charalambous Jun 1996 A
5533107 Irwin et al. Jul 1996 A
5535321 Massaro et al. Jul 1996 A
5537470 Lee Jul 1996 A
5553119 McAllister et al. Sep 1996 A
5561711 Muller Oct 1996 A
5566291 Boulton et al. Oct 1996 A
5586060 Kuno et al. Dec 1996 A
5586171 McAllister et al. Dec 1996 A
5586219 Yufik Dec 1996 A
5594791 Szlam et al. Jan 1997 A
5600781 Root et al. Feb 1997 A
5615323 Engel et al. Mar 1997 A
5633909 Fitch May 1997 A
5657383 Gerber et al. Aug 1997 A
5659724 Borgida et al. Aug 1997 A
5666400 McAllister et al. Sep 1997 A
5668856 Nishimatsu et al. Sep 1997 A
5671351 Wild et al. Sep 1997 A
5675707 Gorin et al. Oct 1997 A
5684870 Maloney et al. Nov 1997 A
5684872 Flockhart et al. Nov 1997 A
5706334 Balk et al. Jan 1998 A
5710884 Dedrick et al. Jan 1998 A
5727950 Cook et al. Mar 1998 A
5729600 Blaha et al. Mar 1998 A
5734709 DeWitt et al. Mar 1998 A
5740549 Reilly et al. Apr 1998 A
5757644 Jorgensen et al. May 1998 A
5758257 Herz et al. May 1998 A
5771276 Wolf Jun 1998 A
5790117 Halviatti et al. Aug 1998 A
5793368 Beer Aug 1998 A
5802526 Fawcett et al. Sep 1998 A
5806060 Borgida et al. Sep 1998 A
5808908 Ghahramani Sep 1998 A
5809282 Cooper et al. Sep 1998 A
5812975 Komori et al. Sep 1998 A
5819221 Kondo et al. Oct 1998 A
5821936 Shaffer et al. Oct 1998 A
5822397 Newman Oct 1998 A
5822744 Kesel Oct 1998 A
5825856 Porter et al. Oct 1998 A
5825869 Brooks et al. Oct 1998 A
5832428 Chow et al. Nov 1998 A
5832430 Lleida et al. Nov 1998 A
5835565 Smith et al. Nov 1998 A
5848396 Gerace Dec 1998 A
5864605 Keshav Jan 1999 A
5864844 James et al. Jan 1999 A
5870308 Dangelo et al. Feb 1999 A
5872865 Normile et al. Feb 1999 A
5873068 Beaumont et al. Feb 1999 A
5884029 Brush, II et al. Mar 1999 A
5899992 Iyer et al. May 1999 A
5903641 Tonisson May 1999 A
5905774 Tatchell et al. May 1999 A
5920477 Hoffberg et al. Jul 1999 A
5923745 Hurd Jul 1999 A
5943416 Gisby Aug 1999 A
5953406 LaRue et al. Sep 1999 A
5963965 Vogel Oct 1999 A
5974253 Nahaboo et al. Oct 1999 A
5991735 Gerace Nov 1999 A
5999611 Tatchell et al. Dec 1999 A
5999908 Abelow Dec 1999 A
6014638 Burge et al. Jan 2000 A
6016336 Hanson Jan 2000 A
6026381 Barton, III et al. Feb 2000 A
6032129 Greef et al. Feb 2000 A
6035283 Rofrano Mar 2000 A
6035336 Lu et al. Mar 2000 A
6038560 Wical Mar 2000 A
6044355 Crockett et al. Mar 2000 A
6052693 Smith et al. Apr 2000 A
6055542 Nielsen et al. Apr 2000 A
6058163 Pattison et al. May 2000 A
6058179 Shaffer et al. May 2000 A
6058435 Sassin et al. May 2000 A
6061433 Polcyn et al. May 2000 A
6067538 Zorba et al. May 2000 A
6088429 Garcia Jul 2000 A
6099320 Papadopoulos Aug 2000 A
6104790 Narayanaswami Aug 2000 A
6128380 Shaffer et al. Oct 2000 A
6134315 Galvin Oct 2000 A
6134530 Bunting et al. Oct 2000 A
6148063 Brennan et al. Nov 2000 A
6157808 Hollingsworth Dec 2000 A
6160877 Tatchell et al. Dec 2000 A
6161130 Horvitz et al. Dec 2000 A
6163607 Bogart et al. Dec 2000 A
6166732 Mitchell et al. Dec 2000 A
6170011 Macleod Beck et al. Jan 2001 B1
6173053 Bogart et al. Jan 2001 B1
6173279 Levin et al. Jan 2001 B1
6201948 Cook et al. Mar 2001 B1
6212502 Ball et al. Apr 2001 B1
6219665 Shiomi Apr 2001 B1
6230197 Beck et al. May 2001 B1
6236955 Summers May 2001 B1
6236990 Geller et al. May 2001 B1
6243375 Speicher Jun 2001 B1
6249579 Bushnell Jun 2001 B1
6263052 Cruze Jul 2001 B1
6269153 Carpenter et al. Jul 2001 B1
6278976 Kochian Aug 2001 B1
6282404 Linton Aug 2001 B1
6289084 Bushnell Sep 2001 B1
6292909 Hare Sep 2001 B1
6295551 Roberts et al. Sep 2001 B1
6296376 Kondo et al. Oct 2001 B1
6308172 Agrawal et al. Oct 2001 B1
6330326 Whitt Dec 2001 B1
6332154 Beck et al. Dec 2001 B1
6336109 Howard Jan 2002 B1
6349290 Horowitz et al. Feb 2002 B1
6353661 Bailey, III Mar 2002 B1
6353825 Ponte Mar 2002 B1
6357017 Bereiter et al. Mar 2002 B1
6366879 Coxhead et al. Apr 2002 B1
6374260 Hoffert et al. Apr 2002 B1
6389400 Bushey et al. May 2002 B1
6389403 Dorak, Jr. et al. May 2002 B1
6389538 Gruse et al. May 2002 B1
6400807 Hewitt et al. Jun 2002 B1
6405149 Tsai et al. Jun 2002 B1
6405159 Bushey et al. Jun 2002 B1
6405170 Phillips et al. Jun 2002 B1
6411687 Bohacek et al. Jun 2002 B1
6434714 Lewis et al. Aug 2002 B1
6448980 Kumar et al. Sep 2002 B1
6483523 Feng Nov 2002 B1
6487277 Beyda et al. Nov 2002 B1
6516051 Sanders Feb 2003 B1
6564197 Sahami et al. May 2003 B1
6598022 Yuschik Jul 2003 B1
6618715 Johnson et al. Sep 2003 B1
6694482 Arellano et al. Feb 2004 B1
20010014863 Williams III. Aug 2001 A1
20010041562 Elsey et al. Nov 2001 A1
20020055868 Dusevic et al. May 2002 A1
20020133394 Bushey et al. Sep 2002 A1
20020196277 Bushey et al. Dec 2002 A1
20030026409 Bushey et al. Feb 2003 A1
20030143981 Kortum et al. Jul 2003 A1
20030156706 Koehler et al. Aug 2003 A1
20030158655 Obradovich et al. Aug 2003 A1
20040006473 Mills et al. Jan 2004 A1
20040032935 Mills et al. Feb 2004 A1
20040042592 Knott et al. Mar 2004 A1
Foreign Referenced Citations (2)
Number Date Country
0033548 Jun 2000 WO
0073968 Dec 2000 WO
Related Publications (1)
Number Date Country
20030156133 A1 Aug 2003 US
Continuations (1)
Number Date Country
Parent 09532038 Mar 2000 US
Child 10230728 US