1. Field of the Invention
This invention relates generally to software design tools and, more particularly to displaying key performance indicators for an executed application in a user interface of an application design tool.
2. Description of the Background Art
As shown in
The present invention provides a system and method that closes the loop in the full application lifecycle by providing key performance indicators for an executed application back to the design phase. Specifically, the present invention enables an application designer/developer to design an application using a design tool, deploy the application for execution, and subsequently view statistics related to the execution within the design tool.
The application lifecycle begins with the designer creating an application flow for an application using an application design tool. The application flow represents the structure and flow of the application. The application flow includes a sequence of nodes that represents functions in the application.
After the design is complete, the design tool generates software code for the designed application. In generating software code, the design tool inserts markers in the application code that demarks the boundaries between nodes in the application flow of the designed application. When the application is executed, a log is created in accordance with the markers such that data is logged for nodes traversed during the execution of the application.
Using the logged data, execution-related statistics are calculated for one or more of the nodes. These statistics are provided to the design tool. The designer is then able to see statistics related to the performance of individual nodes within the design tool user interface (UI). The statistics are preferably displayed in conjunction with the application flow in the UI.
In one embodiment, the designer can elect to see performance statistics only for nodes that satisfy certain performance thresholds (e.g., “show nodes that resulted in a problem during at least 30% of the executions of such node”). In response to a designer entering threshold requirements, the design tool identifies nodes that satisfy such thresholds (based on the statistics) and displays statistical information in the UI for only such nodes. This enables designers to easily find nodes that could be improved through the design or other application specification modifications.
The present invention can be used to design and optimize a variety of different applications, including, but not limited to IVR applications, web applications, mobile phone applications, and multi-modal communication applications.
a-6c are screen shots that illustrate an application flow in a design tool UI.
The system 200 includes a Design Tool 210 that provides an interface via which a user can graphically design an application, such as an IVR application, a web application, a mobile phone application, or a multi-modal communication application. The Design Tool runs on either a server or a stand-alone computer. The Design Tool 210 includes a Code Generation Module 215 that generates application code 225 based on the application design 212. As will be discussed in more detail with respect to
The application code is deployed and executed on an Application Server 235, which represents one server or a network of servers. In executing the application code 225, the Application Server 235 logs data associated with the execution of the application. The logs 240 are stored in Database 250 by Analytics Server 260.
Analytics Server 260 processes the logged data to generate performance indicator values or other statistics related to the execution of the application code 225. The Analytics Server 260 includes (i) a Log Processor 255 that preprocesses logged data before it stores the data in Database 250, (ii) a Database Interface 265 that retrieves data from Database 250, and (iii) a Performance Indicator Processor 270 that processes statistics requested by the Design Tool 210. The Analytics Server 260 also may include one or more additional modules (not shown) related to rendering other reports based on the logged data (i.e., reports not requested by the Design Tool), but such modules are not relevant to the present invention. Execution-related statistics are requested by the Design Tool 210 via an Analytics Server Interface 220.
Referring to
After the designer completes the design, the Code Generation Module 215 of the Design Tool 210 generates application code for the designed application (step 320). In generating the application code, the Code Generation Module 215 maps nodes illustrated in the application flow to one or more lines of code. After the Code Generation Module 215 has identified the lines of code corresponding to each node in the application flow, the Code Generation Module 215 inserts markers in the application code that demark the boundaries between nodes and that indicate the type of information that will be logged for each node (step 330). In one embodiment, the markers are in the form of log statements.
The application code is then deployed on the Application Server 235 and executed (step 340). For each instance of the application executed, the Application Server 235 logs data in accordance with the node markers/log statements (step 350). Consequently, data is logged for each design node (i.e., application flow node) traversed during the execution.
The Analytics Server 260 obtains the data logs from the Application Server 235 (step 360). In one embodiment, the Analytics Server uses a scheduling mechanism to periodically (e.g., once a day) retrieve data logs from the Application Server 235, which may be a single server or a group of servers. The Log Processor 255 on the Analytics Server 260 “preprocesses” the data (i.e., it does some preliminary processing) (step 370) before it is stored in Database 250 (step 380).
Below is an example of log statements for a node:
EVNT=UIndst|NAME=node_name
[ . . . node specific log details]
EVNT=UIndnd|NAME=node_name|RSLT=node_result|INFO=result_info
Below are examples of logged events for an IVR application. In these examples, the start of an execution of a design node is logged as a “UIndst” event, and the end of the execution of a node is logged as a “UIndnd” event (in accordance with the example log statements above).
Referring to
Once the Design Tool establishes a secure connection with the Analytics Server 260, the Design Tool can then fetch data from the Analytics Server 260. To do so, the Design Tool 210 requests execution-related statistics (e.g., performance data) for an application from the Analytics Server 260 (step 420). The request includes the node IDs associated with the application (step 430). In the preferred embodiment, the request also includes definitions for the statistics desired (e.g., definitions for desired key performance indicators). This enables designers to define key performance indicators and other statistics after a designed application is compiled and execution-related data is logged (i.e., an “after-the-fact” definition). The variables in the definitions are attributes logged during the execution process, but the way the attributes are used in a definition can be determined subsequent to the logging. Below is an example of an after-the-fact task definition of “success” and “failure” for a node named “Welcome” that uses the “RLST” and INFO″ attributes discussed above:
In an alternate embodiment, the request can include predefined statistics known to the Analytics Server 260.
In response to receiving the request from the Design Tool, the Analytics Server 260 calculates the statistics defined in the request for each of the nodes identified in the request (step 440). Calculations are based on the preprocessed logged data stored in the Database 250. The request may restrict the logged data used in the calculations to a certain set of parameters, such as a specified date range, a specified number of calls, etc. In the preferred embodiment, the Analytics Server 260 has a processor dedicated to calculating statistics requested by the Design Tool 210. In the system illustrated in
After the calculations are complete, the Analytics Server 260 provides the requested execution-related statistics for the designed application back to the Design Tool 210 (step 450). Statistics are provided for each node ID listed in the request. In the preferred embodiment, the Analytics Server 260 provides the statistics within a structured XML document, wherein separate statistics are provided for each node. The XML document is a representation of raw data and compiled statistics. Preferably, the Design Tool 210 persistently stores the statistics so that the Design Tool 210 only needs to download such statistics from Analytics Server 260 once (until new or updated statistics are desired).
The designer can now elect to see the execution-related statistics within the UI of the Design Tool 210. In the preferred embodiment, the designer can elect to see statistics within the display of the application flow in the UI. In response to the designer indicating that he/she would like to see the execution-related statistics (step 510), the Design Tool 210 prompts the user to enter one or more thresholds that determine for which nodes statistics will be displayed (step 520). Examples of threshold requirements for an IVR application are:
b illustrates pop-up window 620 via which a user can enter values for the above-described threshold requirements.
In response to the user entering the thresholds, the Design Tool 210 retrieves the execution-related statistics provided by the Analytics Server 260, and uses such data to identify the nodes in the application flow that satisfy the threshold requirements (step 530). The Design Tool 210 then displays execution-related statistical information for the identified nodes, preferably adjacent the nodes in the display of the application flow (step 540).
Showing performance data and/or other execution-related statistical information for only select nodes (i.e., those nodes associated with statistics that meet the threshold requirements) makes it easier for a designer to easily identify the nodes that could be improved through design or other application specification modifications. However, in an alternate embodiment, performance data and/or other statistics are displayed for all nodes.
In the preferred embodiment, the statistics are reflected in performance meters, such as the performance meters 630 illustrated in
In one embodiment, the user can mouse over a performance meter to see additional information (e.g., additional execution-related statistics) related to the performance of the node. Examples of such additional information include:
Table 640 illustrates an example of additional information displayed for a node 635 when a user mouses over the performance meter. Specifically, Table 640 shows the total number of calls that entered node 635, the number/percentage that completed successfully, the number/percentage which had a hang up, and the number/percentage that resulted in a live agent request, and the number/percentage of errors. The middle column is the percentages for calls that entered the node, and the right column is the percentages based on all calls in the system. The left column is the number of calls received at the node that are associated with the threshold condition.
When the user elects to see node statics, the Design Tool 210 may illustrate general information related to the statistics in the Design Tool 210 UI. In
The display of the performance meters/statistics can be toggled on and off as desired by the designer. In one embodiment, a designer can also generate an application flow specification that includes the performance meters so that key performance data can be presented to clients.
As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the above disclosure of the present invention is intended to be illustrative and not limiting of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4697282 | Winter et al. | Sep 1987 | A |
4918322 | Winter et al. | Apr 1990 | A |
5740233 | Cave et al. | Apr 1998 | A |
6192108 | Mumford et al. | Feb 2001 | B1 |
6243680 | Gupta et al. | Jun 2001 | B1 |
6246982 | Beigi et al. | Jun 2001 | B1 |
6380937 | Dong et al. | Apr 2002 | B1 |
6493703 | Knight et al. | Dec 2002 | B1 |
6778982 | Knight et al. | Aug 2004 | B1 |
6842504 | Mills et al. | Jan 2005 | B2 |
6868525 | Szabo | Mar 2005 | B1 |
6907118 | Henderson et al. | Jun 2005 | B2 |
6934935 | Bennett et al. | Aug 2005 | B1 |
7023979 | Wu et al. | Apr 2006 | B1 |
7039166 | Peterson et al. | May 2006 | B1 |
7047486 | Nagao | May 2006 | B1 |
7106851 | Tang et al. | Sep 2006 | B2 |
7353016 | Roundtree et al. | Apr 2008 | B2 |
7360151 | Froloff | Apr 2008 | B1 |
7366780 | Keller et al. | Apr 2008 | B2 |
7401087 | Copperman et al. | Jul 2008 | B2 |
7487095 | Hill et al. | Feb 2009 | B2 |
7657005 | Chang | Feb 2010 | B2 |
7698140 | Bhardwaj et al. | Apr 2010 | B2 |
7724878 | Timmins et al. | May 2010 | B2 |
7779408 | Papineau | Aug 2010 | B1 |
8000973 | Williams et al. | Aug 2011 | B2 |
8370362 | Szabo | Feb 2013 | B2 |
8380696 | Rogers et al. | Feb 2013 | B1 |
8401156 | Milro et al. | Mar 2013 | B1 |
8589373 | Mayer | Nov 2013 | B2 |
20010037241 | Puri | Nov 2001 | A1 |
20010039492 | Nemoto | Nov 2001 | A1 |
20010056359 | Abreu | Dec 2001 | A1 |
20020010000 | Chern et al. | Jan 2002 | A1 |
20020065736 | Wilner | May 2002 | A1 |
20020087323 | Thomas et al. | Jul 2002 | A1 |
20020169618 | Caspari | Nov 2002 | A1 |
20020177914 | Chase | Nov 2002 | A1 |
20030023439 | Ciurpita et al. | Jan 2003 | A1 |
20030162561 | Johnson et al. | Aug 2003 | A1 |
20030185359 | Moore et al. | Oct 2003 | A1 |
20030204404 | Weldon et al. | Oct 2003 | A1 |
20030204444 | Van Luchene et al. | Oct 2003 | A1 |
20030233224 | Marchisio et al. | Dec 2003 | A1 |
20040102225 | Furuta et al. | May 2004 | A1 |
20040107088 | Budzinski | Jun 2004 | A1 |
20040161097 | Henry | Aug 2004 | A1 |
20040162724 | Hill et al. | Aug 2004 | A1 |
20040252646 | Adhikari et al. | Dec 2004 | A1 |
20050114794 | Grimes et al. | May 2005 | A1 |
20050147052 | Wu | Jul 2005 | A1 |
20050183032 | Bushey et al. | Aug 2005 | A1 |
20060080107 | Hill et al. | Apr 2006 | A1 |
20060135215 | Chengalvarayan et al. | Jun 2006 | A1 |
20060155662 | Murakami et al. | Jul 2006 | A1 |
20060262919 | Danson et al. | Nov 2006 | A1 |
20070190983 | Goldfarb et al. | Aug 2007 | A1 |
20080010280 | Jan et al. | Jan 2008 | A1 |
20080084971 | Dhanakshirur | Apr 2008 | A1 |
20080088440 | Palushaj | Apr 2008 | A1 |
20080300870 | Hsu et al. | Dec 2008 | A1 |
20080304632 | Catlin et al. | Dec 2008 | A1 |
20090245493 | Chen et al. | Oct 2009 | A1 |
20090274068 | Kostner et al. | Nov 2009 | A1 |
20100122214 | Sengoku | May 2010 | A1 |
20100135470 | Bishop | Jun 2010 | A1 |
20100205180 | Cooper et al. | Aug 2010 | A1 |
20100267378 | Hamabe et al. | Oct 2010 | A1 |
20100312782 | Li et al. | Dec 2010 | A1 |
20100313157 | Carlsson et al. | Dec 2010 | A1 |
20110200181 | Issa et al. | Aug 2011 | A1 |
20110218983 | Chaney et al. | Sep 2011 | A1 |
20110238409 | Larcheveque et al. | Sep 2011 | A1 |
20110238410 | Larcheveque et al. | Sep 2011 | A1 |
20110276190 | Lillis et al. | Nov 2011 | A1 |
20130130661 | Berner et al. | May 2013 | A1 |
20130139111 | Grimes et al. | May 2013 | A1 |