Multidimensional metrics-based annotation

Information

  • Patent Grant
  • 8261181
  • Patent Number
    8,261,181
  • Date Filed
    Thursday, March 30, 2006
    18 years ago
  • Date Issued
    Tuesday, September 4, 2012
    11 years ago
Abstract
Persistent annotations are created on a scorecard that combines multi-dimensional as well as fixed value data. The annotations uniquely defined by the scorecard view definition and by the retrieved scorecard data, are independent of the data's dimensionality enabling persistence of the annotations with the data and definition even when the scorecard is reconfigured. The annotations may include a “bubble-up” feature, where a hierarchical structure of the scorecard is inherited by the annotations. Threaded discussions and updated document lists are enabled around the annotations with appropriate permissions and/or credentials.
Description
BACKGROUND

Key Performance Indicators, also known as KPI or Key Success Indicators (KSI), help an organization define and measure progress toward organizational goals. Once an organization has analyzed its mission, identified all its stakeholders, and defined its goals, it needs a way to measure progress toward those goals. Key Performance Indicators are used to provide those measurements.


Scorecards are used to provide detailed and summary analysis of KPI's and aggregated KPI's such as KPI groups, objectives, and the like. Scorecard calculations are typically specific to a defined hierarchy of the above mentioned elements, selected targets, and status indicator schemes. Business logic applications that generate, author, and analyze scorecards are typically enterprise applications with multiple users (subscribers), designers, and administrators. It is not uncommon, for organizations to provide their raw performance data to a third party and receive scorecard representations, analysis results, and similar reports.


Many scorecard applications add a shared discussion mechanism to the same page as a scorecard and have users leave notes to each other. However, the comments may not necessarily reflect the data that is being shown in the scorecard. Moreover, after analysis has been conducted and the data has been manipulated to a specific configuration, it may be impossible to associate that configuration with specific end-user annotation. In addition to discussion threads, other annotations may increase effectiveness of scorecard experience for users.


It is with respect to these and other considerations that the present invention has been made.


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.


Aspects are directed to providing annotation capability uniquely defined by a scorecard view definition and by data returned by an underlying query are enabled independent of the data's dimensionality. A report view structure is also provided for annotations such that the annotations may be filtered and combined across scorecard dimensions (rows or columns).


These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an exemplary computing operating environment;



FIG. 2 illustrates a system where example embodiments may be implemented;



FIG. 3 illustrates an example scorecard architecture according to embodiments;



FIG. 4 illustrates a screenshot of an example scorecard;



FIG. 5 illustrates a screenshot of another example scorecard with annotation capability turned off and on;



FIG. 6 illustrates report view presentation of an annotation based on a discussion thread;



FIG. 7 illustrates a persistent architecture of annotations in a scorecard system; and



FIG. 8 illustrates a logic flow diagram of a process for using annotations within a scorecard architecture in a business logic system.





DETAILED DESCRIPTION

As briefly described above, a scorecard system enabling annotations independent of data dimensionality is provided allowing the annotations to be used with different scorecard configurations, filters, and the like. In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents.


Referring now to the drawings, aspects and an exemplary operating environment will be described. FIG. 1 and the following discussion are intended to provide a brief, general description of a suitable computing environment in which the invention may be implemented. While the embodiments will be described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a personal computer, those skilled in the art will recognize that aspects may also be implemented in combination with other program modules.


Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.


Embodiments may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.


With reference to FIG. 1, one exemplary system for implementing the embodiments includes a computing device, such as computing device 100. In a basic configuration, the computing device 100 typically includes at least one processing unit 102 and system memory 104. Depending on the exact configuration and type of computing device, the system memory 104 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. System memory 104 typically includes an operating system 105 suitable for controlling the operation of a networked personal computer, such as the WINDOWS® operating systems from MICROSOFT CORPORATION of Redmond, Wash. The system memory 104 may also include one or more software applications such as program modules 106, scorecard application 120, and annotation module 122. Scorecard application 120 manages business evaluation methods, computes KPI's, and provides scorecard data to reporting applications. In some embodiments, scorecard application 120 may itself generate reports based on metric data.


Annotation module 122 manages creation, persistent storage, and presentation of annotation within the scorecard application 120. Annotation module 122 may be an integrated part of scorecard application 120 or a separate application. Scorecard application 120 and annotation module 122 may communicate between themselves and with other applications running on computing device 100 or on other devices. Furthermore, scorecard application 120 and annotation module 122 may be executed in an operating system other than operating system 105. This basic configuration is illustrated in FIG. 1 by those components within dashed line 108.


The computing device 100 may have additional features or functionality. For example, the computing device 100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 1 by removable storage 109 and non-removable storage 110. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 104, removable storage 109 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 100. Any such computer storage media may be part of device 100. Computing device 100 may also have input device(s) 112 such as keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 114 such as a display, speakers, printer, etc. may also be included. These devices are well known in the art and need not be discussed at length here.


The computing device 100 may also contain communication connections 116 that allow the device to communicate with other computing devices 118, such as over a network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 116 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.


Referring to FIG. 2, a system where example embodiments may be implemented, is illustrated. System 200 may comprise any topology of servers, clients, Internet service providers, and communication media. Also, system 200 may have a static or dynamic topology. The term “client” may refer to a client application or a client device employed by a user to perform business logic operations. Scorecard service 202, database server 204, and report server 206 may also be one or more programs or a server machine executing programs associated with the server tasks. Both clients and application servers may be embodied as single device (or program) or a number of devices (programs). Similarly, data sources may include one or more data stores, input devices, and the like.


A business logic application may be run centrally on scorecard service 202 or in a distributed manner over several servers and/or client devices. Scorecard service 202 may include implementation of a number of information systems such as performance measures, business scorecards, and exception reporting. A number of organization-specific applications including, but not limited to, financial reporting, analysis, marketing analysis, customer service, and manufacturing planning applications may also be configured, deployed, and shared in system 200. In addition, the business logic application may also be run in one or more client devices and information exchanged over network(s) 210.


Data sources 212, 214, 216, and 218 are examples of a number of data sources that may provide input to scorecard service 202 through database server 204. Additional data sources may include SQL servers, databases, non multi-dimensional data sources such as text files or EXCEL® sheets, multi-dimensional data source such as data cubes, and the like. Database server 204 may manage the data sources, optimize queries, and the like.


According to some embodiments, data sources 212, 214, and 226 may store or provide annotations in addition to scorecard metric information. According to other embodiments, data source 218 may store annotations independent of the data such that the annotations can persist across dimensions, scorecard configurations, etc. This enables a subscriber to filter annotations when building a scorecard or dynamically after the scorecard is built.


Users may interact with scorecard service 202 running the business logic application from client devices 222, 224, and 226 over network(s) 210. In one embodiment, additional applications that consume scorecard-based data may reside on scorecard service 202 or client devices 222, 224, and 226. Examples of such applications and their relation to the scorecard application are provided below in conjunction with FIG. 3.


Report server 206 may include reporting applications, such as charting applications, alerting applications, analysis applications, and the like. These applications may receive scorecard data from scorecard service 202 and provide reports directly or through scorecard service 202 to clients.


Network(s) 210 may include a secure network such as an enterprise network, or an unsecure network such as a wireless open network. Network(s) 210 provide communication between the nodes described above. By way of example, and not limitation, network(s) 210 may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.


Many other configurations of computing devices, applications, data sources, data distribution and analysis systems may be employed to implement a business logic application automatically generating dashboards with scorecard metrics and subordinate reporting.


Now referring to FIG. 3, example scorecard architecture 300 is illustrated. Scorecard architecture 300 may comprise any topology of processing systems, storage systems, source systems, and configuration systems. Scorecard architecture 300 may also have a static or dynamic topology.


Scorecards are a simple method of evaluating organizational performance. The performance measures may vary from financial data such as sales growth to service information such as customer complaints. In a non-business environment, student performances and teacher assessments may be another example of performance measures that can employ scorecards for evaluating organizational performance. In the exemplary scorecard architecture 300, a core of the system is scorecard engine 308. Scorecard engine 308 may be an application that is arranged to evaluate performance metrics. Scorecard engine 308 may be loaded into a server, executed over a distributed network, executed in a client device, and the like.


In addition to performing scorecard calculation, scorecard engine may also provide report parameters associated with a scorecard to other applications 318. The report parameters may be determined based on a subscriber request or a user interface configuration. The user interface configuration may include a subscriber credential or a subscriber permission attribute. The report parameter may include a scorecard identifier, a scorecard view identifier, a row identifier, a column identifier, a page filter, a performance measure group identifier, or a performance measure identifier. The performance measure may be a KPI, a KPI group, or an objective. The page filter determines a period and an organizational unit for application of the scorecard calculations.


Data for evaluating various measures may be provided by a data source. The data source may include source systems 312, which provide data to a scorecard cube 314. Source systems 312 may include multi-dimensional databases such as an Online Analytical Processing (OLAP) database, other databases, individual files, and the like, that provide raw data for generation of scorecards. Scorecard cube 314 is a multi-dimensional database for storing data to be used in determining Key Performance Indicators (KPIs) as well as generated scorecards themselves. As discussed above, the multi-dimensional nature of scorecard cube 314 enables storage, use, and presentation of data over multiple dimensions such as compound performance indicators for different geographic areas, organizational groups, or even for different time intervals. Scorecard cube 314 has a bi-directional interaction with scorecard engine 308 providing and receiving raw data as well as generated scorecards.


Scorecard database 316 is arranged to operate in a similar manner to scorecard cube 314. In one embodiment, scorecard database 316 may be an external database providing redundant back-up database service.


Scorecard builder 302 may be a separate application, a part of the performance evaluation application, and the like. Scorecard builder 302 is employed to configure various parameters of scorecard engine 308 such as scorecard elements, default values for actuals, targets, and the like. Scorecard builder 302 may include a user interface such as a web service, a Graphical User Interface (GUI), and the like.


Strategy map builder 304 is employed for a later stage in scorecard generation process. As explained below, scores for KPIs and parent nodes such as Objective and Perspective may be presented to a user in form of a strategy map. Strategy map builder 304 may include a user interface for selecting graphical formats, indicator elements, and other graphical parameters of the presentation.


Data Sources 306 may be another source for providing raw data to scorecard engine 308. Data sources may be comprised of a mix of several multi-dimensional and relational databases or other Open Database Connectivity (ODBC)—accessible data source systems (e.g. Excel, text files, etc.). Data sources 306 may also define KPI mappings and other associated data.


Scorecard architecture 300 may include scorecard presentation 310. This may be an application to deploy scorecards, customize views, coordinate distribution of scorecard data, and process web-specific applications associated with the performance evaluation process. For example, scorecard presentation 310 may include a web-based printing system, an email distribution system, and the like. A user interface for scorecard presentation 310 may also include an overview of available scorecards for a subscriber to select from. Scorecard presentation 310 may further include a matrix or a list presentation of the scorecard data. The scorecard presentation and one or more zones for other applications may be displayed in an integrated manner.


Annotation module 320 is configured to interact with scorecard engine 308, scorecard presentation 310, other applications 318, and manage generation, storage, and presentation of annotations across different scorecard configurations, report views, and subscriber defined filters. According to embodiments, annotations may be created on a scorecard that combines multi-dimensional as well as fixed value data. Annotations that are uniquely defined by the scorecard view definition and by data brought back from an underlying query independent of the data's dimensionality. Thus, as the scorecard is reconfigured with additional columns and or rows the annotations can remain with the data and definition.


Threaded discussions and updated document lists are enabled around the annotations with appropriate permissions and/or credentials. Utilization of annotation data stores independent from a scorecard server is also made possible. According to some embodiments, annotations may be rolled-up across dimensions (row or columns) to be used in reporting scenarios (e.g. “Show me all comments for North America for a specific product and a specific time period”).


Other applications 318 may include any application that receives data associated with a report parameter and consumes the data to provide a report, perform analysis, provide alerts, perform further calculations, and the like. The data associated with the report parameter includes content data and metadata. Other applications may be selected based on the report parameter, a subscriber request, or a user interface configuration. The user interface configuration may include a subscriber credential or a subscriber permission attribute. Other applications 318 may include a graphical representation application, a database application, a data analysis application, a communications application, an alerting application, or a word processing application.



FIG. 4 illustrates a screenshot of an example scorecard. As explained before, Key Performance Indicators (KPIs) are specific indicators of organizational performance that measure a current state in relation to meeting the targeted objectives. Decision makers may utilize these indicators to manage the organization more effectively.


When creating a KPI, the KPI definition may be used across several scorecards. This is useful when different scorecard managers might have a shared KPI in common. The shared use of KPI definition may ensure a standard definition is used for that KPI. Despite the shared definition, each individual scorecard may utilize a different data source and data mappings for the actual KPI.


Each KPI may include a number of attributes. Some of these attributes include frequency of data, unit of measure, trend type, weight, and other attributes. The frequency of data identifies how often the data is updated in the source database (cube). The frequency of data may include: Daily, Weekly, Monthly, Quarterly, and Annually.


The unit of measure provides an interpretation for the KPI. Some of the units of measure are: Integer, Decimal, Percent, Days, and Currency. These examples are not exhaustive, and other elements may be added without departing from the scope of the invention.


A trend type may be set according to whether an increasing trend is desirable or not. For example, increasing profit is a desirable trend, while increasing defect rates is not. The trend type may be used in determining the KPI status to display and in setting and interpreting the KPI banding boundary values. The trend arrows displayed in scorecard 400 indicate how the numbers are moving this period compared to last. If in this period the number is greater than last period, the trend is up regardless of the trend type. Possible trend types may include: Increasing Is Better, Decreasing Is Better, and On-Target Is Better.


Weight is a positive integer used to qualify the relative value of a KPI in relation to other KPIs. It is used to calculate the aggregated scorecard value. For example, if an Objective in a scorecard has two KPIs, the first KPI has a weight of 1, and the second has a weight of 3 the second KPI is essentially three times more important than the first, and this weighted relationship is part of the calculation when the KPIs' values are rolled up to derive the values of their parent Objective.


Other attributes may contain pointers to custom attributes that may be created for documentation purposes or used for various other aspects of the scorecard system such as creating different views in different graphical representations of the finished scorecard. Custom attributes may be created for any scorecard element and may be extended or customized by application developers or users for use in their own applications. They may be any of a number of types including text, numbers, percentages, dates, and hyperlinks.


One of the benefits of defining a scorecard is the ability to easily quantify and visualize performance in meeting organizational strategy. By providing a status at an overall scorecard level, and for each perspective, each objective or each KPI rollup, one may quickly identify where one might be off target. By utilizing the hierarchical scorecard definition along with KPI weightings, a status value is calculated at each level of the scorecard.


First column of scorecard 400 shows example elements perspective 420 “Manufacturing” with objectives 422 and 424 “Inventory” and “Assembly” (respectively) reporting to it. Second column 402 in scorecard 400 shows results for each measure from a previous measurement period. Third column 404 shows results for the same measures for the current measurement period. In one embodiment, the measurement period may include a month, a quarter, a tax year, a calendar year, and the like.


Fourth column 406 includes target values for specified KPIs on scorecard 400. Target values may be retrieved from a database, entered by a user, and the like. Column 408 of scorecard 400 shows status indicators.


Status indicators 430 convey the state of the KPI. An indicator may have a predetermined number of levels. A traffic light is one of the most commonly used indicators. It represents a KPI with three-levels of results—Good, Neutral, and Bad. Traffic light indicators may be colored red, yellow, or green. In addition, each colored indicator may have its own unique shape. A KPI may have one stoplight indicator visible at any given time. Indicators with more than three levels may appear as a bar divided into sections, or bands. Column 416 includes trend type arrows as explained above under KPI attributes. Column 418 shows another KPI attribute, frequency.



FIG. 5 illustrates a screenshot of another example scorecard with annotations emphasizing “bubble-up” capability of the annotations. According to some embodiments, a hierarchy representation may be stored for filter values. For example, time dimension may have day, week, month, quarter, or year values. An annotation entry for a specific time value may then be presented (depending on filtering parameters) in multiple time value selections. Similar hierarchical multidimensionality may be implemented for other parameters such as products, geographies, corporate structures, and the like.


Diagram 500 shows an example scorecard for time period 2006 Q1 (502). The scorecard is a typical scorecard with hierarchically structured KPI's and objectives, actuals, targets, and status indicators for different organizational units for a fictional company, Contoso, Ltd. Annotation capability is turned on in the scorecard as indicated by the legend section. The legend section may provide additional detailed information about annotations to a subscriber such as different types, etc. Annotation capability may be turned on or off for a number of reasons. Security level of a client device or the system, subscriber permission levels, system resources, and the like, are examples of why the capability may not always be provided. Scorecard application UI may provide a number of ways to turn on annotation capability such as control icons or buttons.


The columns in the scorecard represent different corporate units, Asian subsidiary, Latin American subsidiary, and North American subsidiary. Cells 506 and 508 of the scorecard include annotations 510 associated with them as indicated by the dark colored triangles on their corners. According to some embodiments, annotations are associated with scorecard elements such as cells, but independent of scorecard data dimensionality (i.e. scorecard configuration). Thus, annotations are uniquely defined by the scorecard view definition and by data returned by an underlying query. This way, the annotations can remain with the data and the definition as the scorecard is reconfigured with additional or fewer columns and or rows.


The annotations may include discussion threads, comments, document lists, and the like. For example, the annotations 510 shown in the scorecard indicate progress of advertising campaigns for different corporate units. Moreover, each annotation has a time point associated with its entry. Using the time point, annotations may be stored, indexed, ordered, or presented based on a filtering parameter associated with the time dimension. This capability provides for a feature of the annotations referred to as “bubble-up” feature herein. For example, the two annotations for the Asian subsidiary with entry time points of Jan. 1 and Feb. 21 of 2006 may be ordered or indexed based on their time points. Furthermore, each annotation may bubble up in report views where the time dimension parameter is monthly, quarterly, or annual. For example, the Asian subsidiary annotations may be presented in a view filtered for Q1 of 2006 or for 2006 as a whole year. According to another embodiment, a scorecard application providing annotation capability may also filter the annotations based on a number of other parameters including, but not limited to, subscriber permission level(s), subscriber preferences, temporal selection, geographic selection, and the like. For example, for annotation that include discussion threads, the application creating the scorecard may restrict subscribers who are allowed to add to the discussion thread based on their permission level for a particular cell. Similarly, viewing (or presentation in a report view UI) of the annotations may also be filtered based on likewise parameters.


Actions that may be performed on the scorecard in FIG. 5 may include adding an annotation, viewing an annotation, updating an annotation, or removing an annotation. As explained above, any of these actions may be limited based on permission levels. Temporal or geographic filtering may include combining or limiting entries of discussion threads based on predefined or subscriber specified selection such as Asia and Latin America, or Q1 of 2006.


Moreover, a multiselect function may be provided according to some embodiments. Multiple cells may be selected graphically or otherwise for combining annotations associated with those cells in a report view presentation. The combination does not have to be a simple addition of all annotations of the selected cells, however. Through an interactive process, the annotations associated with the selected cells may be combined employing any logic operation such as AND, OR, and the like.


Embodiments are not limited to the example scorecard layouts, annotation types and indications described above. Annotation capability in a scorecard may be provided in many other ways using the principles described herein.



FIG. 6 illustrates report view presentation of an annotation based on a discussion thread. Diagram 600 includes scorecard application UI 602 with scorecard 604 and report view UI 606 with annotation 630.


Scorecard application UI 602 is the scorecard presentation screen of a scorecard application. It presents example scorecard 604 for “Manufacturing Evaluation” for first quarter of 2005 (Q1-2005). Elements of scorecard 604 such as KPI's, objectives, columns, indicators, and the like have been described previously. One of the cells of scorecard 604 includes annotation 630, which represents a discussion thread associated with the cell.


As mentioned previously, a preview of the annotation (in this case the discussion thread) may be provided in the scorecard view, for example by right clicking on the selected cell. According to some embodiments, the annotation may also be presented in a report view format as shown in report view UI 606. The report view editor lists all entries of the discussion thread in a predefined format. The entries in the discussion thread are typically from subscribers with sufficient permission. The viewing of the report view UI 606 may also be filtered, however, based on permission level of the viewer and other parameters such as time, geography, and the like. According to other embodiments, a subscriber may be enabled to further filter the annotation presentation by specifying or modifying filter parameters after the report view has been presented.



FIG. 7 illustrates a persistent architecture of annotations in a scorecard system. As diagram 700 shows, the annotation architecture includes in its core an annotations data store 702. Annotations stored independently from a scorecard data dimensionality may be filtered by geography, time, person, and the like, before being provided to a scorecard configuration 704. As explained previously, scorecards may be built by reconfiguring elements of existing scorecards. Thus, a scorecard built from an existing one may also inherit annotations associated with the elements of the original scorecard.


Once the new scorecard configuration is put together, the annotations may then be provided to individual KPIs 706 (KPI 1 through 3). For example, an original scorecard for the North American business unit of an enterprise may include a document list associated with sales targets for a particular year. A subscriber may build a new scorecard for worldwide sales of the whole enterprise bringing in the sales KPIs from individual scorecards for regional business units. When the North American sales KPI is pulled into the new scorecard, the annotations associated with the sales target element may also be inherited making the document list available to the viewers of the new scorecard.


The example implementations of annotations, scorecards, and views in FIGS. 4 through 7 are intended for illustration purposes only and should not be construed as a limitation on embodiments. Other embodiments may be implemented using the principles described herein.



FIG. 8 illustrates a logic flow diagram of a process for using annotations within a scorecard architecture in a business logic system. Process 800 may be implemented in a business logic application.


Process 800 begins with operation 802, where a request to enter an annotation is received. Elements of a scorecard that can be annotated may be graphically indicated (e.g. distinct color, marker, etc.) in the scorecard presentation. A subscriber may indicate his/her request to add annotation by clicking on a marker, activating a button, and the like. In some embodiments, the subscriber may enter an annotation multi-dimensionally by filtering and/or clicking on a cell that represents a multi-dimensional value (For example, the Asian subsidiary in FIG. 5). Annotations may then be stored, indexed, ordered, or presented based on their dimensionality. Processing advances from operation 802 to decision operation 804.


At decision operation 804, a determination is made whether the requesting subscriber has sufficient permission level(s). For example, a discussion thread for a KPI may be restricted to executive level subscribers only. Thus, a subscriber without the requisite permission level may not be allowed to add to the discussion thread for that particular KPI. If the decision at operation 804 is negative, processing may terminate. If the subscriber has proper permission level(s), processing proceeds to operation 806.


At operation 806, the annotation is received. As described previously, the annotation may include a comment, an entry for a discussion thread, an entry for a document list, and the like. Processing moves from operation 806 to operation 808.


At operation 808, the annotation is stored such that a persistent structure for the annotation is preserved. Thus, the stored annotation can be viewed or otherwise used in a reporting structure independent of a data dimensionality. The discussion associated with FIG. 6 explains in more detail how persistent annotations can be used across dimensions and configurations. The annotation may be stored in a scorecard data store in conjunction with the scorecard data, or it may be stored in an independent annotation data store. Once the annotation is stored, it may be made available to the whole scorecard architecture for use in different configurations and reporting presentations. The second portion of the flowchart beginning with operation 810 shows such a use of the annotation after it has been stored.


Operation 810 is illustrated as following operation 808 with a dashed line. This connection is intended to show that other operations may take place between the two operations, or one may not necessarily follow the other in a chronological order. At operation 810, a request to view the annotation is received. Examples of such a request include a real time action by a subscriber viewing the scorecard, a reporting application preparing a report presentation that includes the annotations, and the like. Processing advances from operation 810 to decision operation 812.


At decision operation 812, a determination is made whether the requesting subscriber has sufficient permission level(s). For example, a discussion thread or individual annotation for a particular scorecard element (e.g. target values for a specific KPI) may be restricted to executive level subscribers only. Thus, a subscriber without the requisite permission level may not be allowed to view the whole discussion thread or portions of the thread associated with that particular scorecard element. If the decision at operation 812 is negative, processing may return to operation 810. If the subscriber has proper permission level(s), processing proceeds to operation 814.


At operation 814, one or more filters are applied to the annotation(s). Filters may include temporal filters, logic combinations of scorecard elements (cells), subscriber credentials, geographic filters, organizational architecture filters, and the like. For example, a discussion thread may be filtered to include entries for Q1 of 2006 only and for North America and Asia business units of an enterprise. As discussed before, the filters may combine different conditions in any logic combination such as AND, OR, etc. Processing moves from operation 814 to operation 816.


At operation 816, the annotation(s) is (are) presented in the requested format, such as a report presentation. Annotations may be presented in many formats including, but not limited to, pop-ups, separate reports, separate displays (e.g. scorecard view is presented on one display, a selected annotation is presented on another display), and the like. After operation 816, processing moves to a calling process for further actions.


The operations included in process 800 are for illustration purposes. Using persistent annotations within a scorecard architecture may be implemented by similar processes with fewer or additional steps, as well as in different order of operations using the principles described herein.


The above specification, examples and data provide a complete description of the manufacture and use of the composition of the embodiments. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims and embodiments.

Claims
  • 1. A computer-readable storage medium device having computer-executable instructions which when executed perform a method for recording and presenting annotations in a scorecard, the method executed by the computer-executable instructions comprising: receiving a plurality of annotations corresponding to an associated scorecard element, each of the plurality of annotations including a dimension parameter;storing the plurality of annotations based on the associated scorecard element and the dimension parameter such that the plurality of annotations are independent from a configuration of the scorecard;receiving a selection of multiple elements of the scorecard, wherein receiving the selection of multiple elements of the scorecard comprises receiving a request to view any annotations associated with the selected elements; andproviding the plurality of annotations associated with the selected elements of the scorecard for a report view presentation based on a filtering of the dimension parameter, wherein providing the plurality of annotations for the report view presentation based on the filtering of the dimension parameter comprises: combining each selected element's corresponding annotations,sorting the combined annotations based on the dimension parameter, andpresenting the combined and sorted annotations in the report view presentation.
  • 2. The computer-readable storage medium device of claim 1, wherein the filtering of the dimension parameter includes a filtering of at least one from a set of: time, product, geographic configuration, and organizational structure.
  • 3. The computer-readable storage medium device of claim 1, further comprising confirming a permission attribute based on a subscriber credential and the scorecard element.
  • 4. The computer-readable storage medium device of claim 1, further comprising filtering individual entries of a discussion thread prior to the report view presentation based on a predefined set of conditions.
  • 5. The computer-readable storage medium device of claim 4, wherein filtering based on the predefined set of conditions comprises filtering based on the predefined set of conditions that: is determined by one of a subscriber and an administrator, and includes at least one from a set of selection by geography, selection by subscriber credential, and selection by time.
  • 6. A system for using persistent scorecard annotations in a business logic system, the system comprising: a scorecard application, operatively associated with at least one computing system comprising a memory storage and a processing unit coupled to the memory storage, configured to compute scorecard metrics and provide a scorecard presentation based on the computed scorecard metrics; andan annotation module, operatively associated with the at least one computing system, configured to: record at least one annotation associated with a scorecard element in a scorecard, the at least one annotation associated with the scorecard element being further associated with at least one time value,restrict the at least one annotation if the annotation was not received from a permitted source allowed to annotate a cell of the scorecard,provide an indication of the at least one annotation within the associated scorecard element in the scorecard, the indication comprising a displayed icon within the scorecard element,receive a request to view, simultaneously, any annotations associated with multiple scorecard elements in the scorecard;combine each annotation associated with the multiple scorecard elements in the scorecard;filter the combined annotations based on a filtering parameter, the filtering parameter comprising a temporal parameter, andprovide the combined and filtered annotations for a report view presentation, the combined and filtered annotations being provided in multiple time values based on the temporal parameter.
  • 7. The system of claim 6, wherein the annotation module is further configured to provide a preview of the at least one annotation as part of the scorecard presentation.
  • 8. The system of claim 6, wherein the annotation module is further configured to enable a subscriber to perform at least one action from a set of read, create, delete, and update actions associated with the at least one annotation based on a permission attribute assigned to the subscriber.
  • 9. The system of claim 6, wherein the annotation module is configured to filter the recorded at least one annotation dynamically when at least one of an element and a configuration of the scorecard is modified.
  • 10. The system of claim 6, wherein the annotation module is further configured to combine a plurality of annotations associated with distinct cells of the scorecard employing a logic operation.
  • 11. The system of claim 6, wherein the annotation module is integrated with the scorecard application.
  • 12. A method for recording and presenting annotations in a scorecard system, the method comprising: receiving, by a computer, a plurality of annotations corresponding to a scorecard element of a scorecard, each of the plurality of annotations including a dimension parameter;storing the plurality of annotations based on the scorecard element and the dimension parameter such that the plurality of annotations are independent from a configuration of the scorecard;receiving a selection of multiple elements of the scorecard, wherein receiving the selection of multiple elements of the scorecard comprises receiving a request to view any annotations associated with the selected elements of the scorecard; andproviding the plurality of annotations associated with the selected elements of the scorecard for a report view presentation based on a filtering of the dimension parameter, wherein providing the plurality of annotations for the report view presentation based on the filtering of the dimension parameter comprises:combining each selected element's corresponding annotations,sorting the combined annotations based on the dimension parameter, andpresenting the combined and sorted annotations in the report view presentation.
  • 13. The method of claim 12, wherein the filtering of the dimension parameter includes a filtering of at least one from a set of: time, product, geographic configuration, and organizational structure.
  • 14. The method of claim 12, further comprising confirming a permission attribute based on the element of the scorecard.
  • 15. The method of claim 12, further comprising confirming a permission attribute based on a subscriber credential.
  • 16. The method of claim 12, further comprising filtering individual entries of a discussion thread prior to the report view presentation based on a predefined set of conditions.
  • 17. The method of claim 16, wherein filtering based on the predefined set of conditions comprises filtering based on the predefined set of conditions that is determined by a subscriber and includes at least one from a set of selection by geography, selection by subscriber credential, and selection by time.
  • 18. The method of claim 16, wherein filtering based on the predefined set of conditions comprises filtering based on the predefined set of conditions that is determined by an administrator and includes at least one from a set of selection by geography, selection by subscriber credential, and selection by time.
US Referenced Citations (416)
Number Name Date Kind
5018077 Healey May 1991 A
5233552 Brittan Aug 1993 A
5253362 Nolan et al. Oct 1993 A
5404295 Katz et al. Apr 1995 A
5473747 Bird Dec 1995 A
5615347 Davis et al. Mar 1997 A
5675553 O'Brien, Jr. et al. Oct 1997 A
5675782 Montague et al. Oct 1997 A
5680636 Levine Oct 1997 A
5758351 Gibson et al. May 1998 A
5764890 Glasser et al. Jun 1998 A
5779566 Wilens Jul 1998 A
5797136 Boyer et al. Aug 1998 A
5819225 Eastwood et al. Oct 1998 A
5832504 Tripathi et al. Nov 1998 A
5838313 Hou et al. Nov 1998 A
5845270 Schatz Dec 1998 A
5877758 Seybold Mar 1999 A
5911143 Deinhart et al. Jun 1999 A
5926794 Fethe Jul 1999 A
5941947 Brown et al. Aug 1999 A
5943666 Kleewein et al. Aug 1999 A
5956691 Powers Sep 1999 A
6012044 Maggioncalda et al. Jan 2000 A
6023714 Hill et al. Feb 2000 A
6061692 Thomas et al. May 2000 A
6115705 Larson Sep 2000 A
6119137 Smith et al. Sep 2000 A
6141655 Johnson Oct 2000 A
6163779 Mantha Dec 2000 A
6182022 Mayle et al. Jan 2001 B1
6216066 Goebel et al. Apr 2001 B1
6226635 Katariya May 2001 B1
6230310 Arrouye et al. May 2001 B1
6233573 Bair May 2001 B1
6249784 Macke Jun 2001 B1
6308206 Singh Oct 2001 B1
6321206 Honarvar Nov 2001 B1
6341277 Coden et al. Jan 2002 B1
6345279 Li et al. Feb 2002 B1
6389434 Rivette May 2002 B1
6393406 Eder May 2002 B1
6421670 Fourman Jul 2002 B1
6463431 Schmitt Oct 2002 B1
6466935 Stuart Oct 2002 B1
6493733 Pollack Dec 2002 B1
6516324 Jones Feb 2003 B1
6519603 Bays et al. Feb 2003 B1
6522342 Gagnon et al. Feb 2003 B1
6529215 Golovchinsky et al. Mar 2003 B2
6563514 Samar May 2003 B1
6578004 Cimral Jun 2003 B1
6601233 Underwood Jul 2003 B1
6604084 Powers et al. Aug 2003 B1
6606627 Guthrie et al. Aug 2003 B1
6628312 Rao Sep 2003 B1
6633889 Dessloch et al. Oct 2003 B2
6658432 Alavi et al. Dec 2003 B1
6665577 Onyshkevych et al. Dec 2003 B2
6677963 Mani et al. Jan 2004 B1
6687735 Logston et al. Feb 2004 B1
6687878 Eintracht Feb 2004 B1
6728724 Megiddo et al. Apr 2004 B1
6763134 Cooper et al. Jul 2004 B2
6772137 Hurwood et al. Aug 2004 B1
6775675 Nwabueze Aug 2004 B1
6782421 Soles et al. Aug 2004 B1
6785675 Graves et al. Aug 2004 B1
6804657 Sultan Oct 2004 B1
6831575 Wu et al. Dec 2004 B2
6831668 Cras Dec 2004 B2
6842176 Sang'Udi Jan 2005 B2
6850891 Forman Feb 2005 B1
6854091 Beaudoin Feb 2005 B1
6859798 Bedell et al. Feb 2005 B1
6867764 Ludtke Mar 2005 B2
6868087 Agarwala et al. Mar 2005 B1
6874126 Lapidous Mar 2005 B1
6898603 Petculescu May 2005 B1
6900808 Lassiter May 2005 B2
6901426 Powers et al. May 2005 B1
6917921 Cimral et al. Jul 2005 B1
6959306 Nwabueze Oct 2005 B2
6963826 Hanaman et al. Nov 2005 B2
6968312 Jordan Nov 2005 B1
6973616 Cottrille Dec 2005 B1
6976086 Sadeghi et al. Dec 2005 B2
6988076 Ouimet Jan 2006 B2
6995768 Jou Feb 2006 B2
7013285 Rebane Mar 2006 B1
7015911 Shaughnessy et al. Mar 2006 B2
7027051 Alford et al. Apr 2006 B2
7043524 Shah et al. May 2006 B2
7058638 Singh Jun 2006 B2
7065784 Hopmann et al. Jun 2006 B2
7079010 Champlin Jul 2006 B2
7158628 McConnell et al. Jan 2007 B2
7181417 Langseth et al. Feb 2007 B1
7200595 Dutta et al. Apr 2007 B2
7216116 Nilsson et al. May 2007 B1
7222308 Sauermann et al. May 2007 B2
7224847 Zhang et al. May 2007 B2
7249120 Bruno et al. Jul 2007 B2
7275024 Yeh et al. Sep 2007 B2
7302421 Aldridge Nov 2007 B2
7302431 Apollonsky et al. Nov 2007 B1
7302444 Dunmore et al. Nov 2007 B1
7313561 Lo et al. Dec 2007 B2
7340448 Santosuosso Mar 2008 B2
7349862 Palmer et al. Mar 2008 B2
7349877 Ballow et al. Mar 2008 B2
7359865 Connor et al. Apr 2008 B1
7383247 Li et al. Jun 2008 B2
7398240 Ballow et al. Jul 2008 B2
7406431 Spira et al. Jul 2008 B2
7409357 Schaf et al. Aug 2008 B2
7412398 Bailey Aug 2008 B1
7433876 Spivack et al. Oct 2008 B2
7440976 Hart et al. Oct 2008 B2
7454393 Horvitz et al. Nov 2008 B2
7496852 Eichorn et al. Feb 2009 B2
7496857 Stata et al. Feb 2009 B2
7509343 Washburn et al. Mar 2009 B1
7546226 Yeh et al. Jun 2009 B1
7546246 Stamm et al. Jun 2009 B1
7548912 Gideoni et al. Jun 2009 B2
7559023 Hays et al. Jul 2009 B2
7568217 Prasad et al. Jul 2009 B1
7587665 Crow et al. Sep 2009 B2
7587755 Kramer Sep 2009 B2
7599848 Wefers et al. Oct 2009 B2
7613625 Heinrich Nov 2009 B2
7617177 Bukary et al. Nov 2009 B2
7617187 Zhu et al. Nov 2009 B2
7630965 Erickson et al. Dec 2009 B1
7634478 Yang et al. Dec 2009 B2
7636709 Srikant et al. Dec 2009 B1
7640506 Pratley et al. Dec 2009 B2
7660731 Chaddha et al. Feb 2010 B2
7667582 Waldorf Feb 2010 B1
7685207 Helms Mar 2010 B1
7694270 Manikotia et al. Apr 2010 B2
7698349 Hulen et al. Apr 2010 B2
7702554 Ballow et al. Apr 2010 B2
7702779 Gupta et al. Apr 2010 B1
7707490 Hays et al. Apr 2010 B2
7716253 Netz et al. May 2010 B2
7716278 Beringer et al. May 2010 B2
7716571 Tien et al. May 2010 B2
7716592 Tien et al. May 2010 B2
7725947 Bukary et al. May 2010 B2
7730023 MacGregor Jun 2010 B2
7730123 Erickson et al. Jun 2010 B1
7739148 Suzuki et al. Jun 2010 B2
7747572 Scott et al. Jun 2010 B2
7752094 Davidson et al. Jul 2010 B2
7752301 Maiocco et al. Jul 2010 B1
7778910 Ballow et al. Aug 2010 B2
7788280 Singh et al. Aug 2010 B2
7792774 Friedlander et al. Sep 2010 B2
7822662 Guzik et al. Oct 2010 B2
7831464 Nichols et al. Nov 2010 B1
7840896 Tien et al. Nov 2010 B2
7848947 McGloin et al. Dec 2010 B1
7899833 Stevens et al. Mar 2011 B2
7899843 Dettinger et al. Mar 2011 B2
7904797 Wong et al. Mar 2011 B2
8126750 Tien et al. Feb 2012 B2
8190992 Tien et al. May 2012 B2
20010004256 Iwata et al. Jun 2001 A1
20010051835 Cline Dec 2001 A1
20010054046 Mikhailov et al. Dec 2001 A1
20020029273 Haroldson et al. Mar 2002 A1
20020038217 Young Mar 2002 A1
20020049621 Bruce Apr 2002 A1
20020052740 Charlesworth May 2002 A1
20020052862 Scott et al. May 2002 A1
20020059267 Shah May 2002 A1
20020078175 Wallace Jun 2002 A1
20020087272 Mackie Jul 2002 A1
20020091737 Markel Jul 2002 A1
20020099578 Eicher et al. Jul 2002 A1
20020099678 Albright et al. Jul 2002 A1
20020103976 Steely et al. Aug 2002 A1
20020112171 Ginter et al. Aug 2002 A1
20020133368 Strutt et al. Sep 2002 A1
20020147803 Dodd et al. Oct 2002 A1
20020161614 Spira et al. Oct 2002 A1
20020169658 Adler Nov 2002 A1
20020169799 Voshell Nov 2002 A1
20020177784 Shekhar Nov 2002 A1
20020178119 Griffin et al. Nov 2002 A1
20020184043 Lavorgna et al. Dec 2002 A1
20020184061 Digate et al. Dec 2002 A1
20020188513 Gil et al. Dec 2002 A1
20020194042 Sands Dec 2002 A1
20020194090 Gagnon et al. Dec 2002 A1
20020194329 Alling Dec 2002 A1
20020198985 Fraenkel et al. Dec 2002 A1
20030004742 Palmer et al. Jan 2003 A1
20030014290 McLean et al. Jan 2003 A1
20030014488 Dalal et al. Jan 2003 A1
20030028419 Monaghan Feb 2003 A1
20030033191 Davies et al. Feb 2003 A1
20030040936 Nader et al. Feb 2003 A1
20030055731 Fouraker et al. Mar 2003 A1
20030055927 Fischer et al. Mar 2003 A1
20030061132 Yu et al. Mar 2003 A1
20030065604 Gatto Apr 2003 A1
20030065605 Gatto Apr 2003 A1
20030069773 Hladik et al. Apr 2003 A1
20030069824 Menninger Apr 2003 A1
20030071814 Jou et al. Apr 2003 A1
20030078830 Wagner et al. Apr 2003 A1
20030093423 Larason et al. May 2003 A1
20030110249 Buus et al. Jun 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030146937 Lee Aug 2003 A1
20030149696 Nelson et al. Aug 2003 A1
20030182181 Kirkwood Sep 2003 A1
20030187675 Hack Oct 2003 A1
20030195878 Neumann Oct 2003 A1
20030204430 Kalmick et al. Oct 2003 A1
20030204487 Sssv Oct 2003 A1
20030212960 Shaughnessy et al. Nov 2003 A1
20030225604 Casati et al. Dec 2003 A1
20030226107 Pelegri-Llopart Dec 2003 A1
20030236732 Cimral et al. Dec 2003 A1
20040021695 Sauermann et al. Feb 2004 A1
20040030741 Wolton et al. Feb 2004 A1
20040030795 Hesmer et al. Feb 2004 A1
20040033475 Mizuma et al. Feb 2004 A1
20040044665 Nwabueze Mar 2004 A1
20040044678 Kalia et al. Mar 2004 A1
20040059518 Rothschild Mar 2004 A1
20040064293 Hamilton et al. Apr 2004 A1
20040066782 Nassar Apr 2004 A1
20040068429 MacDonald Apr 2004 A1
20040068431 Smith et al. Apr 2004 A1
20040083246 Kahlouche et al. Apr 2004 A1
20040093296 Phelan et al. May 2004 A1
20040102926 Adendorff May 2004 A1
20040117731 Blyashov Jun 2004 A1
20040119752 Berringer et al. Jun 2004 A1
20040128150 Lundegren Jul 2004 A1
20040135826 Pickering Jul 2004 A1
20040138944 Whitacre Jul 2004 A1
20040162772 Lewis Aug 2004 A1
20040164983 Khozai Aug 2004 A1
20040172323 Stamm Sep 2004 A1
20040183800 Peterson Sep 2004 A1
20040199541 Goldberg et al. Oct 2004 A1
20040204913 Mueller et al. Oct 2004 A1
20040210574 Aponte et al. Oct 2004 A1
20040212636 Stata et al. Oct 2004 A1
20040215626 Colossi et al. Oct 2004 A1
20040225571 Urali Nov 2004 A1
20040225955 Ly Nov 2004 A1
20040230463 Boivin Nov 2004 A1
20040230471 Putnam Nov 2004 A1
20040249482 Abu El Ata et al. Dec 2004 A1
20040249657 Koi et al. Dec 2004 A1
20040252134 Bhatt et al. Dec 2004 A1
20040254806 Schwerin-Wenzel et al. Dec 2004 A1
20040254860 Wagner et al. Dec 2004 A1
20040260582 King Dec 2004 A1
20040260717 Albornoz et al. Dec 2004 A1
20040268228 Croney et al. Dec 2004 A1
20050004781 Price et al. Jan 2005 A1
20050012743 Kapler et al. Jan 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050049831 Lilly Mar 2005 A1
20050049894 Cantwell et al. Mar 2005 A1
20050055257 Senturk et al. Mar 2005 A1
20050060048 Pierre Mar 2005 A1
20050060300 Stolte et al. Mar 2005 A1
20050060325 Bakalash Mar 2005 A1
20050065925 Weissman et al. Mar 2005 A1
20050065930 Swaminathan et al. Mar 2005 A1
20050065967 Schuetze et al. Mar 2005 A1
20050071680 Bukary et al. Mar 2005 A1
20050071737 Adendorff Mar 2005 A1
20050091093 Bhaskaran Apr 2005 A1
20050091253 Cragun et al. Apr 2005 A1
20050091263 Wallace Apr 2005 A1
20050097438 Jacobson May 2005 A1
20050097517 Goin et al. May 2005 A1
20050108271 Hurmiz et al. May 2005 A1
20050114241 Hirsch May 2005 A1
20050114801 Yang May 2005 A1
20050144022 Evans Jun 2005 A1
20050149558 Zhuk Jul 2005 A1
20050149852 Bleicher Jul 2005 A1
20050154628 Eckart et al. Jul 2005 A1
20050154635 Wright et al. Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050160356 Albornoz Jul 2005 A1
20050171835 Mook Aug 2005 A1
20050181835 Lau et al. Aug 2005 A1
20050197946 Williams et al. Sep 2005 A1
20050198042 Davis Sep 2005 A1
20050203876 Cragun et al. Sep 2005 A1
20050209943 Ballow et al. Sep 2005 A1
20050209945 Ballow et al. Sep 2005 A1
20050209946 Ballow et al. Sep 2005 A1
20050209948 Ballow et al. Sep 2005 A1
20050210052 Aldridge Sep 2005 A1
20050216831 Guzik Sep 2005 A1
20050228880 Champlin Oct 2005 A1
20050240467 Eckart Oct 2005 A1
20050240898 Manikotia et al. Oct 2005 A1
20050256825 Dettinger Nov 2005 A1
20050262051 Dettinger et al. Nov 2005 A1
20050262451 Remingnanti et al. Nov 2005 A1
20050272022 Montz, Jr. et al. Dec 2005 A1
20050273762 Lesh Dec 2005 A1
20050289452 Kashi Dec 2005 A1
20060004555 Jones Jan 2006 A1
20060004731 Seibel et al. Jan 2006 A1
20060009990 McCormick Jan 2006 A1
20060010032 Eicher et al. Jan 2006 A1
20060010164 Netz et al. Jan 2006 A1
20060020531 Veeneman et al. Jan 2006 A1
20060026179 Brown et al. Feb 2006 A1
20060036455 Prasad Feb 2006 A1
20060036595 Gilfix et al. Feb 2006 A1
20060047419 Diendorf et al. Mar 2006 A1
20060059107 Elmore et al. Mar 2006 A1
20060074789 Capotosto et al. Apr 2006 A1
20060080156 Baughn et al. Apr 2006 A1
20060085444 Sarawgi et al. Apr 2006 A1
20060089868 Griller et al. Apr 2006 A1
20060089894 Balk et al. Apr 2006 A1
20060089939 Broda et al. Apr 2006 A1
20060095276 Axelrod et al. May 2006 A1
20060095915 Clater May 2006 A1
20060111921 Chang et al. May 2006 A1
20060112123 Clark et al. May 2006 A1
20060112130 Lowson May 2006 A1
20060123022 Bird Jun 2006 A1
20060136830 Martlage et al. Jun 2006 A1
20060154692 Ikehara et al. Jul 2006 A1
20060161471 Hulen et al. Jul 2006 A1
20060161596 Chan et al. Jul 2006 A1
20060167704 Nicholls et al. Jul 2006 A1
20060178897 Fuchs Aug 2006 A1
20060178920 Muell Aug 2006 A1
20060195424 Wiest et al. Aug 2006 A1
20060206392 Rice, Jr. et al. Sep 2006 A1
20060224325 Conway et al. Oct 2006 A1
20060229925 Chalasani et al. Oct 2006 A1
20060230234 Bentolila et al. Oct 2006 A1
20060233348 Cooper Oct 2006 A1
20060235732 Miller et al. Oct 2006 A1
20060235778 Razvi et al. Oct 2006 A1
20060253475 Stewart et al. Nov 2006 A1
20060259338 Rodrigue et al. Nov 2006 A1
20060265377 Raman et al. Nov 2006 A1
20060271583 Hulen et al. Nov 2006 A1
20060277128 Anandarao et al. Dec 2006 A1
20060282819 Graham et al. Dec 2006 A1
20060288211 Vargas et al. Dec 2006 A1
20070021992 Konakalla Jan 2007 A1
20070022026 Davidson et al. Jan 2007 A1
20070033129 Coates Feb 2007 A1
20070038934 Fellman Feb 2007 A1
20070050237 Tien et al. Mar 2007 A1
20070055564 Fourman Mar 2007 A1
20070055688 Blattner Mar 2007 A1
20070067381 Grant et al. Mar 2007 A1
20070112607 Tien et al. May 2007 A1
20070143161 Tien et al. Jun 2007 A1
20070143174 Tien et al. Jun 2007 A1
20070143175 Tien et al. Jun 2007 A1
20070156680 Tien et al. Jul 2007 A1
20070168323 Dickerman et al. Jul 2007 A1
20070174330 Fox et al. Jul 2007 A1
20070225986 Bowe et al. Sep 2007 A1
20070239508 Fazal et al. Oct 2007 A1
20070239573 Tien et al. Oct 2007 A1
20070239660 Tien et al. Oct 2007 A1
20070254740 Tien et al. Nov 2007 A1
20070255681 Tien et al. Nov 2007 A1
20070260625 Tien et al. Nov 2007 A1
20070265863 Tien et al. Nov 2007 A1
20070266042 Hsu et al. Nov 2007 A1
20070282673 Nagpal et al. Dec 2007 A1
20080005064 Sarukkai Jan 2008 A1
20080040309 Aldridge Feb 2008 A1
20080059441 Gaug et al. Mar 2008 A1
20080086345 Wilson et al. Apr 2008 A1
20080086359 Holton et al. Apr 2008 A1
20080109270 Shepherd et al. May 2008 A1
20080115103 Datars et al. May 2008 A1
20080140623 Tien et al. Jun 2008 A1
20080162209 Gu et al. Jul 2008 A1
20080162210 Gu et al. Jul 2008 A1
20080163066 Gu et al. Jul 2008 A1
20080163099 Gu et al. Jul 2008 A1
20080163125 Gu et al. Jul 2008 A1
20080163164 Chowdhary et al. Jul 2008 A1
20080168376 Tien et al. Jul 2008 A1
20080172287 Tien et al. Jul 2008 A1
20080172348 Tien et al. Jul 2008 A1
20080172414 Tien et al. Jul 2008 A1
20080172629 Tien et al. Jul 2008 A1
20080183564 Tien et al. Jul 2008 A1
20080184099 Tien et al. Jul 2008 A1
20080184130 Tien et al. Jul 2008 A1
20080189632 Tien et al. Aug 2008 A1
20080189724 Tien et al. Aug 2008 A1
20080243597 Ballow et al. Oct 2008 A1
20080288889 Hunt et al. Nov 2008 A1
20090300110 Chene et al. Dec 2009 A1
20100262659 Christiansen et al. Oct 2010 A1
20120150905 Tien et al. Jun 2012 A1
Foreign Referenced Citations (12)
Number Date Country
1 128 299 Aug 2001 EP
1 050 829 Mar 2006 EP
WO 97031320 Aug 1997 WO
WO 0101206 Jan 2001 WO
WO 0101206 Jan 2001 WO
WO 0165349 Sep 2001 WO
WO 0169421 Sep 2001 WO
WO 0169421 Sep 2001 WO
WO 03037019 May 2003 WO
WO 2004114177 Dec 2004 WO
WO 2004114177 Dec 2004 WO
WO 2005101233 Oct 2005 WO
Related Publications (1)
Number Date Country
20070234198 A1 Oct 2007 US