Service architecture based metric views

Information

  • Patent Grant
  • 8321805
  • Patent Number
    8,321,805
  • Date Filed
    Tuesday, January 30, 2007
    17 years ago
  • Date Issued
    Tuesday, November 27, 2012
    11 years ago
Abstract
Scorecard associated content is provided to limited user interfaces of desktop visualization applications on a user's computing device (e.g. a mobile computing device, an ultra-mobile computing device, a personal digital assistant, an in-car computing system, and a tablet computing device) for delivery of personalized and scalable metrics. Users are enabled to set up personalized metric views based on predefined or user-defined desktop visualization applications employing indicators, partial report views, audio, video, and the like. Data delivery attributes from local or remote data sources can be set for deployment of the desktop visualization applications in a service based architecture. Computing device visualization applications may also be used to activate local or remote applications for various scorecard operations.
Description
BACKGROUND

Key Performance Indicators (KPIs) are quantifiable measurements that reflect the critical success factors of an organization ranging from income that comes from return customers to percentage of customer calls answered in the first minute. Key Performance Indicators may also be used to measure performance in other types of organizations such as schools, social service organizations, and the like. Measures employed as KPI within an organization may include a variety of types such as revenue in currency, growth or decrease of a measure in percentage, actual values of a measurable quantity, and the like.


An average business user may need to visit several data systems before they are able to find the information (such as KPIs, aggregations, metric presentation, and the like) they need. High cost of data access may add overhead to the decision making process, driving up costs and frequency of decisions made on incomplete data and lower overall (Return On Investment) ROI on the data that is collected for the enterprise.


On the other hand, desktop visualization application user interfaces with limited functionality and presentation (e.g. “gadgets” or “widgets”) are becoming popular components of operating system desktops. For example, analog or digital clocks, weather presentations, news feeds, and the like are commonly found on many computer users' screens. These user interfaces provide a simple tool for the user to access selected information (e.g. traffic information for a selected area). Use of widgets or gadgets is expanding to numerous areas such as medical information (e.g. doctors can monitor a particular patient's vital information).


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed object matter, nor is it intended as an aid in determining the scope of the claimed subject matter.


Embodiments are directed to providing content to desktop visualization applications (“gadgets”) on a user's desktop for delivery of personated and scalable metrics. Configurable user interlaces within a service-based architecture enable users to set up personalized metric views and data delivery attributes. The user interfaces may be coupled with applications that have expanded interaction capability and can provide more detail if requested by the user.


These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.





BRIEF DESCRIPTION OF TOE DRAWINGS


FIG. 1 illustrates an example scorecard architecture;



FIG. 2 illustrates a screenshot of an example scorecard;



FIG. 3 illustrates a screenshot of an example scorecard application user interface;



FIG. 4 illustrates a screenshot of an example desktop visualization application for performance metrics;



FIG. 5 illustrates a screenshot of an indicator selection wizard for a desktop visualization application for performance metrics;



FIG. 6 illustrate another screenshot of an indicator selection wizard for a desktop visualization application for performance metrics;



FIG. 7 illustrates example desktop visualization applications and expansion to various applications from a selected desktop visualization application;



FIG. 8 illustrates an example service-based architecture in which the desktop visualization application for performance metrics may be deployed;



FIG. 9 is a diagram of a networked environment where embodiments may be implemented;



FIG. 10 is a block diagram of an example computing operating environment, where embodiments may be implemented; and



FIG. 11 illustrates a logic flow diagram for a process of employing desktop visualization applications for performance metrics in a service-based architecture.





DETAILED DESCRIPTION

As briefly described above, users are enabled to set up limited desktop user interfaces providing personalized and scalable metric information. In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents.


While the embodiments will be described in the general contest of program modules that execute in conjunction with an application program that runs on an operating system on a personal computer, those skilled in the art will recognize that aspects may also be implemented in combination with oilier program modules.


Generally, program module include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.


Embodiments may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.


Referring to FIG. 1, an example scorecard architecture is illustrated. The scorecard architecture may comprise any topology of processing systems, storage, systems, source systems, and configuration systems. The scorecard architecture may also have a static or dynamic topology.


Scorecards are an easy method of evaluating organizational performance. The performance measures may vary from financial data such as sales growth to service information such as customer complaints. In a non-business environment, student performances and teacher assessments may be another example of performance measures that can employ scorecards for evaluating organizational performance. In the exemplary scorecard architecture, a core of the system is scorecard engine 108. Scorecard engine 108 may be an application software that is arranged to evaluate performance metrics. Scorecard engine 108 may be loaded into a server, executed over a distributed network, executed in a client device, and the like.


Data for evaluating various measures may be provided by a data source. The data source may include source systems 112, which provide data to a scorecard cube 114. Source systems 112 may include multi-dimensional databases such OLAP, other databases, individual files, and the like, that provide raw data for generation of scorecards. Scorecard cube 114 is a multi-dimensional database for storing data to be used in determining Key Performance Indicators (KPIs) as well as generated scorecard themselves. As discussed above, the multi-dimensional nature of scorecard cube 114 enables storage, use, and presentation of data over multiple dimensions such as compound performance indicators for different geographic areas, organizational groups, or even for different time intervals. Scorecard cube 114 has a bi-directional interaction wish scorecard engine 108 providing and receiving raw data as well as generated scorecards.


Scorecard database 116 is arranged to operate in a similar manner to scorecard cube 114. In one embodiment, scorecard database 116 may be an external database providing redundant back-up database service.


Scorecard builder 102 may be a separate application or a part of a business logic application such as the performance evaluation application, and the like. Scorecard builder 102 is employed to configure various parameters of scorecard engine 108 such as scorecard elements, default values for actuals, targets, and the like. Scorecard builder 102 may include a user interface such as a web service, a GUI, and the like.


Strategy map builder 104 is employed for a later stage in scorecard generation process. As explained below, scores for KPIs and other metrics may be presented to a user in form of a strategy map. Strategy map builder 104 may include a user interface for selecting graphical formats, indicator elements, and other graphical parameters of the presentation.


Data Sources 106 may be another source for providing raw data to scorecard engine 108. Data sources 106 may also define KPI mappings and other associated data.


Additionally, the scorecard architecture may include scorecard presentation 110. This may be an application to deploy scorecards, customize views, coordinate distribution of scorecard data, and process web-specific applications associated with the performance evaluation process. For example, scorecard presentation 110 may include a web-based printing system, an email distribution system, and the like. In some embodiments, scorecard presentation 110 may be an interface that is used as part of the scorecard engine to export data and/or views to a desktop visualization application enabling visualizations of performance metric (e.g. using composite objects).



FIG. 2 illustrates a screenshot of an example scorecard with status indicators 230. As explained before, Key Performance Indicators (KPIs) are specific indicators of organizational performance that measure a current state in relation to meeting the targeted objectives. Decision makers may utilize these indicators to manage the organization more effectively.


When creating a KPI, the KPI definition may be used across several scorecards. This is useful when different scorecard managers might have a shared KPI in common. This may ensure a standard definition is used for that KPI. Despite the shared definition, each individual scorecard may utilize a different data source and data mappings for the actual KPI.


Each KPI may include a number of attributes. Some of these attributes include frequency of data, unit of measure, trend type, weight, and other attributes.


The frequency of data identifies how often the data is updated in the source database (cube). The frequency of data may include: Daily, Weekly, Monthly, Quarterly, and Annually.


The unit of measure provides an interpretation for the KPI. Some of the units of measure are: Integer, Decimal, Percent, Days, and Currency. These examples are not exhaustive, and other elements may be added without departing from the scope of the invention.


A trend type may be set according to whether an increasing trend is desirable or not. For example, increasing profit is a desirable trend, while increasing defect rates is not. The trend type may be used in determining the KPI status to display and ink setting and interpreting the KPI banding boundary values. The arrows displayed in the scorecard of FIG. 2 indicate how the numbers are moving this period compared to last. If in this period the number is greater than last period, the trend is up regardless of the trend type. Possible trend types may include; Increasing Is Better, Decreasing Is Better, and On-Target Is Better.


Weight is a positive integer used to qualify the relative value of a KPI in relation to other KPIs. It is used to calculate the aggregated scorecard value. For example, if an Objective in a scorecard has two KPIs, the first KPI has a weight of 1, and the second has a weight of 3 the second KPI is essentially three times more important than the first, and this weighted relationship is part of the calculation when the KPIs values are roiled up to derive the values of their parent metric.


Other attributes may contain pointers to custom attributes that may be created for documentation purposes or used for various other aspects of the scorecard system such as creating different views in different graphical representations of the finished scorecard. Custom attributes may be created for any scorecard element and may be extended or customized by application developers or users for use in their own applications. They may be any of a number of types including text, numbers, percentages, dates, and hyperlinks.


One of the benefits of defining a scorecard is the ability to easily quantify and visualize performance in meeting organizational strategy. By providing a status as an overall scorecard level, and for each perspective, each objective or each KPI roll up, one may quickly identify where one might be off target. By utilizing the hierarchical scorecard definition along with KPI weightings, a status value is calculated at each level of the scorecard.


First column of the scorecard shows example top level metric 236 “Manufacturing” with its reporting KPIs 238 and 242 “Inventory” and “Assembly”. Second column 222 in the scorecard shows results for each measure from a previous measurement period. Third column 224 shows results for the same measures for the current measurement period. In one embodiment the measurement period may include a month, a quarter, a tax year, a calendar year, and the like.


Fourth column 226 includes target values for specified KPIs on the scorecard. Target values may be retrieved from a database, entered by a user, and the like. Column 228 of the scorecard shows status indicators 230.


Status indicators 230 convey the state of the KPI. An indicator may have a predetermined number of levels. A traffic light is one of the most commonly used indicators. It represents a KPI with three-levels of results—Good, Neutral, and Bad. Traffic light indicators may be colored red, yellow, or green. In addition, each colored indicator may have its own unique shape. A KPI may have one stoplight indicator visible at any given time. Other types of indicators may also be employed to provide status feedback. For example, indicators with more than three levels may appear as a bar divided into sections, or bands. Column 232 includes trend type arrows as explained above under KPI attributes. Column 234 shows another KPI attribute, frequency.



FIG. 3 is a screenshot of an example scorecard application with an example scorecard. The example scorecard application may be part of a business logic service that collects, processes, and analyses performance data from various aspects of an organization.


The user interface of the scorecard application as shown in the screenshot include controls 354 for performing actions such as formatting of data, view options, actions on she presented information, and the like. The main portion of the user interface displays scorecard 358 “Contoso Corporate Scorecard”. The scorecard includes metrics such as “Internet Sales Amount”, “Internet Order Quantity”, “Customer Count”, and the like in column 362. Columns 364 and 366 respectively display actuals and targets for the category of “Accessories” for each of the listed metrics. Columns 368 and 372 display the actuals for the categories “Bikes” and “Female” (referring to female bikes).


Side panel 352 titled “Workspace Browser” provides a selection of available KPIs as well as elements of the scorecard such indicators and reports that are associated with the selected scorecard. Other side panel 356 provides additional details about available scorecard elements such as a collapsible list of KPIs, targets, and dimension combinations. A scorecard application, as discussed its further detail below, may include additional aspects of the scorecard such as different visualizations, linked information (geography, time and date, contact information, etc.), commentary, and so on.


According to some embodiments, data and/or views of portions of the presented scorecard may be exported to a desktop visualization application for providing personalized and scalable metric information to a user.



FIG. 4 illustrates a screenshot of an example desktop visualization application for performance metrics. As described above, a user interface of a scorecard application typically includes a number of controls and elements conveying significant amounts of information to the user and enabling the user to interact with various aspects of performance metric computation. On the other hand, desktop visualization applications may be used to convey limited information associated with selected aspects of performance metrics.


Due to their limited characters information conveyed by the desktop visualization applications is typically highly visual (e.g. a clock screen, a gauge, a number of flags, and the like). Thus, desktop visualization application for performance metrics according to embodiments may utilize elements test present a selected aspect of the information in a simplistic and visual manner. As discussed in more detail below, such elements may include gauges, traffic lights, flags, small bar graphs, and the like. Colors, shapes, video and audio may also fee used to convey information associated with a performance metric. For example, a video feed gadget may provide live or recorded view of a production line which is the subject of the performance metric. A colored pie chart shaped desktop visualization application may provide up-to-date status of a selected metric (actual vs. target(s)).


While desktop visualization applications include limited and simple user interfaces relative to full-capacity application user interlaces, they may also vary in complexity. Example desktop visualization application 482 illustrates a more complex implementation, where a portion of a scorecard is illustrated by the desktop visualization application. This enables the user to get a glimpse of some of the metrics, their actuals & targets, as well as statuses. Unlike a scorecard application user interface, desktop visualization application 482 provides limited controls for the user, however. For example, the user cannot click on individual scorecard elements and access the underlying data, make modifications, access annotations, and the like.


Some controls may be provided to the user, nevertheless. For example, a small information panel 484 may be opened up adjacent to the desktop visualization application 482 in response to the user clicking on the desktop visualization application and provide highlights of the scorecard calculations with the latest data (e.g. customer complaints are up 20.5%, over budget on FY06 spend by 30.4%, etc.).


Another example control is information panel 486, which provides a listing of issues and warnings associated with the scorecard to the user. The list of issues and warnings of information panels and the highlights of information panel 484 may include graphical symbols (e.g. icons) in addition to the textual information. Two items are highlighted in the information panels 484 and 486. The highlighted items illustrate another feature that may be provided according to some embodiments.


The desktop visualization application or its information panels may be associated with applications on the client device or remote applications provided by a hosted business service. These applications may be activated by the user selecting an item from the information panels (or the desktop visualization application itself) such as “Discounting out of tolerances” 485 on information panel 484 or “issues” 487 on information panel 486. The user can then perform more detailed operations associated with the selected items.



FIG. 5 illustrates a screenshot of an indicator selection wizard for a desktop visualization application for performance metrics. Desktop visualization applications according to embodiments may be provided with preselected indicators coupled to predefined metrics or analysis results. According to other embodiments, the user may be enabled to select one or more indicators to be used by a desktop visualization application and define scorecard elements (and/or analysis results) to be reflected by the selected indicators. The indicator selection wizard is one method of implementing such a selection.


Category panel 592 lists available indicators by category in a collapsible list format. The list of available indicators may also be provide using other formats. In the example screenshot, “Miscellaneous” category 596 is selected under the main group of Standard Indicates.


Template panel 594 includes visual representations of available indicators in this category. The indicators include circled check marks, check marks, flags, pie chart icons, and the like. As mentioned previously, desktop visualization applications may include single or multiple indicators. The indicators employ geometric units to visualize business performance and show magnitude, patterns of structured and unstructured data, interrelationships, causalities, and dependencies. Through vitalizing outputs of quantitative models business users may be enabled to make faster, more relevant decisions based on date that is readily interpreted.


The example indicator selection wizard may be part of an embeddable authoring user interface for generating performance metric based visualizations. For example, the wizard may provide a selection of objects from a graphics application such as VISIO® by MICROSOFT CORP. of Redmond, Wash.



FIG. 6 illustrates another screenshot of an indicator selection wizard for a desktop visualization application for performance metrics. In the example screenshot of FIG. 6, All Indicators category is selected by the user causing all available indicators to be listed in Template panel 694.


The example indicators in Template panel 694 include bar indicators, traffic light indicators, traffic sign indicators, which utilize shape and color to convey performance metric information to the user.



FIG. 7 illustrates example desktop visualization applications and expansion to various applications from a selected desktop visualization application. Desktop visualization application 702 is an example of a single indicator desktop visualization application that shows the status of a metric with a bar indicator. The bar indicator may fill up based on a comparison of a actual and a target associated with the metric.


Desktop visualization application 704 is another single indicator desktop visualization application. A circular meter style indicator provides visual representation of a metric's status to the user. Desktop visualization application 706 is an example of a miniature chart type desktop visualization application. A simplified bar chart may display the status of multiple metrics or multiple targets of a single metrics, or even temporal change of a metric over a defined data range. An indicator arrow may further provide trend information.


Desktop visualization application 706 is an example of a composite desktop visualization application. The desktop visualization application includes a thermometer indicator showing the status of a metric, A background color of the desktop visualization application and/or a fill-color of the thermometer indicator may former reflect the status according to a color code (e.g. green for on target, yellow for slightly off target, and red for off target). The desktop visualization application may further be configured to adjust its shape according to a status of the metric. For example, the outline of the desktop visualization application may take traffic sign style shapes based on the status (e.g. round for on target, triangular for slightly off target, hexagonal for off target).



FIG. 7 further illustrates another feature of performance metric based desktop visualization applications according to embodiments. For example, the user may be provided an option to activate a communication application 712, a presentation application 714, or trend analysis application 716. The applications may include a number of local applications on the client device or remote application hosted by a business service. According to further embodiments, hypertext links may also be provided in response to a user selection associated with the desktop visualization application (through a drop down menu, selection of a portion of the desktop visualization application, and so on).



FIG. 8 illustrates an example service-based architecture in which the desktop visualization application for performance metrics may be deployed. Service based architectures are an increasingly popular style for building software applications that use services available in a network such as the web. They promote loose coupling between software components so that they can be reused. Applications in a service based architecture are built based on services. A service is an implementation of a well-defined business functionality, and such services can then be consumed by clients in different applications or business processes.


A service based architecture allows for the reuse of existing assets where new services can be created from an existing infrastructure of systems. In other words, it enables businesses to leverage existing investments by allowing them to reuse existing applications, and promises interoperability between heterogeneous applications and technologies. Service based architectures provide a level of flexibility in the sense that services are software components with well-defined interfaces that are implementation-independent. An important aspect of service based architecture is the separation of the service interface from its implementation. Such services are consumed by clients that are not concerned with how these services will execute their requests. Services are commonly self-contained (perform predetermined tasks) and loosely coupled. Furthermore, services can be dynamically discovered, and composite services can be built from aggregates of other services.


A service based architecture uses a find-bind-execute paradigm. In this paradigm, service providers register their service in a public registry. This registry is used by consumers to find, services that match certain criteria. If the registry has such a service, it provides the consumer with a contract and an endpoint address for that service. Service based applications are typically distributed multi-tier applications that have presentation, business logic, and persistence layers. Services are the building blocks of service based applications.


In FIG. 8, date associated with performance metric calculations may be stored and provided by scorecard database(s) 846 managed by scorecard database server 851. Scorecard database server 851 may manage, exchange of scorecard data based granular, role based permissions. Source data for metric calculations and statistical analyses may be provided by data sources 848.


Data sources 848 may include business models database server 852, analysis services database server(s) 853, tables server 854, lists server files server 856 (e.g. text files, spreadsheet files, and the like), and so on. The data sources may be managed by one or more servers of any type discussed herein. The scorecard database server and data source servers may communicate with servers managing performance metric services through a secure network communication protocol such as HTTPS 838.


Performance metric services may include scorecard service managed by scorecard server(s) 840. Scorecard server(s) 840 may also provide web services. Reporting services may be provided by one or more reporting servers 842. Reporting services may include providing results of statistical analyses, performance metric computations, presentations, and the like in various formats based on subscriber permissions, profiles, client devices, and client applications. According to an example embodiment, data may be provided to one or more desktop visualization applications installed on a user's desktop such that a limited presentation is portrayed on the user's desktop.


Moreover, shared services servers 844 may manage shared services that enable individual users to access the scorecard services, presentations, and data through client devices 836. Client devices 836 may include specialized applications or web applications to facilitate the communication through a secure protocol such as HTTPS 838.


Scorecard computations may also be performed in coordination with scorecard server(s) 840 by a client application on client device 834 communicating with the scorecard servers through HTTPS 838. As illustrated by reporting servers 842 and shared services servers 844, some or all of the servers at different levels of the architecture may support web farming.


Referring now to the following figures, aspects and exemplary operating environments will be described. FIG. 9, FIG. 10, and the associated discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.



FIG. 9 is a diagram of a networked environment where embodiments may be implemented. The system may comprise any topology of servers, clients, Internet service providers, and communication media. Also, the system may have a static or dynamic topology. The term “client” may refer to a client application or a client device employed by a user to perform operations associated with rendering performance metric data using geometric objects. While a networked business logic system may involve many more components, relevant ones are discussed in conjunction with this figure.


In a typical operation according to embodiments, business logic service may be provided centrally from server 972 or in a distributed manner over several servers (e.g. servers 972 and 974) and/or client devices. Server 972 may include implementation of a number of information systems such as performance measures, business scorecards, and exception repotting. A number of organization-specific applications including, but not limited to, financial reporting/analysis, booking, marketing analysis, customer service, and manufacturing planning applications may also be configured, deployed, and shared in the networked system.


Data sources 961-963 are examples of a number of data sources that may provide input to server 972. Additional data sources may include SQL servers, databases, non multi-dimensional data sources such as test files or EXCEL® sheets, multi-dimensional data source such as data cubes, and the like.


Users may interact with the server running the business logic service from client devices 965-967 over network 970. Users may also directly access the data from server 972 and perform analysis on their own machines. In some embodiments, users may set up personalized desktop visualization applications displayed on the client devices 965-967 that receive data (and/or views) from the business logic service and provide scalable views of metrics.


Client devices 965-967 or servers 972 and 974 may be in communications with additional client devices or additional servers over network 970. Network 970 may include a secure network such as an enterprise network, an unsecure network such as a wireless open network, or the Internet. Network 970 provides communication between the nodes described herein. By way of example, and not limitation, network 970 may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.


Many other configurations of computing devices, applications, data sources, data distribution and analysis systems may be employed to implement providing personalized and scalable performance metric information using desktop visualization applications in a service based architecture. Furthermore, the networked environments discussed in FIG. 9 are for illustration purposes only. Embodiments are not limited to the example applications, modules, or processes. A networked environment for may be provided in many other ways using the principles described herein.


With reference to FIG. 10, a block diagram of an example computing operating environment is illustrated, such as computing device 1000. In a basic configuration, the computing device 1000 typically includes at least one processing unit 1002 and system memory 1004. Computing device 1000 may include a plurality of processing units that cooperate in executing programs. Depending on the exact configuration and type of computing device, the system memory 1004 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. System memory 1004 typically includes an operating system 1005 suitable for controlling the operation of a networked personal computer, such as the WINDOWS® operating systems from MICROSOFT CORPORATION of Redmond, Wash. The system memory 1004 may also include one or more software applications such as program modules 1006, business logic application 1022, desktop visualization application 1024, and optional presentation application 1026.


Business logic application 1022 may fee any application that processes and generates scorecards and associated data. Desktop visualization application 1024 may be a limited user interface associated with a module within business logic application 1022, and independent module, or the operating system. Desktop visualization application 1024 may provide personalized and scalable performance metric information based on data received from business logic application 1022 or another source. Presentation application 1026 or business logic application 1022 may be associated with desktop visualization application 1024 such that user interfaces or either application may be activated upon user selection of an aspect of desktop visualization application 1024. Presentation application 1026 or business logic application 1022 may be executed in an operating system other than operating system 1005. This basic configuration is illustrated in FIG. 10 by those components within dashed line 1008.


The computing device 1000 may have additional features or functionality. For example, the computing device 1000 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 10 by removable storage 1009 and non-removable storage 1010. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 1004, removable storage 1009 and non-removable storage 1010 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 1000. Any such computer storage media may be part of device 1000. Compiling device 1000 may also have input device(s) 1012 such as keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 1014 such as a display, speakers, printer, etc. may also be included. These devices are well known in the art and need not be discussed at length here.


The computing device 1000 may also contain communication connections 1016 that allow the device to communicate with other computing devices 1018, such as over a network in a distributed computing environment, for example, an intranet or the Internet. Communication connection 1016 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.


The claimed subject matter also includes methods. These methods can be implemented in any number of ways, including the structures described in this document. One such way is by machine operations, of devices of the type described in this document.


Another optional way is for one or more of the individual operations of the methods to be performed in conjunction with one or more human operators performing some. These human operators need not be collocated with each other, but each can be only with a machine that performs a portion of the program.



FIG. 11 illustrates a logic flow diagram for a process of employing desktop visualization application for performance metrics in a service-based architecture. Process 1100 may be implemented in a service architecture based business logic service.


Process 1100 begins with optional operation 1102, where a desktop visualization application is configured. The application may be configured by a user or by a hosted business service automatically based on a user profile. Processing advances from optional operation 1102 to optional operation 1104.


At optional operation 1104, the desktop visualization application is registered with the hosted business service. Registration may include authorization based on a user permission, user profile, or a network communication status. Processing proceeds from optional operation 1104 to operation 1106.


At operation 1106, performance metric data is received from the hosted business service. A portion of the data may also be received fern a local data store. Processing moves from operation 1106 to operation 1108.


At operation 1108, a visualization based on the received performance metric data is rendered. As described previously, one or more indicators, icons, miniature charts, audio, and video may fee used to render the visualization. Processing advances to optional operation 1110 from operation 1108.


At optional operation 1110, a user selection is received by the desktop visualization application. The user selection may be the user clicking on a portion of the visualization, selecting a checkbox, a radio button, or any other icon. The selection may also include the user clicking on a portion of a rendered view (e.g. a partial scorecard view). Processing advances to optional operation 1112 from optional operation 1110.


At optional operation 1112, an application associated with the rendered visualization is activated in response to the user selection. The application may be a local application or a remote application managed by the hosted business service. After operation 1112, processing moves to a calling process for further actions.


The operations included in process 1100 are for illustration purposes. Rendering visualizations by a desktop visualization application based on performance metric data from a hosted business service may be implemented by similar processes with fewer or additional steps, as well as in different order of operations using the principles described herein.


The above specification, examples and data provide a complete description of the manufacture and use of the composition of the embodiments. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims and embodiments.

Claims
  • 1. A method to be executed at least in part in a computing device for rendering a performance metric based visualization, the method comprising: receiving performance metric data at a desktop visualization application from a hosted business service, wherein the desktop visualization application is registered with the hosted business service;providing a list of user-selectable indicators associated with visualizing the performance metric data;receiving a selection of at least one user-selectable indicator;receiving a definition of at least one performance metric data element to be reflected by the at least one user-selectable indicator;rendering a visualization based on the received performance metric data by the desktop visualization application;receiving a first user selection of the at least one performance indicator;providing, in response to receiving the first user selection, an information panel providing additional information associated with the at least one performance indicator;receiving a second user selection of the additional information; andactivating at least one application associated with the additional information.
  • 2. The method of claim 1, further comprising: enabling, a user to configure at least one of a layout and a data delivery attribute of the desktop visualization application; andregistering the desktop visualization application with the hosted business service.
  • 3. The method of claim 1, further comprising: authorizing data delivery to the desktop visualization application based on at least one from a set of: a user permission, a user profile, and a network connection status.
  • 4. The method of claim 1, further comprising: receiving a user selection by the desktop visualization application; andactivating an application associated with the received user selection.
  • 5. The method of claim 4, wherein activating the application associated with the received user selection comprises activating one of a local application and a remote application managed by the hosted business service.
  • 6. The method of claim 5, wherein activating the local application and the remote application comprises activating one from a set of: a presentation application, an analysis application, a communication application, a spreadsheet application, and a graphics application.
  • 7. The method of claim 1,further comprising employing, by the desktop visualization application, at least one from a set of: an indicator set, a color scheme, a shape scheme, a video stream, and audio to render the visualization.
  • 8. The method of claim 7, wherein employing the indicator set comprises employing one of: a gauge, a flag, a meter, a progress bar, a pie chart, a traffic sign, a thermometer, and a smiley face.
  • 9. The method of claim 1, further comprising: caching the received data at a client device executing the desktop visualization application.
  • 10. The method of claim 1, further comprising: partitioning service providers for performance metric services using virtualized instances.
  • 11. The method of claim 1, wherein the desktop visualization application is provided as a plug-in module.
  • 12. The method of claim 1, wherein the desktop visualization application is configured, to receive one of a live data feed and an aggregation of dependent data feeds.
  • 13. The method of claim 1, wherein the desktop visualization application is configured to be launched from a thin client using user defined parameters.
  • 14. The method of claim 1, further comprising: tracking desktop visualization application activities for billing a user by at least one of the hosted business service and a third service provider.
  • 15. A system for rendering a performance metric based visualization, the system comprising: a memory;a processor coupled to the memory, wherein the processor is configured to: provide, to a desktop visualization application, a list of user-selectable indicators associated with the visualizing performance metric data;receive a selection of at least one user-selectable indicator;configure at least one parameter to be reflected by the at least one user-selectable indicator of the desktop visualization application based on one of a user input and a preset profile;authorize data delivery to the desktop visualization application based on a user permission;provide performance metric data to the desktop visualization application;enable caching of the performance metric data;enable rendering of a visualization based on the provided performance metric data;receive a first user selection of the at least one performance indicator;provide, in response to receiving the first user selection, an information panel providing additional information associated with the at least one performance indicator;receive a second user selection of the additional information;activate at least one application associated with the additional information.
  • 16. The system of claim 15, wherein the client device includes one of: a mobile computing device, an ultra-mobile computing device, a personal digital assistant, an in-car computing system, and a tablet computing device.
  • 17. The system of claim 15, wherein the processor is further configured to automatically provide at least one of a service update and a security update to the desktop visualization application.
  • 18. A computer-readable storage medium with instructions stored thereon which when executed perform a method for rendering a performance metric based visualization, the method executed by the instructions comprising: provide at least one user-selectable indicator associated with the performance metric data to a desktop visualization application;configuring a desktop visualization application including the at least one user-selectable indicator at least in part based on a user input;registering the desktop visualization application with a hosted business service;providing performance metric data to the desktop visualization application from the hosted business service;enabling caching of the performance data at a client device executing the desktop visualization application;enabling rendering of a visualization by the desktop visualization application based on the provided performance metric data; wherein a rendered visualization conveys a trend of the performance metric data and a comparison of actual data values associated with the performance metric data with target data values associated with the performance metric data;providing to the desktop visualization one of a live media and recorded media of a subject of the performance metric;activating an information panel adjacent to the desktop visualization application in response to receiving a first user selection through the desktop visualization, the information panel conveying additional information associated with the performance metric data, wherein the additional information comprises at least one of: issues, warnings, and highlights associated with the performance metric data;receiving a user selection of the additional information;activating at least one application associated with the additional information; andtracking activities associated with the visualization for billing the user by the hosted business service.
  • 19. The computer-readable storage medium of claim 18, wherein the desktop visualization application includes an embeddable user interface embedded into an application on the client device executing the desktop visualization application.
  • 20. The computer-readable storage medium of claim 18, wherein the hosted business service further provides an online collaboration service associated with the visualization.
US Referenced Citations (416)
Number Name Date Kind
5018077 Healey May 1991 A
5233552 Brittan Aug 1993 A
5253362 Nolan et al. Oct 1993 A
5404295 Katz et al. Apr 1995 A
5473747 Bird et al. Dec 1995 A
5615347 Davis et al. Mar 1997 A
5675553 O'Brien, Jr. et al. Oct 1997 A
5675782 Montague et al. Oct 1997 A
5680636 Levine et al. Oct 1997 A
5758351 Gibson et al. May 1998 A
5764890 Glasser et al. Jun 1998 A
5779566 Wilens Jul 1998 A
5797136 Boyer et al. Aug 1998 A
5819225 Eastwood et al. Oct 1998 A
5832504 Tripathi et al. Nov 1998 A
5838313 Hou et al. Nov 1998 A
5845270 Schatz et al. Dec 1998 A
5877758 Seybold Mar 1999 A
5911143 Deinhart et al. Jun 1999 A
5926794 Fethe Jul 1999 A
5941947 Brown et al. Aug 1999 A
5943666 Kleewein et al. Aug 1999 A
5956691 Powers Sep 1999 A
6012044 Maggioncalda et al. Jan 2000 A
6023714 Hill et al. Feb 2000 A
6061692 Thomas et al. May 2000 A
6115705 Larson Sep 2000 A
6119137 Smith et al. Sep 2000 A
6141655 Johnson et al. Oct 2000 A
6163779 Mantha et al. Dec 2000 A
6182022 Mayle et al. Jan 2001 B1
6216066 Goebel et al. Apr 2001 B1
6226635 Katariya May 2001 B1
6230310 Arrouye et al. May 2001 B1
6233573 Bair et al. May 2001 B1
6249784 Macke Jun 2001 B1
6308206 Singh Oct 2001 B1
6321206 Honarvar Nov 2001 B1
6341277 Coden et al. Jan 2002 B1
6345279 Li et al. Feb 2002 B1
6389434 Rivette May 2002 B1
6393406 Eder May 2002 B1
6421670 Fourman Jul 2002 B1
6463431 Schmitt Oct 2002 B1
6466935 Stuart Oct 2002 B1
6493733 Pollack Dec 2002 B1
6516324 Jones Feb 2003 B1
6519603 Bays Feb 2003 B1
6522342 Gagnon et al. Feb 2003 B1
6529215 Golovchinsky et al. Mar 2003 B2
6563514 Samar May 2003 B1
6578004 Cimral Jun 2003 B1
6601233 Underwood Jul 2003 B1
6604084 Powers et al. Aug 2003 B1
6606627 Guthrie et al. Aug 2003 B1
6628312 Rao Sep 2003 B1
6633889 Dessloch et al. Oct 2003 B2
6658432 Alavi et al. Dec 2003 B1
6665577 Onyshkevych et al. Dec 2003 B2
6677963 Mani et al. Jan 2004 B1
6687735 Logston et al. Feb 2004 B1
6687878 Eintracht Feb 2004 B1
6728724 Megiddo et al. Apr 2004 B1
6763134 Cooper et al. Jul 2004 B2
6772137 Hurwood et al. Aug 2004 B1
6775675 Nwabueze Aug 2004 B1
6782421 Soles et al. Aug 2004 B1
6785675 Graves et al. Aug 2004 B1
6804657 Sultan Oct 2004 B1
6831575 Wu et al. Dec 2004 B2
6831668 Cras Dec 2004 B2
6842176 Sang'Udi Jan 2005 B2
6850891 Forman Feb 2005 B1
6854091 Beaudoin Feb 2005 B1
6859798 Bedell et al. Feb 2005 B1
6867764 Ludtke Mar 2005 B2
6868087 Agarwala et al. Mar 2005 B1
6874126 Lapidous Mar 2005 B1
6898603 Petculescu et al. May 2005 B1
6900808 Lassiter et al. May 2005 B2
6901426 Powers et al. May 2005 B1
6917921 Cimral et al. Jul 2005 B1
6959306 Nwabueze Oct 2005 B2
6963826 Hanaman et al. Nov 2005 B2
6968312 Jordan Nov 2005 B1
6973616 Cottrille et al. Dec 2005 B1
6976086 Sadeghi et al. Dec 2005 B2
6988076 Ouimet Jan 2006 B2
6995768 Jou Feb 2006 B2
7013285 Rebane Mar 2006 B1
7015911 Shaughnessy et al. Mar 2006 B2
7027051 Alford et al. Apr 2006 B2
7043524 Shah et al. May 2006 B2
7058638 Singh Jun 2006 B2
7065784 Hopmann et al. Jun 2006 B2
7079010 Champlin Jul 2006 B2
7158628 McConnell et al. Jan 2007 B2
7181417 Langseth et al. Feb 2007 B1
7200595 Dutta et al. Apr 2007 B2
7216116 Nilsson et al. May 2007 B1
7222308 Sauermann et al. May 2007 B2
7224847 Zhang et al. May 2007 B2
7249120 Bruno et al. Jul 2007 B2
7275024 Yeh et al. Sep 2007 B2
7302421 Aldridge Nov 2007 B2
7302431 Apollonsky et al. Nov 2007 B1
7302444 Dunmore et al. Nov 2007 B1
7313561 Lo et al. Dec 2007 B2
7340448 Santosuosso Mar 2008 B2
7349862 Palmer et al. Mar 2008 B2
7349877 Ballow et al. Mar 2008 B2
7359865 Connor et al. Apr 2008 B1
7383247 Li et al. Jun 2008 B2
7398240 Ballow et al. Jul 2008 B2
7406431 Spira et al. Jul 2008 B2
7409357 Schaf et al. Aug 2008 B2
7412398 Bailey Aug 2008 B1
7433876 Spivack et al. Oct 2008 B2
7440976 Hart et al. Oct 2008 B2
7454393 Horvitz et al. Nov 2008 B2
7496852 Eichorn et al. Feb 2009 B2
7496857 Stata et al. Feb 2009 B2
7509343 Washburn et al. Mar 2009 B1
7546226 Yeh et al. Jun 2009 B1
7546246 Stamm et al. Jun 2009 B1
7548912 Gideoni et al. Jun 2009 B2
7559023 Hays et al. Jul 2009 B2
7568217 Prasad et al. Jul 2009 B1
7587665 Crow et al. Sep 2009 B2
7587755 Kramer Sep 2009 B2
7599848 Wefers et al. Oct 2009 B2
7613625 Heinrich Nov 2009 B2
7617177 Bukary et al. Nov 2009 B2
7617187 Zhu et al. Nov 2009 B2
7630965 Erickson et al. Dec 2009 B1
7634478 Yang et al. Dec 2009 B2
7636709 Srikant et al. Dec 2009 B1
7640506 Pratley et al. Dec 2009 B2
7660731 Chaddha et al. Feb 2010 B2
7667582 Waldorf Feb 2010 B1
7685207 Helms Mar 2010 B1
7694270 Manikotia et al. Apr 2010 B2
7698349 Hulen et al. Apr 2010 B2
7702554 Ballow et al. Apr 2010 B2
7702779 Gupta et al. Apr 2010 B1
7707490 Hays et al. Apr 2010 B2
7716253 Netz et al. May 2010 B2
7716278 Beringer et al. May 2010 B2
7716571 Tien et al. May 2010 B2
7716592 Tien et al. May 2010 B2
7725947 Bukary et al. May 2010 B2
7730023 MacGregor Jun 2010 B2
7730123 Erickson et al. Jun 2010 B1
7739148 Suzuki et al. Jun 2010 B2
7747572 Scott et al. Jun 2010 B2
7752094 Davidson et al. Jul 2010 B2
7752301 Maiocco et al. Jul 2010 B1
7778910 Ballow et al. Aug 2010 B2
7788280 Singh et al. Aug 2010 B2
7792774 Friedlander et al. Sep 2010 B2
7822662 Guzik et al. Oct 2010 B2
7831464 Nichols et al. Nov 2010 B1
7840896 Tien et al. Nov 2010 B2
7848947 McGloin et al. Dec 2010 B1
7899833 Stevens et al. Mar 2011 B2
7899843 Dettinger et al. Mar 2011 B2
7904797 Wong et al. Mar 2011 B2
8126750 Tien et al. Feb 2012 B2
8190992 Tien et al. May 2012 B2
20010004256 Iwata et al. Jun 2001 A1
20010051835 Cline Dec 2001 A1
20010054046 Mikhailov et al. Dec 2001 A1
20020029273 Haroldson et al. Mar 2002 A1
20020038217 Young Mar 2002 A1
20020049621 Bruce Apr 2002 A1
20020052740 Charlesworth May 2002 A1
20020052862 Scott et al. May 2002 A1
20020059267 Shah May 2002 A1
20020078175 Wallace Jun 2002 A1
20020087272 Mackie Jul 2002 A1
20020091737 Markel Jul 2002 A1
20020099578 Eicher et al. Jul 2002 A1
20020099678 Albright et al. Jul 2002 A1
20020103976 Steely et al. Aug 2002 A1
20020112171 Ginter et al. Aug 2002 A1
20020133368 Strutt et al. Sep 2002 A1
20020147803 Dodd et al. Oct 2002 A1
20020161614 Spira et al. Oct 2002 A1
20020169658 Adler Nov 2002 A1
20020169799 Voshell Nov 2002 A1
20020177784 Shekhar Nov 2002 A1
20020178119 Griffin et al. Nov 2002 A1
20020184043 Lavorgna et al. Dec 2002 A1
20020184061 Digate et al. Dec 2002 A1
20020188513 Gil et al. Dec 2002 A1
20020194042 Sands Dec 2002 A1
20020194090 Gagnon et al. Dec 2002 A1
20020194329 Alling Dec 2002 A1
20020198985 Fraenkel et al. Dec 2002 A1
20030004742 Palmer et al. Jan 2003 A1
20030014290 McLean et al. Jan 2003 A1
20030014488 Dalal et al. Jan 2003 A1
20030028419 Monaghan Feb 2003 A1
20030033191 Davies et al. Feb 2003 A1
20030040936 Nader et al. Feb 2003 A1
20030055731 Fouraker et al. Mar 2003 A1
20030055927 Fischer et al. Mar 2003 A1
20030061132 Yu et al. Mar 2003 A1
20030065604 Gatto Apr 2003 A1
20030065605 Gatto Apr 2003 A1
20030069773 Hladik et al. Apr 2003 A1
20030069824 Menninger Apr 2003 A1
20030071814 Jou et al. Apr 2003 A1
20030078830 Wagner et al. Apr 2003 A1
20030093423 Larason et al. May 2003 A1
20030110249 Buus et al. Jun 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030146937 Lee Aug 2003 A1
20030149696 Nelson et al. Aug 2003 A1
20030182181 Kirkwood Sep 2003 A1
20030187675 Hack Oct 2003 A1
20030195878 Neumann Oct 2003 A1
20030204430 Kalmick et al. Oct 2003 A1
20030204487 Sssv Oct 2003 A1
20030212960 Shaughnessy et al. Nov 2003 A1
20030225604 Casati et al. Dec 2003 A1
20030226107 Pelegri-Llopart Dec 2003 A1
20030236732 Cimral et al. Dec 2003 A1
20040021695 Sauermann et al. Feb 2004 A1
20040030741 Wolton et al. Feb 2004 A1
20040030795 Hesmer et al. Feb 2004 A1
20040033475 Mizuma et al. Feb 2004 A1
20040044665 Nwabueze Mar 2004 A1
20040044678 Kalia et al. Mar 2004 A1
20040059518 Rothschild Mar 2004 A1
20040064293 Hamilton et al. Apr 2004 A1
20040066782 Nassar Apr 2004 A1
20040068429 MacDonald Apr 2004 A1
20040068431 Smith et al. Apr 2004 A1
20040083246 Kahlouche et al. Apr 2004 A1
20040093296 Phelan et al. May 2004 A1
20040102926 Adendorff May 2004 A1
20040117731 Blyashov Jun 2004 A1
20040119752 Berringer et al. Jun 2004 A1
20040128150 Lundegren Jul 2004 A1
20040135825 Brosnan Jul 2004 A1
20040138944 Whitacre et al. Jul 2004 A1
20040162772 Lewis Aug 2004 A1
20040164983 Khozai Aug 2004 A1
20040172323 Stamm Sep 2004 A1
20040183800 Peterson Sep 2004 A1
20040199541 Goldberg et al. Oct 2004 A1
20040204913 Mueller et al. Oct 2004 A1
20040210574 Aponte et al. Oct 2004 A1
20040212636 Stata et al. Oct 2004 A1
20040215626 Colossi et al. Oct 2004 A1
20040225571 Urali Nov 2004 A1
20040225955 Ly Nov 2004 A1
20040230463 Boivin Nov 2004 A1
20040230471 Putnam Nov 2004 A1
20040249482 Abu El Ata et al. Dec 2004 A1
20040249657 Koi et al. Dec 2004 A1
20040252134 Bhatt et al. Dec 2004 A1
20040254806 Schwerin-Wenzel et al. Dec 2004 A1
20040254860 Wagner et al. Dec 2004 A1
20040260582 King Dec 2004 A1
20040260717 Albornoz et al. Dec 2004 A1
20040268228 Croney et al. Dec 2004 A1
20050004781 Price et al. Jan 2005 A1
20050012743 Kapler Jan 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050049831 Lilly Mar 2005 A1
20050049894 Cantwell et al. Mar 2005 A1
20050055257 Senturk et al. Mar 2005 A1
20050060048 Pierre Mar 2005 A1
20050060300 Stolte et al. Mar 2005 A1
20050060325 Bakalash Mar 2005 A1
20050065925 Weissman et al. Mar 2005 A1
20050065930 Swaminathan et al. Mar 2005 A1
20050065967 Schuetze et al. Mar 2005 A1
20050071680 Bukary et al. Mar 2005 A1
20050071737 Adendorff et al. Mar 2005 A1
20050091093 Bhaskaran Apr 2005 A1
20050091253 Cragun Apr 2005 A1
20050091263 Wallace Apr 2005 A1
20050097438 Jacobson May 2005 A1
20050097517 Goin et al. May 2005 A1
20050108271 Hurmiz et al. May 2005 A1
20050114241 Hirsch May 2005 A1
20050114801 Yang May 2005 A1
20050144022 Evans Jun 2005 A1
20050149558 Zhuk Jul 2005 A1
20050149852 Bleicher Jul 2005 A1
20050154628 Eckart et al. Jul 2005 A1
20050154635 Wright et al. Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050160356 Albornoz Jul 2005 A1
20050171835 Mook Aug 2005 A1
20050181835 Lau et al. Aug 2005 A1
20050197946 Williams et al. Sep 2005 A1
20050198042 Davis Sep 2005 A1
20050203876 Cragun et al. Sep 2005 A1
20050209943 Ballow et al. Sep 2005 A1
20050209945 Ballow et al. Sep 2005 A1
20050209946 Ballow et al. Sep 2005 A1
20050209948 Ballow et al. Sep 2005 A1
20050210052 Aldridge Sep 2005 A1
20050216831 Guzik Sep 2005 A1
20050228880 Champlin Oct 2005 A1
20050240467 Eckart Oct 2005 A1
20050240898 Mankotia et al. Oct 2005 A1
20050256825 Dettinger Nov 2005 A1
20050262051 Dettinger et al. Nov 2005 A1
20050262451 Remignanti et al. Nov 2005 A1
20050272022 Montz, Jr. et al. Dec 2005 A1
20050273762 Lesh Dec 2005 A1
20050289452 Kashi Dec 2005 A1
20060004555 Jones Jan 2006 A1
20060004731 Seibel et al. Jan 2006 A1
20060009990 McCormick Jan 2006 A1
20060010032 Eicher et al. Jan 2006 A1
20060010164 Netz et al. Jan 2006 A1
20060020531 Veeneman et al. Jan 2006 A1
20060026179 Brown et al. Feb 2006 A1
20060036455 Prasad Feb 2006 A1
20060036595 Gilfix et al. Feb 2006 A1
20060047419 Diendorf et al. Mar 2006 A1
20060059107 Elmore et al. Mar 2006 A1
20060074789 Capotosto et al. Apr 2006 A1
20060080156 Baughn et al. Apr 2006 A1
20060085444 Sarawgi et al. Apr 2006 A1
20060089868 Griller et al. Apr 2006 A1
20060089894 Balk et al. Apr 2006 A1
20060089939 Broda et al. Apr 2006 A1
20060095276 Axelrod et al. May 2006 A1
20060095915 Clater May 2006 A1
20060111921 Chang et al. May 2006 A1
20060112123 Clark et al. May 2006 A1
20060112130 Lowson May 2006 A1
20060123022 Bird Jun 2006 A1
20060136830 Martlage et al. Jun 2006 A1
20060154692 Ikehara et al. Jul 2006 A1
20060161471 Hulen et al. Jul 2006 A1
20060161596 Chan et al. Jul 2006 A1
20060167704 Nicholls et al. Jul 2006 A1
20060178897 Fuchs Aug 2006 A1
20060178920 Muell Aug 2006 A1
20060195424 Wiest et al. Aug 2006 A1
20060206392 Rice, Jr. et al. Sep 2006 A1
20060224325 Conway et al. Oct 2006 A1
20060229925 Chalasani et al. Oct 2006 A1
20060230234 Bentolila et al. Oct 2006 A1
20060233348 Cooper Oct 2006 A1
20060235732 Miller et al. Oct 2006 A1
20060235778 Razvi et al. Oct 2006 A1
20060253475 Stewart et al. Nov 2006 A1
20060259338 Rodrigue et al. Nov 2006 A1
20060265377 Raman et al. Nov 2006 A1
20060271583 Hulen et al. Nov 2006 A1
20060277128 Anandarao et al. Dec 2006 A1
20060282819 Graham et al. Dec 2006 A1
20060288211 Vargas et al. Dec 2006 A1
20070021992 Konakalla Jan 2007 A1
20070022026 Davidson et al. Jan 2007 A1
20070033129 Coates Feb 2007 A1
20070038934 Fellman Feb 2007 A1
20070050237 Tien et al. Mar 2007 A1
20070055564 Fourman Mar 2007 A1
20070055688 Blattner Mar 2007 A1
20070067381 Grant et al. Mar 2007 A1
20070112607 Tien et al. May 2007 A1
20070143161 Tien et al. Jun 2007 A1
20070143174 Tien et al. Jun 2007 A1
20070143175 Tien et al. Jun 2007 A1
20070156680 Tien et al. Jul 2007 A1
20070168323 Dickerman et al. Jul 2007 A1
20070174330 Fox et al. Jul 2007 A1
20070225986 Bowe et al. Sep 2007 A1
20070234198 Tien et al. Oct 2007 A1
20070239508 Fazal et al. Oct 2007 A1
20070239573 Tien et al. Oct 2007 A1
20070239660 Tien et al. Oct 2007 A1
20070254740 Tien et al. Nov 2007 A1
20070255681 Tien et al. Nov 2007 A1
20070260625 Tien et al. Nov 2007 A1
20070265863 Tien et al. Nov 2007 A1
20070266042 Hsu et al. Nov 2007 A1
20070282673 Nagpal et al. Dec 2007 A1
20080005064 Sarukkai Jan 2008 A1
20080040309 Aldridge Feb 2008 A1
20080059441 Gaug et al. Mar 2008 A1
20080086345 Wilson et al. Apr 2008 A1
20080086359 Holton et al. Apr 2008 A1
20080109270 Shepherd et al. May 2008 A1
20080115103 Datars et al. May 2008 A1
20080140623 Tien et al. Jun 2008 A1
20080162209 Gu et al. Jul 2008 A1
20080162210 Gu et al. Jul 2008 A1
20080163066 Gu et al. Jul 2008 A1
20080163099 Gu et al. Jul 2008 A1
20080163125 Gu et al. Jul 2008 A1
20080163164 Chowdhary et al. Jul 2008 A1
20080168376 Tien et al. Jul 2008 A1
20080172287 Tien et al. Jul 2008 A1
20080172348 Tien et al. Jul 2008 A1
20080172414 Tien et al. Jul 2008 A1
20080172629 Tien et al. Jul 2008 A1
20080183564 Tien et al. Jul 2008 A1
20080184099 Tien et al. Jul 2008 A1
20080189632 Tien et al. Aug 2008 A1
20080189724 Tien et al. Aug 2008 A1
20080243597 Ballow et al. Oct 2008 A1
20080288889 Hunt et al. Nov 2008 A1
20090300110 Chene et al. Dec 2009 A1
20100262659 Christiansen et al. Oct 2010 A1
20120150905 Tien et al. Jun 2012 A1
Foreign Referenced Citations (14)
Number Date Country
1128299 Aug 2001 EP
1050829 Mar 2006 EP
WO 9731320 Aug 1997 WO
WO 0101206 Jan 2001 WO
WO 0101206 Jan 2001 WO
WO 0165349 Sep 2001 WO
WO 0169421 Sep 2001 WO
WO 0169421 Sep 2001 WO
WO 03037019 May 2003 WO
WO 2004114177 Dec 2004 WO
WO 2004114177 Dec 2004 WO
WO 2005062201 Jul 2005 WO
WO 2005072410 Aug 2005 WO
WO 2005101233 Oct 2005 WO
Related Publications (1)
Number Date Country
20080184130 A1 Jul 2008 US