The Service Oriented Architecture (SOA) is an approach to information technology (IT) infrastructure design that provides methods for systems development and integration where systems group functionality around business processes and package these as interoperable services. A SOA infrastructure also allows different applications to exchange data with one another as the applications participate in business processes. Service-orientation aims at a loose coupling of services with operating systems, programming languages, and other technologies that underlie applications. SOA separates functions into distinct units, or services, http://en.wikipedia.org/wiki/Service-oriented_architecture—cite_note-Bell-1#cite_note-Bell-1, which developers make accessible over a network in order that a user can combine and reuse them in the production of business applications. These services communicate with each other by passing data from one service to another, or by coordinating an activity between two or more services.
Exemplary embodiments of a system and method for determining the quality of a service provided in a services registry will be described in detail with reference to the following figures, in which like numerals refer to like elements, and wherein:
An exemplary system and method are presented for determining the quality of a service catalogued within a services registry. The system and method provide a rating and scoring mechanism, which provides a set of characteristics that can be used by a consumer of the service concerned, i.e., a service consumer, to determine an overall rating. The embodiments described go beyond a simple weighted scoring technique and provide a set of axes and the associated scales as a weighting technique to specifically measure the quality rating of services as defined in a service-oriented architecture (SOA) architecture. Specifically, an embodiment of a configurable user rating system will be described that incorporates multiple sources of quality, including user ratings, testing results, operating monitoring quality, contract management, and the like. As a result, confidence is created for a service consumer of a service by helping them to understand the quality of the services being consumed.
Time-based metrics, e.g., a certain percentage increase or decrease in a parameter over time are also supported. Examples may be that defect rates are still low, test coverage is increasing, etc. How this changes in a temporal sense may impact overall actual/perceived service quality rating.
The SOA testing environment 130 may provide a defects rating dimension 132, and a test coverage rating dimension 134 to the SOA repository 110. The defects 132 may include, for example, bugs and issues. The SOA testing environment 130 tracks the defects 132, which each may have a set of properties, such as priority, severity, time-to-solve, developer or customer defect, and the like. The SOA testing environment 130 may provide an aggregation report related to the service 112 based on these properties. The service quality rating 124 from the defects perspective may be computed using aggregation techniques (e.g., low number of defects and lower severity and priority are better). The HP SOA Systinet software available from Hewlett-Packard Company is an exemplary product that integrates with defect management systems to trace defects and incidents 132 of a service.
The test coverage rating dimension 134 may be the number of tests and coverage percentage, such as 80% coverage of a service during testing. The service quality rating 124 from the test perspective can be higher with higher number of tests, i.e. greater test coverage. The HP SOA Systinet software available from Hewlett-Packard Company is an exemplary product that integrates with the SOA testing environment 130 that maintains and manages tests and their results.
The service management system 140 may provide insight into the number of incidents raised against a service. This incident frequency can be expressed as the incidents rating dimension 142 and provided to the SOA repository 110. The incidents 142 may be help desk issues that occur when the service 112 is being deployed 162, for example.
The monitoring system 150 monitors 154 the operations 160 to provide an operational usage rating dimension 152, i.e., operational usage, to the SOA repository 110. The formula to compute the quality of the operational usage 152 may be user-defined based on runtime properties. For example, a runtime property may include the percentage uptime (e.g., 99.99%) where quality is measured on a 0 (0%) to 1 (100%) scale. Alternatively, a runtime property may include the average response time for the service, where the quality is computed based on the variance of the runtime response time versus an agreed-upon Service Level Agreement (SLA).
The service consumer 170 may provide a user rating rating dimension 172, user rating, and a usage rating dimension 174, i.e., usage to the SOA repository 110. The usage rating can be as simple as a 1-5 rating scale. Alternatively, the usage rating can be a more complex multi-criteria rating where service consumers score a service across multiple dimensions, such as reliability, availability, and response time. Each service consumer 170 of the SOA repository 110 may express their own perception of the service quality (e.g., based on their own experience, behind-the-scenes knowledge, and the like). The service quality rating 124 from the user rating perspective may be computed as an average (or minimum or maximum) of all service consumers' ratings combined with the service consumers' credibility.
Additional exemplary rating dimensions are shown in
Regarding the contact and reuse dimension 206, a contract management system may capture service reuse based on a service level agreement (SLA). The service quality rating 124 from the contract and reuse perspective may be computed from the number of contracts (i.e., higher number of contracts may be better) and SLA properties (e.g., availability, response time, and the like). The overall quality of the service 112 may affect the quality received by the service consumer that is in the contract 114 recursively. A service consumer may further correlate and/or combine the SLA information with the operational usage rating dimension 152 and the user rating dimension 174.
The lifecycle stage 204 may be based on a web services policy (WS-Policy) (e.g., a policy need to be fulfilled) and an approval policy (e.g., configurable number of approvers need to approve a stage change). Each lifecycle stage may have a different quality rating defined by its purpose (e.g., the development stage has lower quality rating comparing to the production stage). The service quality rating 124 from the lifecycle perspective may be computed from the quality of the current lifecycle stage (which may be configurable). Additionally, other properties, such as age of service in the lifecycle stage 204 and the number of approvers may be taken into account. For example, the lifecycle used in the HP SOA Systinet software is composed of configurable lifecycle stage 204.
Regarding the source of service dimension 202, the service 112 may be added to the SOA repository 110 from different sources. For example, the service 112 may be imported from other systems (e.g., universal description, discovery and integration (UDDI), application management systems, such as HP Business Availability Center (BAC), and the like). Alternatively, the service 112 may go through the whole lifecycle in the SOA repository 110 (i.e., development (Dev), quality assurance (QA), staging, production, and the like). The service consumer 170 typically may be able to place more trust in imported services 112 because the imported services 112 may need to be trusted already as a prerequisite to their import.
Eight exemplary rating dimensions are described above for illustration purposes. One skilled in the art will appreciate that other types of rating dimensions can be equally applied.
Referring back to
The system 100 may also include visualization and reporting 180 that includes elements, such as service portfolio management 182, service quality 184, and searches and sorting 186.
As shown in
Table 1 shown below summarizes an exemplary technique for calculating the 0.1 scales for each of the exemplary dimensions outlined above.
In addition to defining the axis of service quality and the associated scales, the system 100 provides a mechanism for quality computation. Specifically, the system 100 accounts for the fact that quality levels are not static and need to be recalculated over time as the service 112 is being used or as the service 112 goes through lifecycle stages 204. Additionally, dynamic calculation may be needed because new services may be introduced into the environment or services may be decommissioned. The service quality rating 124 may be recalculated several times per day or per week, or when the administrator manually forces a quality computation to be executed.
The system 100 improves the confidence level of potential service consumers prior to service usage. The following are a few exemplary usages that the aggregated service quality rating 124 score may also deliver to an organization.
More focused searches can be performed by the service consumer 170, for example, to find services that have a quality level above or below a certain threshold level (N). Sorting of the services 112 returned from the SOA repository 110 may be done more effectively using the service quality rating 124.
Further, reports can be easily generated to show the services 112 in the SOA repository 110 that are above or below a given quality level. Additionally, trend reports based on time may be generated for the service quality rating 124. A service quality rating score (not shown) may be calculated at an aggregate level. For example, the quality of a given information technology (IT) service portfolio may be computed. Similarly, the quality of all services (i.e., the quality of an SOA) may be computed using the system 110.
The SOA testing environment 130 provides the defects rating dimension 132 and the test coverage rating dimension 134 to the SOA repository 110 at block 312. The defects rating dimension 132 includes a set of properties including a priority, a severity, a time-to-solve, and whether a defect is a developer defect or a customer defect. A organization may, for instance, decide to place higher emphasis or importance on defects that were generated by the end customer versus those coming internally from the Quality Assurance (QA) department. The SOA testing environment 130 provides an aggregation report related to the service 112 based on the set of properties using aggregation techniques. The test coverage rating dimension 134 includes a number of tests and a coverage percentage of the service 112. The service quality rating 124 associated with the test coverage rating dimension 134 is higher with a higher number of tests.
The service management system 140 provides the incidents rating dimension 142 to the SOA repository 110 at block 314. The incidents rating dimension 142 takes into account help desk issues that occur when the service 112 is being deployed, for instance the number and/or severity of such issues.
The monitoring system 150 monitors 154 the operations 160 to provide the operational usage rating dimension 152 to the SOA repository 110 at block 316. The monitoring system 150 uses a formula that is user-defined based on runtime properties to compute the service quality rating 124 of the operational usage rating dimension 152.
The SOA repository 110 also accepts the user rating rating dimension 172 that is provided by the service consumer 170 (block 318). The user rating rating dimension 172 is based on the experience and the knowledge of the service consumer 170 on the service 112. The service quality rating 124 of the user rating rating dimension 172 is computed as an average of service ratings submitted by multiple service consumers combined with credibility ratings of the multiple service consumers.
The rating calculation engine 120 provides a scale of axes on a real number scale from 0 to 1 to provide visualization of the service quality rating 124 of the one or more services 112 (block 320). The method 300 further uses a specific and customizable formula to compute the service quality rating 124 for each axis with an aggregated score using a weighted scoring technique (block 322). The method 300 ends at 324.
The memory 402 may include random access memory (RAM) or similar types of memory. The secondary storage device 412 may include a hard disk drive, floppy disk drive, CD-ROM drive, or other types of non-volatile data storage, and may correspond with various databases or other resources. The processor 414 may execute information stored in the memory 402, the secondary storage 412, or received from the Internet or other network 418. The input device 416 may include any device for entering data into the computer 400, such as a keyboard, keypad, cursor-control device, touch-screen (possibly with a stylus), or microphone. The display device 410 may include any type of device for presenting a visual image, such as, for example, a computer monitor, flat-screen display, or display panel. The output device 408 may include any type of device for presenting data in hard copy format, such as a printer, or other types of output devices including speakers or any device for providing data in audio form. The computer 400 can possibly include multiple input devices, output devices, and display devices.
Although the computer 400 is shown with various components, one skilled in the art will appreciate that the computer 400 can contain additional or different components. In addition, although aspects of an implementation consistent with the method for effectively determining the quality of a service provided in a services registry are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on or read from other types of computer program products or computer-readable media, such as secondary storage devices, including hard disks, floppy disks, or CD-ROM; or other forms of RAM or ROM. The computer-readable media may include instructions for controlling the computer 400 to perform a particular method.
There has been described an embodiment of a system for determining the quality of a service provided in a services registry includes a service-oriented architecture (SOA) repository that takes as input information from a plurality of data sources that map to a plurality of rating dimensions. The SOA repository includes one or more services to be offered to a service consumer. The system further includes a rating calculation engine that receives service characteristics for a service and calculates, based on category weightings and rating rules that are customizable by an organization, a service quality rating for provision to service consumers that takes into account the plurality of rating dimensions. The rating calculation engine recalculates the service quality rating over time as the service is being used and as the service goes through application development lifecycle stages.
The plurality of rating dimensions may include a defects rating dimension, a test coverage rating dimension, an incidents rating dimension, an operational usage rating dimension, a user rating rating dimension, a contract and reuse rating dimension, a lifecycle stage rating dimension, and a source of service rating dimension.
The rating calculation engine may, in some embodiments, provide a normalized scale of axes (e.g., on a real number scale from 0 to 1) in order to provide improved insight into (e.g., visualization) of the service quality rating of the more than one service. A specific and customizable formula is used to compute the service quality rating for each axis with an aggregated score using a weighted scoring technique.
Also described has been an embodiment of a method for determining the quality of a service provided in a services registry includes providing a service-oriented architecture (SOA) repository that takes as input information from a plurality of data sources that map to a plurality of rating dimensions. The SOA repository includes one or more services to be offered to a service consumer. The method further includes providing a rating calculation engine that receives service characteristics for a service, using the rating calculation engine to calculate, based on category weightings and rating rules that are customizable by an organization, a service quality rating for provision to service consumers that takes into account the plurality of rating dimensions, and using the rating calculation engine to recalculate the service quality rating over time as the service is being used and as the service goes through lifecycle stages.
Further, an embodiment of a computer readable medium has been described that provides instructions for determining the quality of a service provided in a services registry. The instructions include providing a service-oriented architecture (SOA) repository that takes as input information from a plurality of data sources that map to a plurality of rating dimensions. The SOA repository includes one or more services to be offered to a service consumer. The instructions further include providing a rating calculation engine that receives service characteristics for a service, using the rating calculation engine to calculate, based on category weightings and rating rules that are customizable by an organization, a service quality rating for provision to service consumers that takes into account the plurality of rating dimensions, and using the rating calculation engine to recalculate the service quality rating over time as the service is being used and as the service goes through lifecycle stages.
While the system and method for effectively determining the quality of a service provided in a services registry have been described in connection with an exemplary embodiment, those skilled in the art will understand that many modifications in light of these teachings are possible, and this application is intended to cover variations thereof.