1. Technical Field
The present invention relates generally to computer software and in particular to computer software development. Still more particularly, the present invention relates to analyzing software quality during computer software design and development.
2. Description of the Related Art
Computer software developers are continually developing new software and/or updating existing software to meet customer demands. Oftentimes, software is developed for specific customers, with whom the software developer contracts to provide the software, and the client expects software to be delivered that is both fully functional and which meets/exhibits a determinable level of quality. When software developers develop (or update) a computer software program or application, however, there is rarely any predictive analysis performed by which the developer is able to ascertain whether the quality of that software meets the expectations of the customer.
Occasionally, during testing of newly developed software, the quality of the software does not meet the customer expected quality, and the software developers (i.e., executives of the software development company) may be forced to defer release of the software product until product quality improves. Alternatively, the software developer may agree to release the product in order to gain a particular business advantage (e.g., first to market), without assurances that the product will meet the required quality for the customers. This decision (or business practice) may ultimately result in substantial costs/expense to the developer should the software prove to be of sub-standard quality (from the customers' perspective).
For example, with conventional software implementation, the cost of fixing a defect found by customers within released software may range between $5,000 and $50,000 per defect, depending on complexity. Post-release expenses are incurred as the developer is forced to carry out re-design, re-engineering, re-coding, or re-testing of the software product. Additionally, the cost of providing customer support varies, and may cost the company between $250 and $2,500.00 each time a customer phones in for support or for a software fix. In addition, certain intangible costs (i.e., costs that are not immediately quantifiable) may be incurred by the company as well. When a delivered software product fails to meet the quality expectation of a customer, the company loses the goodwill associated with customer satisfaction, and it is customer satisfaction that leads to repeat business.
As a result, a comprehensive, consistent, repeatable, and reliable business process is essential for more fully understanding the quality of software that will be released and the likelihood of success when deployed in the customer environment.
Developers today rely on verification or quality assurance teams that track individual indicators with various levels of meaning towards understanding the quality of the software product during development. Several different tools are available to help with various aspects of software testing. However, no single reliable approach exists that is generally applicable to all software development processes, as conventional methods provide a large range of approaches, some of which are product-specific and not generally applicable.
The existing methodologies for predicting quality of software each utilize only objective measures for their predictive analysis (see, for example, the article entitled Is this Software Done?, found in the Software Testing and Quality Engineering Magazine, Volume 4, Issue 2, March/April 2002). Virtually all of these methodologies depend upon defects identified during testing to perform a risk assessment. Another example is the Raleigh prediction model, described in Steve Kan's, Metrics and Models in Software Engineering, ISBN 0-201-72915-6, chapter 7, which discusses software metrics.
Obtaining a better understanding of how clients will view the quality of a particular piece of software may be crucial in some software deployments. Consequently, being able to understand and consistently quantify “quality risks” before software is released to customers is of utmost importance to the software development process. Clearly, a method for better prediction of the quality of software during software development will be a welcome improvement.
Disclosed is a method, system, and computer program product for providing predictive quality analysis during software creation/development. A measurement method is provided that utilizes a combination of objective measures and subjective inputs to determine a final quality of the software that is being developed. In addition to using objective measures in a unique, consistent, and deliberate fashion, subjective measures are also utilized to increase/improve the validity of the predictive quality analysis. The subjective elements utilized are ones that provide good indicators of the likelihood of success and customer satisfaction from a software quality standpoint.
The above as well as additional objectives, features, and advantages of the present invention will become apparent in the following detailed written description.
The invention itself, as well as a preferred mode of use, further objects, and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings, wherein:
The present invention provides a method, system, and computer program product for providing predictive quality analysis during software creation/development. A measurement instrument is provided that utilizes a combination of objective measures and subjective inputs to determine a final quality of the software that is being developed. In addition to using objective measures in a unique, consistent, and deliberate fashion, subjective measures are also utilized to increase/improve the validity of the predictive quality analysis. The subjective elements utilized are ones that provide good indicators of the likelihood of success and customer satisfaction from a software quality standpoint.
With reference now to the figures, and in particular
Located within memory 103 and executed on processor 101 are a number of software components, including operating system (O/S) 120 and software development quality analysis (SDQA) utility 124. SDQA utility 124 is the principal software component that enables the implementation of the quality analysis/assessment features provided by the present invention. While described in the context of a computer system, the described features of the invention may be completed without use of a computer system.
According to the illustrative embodiment, implementation of the invention involves a user executing the SDQA utility 124 and entering a series of inputs requested within the SDQA utility. It is noted that, while the illustrative embodiment of the invention is described with specific reference to a computer-executed process via the SDQA utility, the functionality associated with the invention is not necessarily limited to implementation with a computer or within a computing environment. The calculations of interest in determining the quality of developed software may be completed utilizing pen and paper, an abacus or other non-electronic counting tool, an electronic adding tool, such as a calculator, as well as a computer device, which may be hand-held (or portable) or a desktop computer device. For simplicity in describing the invention as well as providing a context for generating spreadsheets utilized to enter the subjective data and perform the calculations of the quality of software, a computer implemented method is described that includes use of the SDQA utility within a computer system. This specific implementation is, however, not meant to imply any limitations on the invention.
Several major areas (or phases of development) are identified and programmed within SDQA utility. Compiling information for each of these areas is required for SDQA to provide a comprehensive analysis of the quality of the software. The major areas identified apply to any software and as such the SDQA utility is generally applicable to analyze any developed software.
These major areas are listed below along with a brief description of their respective functionality:
The invention provides a method for utilizing objective and subjective criteria to predict the end customer's view of the quality of software. In the illustrative embodiment, the invention employs a consistent and sophisticated process to address software quality issues by having quality assurance teams review the development process and interact with the SDQA utility to produce a final, quantitative, quality analysis result.
The methodology presented by the described embodiment of the invention employs a consistent and sophisticated process to address the following software quality issues by having quality assurance teams answer questions concerning: (1) particular development methodologies utilized; (2) whether or not industry standard best practices were employed; (3) the type of customer interaction that occurred; (4) different areas of project “churn,” and others. Among the software quality issues analyzed by the quality assurance teams are the following: (1) How is consistency ensured?; (2) How does one validate that the software that is about to be shipped/released is of a high quality?; (3) Are the risks quantifiable?; (4) What assurances can be offered to clients that the release being shipped is trustworthy?; (5) Have the more intangible elements been taken into account, rather than only identifying the numbers of defects?
Each spreadsheet of
The first two columns within the spreadsheets are default, pre-populated columns, i.e., columns with specific action items, and other information of relevance provided therein. For example, action-column provides a detailed list of each action that is analyzed within the quantity assessments for that particular development phase. Each individual action listed in column 2 may have one or more associated selections, which are separated by the rows of the spreadsheet. Thus, within each row are a number of selections associated with each action. Each selection has assigned to it a total number of available points, indicated within “points available” column. For instance, as a part of the action described as “methodology used”, there are four possible selections, each having an associated number of available points. These selections and associated points are: (1) interaction design/outside-in design/etc.—10 points; (2) brainstorming—5 points; (3) ad-hoc—0 points; and (4) what's a design?—10 points. In this illustration, the last element, “what's a design?” is a rhetorical question indicating that no design was actually made prior to developing the software. That is, the developers simply began writing code without having a design to work from. In such situation/scenarios, the overall quality of a given product is going to be worse than if formal designs were done and inspected. Thus, having this element in the development process results in an award of a negative 10 rating). For each selection, the team member enters the number of weighted points associated with the particular action within points earned column.
When the SQDA utility is first initiated, the utility may prompt the user for information specific to the software to be analyzed. This information entered by the user may be utilize to select specific actions (from a large database of possible actions) to include within the spreadsheet analysis. The SQDA utility generates the series of spreadsheet-type GUIs, similar to the GUIs illustrated by
In order to analyze the quantity of the software, the developer enters the number of points associated with each row of selections within each of the respective series of spreadsheets. In one implementation, when the SDQA is initially executed and the spreadsheet view is opened, the cursor is immediately positioned within the earned points column of each GUI, and the user is able to select a particular point total for each action and enter that point total within the points earned column.
When the user completes entry of each of the required point totals for each action, SQDA utility calculates a total of the user's entries to yield the total number of quality points earned by the developer in developing the software. In one implementation, SQDA utility then completes a comparison of the points total against the scale, and SQDA utility generates an output indicating whether or not the software meets the required quality. This latter feature requires entry of a threshold value below which the required quality is not met for the software. The threshold value is pre-selected by the developer given the requirements of the customer to whom the software is being shipped. One key advantage of the business process provided by the invention over existing methods is the consistency and comprehensiveness of factors that go into the predictive analysis.
The points entered are totaled by each spreadsheet, to yield an area sub-total, and the group of area-specific sub-totals are summed together to yield an overall total for the entire design and development and test processes. As shown at the bottom of
According to the illustrative embodiment, a maximum total of 767 is possible, when utilizing the series of spreadsheets with the illustrated action items of
In an alternative implementation, no actual predetermined “required quality” or quality level is assigned, and the resulting total/number is utilized solely to provide an assessment of the quality of the product. Then, the business needs, customer needs, etc., for the software are evaluated to determine whether the risks associated in shipping a product with, perhaps, marginal quality as indicated by the assessment, are worth it or not.
In one implementation also, individual development teams are able to tweak the spreadsheets based on the team's own set of criteria—such that the scale shown and/or utilized in the illustrative embodiments is not a “hard and fast, one size fits all” component. For example, if an initial development team is developing a component, which will only be used by other, internal product development teams and, therefore, will NOT go through a system test phase, the “spreadsheet” of that initial development team will be a subset of what was submitted and will be different than that of a team that is developing an end product that will be shipped directly to external customers.
When the development process is completed at block 308, all of the required information is provided to SDQA utility at block 310. The SDQA utility then calculates the point total for the specific development process, indicated at block 312, and analyzes the total against the preset quality threshold(s) at block 314. SDQA utility determines at block 316 whether the required quality threshold level is met. When the level has been met, SDQA utility provides the developer a quantitative feedback result indicating that the software product meets the required levels of quality, as shown at block 318, and, in response, the developer prepares to ship the software to the customer, as indicated at block 320. Otherwise the software is referred back to the development team for further work, as shown at block 322. In one embodiment, the additional work required and/or performed is directed by the individual scores for each spreadsheet. Areas that score the worst are revisited by the software developers.
As a final matter, it is important that while an illustrative embodiment of the present invention has been, and will continue to be, described in the context of a fully functional computer system with installed management software, those skilled in the art will appreciate that the software aspects of an illustrative embodiment of the present invention are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the present invention applies equally regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of signal bearing media include recordable type media such as floppy disks, hard disk drives, CD ROMs, and transmission type media such as digital and analogue communication links.
While the invention has been particularly shown and described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.