1. Technical Field
The present invention relates generally to techniques for integrating software test automation into the design and development of a software product or feature.
2. Description of the Related Art
Software Development Life Cycle (SDLC) is a well-known concept in software engineering and refers to the process of creating or altering software systems. The SDLC is a logical process typically implemented by an entity and its employees (or consultants) to develop an information system, and it usually includes several phases, such as planning, analysis, design and implementation. A typical software development life cycle comprises a sequence of states in which the output of each stage becomes the input for the next. A representative sequence might be as follows: project definition, user requirements definition, system requirements definition, analysis and design, system build/prototyping (including coding and testing), and maintenance.
Traditionally, automated software testing is not an organizational focus by, rather, it is a by-product of ad-hoc automated tests from a quality assurance (QA) team. The development of such tests does not follow a repeatable process, nor is such software testing traditionally regarded as a primary role of any individual within a software development team. As such, no concrete procedure has been established within the software engineering industry to provide test automation that produces reliable, repeatable results.
Further, automated testing systems and methods are well-known in the prior art, as evidenced by the following representative patents: U.S. Pat. Nos. 6,662,312, 6,301,701, 6,002,869, 5,513,315 and 5,751,941. Known prior art testing frameworks also include solutions such as STAF (the Software Testing Automation Framework).
This disclosure describes a software-based business process to automate manual testing of software applications. The “feature automation” process establishes concrete roles and responsibilities to be established and enforced for each member of a development team, and it gives visibility into each phase of development. This allows for predictable, high-quality and repeatable results. According to the process, software tools are used to execute automated tests and to collect results for analysis.
The feature automation process defines step-by-step instructions for involving automation engineers, defining, implementing and reviewing software test automation during the development of a feature or product. This process seamlessly integrates the roles of automation engineers and other resources into the software development life cycle (SDLC). An enterprise (and, in particular, management) first creates a dedicated automation team and, as necessary or desirable, allocates resources and builds expertise within this team. The feature automation team preferably works with the product/feature team to enable the latter team to better understand the roles of the automation engineers and to further facilitate transparency into the product/feature requirements, design and implementation activities. The feature automation process enables an associated quality assurance (QA) team to offload (to the feature automation team) the responsibility of writing test scripts, and for creating an automation framework, test designs, and for implementing and maintaining test code. The process ensures that all stakeholders are involved in the reviewing the automation framework and test design prior to test implementation to enhance the reusability of the framework and the stability of the test runs.
Preferably, the feature automation process is defined by a set of external review checkpoints, each of which includes one or more feature automation activities. The checkpoints preferably include: feature kick-off, high level review, detailed review, development and debugging, and the integration/test suite execution. Unlike the prior art, where software automation does not begin until late in the feature development life cycle, according to the described technique the requirements and design for automation begin at a much earlier phase.
In particular, using the approach described herein, the feature automation process begins much earlier in the development of the product/feature, and automation activities become integrated into the overall SDLC instead of merely being a late stage of the cycle. This process improves coordination from within and outside the automation team, improves the automation development framework, reduces the automation development life cycle, improves code quality and maintainability, improves code and test documentation, and reduces training time for new automation team members.
The foregoing has outlined some of the more pertinent features of the invention. These features should be construed to be merely illustrative. Many other beneficial results can be attained by applying the disclosed invention in a different manner or by modifying the invention as will be described.
For a more complete understanding of the present invention and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
The reader should be familiar with basic terminology of software engineering. The feature automation process of this disclosure preferably includes a number of high level steps or phases that are illustrated in
Turning now to the more detailed aspects, and also with reference to
The high level review 200 preferably includes two major phases: a review feature design phase 202, and a contribute/review test plans phase 204. Each of these will now be described.
With reference now to
The next phase is the detailed review 300, which has a number of sub-phases including a review test cases phase 302, a design automation tests phase 304, an identify common functionality phase 306, a request product hooks phase 308, an automation design review phase 310, and an automation project plan/development task lists phase 312. Each of the phases will now be described.
With reference now to
The automation design review phase 310 begins at step 338 with the feature automation team owning the design of the feature automation. At step 340, the feature automation key reviewer reviews the proposed framework, shared library additions and a sample test case implementation for one or more of the following types of tests as applicable to the type of feature: an acceptance test suite, a functional test suite, and a stress test suite. At step 342, the feature automation team reviews the design with the key reviewer in one or more phases. Finally, the automation project plan/development task lists phase 312 begins with the feature automation team creating a formal project plan or development task lists, preferably with time and cost estimates. This is step 344. At step 346, the automation plan and estimates are provided to the feature team lead to be added to the feature development plan. This enables design progress to be tracked. This completes the detailed review phase 300.
The development/debug phase 400 has a number of sub-phases: a shared library additions phase 402, an automation test implementation phase 404, an automation code reviews phase 406, a triage automation issues phase 408, and a debug automation tests phase 410. Each of these phases will now be described.
Referring now to
The automation code reviews 406 phase is then initiated. It begins at step 428 with the feature automation team reviewing the code with the key reviewer(s), which could be from the current feature automation team, or others. At step 430, a review is carried out to confirm that any applicable coding standards are met. At step 432, any documentation is reviewed. At step 434, complete test coverage for each test case is carried out. At step 436, error handling methods are evaluated. At step 438, the framework is tested. At step 440, the shared library implementation is tested. At step 442, any additional tools that may be required are then tested. At step 444, any appropriate process (such as PyChecker) is run to check for errors. The code is then checked back in to the code branch at step 446. At step 448, the common library for the feature is identified. At step 450, the changes or additions to the shared library (if any) are made. At step 452, when the code is complete for tests, the team meets with QA and obtains approval on test logic coverage. This completes the automation code review phase 406.
The triage automation issues phase 408 begins at step 454. At this stage, any automation issues (i.e., bugs) that are related to the feature are identified and a determination is made regarding the source of the issue. At step 456, the team involves the QA team and then, if necessary, the development team to attempt to determine whether the test failure is due to a product issue and how it might be addressed. This completes triage automation issues phase 408. Finally, the debug automation tests phase 410 then involves step 460, during which the team investigates any failures and provides appropriate fixes until all tests pass. This completes the development/debug phase 400.
Although not meant to be limiting, preferably one or more off-the-shelf tools may be used for developing and reviewing automation code. These include Eclipse (a software development environment), PyDev (a plugin that enables users to use Eclipse for Python development), PyChecker (a tool for finding bugs in Python source code) and PyLint (a Python tool that checks if a module satisfies a coding standard). These tools are merely representative.
The end game phases 500 has a number of sub-phases: an automation code integration phase 502, a test suite execution phase 504, and an update test suite inventory phase 506. Each of these phases will now be described.
Referring now to
The disclosed subject matter has many advantages. The described methodology may be used by any software product development organization and easily integrated into the software development lifecycle. This has the benefit of streamlining the SDLC process and reducing overall workload for each member of the development team while improving overall product quality. Automation provides the following additional benefits: facilitates the definition of an automation team for new product development, reduces the product development lifecycle by weeks/months where periodic (e.g., bi-annual) releases are required by identifying product bugs earlier in the development cycle, provides continuous feedback on product quality, enables feature teams to work more cohesively to deliver a higher quality product, and it provides the ability to define and enable a robust and reliable automation platform for any software development project. The methodology provides a repeatable process that can be used or tailored to define a software test automation environment in an organization's software product development initiatives. It further defines a new software organizational model to enable an entity to move towards achieving software Quality Assurance (Q/A) through test automation.
The process can be used in many ways. Organizations in the software product testing industry can use this methodology to define and execute their test automation strategies. Organizations involved in complex software product development can use the methodology to define and execute their automation development activities. Others, such as software process consulting organizations may use the methodology to improve product quality and reduce testing costs for their clients.
Although not meant to be limiting, step 512 in
In one embodiment, the framework daemon has a number of supporting tools, namely, executable modules, that provide the various functions required. For example, framework daemon runs test suites (or batches), records the results in a database, and stores node images and logs in a web-accessible directory on a local machine's web server. The daemon preferably emails all exceptions, failures, crashes and the like to a target set of email recipients, or otherwise exports such information to a bug tracking system. Thus, the framework provides a functional, cluster level regression test running harness that is highly configurable and that is test language agnostic. The framework also is capable of running white-box testing. Any test that can run on a series of hosts, or even a single host, can be run within the automated test framework.
Referring now to
As illustrated in
Referring now to
The tools layer 205 components comprise a distributed command handler module 223 that is a wrapper to the tools in the layer. The module 223 preferably is accessible via an SSH connection to enable the tools to interact with the cluster non-interactively, or via scripts or an open API. Access to the distributed command handler module preferably requires authentication. A log rotate or “imager” module 225 dumps system logs, checks for error messages in the cluster node logs, and images databases across the cluster nodes after a test is complete. A daemon master module 227 is a script that allows granular control over daemons on the system under test, such as starting and stopping. A database snapshot module 229 is provided to grab a snapshot 231 of the databases spread out across the SUT, and pulls those snapshots back to the local machine. The snapshot module 229 is a tool that actually images the SUT, and it may operate with the log rotate module to get the logs from all of the SUT nodes in between the various test runs of a given test suite. The snapshot module 229 also grabs an image of a current database on all the nodes in the system, copies those files back to the local machine, and stores them alongside the logs for that test/suite run. The snapshot module may also verify a cluster's integrity. A build installer module 233 is a script that leverages the distributed command handler module to install the defined build across the target cluster, including any and all required configuration scripts. The module 233 may be implemented as script that automatically determines the required values based on information in a configuration database. A module 235 wipes or cleans the target system non-interactively, formatting the disks, database, logs, files, and the like, if necessary. A health monitor 237 is a script that verifies the integrity of a running system, also checking for valid/invalid processes, swap usage, and the like. At any point, the health monitor 237 is called to check up on the SUT. Finally, a gateway mount verification module 239 is used to verify the health of the various gateways and to perform various access methods against those gateways; this module thus operates to verify availability of a given cluster (and the nodes within that cluster). The module may also be used as a mounting system for the test runner module to call to mount required resources.
It should be noted that the tools shown in
As noted above, any of the machines illustrated may be run different hardware, different software, or different hardware and different software. As a result, the framework is highly scalable. It is flexible, easily extensible, and preferably test-language and client-platform agnostic. This implementation ensures that the framework can run any test, in any language.
While the process flow diagrams and the above description provide a particular order of operations performed by certain embodiments of the invention, it should be understood that such order is exemplary, as alternative embodiments may perform the operations in a different order, combine certain operations, overlap certain operations, or the like. References in the specification to a given embodiment indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic.
While the present invention has been described in the context of a method or process, the subject matter herein also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including an optical disk, a CD-ROM, and a magnetic-optical disk, a read-only memory (ROM), a random access memory (RAM), a magnetic or optical card, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. As noted above, a given implementation of the present invention is software written in a given programming language that runs on a standard hardware platform running an operating system such as Linux.
While given components of the system have been described separately, one of ordinary skill will appreciate that some of the functions may be combined or shared in given instructions, program sequences, code portions, and the like.