To test a software application prior to release, developers employ test programs that apply programmatic inputs to the software application, and measure the results. To ensure that the programmatic inputs of the test program adequately cover various aspects of the software application, the test program may track the execution of source code, such as C++, C#, and SQL stored procedures in the codebase of the software application while the test program is running.
However, in the context of testing online services that employ backend relational databases as well as front and/or middle tier applications, source code tracking may be inadequate. Unlike stand-alone software applications, such online services perform transactions involving many data elements stored in a backend database. The performance of the online service is dependent on the various possible values for each element, referred to as the “data domain” for each data element. However, source code tracking may fail to indicate whether the test has covered the full realm of possibilities in the data domain for each data element, because operations on data elements stored in the database may be handled generically by the same section of code in a front and/or middle tier application, irrespective of the different value or type of the data in the data element. Thus, tracking of source code coverage cannot be relied upon to provide accurate indication of domain data coverage when testing an online service. Untested aspects of an online service may result in unforeseen errors occurring after release, potentially resulting in undesirable downtime, lost revenues, and loss of goodwill with customers.
Testing systems and methods are provided for determining domain data coverage of a test of a codebase. The testing system may include a coverage program having a setup module configured to receive user input indicative of a target domain data table to be monitored during the test. The coverage program may further include a test module configured to programmatically generate a shadow table configured to receive coverage data, and to create one or more triggers on the target domain data table. The triggers may be configured, upon firing, to make entries of coverage data in the shadow table indicating that the trigger was fired during the test. The coverage program may also include an output module configured to compare the shadow table and the target domain data table to produce a coverage result, and to display the coverage result via a graphical user interface.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
In the design phase 18, a developer may program a codebase 24 on the development computer 14 using a development studio program 26. The codebase 24 may be for a software application or software component that interfaces with a relational database. Various data may be exchanged between the codebase 24 and the relational database during use, and the scope of possible values for this data may be referred to as a data domain for the application and database interaction.
Once the codebase 24 has been developed using the development studio program 26 and is ready for testing, the coverage program 12 may be used during the design phase 18 to receive user input of domain data to monitor for coverage scope during testing. For example, the coverage program 12 may include a setup module 32 that may be executed on the development computer 14 during the design phase 18. The setup module 32 may be configured to display a setup interface 36 on a graphical user interface 38 associated with the development computer 14. The setup module 32 may be configured to receive user input indicative of a target domain data table 34 of the relational database to be monitored during the test of the codebase 24, via the setup interface 36. The target domain data table 34 may include possible values for a data element utilized by the codebase and stored in the relational database.
One example of such a setup interface 36 is illustrated in
Returning to
The coverage program 12 may further include a test module 42 that may be executed on the test computer 16 during the pre-testing phase 20, and configured to determine whether the programmatic inputs of the test program 40 adequately cover various aspects of the software application. During the pre-testing phase, the test module 42 may be configured to programmatically generate a shadow table 44 configured to receive coverage data. The size of the shadow table 44 may be compatible with the target domain data table 34, to facilitate joinder of the data in the tables in downstream processing.
The test module 42 may also be configured to create one or more triggers 46 on the target domain data table. The triggers 46 are procedural code that is executed in response to a defined event on a particular table in a database. The triggers 46 may be configured, upon firing, to make entries 48 of coverage data in the shadow table 44 indicating that the trigger was fired during the test. Thus, triggers 46 provide a mechanism to determine coverage of the various discrete values in the target data domain table during the test. It will be appreciated that the generation of the shadow table and triggers occurs programmatically according to stored algorithms that operate upon the user input domain data table 34, as discussed below.
As illustrated in
It will be appreciated that in some scenarios, multiple shadow tables may be generated, based on the user input domain data tables to be monitored during a test. For example, for each detected foreign key dependency 60, the test module 42 may be configured to create a respective shadow table 44, each shadow table 44 being configured to store a respective action 70, referring table 72, timestamp 74, and value 76 of a data element linked by the foreign key dependency. Further the test module 42 may be configured to create the one or more triggers 46 of the multiple shadow tables 44 by creating triggers 46 on the domain data tables 34 that are linked via the one or more foreign key dependencies 60.
Returning to
The output module 50 and/or the test module 42 may be configured to store an output file including the coverage result 52. The output file 56 may, for example, be in XML format, and readable by the output module to display the coverage result 52 on the visualization interface of the graphical user interface 38.
Turning to
To enable the developer to ascertain the aspects of the domain data table that may not have been adequately covered by the test, the coverage result 52 may include a graphical indication 86 of a lack of coverage for a portion of the data domain. In the illustrated embodiment, the graphical indication 86 is depicted as highlighting in rows where the numerical indication 84 is zero. A zero value indicates that no triggers were fired that would indicate coverage of the corresponding values for SI_SETTLEMENT_STATUS_ID and C_DESCRIPTION in the same row as the zero. Thus, no triggers were fired for the highlighted values such as HARD DECLINE, IMMEDIATE SETTLE DECLINE, etc., in the data domain for the data element SETTLEMENT_STATUS_TYPE, indicating that these values have not been covered by the test.
A developer may utilize the coverage results 52 in several ways. For example, the highlighted rows may be manually investigated by a developer to determine their effect, and if desired, the test program may be modified by the developer to cover one or more of the areas that were not covered in the first run of the test. Or, the highlighted rows may be programmatically communicated to the test program, and the test program may be configured to alter its test suite to cover the highlighted values.
At 104, the method may include programmatically generating a shadow table configured to receive coverage data, the size of the shadow table being compatible with the target domain data table. For example, the shadow table may be sized to be joined to the target domain data table without loss of data in the target domain data table. In some embodiments, the programmatic generation of the shadow table may include detecting one or more foreign key dependencies of the target domain data table. For each detected foreign key dependency, a respective shadow table may be created, each shadow table being configured to store an action, a referring trigger, a timestamp, and a value of a data element linked by the foreign key dependency. Further, creating the one or more triggers may include programmatically creating triggers on the tables that are linked via the one or more foreign key dependencies. It will be appreciated that the step of programmatically generating a shadow may be performed on a test computer.
At 106, the method includes creating one or more triggers on the target domain data table, the triggers being configured, upon firing, to make entries of coverage data in the shadow table. As described above, the triggers may be configured to indicate that a value in the data domain was covered by the test, and may be programmatically created on a table that includes a referring foreign key dependency to a monitored data element.
At 108, the method may include running a test on the codebase. At 110, the method may include during the test, upon firing of a trigger, writing coverage data in the shadow table indicating that the trigger was fired. It will be appreciated that the steps of creating the one or more triggers, running the test, and writing the coverage data to the showdown table may be performed on a test computer.
At 112, the method may include comparing the shadow table and the target domain data table to produce a coverage result. For example, comparing the shadow table and the target domain data table may include joining appropriate data in the shadow table with the target domain data table, to produce the coverage result, as illustrated and described above.
At 114, the method may include displaying the coverage result via the graphical user interface of the coverage program. The coverage result may be in a table format, and includes a numerical or graphic indication of a number of times the trigger was fired, as illustrated in
The above described systems and methods may be used to efficiently determine the coverage of domain data during a test of an application program that utilizes a relational database, by enabling the user to input a data domain table to be monitored, run a test, and then view a visualization of a coverage result.
It will be appreciated that the computing devices described herein may be suitable computing devices configured to execute the programs described herein. For example, the computing devices may be a mainframe computer, personal computer, laptop computer, or other suitable computing device, and may be connected to each other via computer networks, such as a local area network or a virtual private network. These computing devices typically include a processor and associated volatile and non-volatile memory, and are configured to execute programs stored in non-volatile memory using portions of volatile memory and the processor. As used herein, the term “program” refers to software or firmware components that may be executed by, or utilized by, one or more computing devices described herein, and is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. It will be appreciated that computer-readable media may be provided having program instructions stored thereon, which upon execution by a computing device, cause the computing device to execute the methods described above and cause operation of the systems described above.
It will be understood that the embodiments herein are illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalence of such metes and bounds thereof, are therefore intended to be embraced by the claims.