Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.
<1.1 Overall Configuration>
Note that in the present embodiment, the test planning assistance apparatus 100 has been described as being solely composed of the server, but for example, it may be composed of the personal computer 200 including the display section 40 and the input section 50. For example, this allows the operator to use the personal computer 200 to execute a process for inputting test cases and test results, and a process for displaying the status of the test progress.
<1.2 Test Project>
Next, the concept of the “test project” according to the present embodiment will be described. In software system development, the testing is performed a plurality of times during a period from the start to end of development of one system (product). In some cases, for example, five rounds of testing are performed during the period from the start to end of the development as shown in
Each test project is correlated with a plurality of test specifications as shown in
In addition, each test specification is correlated with a plurality of test cases as shown in
In the software system development, the testing is repeatedly performed as described above, and therefore each test specification is repeatedly used. Specifically, the first round of the testing is performed based on test specifications generated in early stages of the development, and thereafter the same test specifications are used for performing the second and subsequent rounds of the testing. However, the test specifications or test cases are added or deleted in accordance with, for example, addition or deletion of functions during the development.
<1.3 Tables>
Described next are tables held in the database 30 in the present embodiment.
In item fields of the test specification table 31 (regions where data items are stored), data items as described below are stored. Stored in the “TEST SPECIFICATION NO.” field is a number for identifying the test specification, and the number is uniquely assigned in each test project. Stored in the “TEST SPECIFICATION NAME” field is a name by which a developer, a tester, etc., can identify the test specification. Stored in the “VERSION” field is a version of the test specification. Stored in the “SUMMARY” field is a description summarizing the test specification. Stored in the “CREATOR” field is the name of the test specification creator. Stored in the “CREATION DATE” field is the creation date of the test specification. Stored in the “UPDATER” field is the name of the person who last updated the test specification. Stored in the “UPDATE DATE” field is the update date of the test specification. Stored in the “APPROVER” field is the name of the person who approved the details of the test specification. Stored in the “TEST CASE NO.” field is a number for identifying a test case, and the number is uniquely assigned within a test project.
In item fields of the test case table 32, data items as described below are stored. Stored in the “TEST CASE NO.” field is a number for identifying the test case, and the number is uniquely assigned within a test project. Note that the “TEST CASE NO.” field in the test specification table 31 and the “TEST CASE NO.” field in the test case table 32 are linked with each other. Stored in the “CREATOR” field is the name of the test case creator. Stored in the “TEST CATEGORY 1” field is the name of a category into which the test case is categorized in accordance with a predetermined rule. The category name may be “normal system”, “abnormal system” or “load”, for example. Stored in the “TEST CATEGORY 2” field is the name of a category into which the test case is categorized in accordance with a rule different from that for the “TEST CATEGORY 1” field. The category name may be “function” or “boundary value”, for example. Stored in the “TEST METHOD” field is a description explaining a method for executing the testing. Stored in the “TEST DATA” field is a description for specifying data for executing the testing (e.g., a full pathname). Stored in the “TEST DATA SUMMARY” field is a description summarizing the test data. Stored in the “TEST LEVEL” field is the level of the test case. The level may be “unit test”, “join test” or “system test”, for example. Stored in the “RANK” field is the importance level of the test case. The importance level may be “H”, “M” or “L”, for example. Stored in the “DETERMINATION CONDITION” field is a description explaining the criterion for determining a pass or fail in the testing. Stored in the “TEST RESULT ID” field is a number for identifying a result of testing the test case. Stored in the “TEST RESULT” field is the result of the testing. In the present embodiment, the test result may be “success”, “failure”, “untested” or “unexecuted”. Stored in the “REPORTER” field is the name of the person who reported the test result. Stored in the “REPORT DATE” field is the report date of the test result. Stored in the “ENVIRONMENT” field is a description explaining a system environment or the like at the time of the testing. Stored in the “REMARKS” field is a description such as a comment on the testing.
As for the test result, “success” is meant to indicate that the test result is successful (pass) , “failure” is meant to indicate that the test result is unsuccessful (fail), “untested” is meant to indicate that the testing is not performed on the test case, and “unexecuted” is meant to indicate that the test case has not yet been tested in the current test phase. In the present embodiment, the details of the test result are used as “test execution information”. Specifically, if the test result is “success” or “failure”, it is understood that the testing has been executed, while if the test result is “untested” or “unexecuted”, it is understood that the testing has not been executed.
In item fields of the test performance table 33, data items as described below are stored. Stored in the “TEST SPECIFICATION NO.” field is a number for identifying a test specification, and the number is uniquely assigned in each test project. Stored in the “ACTUAL MAN-DAYS” field is the number of man-days spent for test execution in an associated test project. Note that the “TEST SPECIFICATION NO.” in the test specification table 31 and the “TEST SPECIFICATION NO.” in the test performance table 33 are linked with each other.
<1.4 Scheduled Performance Display Dialog>
Described next is a screen (hereinafter, referred to as the “scheduled performance display dialog”) 400 for displaying scheduled test progress, actual test progress, and estimated test progress in the present embodiment.
The optimization process will now be described. The optimization process refers to a process for selecting preferred test cases in order to efficiently perform the testing, considering past test results. The optimization process is executed, for example, when it is estimated that the testing of all test cases will not be completed by a previously scheduled completion day. The test planning assistance apparatus 100 is capable of acquiring the test result for each test case in each test project from the test case table 32. For example, in the case where the test results are acquired as shown in
<1.5 Testing>
<1.5.1 Overall Flow>
Described next is a testing procedure using the test planning assistance apparatus 100 according to the present embodiment.
After the current test project is started, the test case management program 21 is executed in the test planning assistance apparatus 100 to perform a test case management process (step S110). The test case management process is meant to indicate registration of a new test case(s) to the database 30, deletion of an existing test case(s) from the database 30, and correction of the details of the existing test case(s) in the database 30.
When all test cases that are to be executed in the current test project are stored to the database 30 in accordance with the test case management process, the procedure advances to step S120. In step S120, the test schedule generation program 22 is executed in the test planning assistance apparatus 100 to perform a test schedule generation process. In the test schedule generation process, a test progress schedule for the current test project is generated, and a graph indicating the schedule is displayed in the graph area 403 of the scheduled performance display dialog 400.
After step S120 is completed, the procedure advances to step S130, where the testing is executed (step S130). The execution of the testing is performed by the worker called the “tester” based on test specifications. After the testing is completed, the procedure advances to step S140.
Instep S140, the test result management program 23is executed in the test planning assistance apparatus 100 to perform a test result management process. The test result management process is meant to indicate inputting of a test result(s) to the database 30, and editing (correction) of the test result(s) in the database 30.
After step S140 is completed, the procedure advances to step S150. In step S150, the test result aggregate display program 26 is executed in the test planning assistance apparatus 100 to perform a test result aggregated is play process. In the test result aggregate display process, an aggregate of the results of executed testing is displayed in the form of a graph, a table, or the like. For example, the aggregate is displayed in the form of a circle graph as shown in
After step S150 is completed, the procedure advances to step S160. In step S160, the test performance management program 24 is executed in the test planning assistance apparatus 100 to perform a test performance display process. In the test performance display process, a graph indicating actual test progress in the current test project is displayed in the graph area 403 of the scheduled performance display dialog 400.
After step S160 is completed, the procedure advances to step S170, where it is determined whether all the test cases that are to be executed in the current test project have already been tested. If the result is that all the test cases have already been tested, the testing for the current test project is completed. On the other hand, if all the test cases have not yet been tested, the procedure advances to step S180.
In step S180, the project administrator determines whether to adjust the test plan for the current test project. If the project administrator determines not to adjust the test plan, the procedure returns to step S130. On the other hand, if the project administrator determines to adjust the test plan, the procedure advances to step S190.
In step S190, the test estimation program 25 is executed in the test planning assistance apparatus 100 to perform a progress estimation process. In the progress estimation process, a time period (estimated period) required for subsequent test execution in the current test project is calculated, and a graph indicating estimated test progress is displayed in the graph area 403 of the scheduled performance display dialog 400. Also, in the progress estimation process, the test case selection program 27 is executed in the test planning assistance apparatus 100, so that the project administrator can select test cases, considering past test results.
<1.5.2 Test Case Management Process>
In step S230, inputting of a test case(s) by the operator is accepted. The test planning assistance apparatus 100 adds the details of the test case(s) inputted by the operator to the database 30 as a new piece of data. After step S230 is completed, the procedure returns to step S210.
In step S240, it is determined whether “DELETION OF TEST CASE” has been selected (as the process detail). If the determination result is that “DELETION OF TEST CASE” has been selected, the procedure advances to step S250. On the other hand, if “DELETION OF TEST CASE” has not been selected, the procedure advances to step S260.
In step S250, selection of a deletion target test case(s) by the operator is accepted. The test planning assistance apparatus 100 deletes the test case(s) selected by the operator from the database 30. After step S250 is completed, the procedure returns to step S210.
In step S260, it is determined whether “CORRECTION OF TEST CASE” has been selected (as the process detail). If the determination result is that “CORRECTION OF TEST CASE” has been selected, the procedure advances to step S270. On the other hand, if “CORRECTION OF TEST CASE” has not been selected, the test case management process is terminated.
In step S270, correction of a test case(s) by the operator is accepted. The test planning assistance apparatus 100 reflects the details of the test case correction by the operator in the database 30. After step S270 is completed, the procedure returns to step S210.
<1.5.3 Test Schedule Generation Process>
In step S320, the test planning assistance apparatus 100 displays a test progress schedule in the graph area 403 of the scheduled performance display dialog 400 based on the scheduled test case number calculated in step S310. For example, the test progress schedule is displayed in the graph area 403 of the scheduled performance display dialog 400, in the form of a graph as shown in
Note that in the present embodiment, a scheduled test case number calculating section is implemented by step S310, and a test progress schedule display section is implemented by step S320 and the scheduled performance display dialog 400.
<1.5.4 Test Result Management Process>
In step S430, inputting of a test result(s) by the operator is accepted. The test planning assistance apparatus 100 reflects the details of the test result(s) inputted by the operator in the database 30. After step S430 is completed, the procedure returns to step S410.
Instep S440, it is determined whether “EDITING OF TEST RESULT” has been selected (as the process detail). If the determination result is that “EDITING OF TEST RESULT” has been selected, the procedure advances to step S450. On the other hand, if “EDITING OF TEST RESULT” has not been selected, the test result management process is terminated.
In step S450, editing of the test result(s) by the operator is accepted. The test planning assistance apparatus 100 reflects the details of the test results edited by the operator in the database 30. After step S450 is completed, the procedure returns to step S410.
<1.5.5 Test Performance Display Process>
In step S520, the test planning assistance apparatus 100 displays actual test progress in the graph area 403 of the scheduled performance display dialog 400, based on the number of rounds of testing executed per day during the current test project and the cumulative number thereof, which are calculated in step S510. For example, the actual test progress is displayed in the form of a graph as shown in
Note that in the present embodiment, an executed test case number acquiring section is implemented by step S510, and an actual test progress display section is implemented by step S520 and the scheduled performance display dialog 400.
<1.5.6 Progress Estimation Process>
In step S620, the test planning assistance apparatus 100 causes the operator to select a test specification that is expected to require the same period of time (man-days) as the test specification that is to be executed, in accordance with a predetermined screen, and thereafter, the test planning assistance apparatus 100 calculates an estimated man-day number, i.e., the number of man-days estimated to be required for test execution, based on the number of past actual man-days spent for the selected test specification. After step S620 is completed, the procedure advances to step S640.
In step S630, the test planning assistance apparatus 100 performs an estimated man-day number calculation process based on the number of past actual man-days spent for the test specification that is to be executed. The estimated man-day number calculation process will be described in detail below. After step S630 is completed, the procedure advances to step S640.
In step S640, the test planning assistance apparatus 100 calculates a time period estimated to be required for test execution by dividing the estimated man-day number calculated in step S620 or S630 by an involved worker number (i.e., the number of workers who execute the testing during the test period). For example, the involved worker number may be previously inputted by the operator (e.g., the project administrator) in accordance with a screen (dialog) as shown in
In step S650, the test planning assistance apparatus 100 displays estimated test progress in the graph area 403 of the scheduled performance display dialog 400 based on the estimated time period calculated in step S640. For example, the estimated test progress is displayed in the form of a graph as shown in
Note that in the present embodiment, an estimated man-day number calculating section is implemented by step S630, an estimated period calculating section is implemented by step S640, and an estimated test progress display section is implemented by step S650 and the scheduled performance display dialog 400.
<1.5.7 Estimated Man-Day Number Calculation Process>
When the estimated man-day number calculation process is started, the test planning assistance apparatus 100 calculates the number of actual man-days per test case for each test specification on a project-by-project basis (hereinafter, referred to as the “project-specific actual man-day average number”) based on the details of test results held in the test case table 32 and actual man-day numbers held in the test performance table 33. The calculation is performed as described below.
The test case table 32 holds the test results for each test case on a project-by-project basis. Each test result is one of the following: “success”, “failure”, “untested”, and “unexecuted”. When the test result is “success” or “failure”, it is understood that the test case has been tested. On the other hand, when the test result is “untested” or “unexecuted”, it is understood that the test case has not been tested. In addition, the “TEST CASE NO.” in the test specification table 31 is linked with the “TEST CASE NO.” in the test case table 32. Therefore, for each test specification, it is possible to acquire the number of test cases that have been tested on a project-by-project basis as shown in
In addition, the test performance table 33 holds the actual man-day number for each test specification on a project-by-project basis. Accordingly, it is possible to acquire the actual man-day number for each test specification on a project-by-project basis as shown in
As such, the number of test cases (that have been tested) and the actual man-day number are acquired for each test specification on a project-by-project basis, and therefore by dividing the actual man-day number by the number of test cases, it is possible to calculate the project-specific actual man-day average number. In the example shown in
After step S632 is completed, the procedure advances to step S634. In step S634, the test planning assistance apparatus 100 calculates the number of actual man-days per test case for each test specification (hereinafter, referred to as the “group-specific actual man-day average number”) based on the project-specific actual man-day average number calculated in step S632. Specifically, a sum total of the project-specific actual man-day average numbers is obtained for each test specification, and the sum total is divided by the number of test projects that have already been executed, thereby obtaining the group-specific actual man-day average number. In the example shown in
After step S634 is completed, the procedure advances to step S636. In step S636, the test planning assistance apparatus 100 calculates a requisite man-day number, i.e., the number of man-days required for test execution, for each test specification in the current test project based on the group-specific actual man-day average number calculated in step S632. Specifically, for each test specification, the number of test cases that are to be tested in the current test project is multiplied by the number of actual man-days per test case. In the example shown in
After step S636 is completed, the procedure advances to step S638. In step S638, the test planning assistance apparatus 100 calculates a sum total of the requisite man-day numbers calculated in step S636. As a result, the number of man-days estimated to be required for subsequent test execution in the current test project is calculated. In the example shown in
Note that in the present embodiment, a project-specific actual man-day average number calculating section is implemented by step S632, a group-specific actual man-day average number calculating section is implemented by step S634, a group-specific requisite man-day number calculating section is implemented by step S636, and a group-specific requisite man-day number totalizing section is implemented by step S638. In addition, a first arithmetic section is implemented by steps S632 and S634, and a second arithmetic section is implemented by steps S636 and S638.
<1.6 Effects>
As described above, according to the present embodiment, each test specification contains a plurality of test cases, and for each test specification in each test project, the number of actual man-days spent for test execution (the actual man-day number) is held as data in the test performance table 33 within the database 30. In addition, the test case table 32 holds past test execution information (which indicates whether the testing has been executed) for each test case. When the test planning assistance apparatus 100 estimates the test progress, the number of man-days required for test execution (the requisite man-day number) for each test case that is to be subsequently executed in the current test project is calculated based on the actual man-day number stored in the test performance table 33 and the test execution information stored in the test case table 32. Thereafter, an overall estimated man-day number is calculated based on the requisite man-day number that is calculated for each test case in accordance with the past performance. Therefore, the requisite man-day number for subsequent test execution can be calculated, considering the difficulty and complexity of test cases that are to be subsequently executed. Thus, it is possible to reduce the difference between the estimated man-day number and the actual man-day number in the test project, compared to the difference conventionally incurred.
For example, in the case where the actual test performance is as shown in
As described above, the difference between the estimated man-day number and the actual man-day number can be reduced, and therefore, for example, it is possible for the project administrator to readily distribute resources, such as workers and devices, and manage test schedules.
In addition, in the present embodiment, a time period (estimated period) required for subsequent test execution is calculated based on the number of involved workers and the estimated man-day number, which is calculated in accordance with the past performance. Therefore, it is possible to reduce the difference between the estimated period and the actual period as compared to the difference conventionally incurred. Thus, it is possible to reduce the risk of delays in test progress.
Further, the scheduled progress, actual performance, and estimation are displayed per test project in the form of a graph in the scheduled performance display dialog 400. Therefore, it is possible for the project administrator to visually obtain the progress of the test project. Thus, it is possible for the project administrator to readily manage the progress of the test project.
Furthermore, when estimating the test progress, it is possible to select preferred test cases in accordance with the optimization process. For example, the optimization process makes it possible to reduce the number of man-days indicated by reference character K2 in
Next, a second embodiment of the present invention will be described. In the first embodiment, the number of man-days required for test execution in the current test project is calculated for each test specification based on the past actual man-day number per test case (see, for example, steps S634 and S636 in
<2.1 Configuration>
The overall system hardware configuration in the present embodiment is the same as that in the first embodiment shown in
In the present embodiment, each test specification is correlated with execution information 80, which indicates an execution result per test as shown in
Next, the skill information table 34 will be described. In general, when the same worker repeatedly tests a given test specification, the more tests he/she experiences, the shorter the time (the number of man-days) required for test execution becomes. This is because the worker becomes familiar with operations for the testing. In the present embodiment, the degree of familiarity (skill) is managed by the skill information table 34 as a “coefficient”. Note that the skill information table 34 is provided for each test specification.
For example, it is assumed that different rounds of testing for a given test specification are executed by workers as shown in
<2.2 Skill Information Table Updating Process>
When the skill information table updating process is started, the test planning assistance apparatus 100 determines whether to update data for “COEFFICIENTS” in the skill information table 34 (step S710). The determination is made based on whether the same worker has consecutively executed the testing for the test specification a plurality of times. Specifically, if the same worker has consecutively executed the testing a plurality of times, the determination is to update the data for “COEFFICIENTS”, and if not, the determination is to not update the data for “COEFFICIENTS”. If the determination result is that the data for “COEFFICIENTS” is to be updated, the procedure advances to step S720, while if the determination result is that the data for “COEFFICIENTS” is not to be updated, the procedure advances to step S750.
In step S720, the “latest coefficient” is calculated. Here, the “latest coefficient” refers to a value representing the ratio between the actual man-day number per test case for the first one of the consecutive rounds of testing currently being executed and the actual man-day number per test case for the latest round of the testing. In the example shown in
After step S720 is completed, the procedure advances to step S730, where an average coefficient value is calculated. In the example shown in
Kave=(Kold×N+Knew)/(N+1) (1),
where Kold is the past average coefficient value, N is the number of times the consecutive rounds of testing have been executed in the past, and Knew is the latest coefficient.
In the example shown in
After step S730 is completed, the procedure advances to step S740, where the skill information table 34 is updated in terms of the “coefficient” data and the “count” data. In step S740, the “coefficient” data is updated to the average coefficient value Kave calculated in step S730, and the “count” data is updated to a value obtained by adding “1” to the data that has been entered in the “COUNTS” field. In the above example, as for the data concerning the “3RD ROUND” for “ICHIRO SUZUKI” in the skill information table 34 shown in
In step S750, the “count” data in the skill information table 34 is updated. Specifically, the data concerning the first round for the corresponding worker is updated to a value obtained by adding “1” to the data that has been entered. The skill information table updating process ends upon completion of step S750. Note that in the present embodiment, a skill information updating section is implemented by steps S710 to S750.
<2.3 Skill-Considered Estimated Man-Day Number Calculation Process>
The estimated man-day number calculation process is performed only on the test specifications whose test operation status is “BEING TESTED” or “UNEXECUTED”. Accordingly, in the example shown in
When the estimated man-day number calculation process is started, the test planning assistance apparatus 100 calculates an actual man-day reference number for each test specification (step S810). Here, the “actual man-day reference number” is meant to indicate the number of man-days that is used as a reference when calculating the estimated man-day number in consideration of skills. Specifically, when the same worker executes consecutive rounds of testing for a given test specification, the actual man-day reference number refers to the number of actual man-days spent per test case in the first one of the consecutive rounds of testing. In the example shown in
In step S820, the test planning assistance apparatus 100 acquires the number of test cases that are to be executed for each test specification. In the example shown in
In step S830, the test planning assistance apparatus 100 calculates the requisite man-day number for each test specification in consideration of skills. Specifically, the skill-considered requisite man-day number T is calculated by the following equation (2):
T=(Tbase×X)/K (2),
where Tbase is the actual man-day reference number calculated in step S810, X is the number of test cases acquired in step S820, and K is a coefficient stored in the skill information table 34, regarding the number of consecutive rounds that is to be currently estimated for the corresponding worker.
In the example shown in
After step S830 is completed, the procedure advances to step S840. In step S840, the test planning assistance apparatus 100 calculates a sum total of the actual man-day numbers calculated in step S830. As a result, the number of man-days estimated to be required for subsequent test execution in the current test project is calculated. After step S840 is completed, the procedure advances to step S640 in
<2.4 Effects>
As describe above, according to the present embodiment, the skill information table 34 holds, for each test specification, data indicating each worker's testing skill in relation to the consecutive rounds of testing executed by the worker. Thereafter, the requisite man-day number required for test execution is calculated based on the skill information stored in the skill information table 34. Therefore, the estimated man-day numbers are calculated, considering the individual workers' testing skills. Thus, it is possible to enhance the accuracy of the test plan for each test project, and minimize the difference between the estimated man-day number and the actual man-day number in the test project.
In addition, the contents of the skill information table 34 are updated each time the same worker executes consecutive rounds of testing for a given test specification. Therefore, data concerning each worker's skill is accumulated as the number of testing rounds increases, so that the estimated man-day numbers are more accurately calculated, considering the individual workers' skills.
Furthermore, in the present embodiment, when the same worker is executing consecutive rounds of testing, the number of actual man-days spent per test case in the first one of the consecutive rounds of testing is determined as the actual man-day reference number. Thereafter, the actual man-day reference number is multiplied by the number of test cases that are to be executed in the current round of testing, and the resultant value is divided by a coefficient indicating the skill, thereby calculating the requisite man-day number. Thus, even if the number of test cases that are to be executed varies from one round of testing to another, the estimated man-day numbers can be accurately calculated, considering the individual workers' skills without being affected by variations in the number of test cases.
<3. Others>
The test planning assistance apparatus 100 is implemented by the programs 21 to 27, which are executed by the CPU 10 for the purpose of test case management and so on, on the premise that there are hardware devices such as the memory 60 and the auxiliary storage 70. For example, part or all of the programs 21 to 27 are provided in the form of a computer-readable recording medium such as a CD-ROM, which has the programs 21 to 27 recorded therein. The user can purchase a CD-ROM having the programs 21 to 27 recorded therein, and load the CD-ROM into a CD-ROM drive (not shown), so that the programs 21 to 27 are read from the CD-ROM and installed into the auxiliary storage 70 of the test planning assistance apparatus 100. In this manner, it is possible to provide the programs in order to cause the computer to execute each step shown in the flowcharts.
Also, in each of the above embodiments, the “TEST CASE NO.” in the test specification table 31 is repeated by the number of test cases as shown in
Furthermore, each of the above embodiments has been described based on the premise that one test project is executed using a plurality of test specifications, each containing a plurality of test cases, as shown in
Furthermore, each of the above embodiments has been described with respect to the example where the test planning assistance apparatus 100 is used for testing in the software system development, but the present invention is not limited to this. For example, the present invention is applicable in testing chemical substances, machinery, instruments and equipment, so long as the testing is repeatedly executed.
While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.
Note that the present application claims priority to Japanese Patent Application No. 2006-165606, titled “TEST PLANNING ASSISTANCE APPARATUS AND TEST PLANNING ASSISTANCE PROGRAM”, filed on Jun. 15, 2006, and Japanese Patent Application No. 2007-123870, titled “TEST PLANNING ASSISTANCE APPARATUS AND TEST PLANNING ASSISTANCE PROGRAM”, filed on May 8, 2007, which are incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
P2006-165606 | Jun 2006 | JP | national |
P2007-123870 | May 2007 | JP | national |