The present application claims Convention priority from Japanese Patent Application No. 2015-237213 filed on Dec. 4, 2015, the full content of which incorporated herein by reference.
1. Field of the Invention
The present invention relates to a testing support system, and a testing support method.
2. Related Art
Japanese Patent Laid-open Publication No. 2008-204405 describes that: “In order to prevent the number of test items from being huge at regression testing, the regression testing system automatically extracts portions to be preferentially tested in accordance with modifications. This enables automatic and efficient regression testing”, “The regression testing system is configured to perform testing by storing information concerning test cases in past testing and combining the stored information and the dependence relation obtained by analyzing the target program”, and “With reference to the dependence relation between statements in a software program, which is stored in a storage unit, the information processing apparatus extracts a statement relating to a modified statement; detects a test case associated with the extracted statement from information about the correspondence between past test cases and statements in the software program, which is stored in the storage unit; and executes the detected test case.”
In the development site of software, to improve the quality and development efficiency, software developers have established development styles in which implementation of source codes and testing are repeatedly performed on the basis of functions. Application software such as Web applications implements a function with plural screens in some cases. Such software is often tested by executing a test scenario describing user's operations, including operations causing screen transition (operations for buttons and links, and the like), as a series of sequences.
However, at testing using such a test scenario, it is not easy to find the correspondence between screens and description of the test scenario or the correspondence between screens and source codes. Accordingly, projects to develop large-scale information processing systems which many developers are involved in particularly require a lot of time and effort to find bugs and specify the causes thereof, thus lowering the software development efficiency.
The present invention is made in the light of the aforementioned background, and an object of the invention is to provide a testing support apparatus, and a testing support method which enable efficient software testing.
An aspect of the invention to achieve the aforementioned object is a testing support apparatus (an image processing apparatus) which supports testing of software, the apparatus including: a storage unit storing a plurality of test scenarios including description concerning transition of screens; source codes constituting the software, and screen/source code correspondence information representing the correspondence between the screens and the source codes related to generation of the screens, a test scenario division part configured to generate test scenario dividing information which is information including the results of dividing the description of each of the test scenarios into a plurality of blocks in terms of the screens, and a test scenario execution priority setting part which sets execution priority levels for the plurality of test scenarios based on the test scenario dividing information.
According to the present invention, it is made possible to test software efficiently.
Hereinafter, a description is given of an embodiment with reference to the accompanying drawings. In the following description, the term “data base” is sometimes abbreviated as DB.
The testing support apparatus 100 supports testing of software performed by the user. In the description of this embodiment, software as an object to be tested by the testing support apparatus 100 is software providing a Web service through the Internet by way of example. Examples of the user of the testing support apparatus 100 assumed in the following description are persons utilizing information outputted from the testing support apparatus 100, developers or maintenance staff of the software as the testing object, and the like.
When the user tests software by executing plural test scenarios including description of screen transition, the testing support apparatus 100 controls execution of the test scenarios so that the user can perform the testing efficiently. For example, when the user tests screens (hereinafter, referred to as screens under development), which are under development and described in predetermined source codes (Web description language (HTML (Hypertext Markup Language), XHTML (eXtensible Hypertext Markup Language), or XML (eXtensible Markup Language), for example) or various types of scripts (JavaScript (registered trademark) or the like)), the testing support apparatus 100 sets different execution priority levels for the plural test scenarios and executes the test scenarios in the order of the set execution priority levels. When software to be tested is of large scale and the test scenario to be executed is significantly long, a certain amount time is required after the initiation of the test scenario before the screen under development, which the user intends to verify, is executed in some cases. In such a case, the execution priority levels are set so as to preferentially execute the test scenario in which the screen under development is executed earlier than in the other test scenarios. This allows the screen under development to be executed early, and the user can acquire the test results of the screen under development quickly.
The testing support apparatus 100 sets the aforementioned execution priority levels based on a criterion specified by the user. The testing support apparatus 100 sets the execution priority levels so as to start the test for a screen under development early, for example. Alternatively, by acquiring information concerning the frequency of failure in a screen under development from an execution log obtained at past execution of the test scenarios, the testing support apparatus 100 sets the execution priority levels so that the test scenario with a high frequency of failure is executed with high priority. In such a manner, the testing support apparatus 100 sets the execution priority levels based on a criterion specified by the user and therefore tests a screen under development according to the user's need. To set the aforementioned execution priority levels, the testing support apparatus 100 divides the description of a test scenario on the basis of screens to efficiently know the description of the test scenario on the basis of screens.
As illustrated in
The processor 11 is composed of a central processing unit (CPU) or a micro-processing unit (MPU), for example. The processor 11 reads and executes a program stored in the main storage device 12 to implement various functions of the information processing apparatus 10. The main storage device 12 is a device storing programs and data, and the examples thereof are a read only memory (ROM), a random access memory (RAM), and NVRAM (non-volatile memory). The auxiliary storage device 13 is a hard disk drive, a solid state drive (SSD), an optical storage device, or a reader/writer for recording media, for example. Programs and data stored in the auxiliary storage device 13 are loaded onto the main storage device 12 when needed.
The input device 14 is a keyboard, a mouse, or a touch panel, for example. The output device 15 is a liquid crystal monitor, a liquid crystal display (LCD), a graphic card, or a speaker, for example. The communication device 16 is a communication interface communicating with another device through the communication network 5, and the examples thereof are a network interface card (NIC) and a wireless communication module. The input and output devices 14 and 15 are not necessarily required, and the information processing apparatus 10 may be configured to input and output information to and from another device through the communication device 16.
Subsequently, a description is given of the function provided for the testing support apparatus 100 and information (data) managed by the testing support apparatus 100 with reference to
As illustrated in
Among the functions illustrated in
The source code acquisition part 102 properly acquires each source code 156 from the project information management apparatus 200 through the communication network 5. Each source code 156 is configured with data described in a Web description language or various types of scripts, for example. The testing support apparatus 100 manages the data on the basis of files which can be handled by the file system, for example.
The screen/source code correspondence information generation part 103 generates the screen/source code correspondence information 151 based on the source codes 156 (source files) stored in the storage unit 150. The screen/source code correspondence information 151 is described in detail later.
The test scenario acquisition part 104 properly acquires test scenarios 154 from the project information management apparatus 200 through the communication network 5. Each test scenario 154 includes plural steps (lines) described in a natural language, for example. The test scenario 154 includes operations which are performed for a Web page (screen) along a predetermined work scenario to cause screen transition. The test scenario 154 includes description to cause screen transition and description about what kind of test is performed for screens.
The test scenario dividing information generation part 105 divides each test scenario 154 into plural blocks on the basis of screens generated based on the source codes 156 and generates the test scenario dividing information 152 including the result of division. The test scenario dividing information 152 is described in detail later.
The test scenario execution priority setting part 106 sets the execution priority levels for the plural test scenarios 154 acquired by the test scenario acquisition part 104 and stores the result of setting the execution priority levels as the test scenario execution priority information 153. The specific method of setting the execution priority levels and the detail of the test scenario execution priority information 153 are described later.
The test scenario execution part 107 sequentially executes the test scenarios 154 in accordance with the execution priority levels set for the respective test scenarios 154. At executing the test scenarios 154, the test scenario execution part 107 controls the Web browser 101 and automatically executes the procedure of input operations based on the test scenarios 154, for example. The software implementing the function of the test scenario execution part 107 is Cucumber (registered trademark) or Turnip (registered trademark), for example. The test scenario execution part 107 may be configured to operate in cooperation with the Web server operating in the content provider apparatus 300, for example.
The execution log acquisition part 108 acquires information generated by the test scenario execution part 107 to execute the test scenarios 154 and stores the acquired information as the execution logs 155. The acquired information includes the start/end time of each step, occurrence of errors, information specifying the location of each error, the detail of each error, execution history, and the like.
The project information management apparatus 200 integratedly stores and manages information concerning projects (projects for development, maintenance, and designing) which the user is involved in. The project information management apparatus 200 includes: a project DB 202 in which information concerning projects is managed; and a DB management part 201 managing the project DB 202. The DB management part 201 includes a function as a database management system (DBMS), a distributed version management system, or the like, for example. The project information management apparatus 200 manages in the project DB 202, software and data items managed as a repository in association with metadata (specifications, design specifications, models, diagrams, information operation rules, and the like), for example.
The content provider apparatus 300 includes a content delivery part 301 storing and delivering content data 302. The content data 302 includes data described in a predetermined data format (character data, audio data, image data, video data, and the like), for example. When receiving the aforementioned content acquisition request, the content provider apparatus 300 sends the corresponding content data 302 to the testing support apparatus 100.
The test scenarios 154, execution logs 155, source codes 156, project DB 202, and content data 302 are unnecessarily arranged in the aforementioned manner. The arrangement manner of the data may be properly changed depending on the configuration of the information processing system 1.
In this example, the screen ID 1511 “Sample screen” corresponds to source files (source codes) including sample. html, sample.js, menu.html, sidebar.html, contents.html, and contents.js as illustrated in
The screen ID 1511 “Top screen” corresponds to the source files (source codes) of top.html and top.js. The behavior of a single or plural screen components described in top.html is described in top.js.
The screen ID 1511 “Confirmation screen” corresponds to the source files (source codes) of confirm.html and confirm.js. The behavior of a single or plural screen components described in confirm.html is described in confirm.js.
The screen ID 1511 “Portal screen” corresponds to the source file (source code) of portal.html. The behavior of a single or plural screen components described in portal.html does not need to be defined, and there is no script file corresponding to Portal screen.
The test scenario dividing information 152 illustrated in
With reference to
The number of screen hops for each screen indicates the number of screens which are tested from the beginning of the scenario until the screen of interest is tested. In
When the screen to be tested is screen B and execution of the test scenarios is prioritized based on the number of screen hops as the criterion, execution of the test scenarios is prioritized in order of test scenario (2), test scenario (1), and test scenario (3).
The test scenario execution priority information 153 includes the execution priority levels of the test scenarios 154 determined by the test scenario execution priority setting part 106 for each screen based on each criterion.
In
In
In
Herein, when the test of a specific screen under development needs to be started early, for example, the user selects any one of the number of screen hops and reach time as the criterion to prioritize the test scenarios 154. When it is necessary to find failure in a specific screen under development early, the user selects the frequency of failure as the criterion to prioritize the test scenarios 154.
The above criteria for priority are shown just way of example, and the test scenarios 154 may be prioritized based on another criterion. Moreover, the test scenarios 154 may be prioritized by collectively judging the orders of execution priority obtained based on plural criterion for priority, for example. When the orders of execution priority based on two criteria are reversed to each other, the order of execution priority based on one of the two criteria is preferentially selected, for example. The test scenario 154 not including the description concerning the screen to be tested may be exempted from the objects to be executed. In this case, a predetermined symbol such as “-” is set in the field of the corresponding test scenario in the test scenario execution priority information 153, for example.
Next, a description is given of a process performed in the thus-configured information processing system 1.
The testing support apparatus 100 first performs a process to generate the screen/source code correspondence information 151 (hereinafter, referred to as a screen/source code correspondence information generation process S811) and a process to generate the test scenario dividing information 152 (hereinafter, referred to as a test scenario dividing information generation process S811) (S811, S812). These processes are described in detail later.
Subsequently, the testing support apparatus 100 accepts a user ID (a developer name concerning the development of screen underdevelopments or the like) and specification of the criterion (S813). The user can specify one or plural criteria.
Subsequently, the test supporting apparatus 100 acquires the source codes associated with the user ID accepted in S813 from the project DB 202 of the project information management apparatus 200 and stores the same as the source codes 156 (S814). For example, the testing support apparatus 100 may be configured to narrow the source codes to be acquired from the project DB 202 based on the information accepted from the user, such as development date and time (not described in this example).
The testing support apparatus 100 then specifies source files storing the acquired source codes (S815) and specifies screens corresponding to the source files (hereinafter referred to as screen under developments) based on the screen/source code correspondence information 151 (S816).
Subsequently, the testing support apparatus 100 acquires the test scenarios 154 including the description concerning the screen under developments specified in S816 with reference to the test scenario dividing information 152 (S817). In this example, the testing support apparatus 100 acquires plural test scenarios 154.
The testing support apparatus 100 sets execution priority levels for the plural test scenarios 154 acquired in S817 based on the criterion accepted in S813 and stores the result of setting as the test scenario execution priority information 153 (S818).
The testing support apparatus 100 then executes the test scenario having the highest execution priority among the unexecuted test scenarios 154 (S819).
The testing support apparatus 100 then determines whether the test concerning the screen under development is successfully executed (S820). When the determination is true (YES in S820), the process proceeds to S821. When the determination is false (NO in S820), the process proceeds to S822, and the testing support apparatus 100 performs a process to output error information and the like and then terminates the execution of the test scenario 154. The execution of the test scenario 154 may be terminated when the test concerning the screen under development ends as described above or may be maintained to the end of the test scenario 154. When the execution of the test scenario 154 is terminated when the test concerning the screen under development ends like the former configuration, the user can acquire the result of testing early, so that the time taken to detect failure can be shortened.
In S821, the testing support apparatus 100 determines whether there is an unexecuted test scenario 154 among the test scenarios 154 concerning the screen under development. When the determination is true (YES in S821), the process returns to S819, and the testing support apparatus 100 starts another unexecuted test scenario 154. When the determination is false (NO in S821), the testing support apparatus 100 terminates the test scenario execution process S800.
In the test scenario execution process S800 described above, when the pattern of screen transition of a certain test scenario 154 overlaps whole or part of the pattern of screen transition of another test scenario 154, the testing support apparatus 100 may be configured to properly select test scenarios 154 to be executed so as not to execute the overlapping patterns for improvement in the efficiency of testing. In
The testing support apparatus 100 first acquires a source code that defines a screen (hereinafter, referred to as a screen defining script file) from the stored source codes 156 (source files). For example, the testing support apparatus 100 acquires a screen defining script file that describes a pair of the screen name and source file name (a HTML file name, a HTML file ID, a script files name, or a script ID, for example) from the source codes 156 (source files) (S911).
The testing support apparatus 100 then acquires a source file name associated with the screen name from the screen defining script file acquired in S911 (S912) and stores the pair of screen name and source file name as the screen/source code correspondence information 151 (S913).
Back to
Back to
The testing support apparatus 100 selects a script ID not selected yet from the script IDs listed in S915.
The testing support apparatus 100 searches the stored source codes 156 (source files) for a script file including the selected script ID and acquires the obtained name (script file name) of the script file (S917). The testing support apparatus 100 stores the pair of the acquired script file name and the screen name described in the screen defining script file as the screen/source code correspondence information 151 (S918).
Back to
In S920, the testing support apparatus 100 determines whether there is a screen defining script file not acquired in S911 among the stored source files 214 (S920). When the determination is true (YES in S920), the process returns to S911, and the testing support apparatus 100 acquires another screen defining script file and performs the same process for the newly acquired screen-defining script file. When the determination is false (NO in S920), the process proceeds to S912 of
As illustrated in
A) DISPLAY top.html
B) INPUT USER NAME (id=“hitachi”)
C) INPUT PASSWORD (pass=“pass”)
D) PRESS LOGIN (id=“login”) BUTTON
F) PRESS NEXT (id=“next”) BUTTON
G) “hitachi” IS DISPLAYED IN LOGIN NAME
According to this test scenario 154, after Top screen is tested, the screen transition from Top screen to Confirmation screen is performed. Confirmation screen is then tested, and screen transition from Confirmation screen to Portal screen is performed. Finally, Portal screen is tested. As illustrated in this example, the description of the test scenario 154 does not include information representing the locations partitioning the screens. It is therefore impossible to directly specify the lines at which testing of a certain screen starts and ends.
Hereinafter, a description is given of the test scenario dividing information generation process S812 with reference to
The testing support apparatus 100 then specifies in the acquired test scenario 154, the description concerning the screen which is to be tested at first when the acquired test scenario 154 is executed (S1112). When the test scenario 154 is a test scenario 154 illustrated in
Subsequently the testing support apparatus 100 determines whether the subsequent lines include description causing screen transition (S1113). When the determination is true (YES in S1113), the process proceeds to S1114. When the determination is false (NO in S1113), the process proceeds to S1116.
For example, when the test scenario 154 is the test scenario illustrated in
In S1114, the testing support apparatus 100 divides the description of the test scenario 154 into blocks on the basis of screens and stores the results of division as the test scenario dividing information 152.
In S1115, the testing support apparatus 100 determines whether there is a subsequent line in the acquired test scenario 154. When the determination is true (YES in S1115), the process returns to S1113, and the testing support apparatus 100 performs the process following S1113 for the subsequent line. When the determination is false (NO in S1115), the process proceeds to S1116.
In the test scenario 154 illustrated by way of example in
In S1116, the testing support apparatus 100 determines whether there is a test scenario 154 not acquired. When the determination is true (YES in S1116), the process returns to S1111. When the determination is false (NO in S1116), the process proceeds to S813 of
By the test scenario dividing information generation process S812, the test scenario 154 illustrated in
In the above description, the test scenario 154 is divided with reference to the description concerning buttons as the screen components but may be divided with reference to another kind of description of link components or the like including characters and images.
As described in detail hereinabove, the testing support apparatus 100 of the embodiment generates the test scenario dividing information 152 by dividing the description of the test scenario 154 into plural blocks in terms of screens. The testing support apparatus 100 is thereby enabled to know the description of each test scenario 154 on a screen basis and to specify part of the description corresponding to the screen under development in the test scenario 154. The testing support apparatus 100 is therefore enabled to determine which test scenario 154 needs to be preferentially executed for the purpose of acquiring the result of testing of the screen under development early or which test scenario 154 needs to be preferentially executed for the purpose of acquiring information concerning failure of screen under developments. By prioritizing the plural test scenarios in accordance with the result of determination, the testing support apparatus 100 is enabled to provide information required by the user efficiently and quickly.
When the user intends to acquire the result of testing of screen B as the screen under development and to prioritize the four test scenarios 154 illustrated in
Hereinabove, the present invention made by the inventor is specifically described based on the embodiment. However, it is certain that the present invention is not limited to the aforementioned embodiment and can be variously changed without departing from the scope of the invention. For example, the embodiment is described in detail for easy understanding of the present invention and is unnecessarily limited to the system including all of the components described above. Moreover, the components of the embodiment can be removed or replaced or further include another component.
The present invention is applicable to various purposes including the purpose of quickly displaying a particular screen at reviewing screens displayed by software, for example.
Moreover, a part or all of each component, function, processing part, processing unit, and the like described above may be implemented by hardware and, for example, may be designed with integrated circuits or the like. Alternatively, each component, function, and the like described above may be implemented by software in such a manner that the processor interprets and executes a program implementing each function thereof. The information of programs, tables, and files implementing the functions and the like can be placed in a recording device such as a memory, a hard disk, a solid state drive (SSD) or a recording medium such as an IC card, an SD card, or a DVD.
The drawings described above illustrate the control and information lines which are thought to be necessary for explanation and do not illustrate all of the control and information lines necessary for implementation. It can be considered that almost all of the components are coupled to each other.
Number | Date | Country | Kind |
---|---|---|---|
2015-237213 | Dec 2015 | JP | national |