This invention relates to a method of delegating and automating at least part of documentation review to software.
After a software product has been on the market for several years, when the current documentation is reviewed, it is often discovered that the documentation contains errors when compared to the original software. Documentation review is a manual process that relies on the attention and competencies of the reviewers. There are currently a few tools that help the reviewers, but such tools are nothing more than collaboration software that allow simultaneous review of an electronic document. In other words, each reviewer can simultaneously review and add their respective comments and see the other reviewers' comments and eventually discuss those comments. There is nothing available that actually frees the reviewer to manually check the code samples, command line options, and compatibility/support tables.
The shortcomings of the prior art are overcome and additional advantages are provided through the provision of a method of delegating and automating at least part of documentation review of software. Embodiments of the present invention propose a tool that automatically identifies errors in software documentation when compared to the original software which is running and assumed to be error-free. In embodiments of the invention, the documentation is split into logical parts that can be associated to specific use cases and automatically validated by a software that is able to read the documentation, extract the proper information, code/build the corresponding test application, run the test application and check the result.
Embodiments of the invention propose a computer implemented method of software product documentation review that involves, for example, importing a description of a structure of the product documentation for use in determining locations in the product documentation from which to extract code and/or command portions of the product documentation for testing and also providing pre-defined test case stub files for the product documentation, which test case stub files have tags indicative of locations for insertion of extracted portions of the product documentation for testing.
According to embodiments of the invention, the code and/or command portions are extracted from the product documentation at locations based on the documentation structure description by a parser/validator component and inserted into the test case stub files at locations indicated by the tags. The test case stub files are run with the code and/or command portions inserted, and a validating test report for the product documentation is generated if the test case stub files with the code and/or command portions inserted are runnable. On the other hand, an error test report is generated if the test case stub files with the code and/or command portions inserted are not runnable.
Additional features and advantages are realized through the techniques of the present invention. Other embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed invention. For a better understanding of the invention with advantages and features, refer to the description and to the drawings.
As a result of the summarized invention, technically we have achieved a solution for implementing a method of software product documentation review that involves checking if code samples are correct and buildable/runnable by extracting code samples from documentation and placing them into a previously defined test case stub and executing the test cases by an automated test framework.
The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other objects, features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains the preferred embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
According to the method for embodiments of the invention, the documentation is split into logical sections according to the specific software use cases. In particular every use case maps a documentation part. Each use case has an associated XML schema and/or grammar rules for defined features including a description on how to read/parse documentation (e.g., how to understand that a line represents a command and how to understand that some of the options are optional and others mandatory). Code samples are also marked in the documentation and associated to a specific use case.
The process of automated documentation validation for embodiments of the invention consists of checking various matters for correctness. For example, the automated documentation validation process involves checking that documentation contains all use cases described correctly according to defined grammatical rules and XML schemas.
The process for embodiments of the invention also involves checking if code samples are correct and buildable/runnable. This is done by extracting code samples from documentation and placing them into a previously defined test case stub. Test cases constructed in that way are executed by an automated test framework (i.e. component verification test suite).
Further, the automated documentation validation process for embodiments of the invention involves checking if command line commands as described in the documentation are working correctly and all of the expected options are listed and work as designed. This is done in a similar way as for code samples, the only difference being in the implementation of test case stubs.
In addition, the process for embodiments of the invention involves checking correctness of syntactical/compatibility tables (e.g., tables showing the platform dependencies of commands or the relationships between commands). This is done by parsing the table, building the proper column/row matches and executing the corresponding commands
For example, if we look at the first row of the ‘Attribute Name’ column 102 we find ‘Product’ and at the first column under the platform ‘AIX’, when running the specific command on AIX, we should expect an attribute called ‘Product’ in the output. If the attribute called ‘Product’ is missing from the output, it is a bug. The opposite is true if running, for example, on Linux (S/390) because the ‘X’ is absent from the intersection of the appropriate row and column.
More specifically, after parsing the table and constructing an XML schema for each platform, it is possible to validate output (eventually converted to XML) for a particular machine. If validation fails, it is possible to detect mismatches between the documentation and the software output. Assuming that the software is stable and sufficiently tested, those mismatches can be considered as documentation problems.
Embodiments of the invention involve, for example, writing test case stub files (i.e., data files or programs that stand in for the original file), which are pieces of code that will execute on the real software, and taking the substance of the extracted documentation and testing it on the real software.
Thus, according to embodiments of the invention, code samples are extracted from the documentation and moved into a test case dump file, which is executed later via a test framework on the real software and produces an output that is correct or incorrect, depending on whether or not the code samples are error-free. If the test case fails, it means that the documentation contained an error for this test case of this extracted piece of code from the documentation.
The validation report 210 is an outcome of comparing the documentation 202 to the description of the documentation structure 204. Thus, for example, the documentation 202 is totally broken if the Chapters do not match what is expected (e.g., a Chapter is missing), and it is immediately reported in the validation report 210.
The test case stub file 212 is also important to the documentation process. The data extracted from the documentation are simply inserted into the test case stub file 212 as explained in the foregoing example, and thereafter the stub file 212 becomes a test case object 206 that can be executed by the automatic test framework 208. The result of the execution of the test is the test report 214 that indicates that a particular test case has passed or failed.
For an example of the validation process for embodiments of the invention utilizing a “Sample Application”, with reference to
Chapter 1: Command Line Installation
Default Target Path Installation
Custom Target Path Installation
Chapter 2: Custom Functions
Income Report Generation
Example Usage of ReportGeneration Class:
The documentation structure description file 204 is a file which describes the structural model with which product documentation 202 must comply, and which is used to find the proper places from which to extract information from the real documentation for test purposes. Referring again to
The test case stub file 212 indicates the location where the code sample that was extracted from the documentation 202 can be inserted, and after insertion at the proper location in the stub file 212, in the example, the “Income Reports” test can be executed on the real software for test purposes. Referring further to
Also referring to
An example of additional test case stubs 212 also filled with the extracted data is as follows:
Referring further to
Automatically testing, according to embodiments of the invention, that examples (e.g. command line commands with parameters or code snippets) in product documentation are correct, and/or that the product documentation complies with a defined product documentation structure, applies primarily to textual documentation samples, targets mainly command-line oriented products, and is most valuable/suitable for example-rich documentation.
The flow diagrams depicted herein are just examples. There may be many variations to these diagrams or the steps (or operations) described therein without departing from the spirit of the invention. For instance, the steps may be performed in a differing order, or steps may be added, deleted or modified. All of these variations are considered a part of the claimed invention.
While the preferred embodiment to the invention has been described, it will be understood that those skilled in the art, both now and in the future, may make various improvements and enhancements which fall within the scope of the claims which follow. These claims should be construed to maintain the proper protection for the invention first described.