The present invention describes a system, method and apparatus for converting the business processes to test-centric activity diagrams to computationally generate automated test suites for various quality attributes.
Automated test scenario generation has always been a challenge, a problem that the software testing industry has been looking to solve. Conventionally, it has been proven that over 30% of the effort in a typical software test life cycle is spent in authoring and maintaining test cases. Reduction of this effort will have a significant impact on the overall cost of the project and resource optimization.
The age-old graph theory has been “re-purposed” to derive test sequences (paths) from the diagram and additional “In-house” methods have been used to generate additional test cases.
U.S. Pat. No. 8,479,164 titled “Automated Test execution plan generation” describes a method to automatically generate test execution plans. Using this test execution plan generation tool, a set of user-configured testing parameters for a software application under test can be obtained. Using the user-configured test parameters and a predefined test execution plan data model, a test execution plan can be automatically generated. This tool consists of a computer program product stored on a physical medium where the plurality of user-configured testing parameters correlates with, at least one of the items contained in a predefined test execution plan data model associated with this tool.
U.S. Pat. No. 6,378,088 titled “Automated test generator” describes a process where the test generator generates tests by randomly traversing a description of the interface of the program being tested. It consists of a computer, and the test generator is executed by the computer. This represents an interface of the application program as a graph and it also automatically generates a test that exercises the application program. The tests generated contain randomly selected actions and randomly generated data. When these tests are executed, it randomly manipulates the program being tested.
U.S. Pat. No. 7,865,780 titled “Method for test case generation” describes a system, which provides randomly generated test cases for set of interfaces for a piece of software. The method comprises of a random test case number generator and a test case generator comprising of a parameter value generator. The parameter value generator assigns the parameter value for each interface based on the test case number. The method involves initializing a test case generator with parameter arrays with cardinality and a prime number for each individual parameter for each of the set of interfaces.
EP1926021 titled “Software test case generation” describes an apparatus, a computer program and a method for test case generation. The apparatus consists of a parser module, which analyses a source code into a code model, an analysis module, which utilizes the code model to parse a source code, in such a way that the execution paths can be determined. The system also consists of a user-interface module to visualize the possible execution paths in the source code and to allow the selection of the execution path from the user. A generation module is configured to generate a test case for testing of the selected execution path. These modules are configured to execute a computer program.
Path-based test conditions are well known in the art and refer to the method of testing where every possible path that the program could take, through the course of its execution, is tested. Test Cases in path-based coverage are prepared based on a logical complexity measure. Test sequences or cases can also be generated by expert systems, with the goal of increased automation. By summarizing domain knowledge based on a manual generation of test cases, the Test Cases and methods are encapsulated in an expert system that is used. (An expert system approach for generating test sequence for CTCS-3 train control system, Zhang Yong et. al., Fourth International Conference on Intelligent Control and Information Processing (IMP), 2013, IEEE.) Error-Based testing refers to using simple programmer error models and focus-directed methods for detecting the effects of errors. (Error-Based Software Testing and Analysis, Howden, 35th Annual Computer Software and Applications Conference Workshops (COMPSACW), 2011 IEEE.) Execution-Based testing refers to a method of inferring certain behavioral properties of the product based, in part, on the results of executing the software (product) in a known environment with selected inputs.
The present invention discloses a system, a computer-implemented method and an apparatus for the generation of automated, hybrid test suites for one or more Business Processes to measure one or more quality attributes of a system under test. Business Processes have tags associated with abstract computing steps that are quality attributes which the testing must achieve including Usability, Database Response, Non-mandatory Fields, Mandatory Fields, Network Failure, Popup Blocker, Multiple Iterations, etc. that is applied to the Business Process. The system and computer-implemented method of the present invention have a Processing module, a Parsing module, an Analysis module, and a Test Generator, a User-interface, one or more Business Processes with tags, and one or more Test Data Models. The Processing module comprises a Configurator and a Transformer wherein the Processing module takes in one or more Business Processes with tags, which are quality attributes that the testing must achieve, via a User-interface and converts it into a Test-Centric Activity Diagram (TCAD). The Parsing module traverses the TCAD to identify one or more types of Nodes and corresponding Edges to generate one or more Lists that annotate the TCAD for the Analysis module. The Analysis module comprises a Path Traverser and a Custom Traverser wherein the Analysis module generates one or more Test Scenarios by representing the various paths through the Business Process under test in addition to Test Scenarios generated using other Tests including Exception-Based, Event-Based, and Expert-System Based tests. The Test Generator takes one or more inputs from storage containing Test Data Models and the Test Scenarios generated by the Analysis module to arrive at an intermediate set of Test Condition Lists and finally automated, hybrid Test Suites including Test Cases, Test Data Placeholders, and Test Scripts are generated.
One or more Wireframes are used along with one or more Business Processes with tags as input to the Processing module. A Wireframe is a blueprint of the process along with a Test Data Model that is used to generate appropriate Test Suites at the Test Generator. The Configurator combines the Business Processes having tags with the Wireframes and passes this on to the Transformer. The Parsing module traverses the TCAD to identify one or more types of Nodes and corresponding Edges to generate one or more Lists that annotate the TCAD with Action, Pair, and Decision Lists. Nodes can be an Action Node that carries out a specific function, a Fork and Join Node that depicts the existence of Concurrent Test Conditions, and a Decision Node where a condition is being tested to decide the path of the Business Process. The Parsing module detects the Node ID, Incoming and Outgoing connections for the Action, Fork and Join and Decision Nodes, generating Edges and Lists while parsing. The Action Nodes alone go through an additional check for the presence of tags in order to create Action Objects that are tagged. An Action List is an array of interconnected actions which provides all possible ways of connecting to each action, also gives the List of Incoming and Outgoing actions. A Pair List is an array of interconnected pairs that provides all possible ways of connecting to each pair, also gives the List of Incoming and Outgoing pairs. A Decision List is an array of interconnected decisions that provides all possible ways of connecting to each decision, also gives the List of Incoming and Outgoing decisions.
The present invention proposes a system, a computer-implemented method, and an apparatus for the examples of Ordering, Agent Sales, Sales Return from Customer, and Sales Return for Vendor.
The present invention is applied to the Business Process of Ordering represented by the abstract steps of checking for the presence of the obsolete accounts, checking for the balance in the account, drawing funds, checking if there are sufficient funds, if there are sufficient funds, accepting order, sending to queue for processing, triggering acknowledgment to the customer, and ending process. If there is a shortage of funds, retrying the withdrawal of funds, gathering data if retry works, for the count of retry operation executed, for reaching a nominal failure point and verifying valid user. Rejecting order, if retrying withdrawal is not successful or if a user is not valid, rolling back order processing, triggering notification, and ending process. The Ordering process is assigned a plurality of tags to indicate the quality attributes which is being tested here including, ‘usability’, the presence of invalid users, and ‘fault injection’.
The Ordering process has a TCAD generated for it, annotated after passing through the Parsing module and Analysis module including inputs from the Test Data Models. Checking for the balance in the account, based on detailed Test Steps that verify the card number, verify the expiry date, and verify the CVV. Drawing funds based on detailed Test Steps that verify the correctness of obtained card details, verify availability of sufficient funds, and verify response code, to ensure if a user has sufficient funds. Accepting order based on detailed Test Steps that verify order number, verify item code, verify item quantity, verify details about coupons applied, verify transaction reference number, verify total amount, and verify the transactional amount. Sending the order to queue, based on detailed Test Steps that verify order number, verify shipment tracking number, verify shipment address, and verify transaction details. Triggering acknowledgment to the customer, simultaneously, based on detailed Test Steps that verify generated acknowledgment number, verify invoice number, and verify ordered item and quantity. Retrying withdrawal of funds, which does a ‘Usability’ checking, based on detailed Test Steps that verify order number, and verify card details. Rejecting the order, based on detailed Test Steps that verify the order number, verify reason for rejection, verify transaction reference number, and verify updating of order status. Rolling back order processing, based on detailed Test Steps that verify the order details, verify the order status, and verify that the order is not placed in such cases. Triggering notification to the customer, based on detailed Test Steps that verify the order number, verify the transaction reference number, verify the order status, verify reason for rejection, verify mail or mobile number, and verify user details. The Ordering process has Decision, Action, and Pair Lists generated after going through the Parsing module including.
The Ordering business process has hybrid, automated Test Suites generated including Test Cases, Test Data Placeholders for assigning values to the fields of order number, card number, expiry date, CVV number, item code, item quantity, coupon details, transaction amount, transaction reference number, shipment tracking number, shipment to address, shipment from address, acknowledgment number of the order, e-mail, invoice number, mobile number, order status, and reason for rejection, and Test Scripts.
The present invention is applied to the Business Process of Agent Sales represented by the abstract steps of (i) Creating an expected sales order, (ii) Confirming and saving the sales order created, (iii) Approving the sales order, (iv) Receiving sales order as letter of credit (L/C), (v) Creating or updating L/C in sales order, (vi) If L/C not updated then proceeding to step xvi, (vii) Checking the credit limit, if exceeding proceeding to step xvi, (viii) Creating delivery if the credit limit has not exceeded, (ix) Saving the delivery number, (x) Creating transfer order and confirming, generating a pick list, (xi) Issuing of post goods (PGI) thus generating a delivery note and a packing list, (xii) Creating invoice, (xiii) Generating a commercial Invoice and printing, (xiv) Releasing for accounting, (xv) Checking accounting documents and ending process, and (xvi) Blocking delivery and ending process.
The Agent Sales has a TCAD generated for it, annotated after passing through the Parsing module and Analysis module including inputs from the Test Data Models. Creating an expected sales order based on detailed Test Steps that verifies sales document type, verifies sales organization, verifies distribution channel, verifies division, verifies sold to party, verifies ship to party, and verifies stock material. Confirming and saving the sales order created based on detailed Test Steps that verifies selling price to customer, verifies item availability, and validates receive the sales order details by letter of credit (L/C). Creating or updating a letter of credit (L/C) based on detailed Test Steps that verifies letter of credit document type, verifies sold to party, and ship to party, validates the financial document number after saving, and verifies the financial document number. Creating delivery if the credit limit has not exceeded based on detailed Test Steps that verifies warehouse number, verifies movement type, verifies material number, verifies quantity, verifies unit of measure, verifies plant or storage location, verifies storage unit type, and verifies movement data from data destination. Saving the delivery number based on detailed Test Steps that records the delivery number and validates the delivery number. Creating transfer order and confirming by generating a pick list, based on detailed Test Steps that verifies warehouse number, verifies movement type, verifies material number, verifies quantity, verifies unit of measure, verifies plant or storage location, verifies storage unit type, and verifies movement data. Creating invoice based on detailed Test Steps that verifies warehouse number, verifies movement type, verifies material number, verifies quantity, verifies unit of measure, verifies plant or storage location, verifies storage unit type, and verifies movement data.
The Agent Sales process has Decision, Action, and Pair Lists generated after going through the Parsing module. The Agent Sales has hybrid, automated Test Suites generated including Test Cases, Test Data Placeholders for assigning values to the fields of sales document type, sales organization, distribution channel, division, sold to party, ship to party, material, sales price, sales order number, shipping point, delivery date, warehouse number, movement type, material number, quantity, unit of measure, plant or storage location, storage unit type, from, destination, sales order, invoice number, accounting number, and financial year, and Test Scripts.
The present invention is applied to the Business Process of Sales Return from Customer represented by the abstract steps of (i) Checking for return from a customer, (ii) If returns received from the customer, then creating return order, (iii) Creating post goods receipt, (iv) Creating an inspection lot, (v) Inspecting for quality of returned goods, (vi) Recording inspection results, (vii) Checking for quality of goods, (viii) Generating an usage decision if goods quality is fine, (ix) Moving material to unrestricted stock, (x) Recording defect details if quality of goods are not fine, (xi) Creating a usage decision, (xii) Moving material to block stock and scrap, and (xiii) Creating a manual inspection lot if returns are not from customer and continuing steps v through xii as required.
The Return from Customer has a TCAD generated for it, annotated after passing through the Parsing module and Analysis module including inputs from the Test Data Models. Creating return order based on detailed Test Steps that verifies the order number, verifies shipping point, verifies date, and validates return delivery after saving. Creating post goods receipt based on detailed Test Steps that verifies T-code, verifies the delivery number, saves the transaction, and validates the transaction number. Creating an inspection lot based on detailed Test Steps that verifies the T-code, verifies material number, verifies plant, verifies inspection lot number, verifies inspection type, verifies inspection lot quantity, verifies start date, verifies inspection end date, verifies vendor, verifies purchasing organization, and verifies short text. Generating a usage decision if goods quality is fine based on detailed Test Steps that verifies the T-code, verifies inspection lot number, and verifies usage decision (UD) code, and creating a usage decision if goods quality is not fine, based on detailed Test Steps that, verifies T-code, verifies inspection lot number, and verifies UD code.
The Sales Return from Customer process has Decision, Action, and Pair Lists generated after going through the Parsing module. The Sales Return from Customer has hybrid, automated Test Suites generated including Test Cases, Test Data Placeholders for assigning values to the fields of order number, shipping point, date, delivery number, material number, plant, inspection lot number, inspection type, start date, inspection end date, vendor, purchasing organization, inspection lot number, and UD code, and Test Scripts.
The present invention is applied to the Business Process of Sales Return for Vendor represented by the abstract steps of (i) Creating a return purchase order (PO), (ii) Checking approval of return PO, (iii) If not approved, either canceling or deleting PO generated, (iv) Creating a return outbound delivery, if the Return PO is approved, (v) Creating a return post goods issue (PGI), (vi) Verifying the return PGI, (vii) Rectifying the error in the return PGI to proceed further, if there is any error, (viii) Creating a gate pass if the return PGI is correct, and (viii) Creating a credit memo (MIRO).
The Sales Return for Vendor has a TCAD generated for it, annotated after passing through the Parsing module and Analysis module including inputs from the Test Data Models. Creating a return purchase order (PO) based on detailed Test Steps that verifies T-code and verifies document type. Creating a return outbound delivery based on detailed Test Steps that verifies the T-code, verifies the delivery number, and validates the return delivery number after saving. Creating a return post goods issue (PGI) based on detailed Test Steps that verifies the T-code, and verifies the delivery number, continue if no error. Creating a gate pass if the return PGI is correct based on detailed Test Steps that verifies the customized T-code for creating a delivery gate pass, verifies the delivery number, and verifies the gate pass transaction number once saved. Creating a credit memo (MIRO) based on detailed Test Steps that verifies the T-code in MIRO, verifies the return PO number, verifies financial year, and validates PO number in reference field.
The Sales Return from Customer process has Decision, Action, and Pair Lists generated after going through the Parsing module. The Sales Return from Customer has hybrid, automated Test Suites generated including Test Cases, Test Data Placeholders for assigning values to the fields of sales document type, account assignment category, item category, material number, quantity, plant, storage location, purchasing group, purchase requisition number, purchase order, warehouse number, storage type, and storage bin, and Test Scripts.
A TCAD is a focused representation of the Business Process created with the various tests that should be performed at the various logical points within the Business Process. For example, if there is a ‘Retry’ button within a Web page, a tag that could be associated with it is the tag of ‘Usability’. So that needs to be tested when the Business Process is run through the testing phase. Therefore the TCAD 10 is the cornerstone of the representation that this invention works with. The Parsing module 2 takes the TCAD 10 and traverses it to generate Nodes, Edges, and Lists. The Nodes within the TCAD 10 could be Decision Nodes, Fork and Join Nodes, or Action Nodes. These are the three types of Nodes within a TCAD 10. The Edges are the connectors between the Nodes. The Lists refer to certain attributes that are represented in List form, for example, the decisions that might affect the flow of the Business Process. The Decision, Action, and Pair Lists 11 that are generated by the Parsing module 2 are sent to the Analysis module 3, which has two abstract modules called the Path Traverser 12 and the Custom Traverser 13. The Path Traverser generates Test Scenarios for the coverage of various logical paths through the Business Process which need to be tested. The output of the Analysis module is a plurality of Test Scenarios. For example, the Path Traverser 12 might generate two valid Test Scenarios for a TCAD, Test Scenario 1 (TS1) where the Nodes 1, 3, 5 need to be traversed and Test Scenario 2 (TS2) where the Nodes 1, 3, 4, 5 required to be traversed in order to cover the entire Business Process that is being executed. The Custom Traverser 13 on the other hand takes inputs from storage having Event-Based, Expert-System Based, Exception-Based Test Conditions 14 which are used to construct further Test Scenarios based on the different types of testing methodologies that are known. The Test Generator 4 takes the Test Scenarios from the Analysis module 3 and converts them into Test Condition Lists. The Test Generator 4 further uses inputs from a Test Data Model storage 19 that might work in lieu with the Wireframes 7. Since the Wireframes 7 is not mandatory, the Test Data Model 15 acts as a second set of clues into how the tests might be generated for a given Business Process. The Test Generator 4 goes on to generate the Test Cases 16, Test Data Placeholders 17, Test Scripts 18 that can be used to test the Business Process that we are interested in, and this is stored in a local or a remote storage 19 in a plurality of formats including Excel, text files, etc.
The process starts 200 by verification of card 201 in which details such as a debit or credit card number, expiry date of the card, CVV number of the card, 201a, are verified. The expiry date is then validated with the current date by the system. If the expiry date is earlier than the current date, the account is marked as obsolete account 216 by the system and the process terminates. The funds are drawn 202 during which the correctness of obtained card details, availability of sufficient funds and response code are verified 202a and a decision is made 503 to ensure if a user has sufficient funds. On availability of sufficient balance 203 in the user's account, the funds are drawn after verifying 503, and for any shortage of funds 204, the system allows a retry 205 which does ‘Usability’ check. The order number and card details 205a are also verified consecutively. While retrying process 205 data is gathered 206 if retry works 505 on numbers of times the retry is done to check validity of the user 504. If there is a shortage of funds 204 or the user is invalid 216, the order gets rejected 210 during which the order number, reason for rejection, transaction reference number are verified, and order status is updated 210a. Further, the order processing is rolled back 211 by verifying order details and order status, also confirms that the order is not placed 211a in such cases. A notification is triggered to the client 212 after verifying the order details, transaction reference number, order status, reason for rejection, mail or mobile number and also, user details are verified 212a.
On sufficient availability of funds 203, the order gets accepted 207 by validating order number, item code, item quantity, details about coupons applied, transaction reference number and total amount 207a. The transacted amount is verified against the total order value to proceed further. The order is sent to queue 208 by verifying details such as order number, shipment tracking number, shipment address, and transaction details 208a, in which the invoice number is also confirmed. Simultaneously, an acknowledgment is triggered to the customer 209 after validating the generated acknowledgment number, order number, email and mobile number, and transaction details 209a. Before ending the process 213 the invoice number, ordered item and quantity are validated.
In this case, different types of Test Cases are generated namely, Exception-Based Test Case, Path-Based Test Case, Tag-Based Test Case and Event-Based Test Case. The ‘order processing’ scenario includes the Action types or Action Nodes “card verification’ 201, ‘draw funds’ 202, ‘accept order’ 207, ‘retry’ 205, ‘gather data’ 206, ‘reject order’ 210 and ‘end process’ 213. The tag ‘usability’ 215 is used for ‘retry’ action 205, the tag ‘fault injection’ 214 is used for ‘accept order’ 207, the tags ‘database’, ‘invalid values’ 216 are used for the action ‘gather data’ 206. The Decision Nodes are 503, 504, 505, Fork Node is 506, Join Node is 507 and all Edges are of Action type.
In the Decision Lists 220 are
In the Action Lists 221 are
In the Action Lists 222 are
The Lists generated are Decision Lists D1, D2, D3, D4, Action Lists A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11, A2, A13, and Pair Lists P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14, P15. For the Business Process of Ordering, the invention generated 10 Test Cases, which includes four Test Cases, four Tag-Based or Experience-Based Test Cases, one Experience-Based Test Case, and one Experience-Based Test Case.
The process starts 550 by creating an expected sales order 551 that verifies sales document type, sales organization, distribution channel, division, sold to party, ship to party, and stock material 551a. The sales order is confirmed and saved 552, by verifying selling price for the customer, availability of ordered item, and validating the received sales order by Letter of Credit (L/C) 552a. If the item is available 552a, the sales order number is generated and said sales order is confirmed 552, saved, and approved 553. Once the sales order is approved 553, the credit limit is checked 554, if not exceeding the limit the delivery is created 555, by verifying the warehouse number, movement type, material number, order quantity, unit of measure, plant or storage location, storage unit type and movement data from destination data 555a. The delivery number is saved by clicking on ‘Save’ button, further recorded and validated. The transfer order is generated and confirmed 556, for the delivery created by verifying the values for warehouse number, movement type, material number, quantity, unit of measure, plant or storage location, storage unit type, and movement data 556a, during which, a pick list is generated 557. Once the delivery has been issued a post goods issue (PGI) 558 is created by generating a packing list 560 and a delivery note 559. Finally, a bill is created 561 as a commercial invoice 562 by verifying details such as, warehouse number, movement type, material number, quantity, unit of measure, plant or storage location, storage unit type, and movement data 561a. The generated commercial invoice 562 is saved by selecting ‘Save’ button and the invoice number is recorded in the system. The invoice is released for accounting that verifies the transaction code (T-code), invoice number and clicking on a ‘Green Flag. The account documents are checked for T-code, accounting number, fiscal year and an ‘enter’ button is pressed for displaying the accounting document.
The sales order details if received as the letter of credit (L/C) 563, is updated in the sales order 564, by verifying the letter of credit document type, sold to party, ship to party, and the financial document number is saved, recorded, and validated for the said financial document number, once recorded in the system 564a. If the credit limit has not exceeded 554, the delivery is created 555, and the same process is followed after the creation of the delivery 555 as explained above. But, if the credit limit exceeds 554, and if the confirmed sales order created is not received as an L/C, then the sales order is blocked for the delivery 565. If the Finance department releases the accounting 566 that was blocked for delivery, then the delivery is created 555, else the process ends 567.
Creating expected sales order 551, confirmed sales order created 552, sales order approved 553, updating L/C in sales order 564, blocking sales order for delivery 565, and creating delivery 555 are Activity Nodes. Decision Nodes are verifying letter of credit 563, checking credit limit 554, and release of sales order 566. All Edges are Activity Edges except the four Object Edges, O1, O2, O3, O4. The other Activity Nodes are creating Transfer order and confirming 556, post goods issue 558, and creating billing 561. The method resulted in the generation of 14 Test Cases which, includes ten Path-Based Test Cases, three Error-Based Test Cases, and one Event-Based Test Case. P1, P2, P3, P4, P5, P6, P7, P8, P9, P10, P11, P12, P13, P14, P15, P16, P17, P18, P19, P20, P21, P22 are the Pair points annotated.
In the Decision List 568 are
In the Action List 569 are
In the Pair List 570 are
Further, Creating transfer order and confirming by generating a pick list, based on detailed Test Steps 556a that verifies warehouse number, verifies movement type, verifies material number, verifies quantity, verifies unit of measure, verifies plant or storage location, verifies storage unit type, and verifies movement data. Entering the T-code and entering the delivery number based on detailed Test Step that enters the delivery number, and hits enter. Entering in the associated pick quantity based on detailed Test Step validates picking quantity. Creating invoice based on detailed Test Steps 561a that verifies warehouse number, verifies movement type, verifies material number, verifies quantity, verifies unit of measure, verifies plant or storage location, verifies storage unit type, and verifies movement data. Creating commercial print invoice based on detailed Test Steps that clicks on ‘Save’ button to save the billing, and notes down the invoice number. Releasing to accounting based on detailed Test Steps that verifies the T-code, verifies invoice number, and clicks on ‘Green Flag’ to release accountings. Checking accounting documents based on detailed Test Steps that verifies the T-code, verifies accounting number, verifies fiscal year, and hits enter to display the accounting document.
The activities involved are started 600 when there is a return from a customer 601 which is an event, and a return delivery is created 602 by verifying order number, shipping point, date and validating the return delivery after recording into the system 602a. Thus, the customer return 601 is a Decision Node, connected to the return delivery creation 602 which is an Activity Node, by an Activity Edge. A post goods receipt 603 is generated by verifying T-code, delivery number and saving the transaction 603a. A ‘Past goods return control’ button when activated saves the transaction. The process owner for creating Return delivery 603 by generating the Post Goods receipt 604, and generating the Inspection lot, is a Sales Administrator. After PGI the returned material is forwarded to a quality inspection 605 for which an inspection lot 604 is created by validating the T-code, material number, plant inspection lot number, inspection type, quantity of inspection lot, inspection start date and end date, vendor, purchasing organization, and short text 604a. The Edge connecting the Inspection lot 604 Activity Node to the quality inspection 605 Merge Node is an Object Edge 05. If the materials are in good condition 608, a usage decision 610 is taken after verifying the T-code, inspection lot number, selecting decision, verifying generated usage decision code (UD code) 610a, and then saving through an inspection lot stock tab. The materials are then moved to an unrestricted stock 611. But if the materials are not in a good condition 608, the defects are recorded 609, and then the usage decision 612 is made after validating the T-code, inspection lot number, selecting decision, usage decision code (UD code) 612a, and the goods are moved to block stock and scrapped 613. The Nodes return delivery creation 602, Post Goods receipt 603, Creating inspection lot 606, recording results 607, recording defects 609, usage decisions 610, 612, material moved to unrestricted stock 611, and to block stock and scrap 613 all these are Activity Nodes. The Decision Nodes are return from the customer 601 and checking condition of goods 608. The Merge Node is Quality Inspection 605. All other Edges are Activity Edges except 04 which is an Object Edge. A Warehouse Clerk handles the material movement once the usage decisions 610, 612 are taken in both cases of moving to block stock and scraping, and to unrestricted stock.
If the return is not from the customer 601, an inspection lot is created manually 606 by verifying the T-code, material number, plant, inspection lot number, inspection type, inspection lot quantity, start date, inspection end date, vendor, and purchasing organization, followed by the quality inspection 605 and the results are recorded 609. Further, the process continues and based on the quality of received materials, said materials are moved either to the unrestricted stock 613 or to block stock and scrapped 611. The method resulted in the generation of nine Test Cases, which includes four Path-Based Test Cases, one Event-Based Test Case, two Error-based Test Cases, and two Expert-System based Test Cases.
In the Decision List 614 are, Check if goods are return from Customer. If yes, check if materials are OK or not.
In the Action List 615 are
In the Pair List 616 are
Each process step consists of detailed Test Steps. Return from Customer if Yes, based on detailed Test Steps 602a that verifies the order number, verifies shipping point, verifies date, hits enter, saves the return delivery and validates the return delivery. Post goods receipt has Test Steps 603a that verifies T-code, verifies the delivery number, hits enter, clicks on ‘Past goods return control’ button, saves the transaction and validates transaction number. The detailed Test Steps 604a for inspection lot are, verifies the T-code, verifies material number, verifies plant, verifies inspection lot number, hits enter, verifies inspection type, verifies inspection lot quantity, verifies start date, verifies inspection end date, verifies vendor, verifies purchasing organization, verifies short text, and clicks on save. Usage decision if the Goods OK (Yes) based on detailed Test Steps 612a that verifies the T-code, verifies inspection lot number, selects the decision, verifies usage decision (UD code), clicks on inspection lot stock tab, and clicks on save. Usage decision if the Goods Not OK (No) based on detailed Test Steps 610a that verifies the T-code, verifies inspection lot number, selects the decision, verifies the UD code, clicks on inspection lot stock tab, and clicks on save.
The process starts 300 by creating a return purchase order (PO) 301 by verifying details such as T-code, document type, and checking the “Return” box in the item screen of the item line 301a. The purchase order will be approved by the concerned department for further process. If the return purchase order is approved 302, then the post goods issue (PGI) is ready for generation. The return outbound delivery is created 303 by verifying the T-code and delivery number 303a. The return delivery number is then saved and recorded into the system, which is then validated 303a. The PGI is created by verifying the T-code and delivery number 304a, further “Post Goods returns” control button is clicked to generate the Return Post Goods Issue (PGI) 304. On getting the goods receipt 305, a customized T-code is verified 306a for creating a delivery gate pass 306, this is for a client having customized T-code.
Further the delivery number is verified 306a, the created gate pass transaction is saved and the gate pass transaction number is verified, if the PGI does not have any error 305. If PGI has any errors 305, the errors are fixed and processed again 308 to generate the gate pass 306. The credit memo (MIRO) is created 307 by verifying the T-code, return purchase order number, financial year 307a, and hitting the enter key. For completing the generation of credit memo the transaction document type is selected as “2Credit Memo”, the purchase order number in reference field is verified 307a, a check box is selected and ‘Simulate’ is clicked which validates the simulations, ‘POST’ button is selected then, the MIRO number is recorded, and the generated MIRO number is validated. But if the return purchase order is not approved 302 then said return purchase order is either cancelled or deleted 309.
The Activity Nodes are creating a return purchase order 301, creating return outbound delivery 303, creating return post goods issue 304, gate pass creation 306, credit memo 307 creation, error rectification 308, and cancellation of PO 309. The approval of return purchase order 302 and receiving post goods issue 305 are Decision Nodes. All Edges generated are Activity Edges. The method resulted in the generation of 10 Test Cases which includes 4 Path-Based Test Cases, 2 Error-based Test Cases, and 4 Expert-System Based Test Cases.
In the Decision List 320 are, check if the created return purchase order is approved or not. If it is approved, check if the return post goods issue (PGI) is created or not.
In the Action List 321 are
In the Pair List 322 are
As previously mentioned, a TCAD is a focused representation of the Business Process with the various tests that should be preformed at the various logical points within the Program. For example, if there is a ‘Retry’ button within a Web page one tag that could be associated with it is the tag of ‘Usability’. So that needs to be tested when the Business Process is run through the testing phase. Therefore, the TCAD is the corner stone of the representation that this invention primarily works with. The Parsing module 98 takes the TCAD and traverses it to generate Nodes, Edges, and Lists. The Nodes within the TCAD could be Decision Nodes, Fork and Join Nodes, or Action Nodes and are primarily the three big classes of Nodes within a TCAD. The Edges are the connectors between the Nodes. The Lists refer to certain attributes that are represented in List form, for example, the decisions that might affect the flow of the Program.
The Decision, Action, and Pair Lists from the Parsing module 98 are then fed into the Analysis module 99 which has two sub-modules called the Path Traverser 114, and the Custom Traverser 115. The Path Traverser 114 is primarily generating Test Scenarios. The output of the Analysis module 99 is Test Scenarios. For example, the Path Traverser 114 might generate two valid Test Scenarios for a TCAD, Test Scenario 1 (TS1) where the Nodes 1, 3, 5 need to be traversed and Test Scenario 2 (TS2) where the Nodes 1, 3, 4, 5 needs to be traversed in order to cover the entire Program that is being executed. The Custom Traverser 115 takes inputs from the storage that has Event-Based, Expert-System Based, Exception-Based Test Conditions 107 which are then used to construct further Test Scenarios based on the different types of known testing methodologies. The Test Generator 100 takes the Test Scenarios from the Analysis module 99 and converts them into Test Condition Lists. The Test Generator 100 further uses Test Data Model 108 as inputs from storage 4 that might work in lieu with the Wireframes 106. Since the Wireframes 106 are not mandatory the Test Data Model 108 acts as a second set of clues into how the Tests might be generated for a given Business Process. The Test Generator 100 goes on to generate the Test Cases 101, Test Data Placeholders 102, Test Scripts 103 that can be used to entirely test the Business Process, and is stored in a local or a remote storage 116 in a plurality of formats including Excel, text files, etc.
The above detailed description of the embodiments, and the examples, are for illustrative purposes only and are not intended to limit the scope and spirit of the invention, and its equivalents, as defined by the appended claims. One skilled in the art will recognize that many variations can be made to the invention disclosed in this specification without departing from the scope and spirit of the invention.
This patent application claims priority on and the benefit of U.S. patent application Ser. No. 14/680,132 having a filing date of 7 Apr. 2015, which claims priority on and the benefit of U.S. Provisional Patent Application No. 61976522 having a filing date of 8 Apr. 2014.
Number | Date | Country | |
---|---|---|---|
61976522 | Apr 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14680132 | Apr 2015 | US |
Child | 15874010 | US |