The present invention relates to documenting a process flow using artificial intelligence (AI), and more particularly to automatic chunking and mapping of system documents to individual blocks in a process diagram.
Many of client/customer facing jobs require a manual review of multiple documents (e.g., user manuals) to determine the actions that need to be taken in response to a service request.
In one embodiment, the present invention provides a computer system that includes a central processing unit (CPU), a memory coupled to the CPU, and one or more computer readable storage media coupled to the CPU. The one or more computer readable storage media collectively contain instructions that are executed by the CPU via the memory to implement a method of generating process flow documentation. The method includes the computer system identifying headings and subheadings in multiple system documents specifying actions required to be taken in response to service requests, which are specified by process flows. The method further includes based on a similarity score indicating an amount of similarity between (i) a heading or subheading in a system document included in the multiple system documents and (ii) a name of a process block included in a process flow included in the process flows, or based on a matching of a portion of the system document to the process block by a semantic understanding of the portion provided by a machine learning system, the computer system mapping the portion to the process block. The portion is specified by the heading or subheading. The method further includes based on the mapping of the portion to the process block, the computer system generating a documentation of the process flow so that the documentation includes the portion of the system document.
A computer program product and a method corresponding to the above-summarized computer system are also described and claimed herein.
A manual review of multiple system documents (i.e., technical documents such as user manuals) to determine the actions that need to be done in response to a service request is a tedious, time-consuming, and error-prone task.
Embodiments of the present invention address the aforementioned unique challenges of manually reviewing system documents to determine actions that are required by client/customer facing jobs in response to a service request being raised in a process flow. In one embodiment, a documentation system automatically breaks down the aforementioned system documents into simpler pieces (i.e., chunks the system documents) based on the service requests and maps the simpler pieces to different components (i.e., individual process blocks or activities) of the process flow, which are represented in a business process diagram. The automatic chunking of the documents into simpler pieces (i.e., chunks) and the mapping of the pieces of the documents to the components of the process flow allows business analysts and support engineers to select a single relevant document (i.e., a process flow documentation, which is a document relevant to a given service request), which is a combination of different pieces of one or more system documents, which saves a significant amount of time in determining the actions required in response to the service request. In one embodiment, the documentation system uses lexical and semantic matching techniques to map the pieces of the documents to the components of the process flow.
In one embodiment, the documentation system identifies common components across different business processes, creates a database for the identified components, and provides a consistent definition of steps needed to map to a business process. In one embodiment, the documentation system uses natural language processing (NLP) techniques to identify process blocks which are identical to each other.
In one embodiment, the documentation system provides for a delivery of a particular business process from multiple sections from multiple documents. In one embodiment, the documentation system creates a process document for each user-created service request.
In one embodiment, the documentation system reformats the chunks taken from different system documents or sections of documents and combines the reformatted chunks into a common layout, which provides a uniform customer experience for a customer who views documentations of different process flows.
System for Generating AI-Based Process Flow Documentation from System Documents
Process flow documentation system 104 receives system documents 114 (e.g., user manuals and user guides) and generates hierarchical JavaScript® Object Notation (JSON) documents (not shown) from the system documents 114. JavaScript is a registered trademark of Oracle America, Inc. located in Redwood Shores, Calif. Process flow documentation system 104 stores the JSON documents in graph database 110. System documents 114 specify actions required to be taken in response to service requests. Sections of system documents 114 are specified by respective nodes in a first set of nodes in the graph database 110. Edges between the nodes that specify the sections of system documents 114 indicate relationships such as hierarchy, similarity score, and the different process blocks of the BPDs in which the sections are used.
Process flow documentation system 104 retrieves business process documentations (BPDs) from a master list 116 of BPDs and stores the BPDs in graph database 110. A BPD specifies a process flow of a service request. Process blocks of the BPDs are specified by respective nodes in a second set of nodes in the graph database 110. The nodes in the second set of nodes (i.e., the nodes specifying the process blocks) are different from the nodes in the aforementioned first set of nodes (i.e., the nodes specifying the sections of the system documents 114). Edges between the nodes that specify the process blocks indicate business process links and hierarchies. An edge between a given first node in the first set of nodes and a given second node in the second set of nodes indicates a relationship of similarities and inclusion (i.e., the section specified by the given first node is matched with the process block specified by the given second node). Graph database 110 allows for a fast retrieval of sections with respect to business processes.
For a given BPD of a process flow for a particular service request, process flow documentation system uses semantic model 112 or NLP module 108 to determine similarity scores between headings in multiple JSON documents and a process block of the given BPD or between sections in multiple JSON documents and the process block. Based on the similarity scores that meet similarity criteria, process flow documentation system 104 maps the corresponding sections of the multiple JSON documents to the process block, and creates a process flow documentation 118 (i.e., a new documentation of the process flow of the service request) by retrieving the mapped sections of the multiple JSON documents from graph database 110 and stitching together the retrieved sections. Process flow documentation system 104 sends the newly created process flow documentation 118 together with a visual representation of the BPD to a display 120 for online viewing by an end user (e.g., a service agent or a business process manager) via a portal. Process flow documentation system 104 generates the aforementioned visual representation of the BPD by using information in the graph database 110.
The functionality of the components shown in
Process for Generating AI-Based Process Flow Documentation from System Documents
In step 204, process flow documentation system 104 (see
After the prerequisite steps of 202 and 204 are performed, the process of
In step 206, process flow documentation system 104 (see
In step 208, process flow documentation system 104 (see
In the first performance of step 210, process flow documentation system 104 (see
In a first performance of step 212, process flow documentation system 104 (see
In step 214, process flow documentation system 104 (see
In step 216, process flow documentation system 104 (see
In a first performance of step 218, process flow documentation system 104 (see
The process of
In step 222, process flow documentation system 104 (see
In step 224, process flow documentation system 104 (see
In one embodiment, process flow documentation system 104 (see
In one embodiment, multiple performances of step 222 via a loop that starts at step 220 includes process flow documentation system 104 (see
Returning to step 220, if process flow documentation system 104 (see
In step 226 following step 224 or the No branch of step 226, process flow documentation system 104 (see
If process flow documentation system 104 (see
In step 228, process flow documentation system 104 (see
In step 230, process flow documentation system 104 (see
In one embodiment, machine learning system 106 (see
1. Create a dataset from the JSON document in which each line has the form: “category label, paragraph belonging to the category, a paragraph that does not belong to the category”
2. Given two sentences, classify the two sentences as entailing, contradicting, or being neutral to each other. For the classification, the machine learning system 106 (see
3. Train and save the semantic model 112 (see
In step 232, process flow documentation system 104 (see
In step 234, process flow documentation system 104 (see
In one embodiment, process flow documentation system 104 (see
Returning to step 228, if process flow documentation system 104 (see
Following step 234 and following the No branch of step 228, the process of
In step 236, process flow documentation system 104 (see
If process flow documentation system 104 (see
In step 238, process flow documentation system 104 (see
In step 240, based on the mapping performed in step 224 or step 234, process flow documentation system 104 (see
In step 242, process flow documentation system 104 (see
In step 244, process flow documentation system 104 (see
If process flow documentation system 104 (see
In one embodiment, the process of
In step (1) in example 300, process flow documentation system 104 (see
In step (3), process flow documentation system 104 (see
Memory 404 includes a known computer readable storage medium, which is described below. In one embodiment, cache memory elements of memory 404 provide temporary storage of at least some program code (e.g., program code 414) in order to reduce the number of times code must be retrieved from bulk storage while instructions of the program code are executed. Moreover, similar to CPU 402, memory 404 may reside at a single physical location, including one or more types of data storage, or be distributed across a plurality of physical systems or a plurality of computer readable storage media in various forms. Further, memory 404 can include data distributed across, for example, a local area network (LAN) or a wide area network (WAN).
I/O interface 406 includes any system for exchanging information to or from an external source. I/O devices 410 include any known type of external device, including a display, keyboard, etc. Bus 408 provides a communication link between each of the components in computer 102, and may include any type of transmission link, including electrical, optical, wireless, etc.
I/O interface 406 also allows computer 102 to store information (e.g., data or program instructions such as program code 414) on and retrieve the information from computer data storage unit 412 or another computer data storage unit (not shown). Computer data storage unit 412 includes one or more known computer readable storage media, where a computer readable storage medium is described below. In one embodiment, computer data storage unit 412 is a non-volatile data storage device, such as, for example, a solid-state drive (SSD), a network-attached storage (NAS) array, a storage area network (SAN) array, a magnetic disk drive (i.e., hard disk drive), or an optical disc drive (e.g., a CD-ROM drive which receives a CD-ROM disk or a DVD drive which receives a DVD disc).
Memory 404 and/or storage unit 412 may store computer program code 414 that includes instructions that are executed by CPU 402 via memory 404 to generate AI-based process flow documentation from a combination of different sections of multiple system documents. Although
Further, memory 404 may include an operating system (not shown) and may include other systems not shown in
As will be appreciated by one skilled in the art, in a first embodiment, the present invention may be a method; in a second embodiment, the present invention may be a system; and in a third embodiment, the present invention may be a computer program product.
Any of the components of an embodiment of the present invention can be deployed, managed, serviced, etc. by a service provider that offers to deploy or integrate computing infrastructure with respect to generating AI-based process flow documentation from a combination of different sections of multiple system documents. Thus, an embodiment of the present invention discloses a process for supporting computer infrastructure, where the process includes providing at least one support service for at least one of integrating, hosting, maintaining and deploying computer-readable code (e.g., program code 414) in a computer system (e.g., computer 102) including one or more processors (e.g., CPU 402), wherein the processor(s) carry out instructions contained in the code causing the computer system to generate AI-based process flow documentation from a combination of different sections of multiple system documents. Another embodiment discloses a process for supporting computer infrastructure, where the process includes integrating computer-readable program code into a computer system including a processor. The step of integrating includes storing the program code in a computer-readable storage device of the computer system through use of the processor. The program code, upon being executed by the processor, implements a method of generating AI-based process flow documentation from a combination of different sections of multiple system documents.
While it is understood that program code 414 for generating AI-based process flow documentation from a combination of different sections of multiple system documents may be deployed by manually loading directly in client, server and proxy computers (not shown) via loading a computer-readable storage medium (e.g., computer data storage unit 412), program code 414 may also be automatically or semi-automatically deployed into computer 102 by sending program code 414 to a central server or a group of central servers. Program code 414 is then downloaded into client computers (e.g., computer 102) that will execute program code 414. Alternatively, program code 414 is sent directly to the client computer via e-mail. Program code 414 is then either detached to a directory on the client computer or loaded into a directory on the client computer by a button on the e-mail that executes a program that detaches program code 414 into a directory. Another alternative is to send program code 414 directly to a directory on the client computer hard drive. In a case in which there are proxy servers, the process selects the proxy server code, determines on which computers to place the proxy servers' code, transmits the proxy server code, and then installs the proxy server code on the proxy computer. Program code 414 is transmitted to the proxy server and then it is stored on the proxy server.
Another embodiment of the invention provides a method that performs the process steps on a subscription, advertising and/or fee basis. That is, a service provider can offer to create, maintain, support, etc. a process of generating AI-based process flow documentation from a combination of different sections of multiple system documents. In this case, the service provider can create, maintain, support, etc. a computer infrastructure that performs the process steps for one or more customers. In return, the service provider can receive payment from the customer(s) under a subscription and/or fee agreement, and/or the service provider can receive payment from the sale of advertising content to one or more third parties.
The present invention may be a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product may include a computer readable storage medium (or media) (i.e., memory 404 and computer data storage unit 412) having computer readable program instructions 414 thereon for causing a processor (e.g., CPU 402) to carry out aspects of the present invention.
The computer readable storage medium can be a tangible device that can retain and store instructions (e.g., program code 414) for use by an instruction execution device (e.g., computer 102). The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions (e.g., program code 414) described herein can be downloaded to respective computing/processing devices (e.g., computer 102) from a computer readable storage medium or to an external computer or external storage device (e.g., computer data storage unit 412) via a network (not shown), for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card (not shown) or network interface (not shown) in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions (e.g., program code 414) for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations (e.g.,
These computer readable program instructions may be provided to a processor (e.g., CPU 402) of a general purpose computer, special purpose computer, or other programmable data processing apparatus (e.g., computer 102) to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium (e.g., computer data storage unit 412) that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions (e.g., program code 414) may also be loaded onto a computer (e.g. computer 102), other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be accomplished as one step, executed concurrently, substantially concurrently, in a partially or wholly temporally overlapping manner, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
While embodiments of the present invention have been described herein for purposes of illustration, many modifications and changes will become apparent to those skilled in the art. Accordingly, the appended claims are intended to encompass all such modifications and changes as fall within the true spirit and scope of this invention.