The present invention relates to building process models and, more particularly, to pruning irrelevant attributes from the execution traces from which the process models are built.
An execution trace describes events occurring in an instance of some process. These events include tasks that are executed in the process, as well as data values input or output by the tasks. Process mining involves mining a graph of causal behavior from process execution logs and produces a process model as output. A process model may be represented by a causal graph of nodes and edges, where nodes are tasks in a process and edges represent the causality between the tasks. The model may also have gateways that show execution semantics along the edges and nodes of the graphs, such as parallelism or exclusive flows.
Process models can be mined from a set of execution traces. A mined process model could be very complex, with many nodes and edges and display spaghetti-like behavior where rarely-used or redundant paths clutter the graph. In one example, a process model could represent a pathway, such as a treatment pathway. One way to accomplish this is to find a set of execution traces that lead to a particular outcome and then mining a process model from these traces.
However, execution traces may include events that are not relevant or which have no impact on the outcome of the process. Currently users have to manually sift through execution logs and eliminate by hand tasks that are not relevant before mining the logs and building a process model. In any realistic process of any substantial size and complexity, this can be a very time consuming process.
A method for pruning process execution logs includes learning a predictive model from a set of execution traces that characterize a process, wherein the predictive model determines a likelihood of a given instance reaching a specified outcome; identifying attributes in the predictive model that fall below a threshold measure of relevance to the specified outcome using a processor; and removing the identified attributes from the set of execution traces.
An execution log pruning system includes a predictive model module configured to learn a predictive model from a set of execution traces that characterize a process, wherein the predictive model determines a likelihood of a given instance reaching a specified outcome; a relevance module comprising a processor configured to identify attributes in the predictive model that fall below a threshold measure of relevance to the specified outcome; and a pruning module configured to remove the identified attributes from the set of execution traces.
These and other features and advantages will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings.
The disclosure will provide details in the following description of preferred embodiments with reference to the following figures wherein:
Embodiments of the present invention provide for the elimination of irrelevant activities (also known as events) in process execution traces that have no impact on the outcome of the process. The present embodiments also remove attributes which are redundant or which otherwise do not meaningfully contribute to the formation of a process model. Once these attributes are removed, process models mined from such execution traces have fewer tasks and provide more clarity in terms of showing the causality of events leading to a particular outcome.
Referring now to the drawings in which like numerals represent the same or similar elements and initially to
Patient1 trace: Diuretics, Antianginal Agents, Crackles on Lung, PulseOxygen, Creatinine, Vital
Patient1 outcome: Hospitalized
Patient2 trace: Cardiotonics, Chest X-ray, Potassium, Diuretics, Heart Failure
Patient2 outcome: Not hospitalized
Patient3 trace: Crackles on Lung, Chest X-ray, Cardiotonics, Diuretics, Heart Failure, Potassium, Vital
Patient3 outcome: Hospitalized
Block 104 characterizes the execution traces having the desired outcome as a matrix. Each column of the matrix is an attribute and each row is a different execution trace. In one example, attributes that are present in an execution trace are represented by values of 1, while attributes that are not present in the trace are represented by values of 0. As an alternative to the binary-valued attribute matrix, the matrix may instead have a continuous value range or some set of discrete range options, where each matrix entry could represent the number of times the attribute corresponding to that matrix column occurs in the trace corresponding to the row of that matrix entry. Block 106 uses machine learning to determine which attributes are correlated with outcomes. This creates a predictive model that embodies attributes which are most strongly correlated with the outcome of interest.
For example a decision tree creates a ranking of attributes that predict the specified outcome. The tree nodes split on the basis of attribute values that lead to a specific outcome. In
Block 108 identifies attributes that are irrelevant to the outcome using the predictive model. This may include identifying only those attributes which have no bearing on the outcome at all, or may be expanded to include attributes with only a small amount of predictive power. For example, if a predictive model has a branch with two possibilities, where one branch has a 51% chance of leading to hospitalization and the other has a 49% chance of leading to hospitalization, then the branch is not strongly predictive of the outcome. A user may set a threshold to determine how weakly predictive an attribute should be to classify the attribute as irrelevant.
Block 110 removes irrelevant attributes from the execution traces. This process may be performed automatically simply by editing the training matrix, setting to zero any entry that corresponds to an irrelevant attribute. Block 112 then mines the process model from the pruned traces to create a pruned process model.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Reference in the specification to “one embodiment” or “an embodiment” of the present principles, as well as other variations thereof, means that a particular feature, structure, characteristic, and so forth described in connection with the embodiment is included in at least one embodiment of the present principles. Thus, the appearances of the phrase “in one embodiment” or “in an embodiment”, as well any other variations, appearing in various places throughout the specification are not necessarily all referring to the same embodiment.
It is to be appreciated that the use of any of the following “/”, “and/or”, and “at least one of”, for example, in the cases of “A/B”, “A and/or B” and “at least one of A and B”, is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of both options (A and B). As a further example, in the cases of “A, B, and/or C” and “at least one of A, B, and C”, such phrasing is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of the third listed option (C) only, or the selection of the first and the second listed options (A and B) only, or the selection of the first and third listed options (A and C) only, or the selection of the second and third listed options (B and C) only, or the selection of all three options (A and B and C). This may be extended, as readily apparent by one of ordinary skill in this and related arts, for as many items listed.
Referring now to
One can use feature selection to determine which attributes 206 are important in predicting hospitalization on this set of execution traces. Here the term attributes is used to indicate a task or data variable that is extracted from the raw data. For example, the task Diuretics is an attribute, and it may have a value associated with it such as 3 mg (indicating the dosage of Diuretics medication prescribed). A process instance or trace may also have instance level data attributes associated with it. For example, the “Diuretics” node may have an associated dosage.
In the present example, the process model 200 describes potential process flows for patients with heart problems. Each path through the graph represents a different potential execution trace, each ending with the condition of being hospitalized. For example, a patient may have a chest x-ray, may subsequently suffer heart failure, be given antianginal agents, and then may be hospitalized. In an alternative execution trace, the patient may be given diuretics, followed by potassium, followed by creatinine, and then be hospitalized.
An exemplary matrix to represent some execution traces that create the process model 200 is shown below at table 1.
As can be readily seen from
Referring now to
Given the exemplary patient traces shown above, the complete set of tasks that would be extracted from the traces and which could show up in the mined process model 200 are Antianginal Agents, Crackles on Lung, Pulse Oxygen, Creatinine, Vital, Chest X-ray, Potassium, Diuretics, Heart Failure, and Cardiotonics. Of these, only the attributes of Antianginal Agents, Crackles on Lung, Diuretics, Heart Failure, and Cardiotonics are actually relevant to predicting whether the patient is hospitalized, leaving the attributes of Pulse Oxygen, Creatinine, Vital, Chest X-Ray, Potassium, and Cardiotonics as being irrelevant.
Referring now to
The predictive model 400 includes a ranking of how attributes 402 impact the outcome, with the most significant attributes closer to the root of the tree. By selecting a subset of the attributes (e.g., the top k levels of the model 400), only the most relevant attributes are kept. Irrelevant attributes do not help determine whether a given execution will reach the outcome in question and hence they do not appear in the model 400. After creating the prediction model 400, the irrelevant attributes are eliminated from the execution traces. It is also important to note that the predictive model 400 may not include the entire set of attributes included in the execution traces. This is because attributes not in a predictive model may have no impact on the outcome specified to the predictive model during the training phase. Therefore, even if one were to select all of the attributes in the predictive model, the resulting predictive model could be a subset of the entire attribute set.
The attributes 402 that are removed from the predictive model 400 are used to prune the execution traces. For example, consider the removal of all attributes that are mere data objects (e.g., identifying the pharmacy a particular drug came from) or which are present in most execution traces (e.g., representing something with low predictive power). The removal of the associated task from the training set simplifies the resulting output model. After pruning, the execution traces described above include:
Patient1 trace: Diuretics, Antianginal Agents, Crackles on Lung
Patient2 trace: Diuretics, Heart Failure,
Patient3 trace: Crackles on Lung, Diuretics, HeartFailure
As a result of this pruning, the mined model 300 includes fewer nodes and fewer edges that represent irrelevant causal relationships.
Referring now to
Having described preferred embodiments of a system and method for pruning process execution logs (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular embodiments disclosed which are within the scope of the invention as outlined by the appended claims. Having thus described aspects of the invention, with the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
5723283 | Classen | Mar 1998 | A |
6420139 | Classen | Jul 2002 | B1 |
6638739 | Classen | Oct 2003 | B2 |
7895323 | Gupta et al. | Feb 2011 | B2 |
7933762 | Pinto | Apr 2011 | B2 |
7971191 | Guadalupe Castellanos | Jun 2011 | B2 |
7979844 | Srinivasan | Jul 2011 | B2 |
8082220 | Hadad et al. | Dec 2011 | B2 |
8321251 | Opalach et al. | Nov 2012 | B2 |
8396884 | Singh et al. | Mar 2013 | B2 |
8402444 | Ball et al. | Mar 2013 | B2 |
8417715 | Bruckhaus | Apr 2013 | B1 |
8589331 | Duan | Nov 2013 | B2 |
8751273 | Pinto | Jun 2014 | B2 |
8996436 | Ren | Mar 2015 | B1 |
9171259 | Laxmanan | Oct 2015 | B1 |
9280740 | Laxmanan | Mar 2016 | B1 |
9299035 | Lakshmanan | Mar 2016 | B2 |
9355371 | Goodwin | May 2016 | B2 |
20010051934 | Oyanagi | Dec 2001 | A1 |
20030061015 | Ben-Gal | Mar 2003 | A1 |
20030212519 | Campos | Nov 2003 | A1 |
20030236691 | Casatl | Dec 2003 | A1 |
20040133531 | Chen | Jul 2004 | A1 |
20050138483 | Hatonen et al. | Jun 2005 | A1 |
20050177414 | Priogin | Aug 2005 | A1 |
20050234688 | Pinto | Oct 2005 | A1 |
20050278705 | Castellanos | Dec 2005 | A1 |
20060173663 | Langheier | Aug 2006 | A1 |
20060293940 | Tsyganskiy et al. | Dec 2006 | A1 |
20070011135 | Chitgupakar | Jan 2007 | A1 |
20080172214 | Col | Jul 2008 | A1 |
20090193039 | Bradley | Jul 2009 | A1 |
20100076799 | Magent | Mar 2010 | A1 |
20100267102 | Begin et al. | Oct 2010 | A1 |
20100281469 | Wang | Nov 2010 | A1 |
20110167412 | Kahlon et al. | Jul 2011 | A1 |
20120049881 | Johnson | Mar 2012 | A1 |
20120066166 | Curbera | Mar 2012 | A1 |
20120101974 | Duan et al. | Apr 2012 | A1 |
20120323827 | Lakshmanan | Dec 2012 | A1 |
20130103441 | Doganata | Apr 2013 | A1 |
20130103719 | Gotz et al. | Apr 2013 | A1 |
20130339088 | Olsen | Dec 2013 | A1 |
20140067732 | Doganata | Mar 2014 | A1 |
20140074764 | Duftler | Mar 2014 | A1 |
20140279769 | Goodwin | Sep 2014 | A1 |
20140365403 | Demuth | Dec 2014 | A1 |
20140365533 | Debray | Dec 2014 | A1 |
20150112710 | Haber | Apr 2015 | A1 |
20150127588 | Lakshmanan | May 2015 | A1 |
20150127589 | Lakshmanan | May 2015 | A1 |
20150227838 | Wang | Aug 2015 | A1 |
Number | Date | Country |
---|---|---|
WO 2013192593 | Dec 2013 | WO |
WO 2014043623 | Mar 2014 | WO |
Entry |
---|
Maggi, et al., “Predictive Monitoring of Business Processes,” arXiv: 1312.4874, Thu, Dec. 19, 2013 19:34:45 GMT, 16 pp., 2013. |
Wang et al., “Trace-Based symbolic analysis for atomicity violations,” Proceedings of the 16th international conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS'10), pp. 328-342, 2010. |
Wang et al., “Symbolic pruning of concurrent program executions,” in: Foundations of Software Engineering, pp. 23-32. ACM, New York (2009). |
Wang et al., “Symbolic predictive analysis for concurrent programs,” in: International Symposium on Formal Methods, pp. 256-272. ACM, New York (2009). |
Maggi et al., “Predictive Monitoring of Business Processes,” arXiv:1312.4874v2 [cs.SE], Thu, Dec. 19, 2013 19:34:45 GMT. |
Arentze, T.A. et al. “Using Decision Tree Induction Systems for Modeling Space-Time Behavior” Geographical Analysis vol. 32 No. 4 [Published 2000] [Retrieved online Jan. 2020] <URL: https://onlinelibrary.wiley.com/doi/epdf/10.1111/j.1538-4632.2000.tb00431.x> (Year: 2000). |
Unnamed, “Artificial Intelligence: Representation and Problem Solving, Induction to Learning & Decision Trees.” CMU, 2007 [ Published 2007] [Retrieved May 2020] <URL: https://www.cs.cmu.edu/afs/cs/academic/class/15381-s07/www/slides/041007decisionTrees1.pdf> (Year: 2007). |
Russell et al. “Artificial Intelligence: A Modern Approach” Prentice Hall, 2nd ed. [Published 2003] [Retrieved May 2020] (Year: 2003). |
Brodley, Carla E. “Automatic Selection of Split Criterion during Tree Growing based on Node Location.” MLP '95 [Published 1995] [ Retrieved Oct. 2020] <URL: https://www.sciencedirect.com/science/article/pii/B9781558603776500189> (Year: 1995). |
Brodley, Carla E. “Multivariate Decision Trees” Machine Learning 19, p. 45-77 [Published 1995] [Retrieved Oct. 2020] <URL: https://link.springer.com/article/10.1023/A:1022607123649> (Year: 1995). |
Not Listed “IBM SPSS Statistics 20 Documentation” IBM Support [Published 2011] [Retrieved Dec. 2020] <URL: https://www.ibm.com/support/pages/node/724851#en> (Year: 2011). |
Alves De Medeiros, A., et al. “Process Mining Based on Clustering: A Quest for Precision” Business Process Management Workshops. Sep. 2007. (12 Pages). |
Cornelissen, B., et al. “Execution Trace Analysis Through Massive Sequence and Circular Bundle Views” Deft University of Technology—Software Engineering Research Group: Technical Report Series. Feb. 2008. pp. 1-40. |
Gunther, C., et al. “Fuzzy Mining—Adaptive Process Simplification Based on Multi-Perspective Metrics” Business Process Management 2007. Sep. 2007. pp. 328-343. |
Kahlon, V., et al. “Universal Causality Graphs: A Precise Happens-Before Model for Detecting Bugs in Concurrent Programs” Computer Aided Verification, 22nd International Conference, CAV 2010. Jul. 2010. pp. 1-17. |
Karegowda, A., et al. “Comparative Study of Attribute Selection Using Gain Ratio and Correlation Based Feature Selection” International Journal of Information Technology and Knowledge Management, vol. 2, No. 2. Jul. 2010. pp. 271-277. |
Lakshmanan, G., et al. “Leveraging Process Mining Techniques to Analyze Semi-Structured Processes” IT Professional, vol. 15, Issue 5. Sep. 2013. pp. 1-13. |
Lakshmanan, G., et al. “Predictive Analytics for Semi-Structured Case Oriented Business Processes” BPM 2010 Workshops. Sep. 2010. pp. 640-651. |
Rosstad, T., et al. “Development of a Patient-Centred Care Pathway Across Healthcare Providers: A Qualitative Study” BMC Health Services Research 2013. Apr. 2013. pp. 1-9. |
Song, M., et al. “Trace Clustering in Process Mining” Proceedings of the 4th Workshop on Business Process Intelligence (BPI'08), BPM Workshops 2008. Sep. 2008. pp. 1-12. |
Taylor, J. “Simplifying Over-Complex Processes” Decision Management Solutions. Dec. 2010. (51 Pages) Available at: http://www.slideshare.net/jamet123/simplifying-over-complex-processes-with-decision-management. |
Van Der Aalst, W., et al. “Process Mining” Communications of the ACM, vol. 55, No. 8. Aug. 2012. pp. 76-83. |
Van Dongen, B., et al. “The Prom Framework: A New Era in Process Mining Tool Support” 6th International Conference, ICATPN 2005. Jun. 2005. pp. 444-454. |
Yang, W., et al. “A Process-Mining Framework for the Detection of Healthcare Fraud and Abuse” Expert Systems with Applications, vol. 31, Issue 1. Jul. 2006. pp. 56-68. |
Duftler, M., et al. “Extracting Clinical Care Pathways Correlated With Outcomes” U.S. Appl. No. 13/851,755, filed Mar. 27, 2013. (32 Pages). |
Lakshmanan, G., et al. “Iterative Refinement of Pathways Correlated With Outcomes” U.S. Patent Application filed concurrently on Nov. 1, 2013. (26 Pages). |
Number | Date | Country | |
---|---|---|---|
20150127588 A1 | May 2015 | US |