The present disclosure relates generally to task mining and more particularly to visualizing a process map of a process using a step-by-step recording and flowcharts associated with the process.
A process is a series of actions that may be performed by a person or machine. A process may consist of any number of actions and could be simple or complex. For example, a person associated with a payroll function in a company may perform the process of generating paychecks for employees of that company on a weekly basis. This process requires a significant amount of the person's time throughout the year, particularly if there are a large number of employees and a large number of paychecks to process. Further, if there is only one person performing a particular process, then it may be difficult or impossible for others to perform the process when that person is unavailable.
Automation can improve the efficiency of processes thereby allowing the person normally assigned to the process to spend time on other tasks, such as those that cannot be automated. Process automation requires detailed information about the process and analysis of that information.
A method for visualizing a process map of a process is executed by a process map server. The method includes receiving a flowchart and a step-by-step recording related to a process. A process map is generated by combining the flowchart and the step-by-step recording and the process map is then displayed. In one embodiment, the process map displays one or more tasks, steps, and actions related to the process. A detail window shows information associated with the process, and portions of the process, in response to user input. In one embodiment, the action is based on information from the step-by-step recording.
Process 10 may be repeated multiple times with a different individual selected each time. For example, a person may generate multiple check requests in order to request paychecks for each employee of a company. Process 10 may be repeated for each pay period. For example, the person may perform process 10 weekly.
The automation of the paycheck process 10 of
Process map 100, as shown in
COMPILE TIME SHEETS task 102 comprises IMPORT INDIVIDUAL TIMESHEET step 200. Step 200 must be completed in order to complete task 102. In one embodiment, steps are named based on one or more actions that are associated with that step (e.g., an action or actions that is/are most relevant to the step). IMPORT INDIVIDUAL TIMESHEET step 200 comprises SELECT INDIVIDUAL 302 action and SELECT TIME FRAME action 304. Each of actions 302 and 304 are required to be completed in order to complete step 200, which in turn is required to be completed in order to complete task 102. In one embodiment, steps are named based on one or more of the actions that are associated with that step (e.g., an action or actions that is/are most relevant to the step). In one embodiment, a natural language description of the steps is generated using one or more of natural language generation (NLG), word embedding, salience (place and font size/color), and/or a custom dictionary with important words as well as stop words.
In one embodiment, process map 100 is generated by a process map server using a step-by-step recording of a process (e.g., process 10 shown in
In one embodiment, the process map is derived and inferred from the step-by-step recordings of actual executions of a process and/or tasks associated with the process. The process map is generated at different levels of granularity and abstraction. Longer tasks are generated with more levels to ensure an easily read overview at the most abstracted (highest) level which can then be drilled into all the way to the most detailed level. Depending on the amount of recorded data and the variance in execution the most detailed level may correspond with one or more examples (traces). In the case of a single trace, all the recorded (step-by-step) data can be immediately shown. If there are multiple traces, the traces can be aggregated with summary statistics as well as visualized as a list so that an analyst can select one or a smaller subset for more information. One to many mapping between high-level process maps and detailed steps is achieved, in one embodiment, by clustering on the metadata of the step (wherein the metadata is generated via optical character recognition (OCR), computer vision, document object model (DOM) tree, hardware event, causality score, word vector, etc.) and/or the steps belonging in a sequence model. The clusters are then tiled and represent a change in level. The process can then be performed iteratively or using hierarchical clustering.
The step-by-step recording, in one embodiment, comprises a recording of a user's input to a computer, and what the computer displays, while the user is performing the process associated with the flowchart. The flowchart, in one embodiment, comprises a plurality of tasks that are associated with the step-by-step recording. Each of the plurality of tasks comprises a plurality of steps and each of the plurality of steps comprises one or more actions. In one embodiment, each of the one or more actions is based on information from the step-by-step recording. For example, actions taken by the user while the user is performing the process (i.e., selection of icons and/or images displayed to the user, keyboard entries, mouse clicks, etc.) are included in the step-by-step recording.
In one embodiment, the step-by-step recording is generated using task mining techniques. Task mining, in one embodiment, is the obtaining of information regarding a task a user performs on a computer by analyzing one or more step-by-step recordings of user input to the computer and what is displayed by the computer. The step-by-step recordings can be related to a single process or multiple processes. The step-by-step recordings can be related to a single performance of a task or multiple performances of a task by one or more users. The step-by-step recordings use various techniques for capturing user input and screenshots such as hardware event intercept and/or video intercept to generate OCR, computer vision of controls, DOM tree, hardware event, causality score, word vector, etc.
Process map 100 generated using the method shown in
UI 500, in one embodiment, includes icons analysis 506, graph 508, export 510 and details 512 available for selection by a user. Those icons are collectively referred to as the dashboard or the process. A user selecting analysis 506 is presented with analysis popup window 600 as shown in
Graph 508 of the dashboard, when selected, provides a user with popup window 700 shown in
Export 510 of the dashboard, when selected, provides a user with popup window 700 shown in
Details 512 of the dashboard, when selected, displays various information about collapsed process map 502 depending on the portion of collapsed process map 502 that is currently selected by a user.
Computing system 1100 further includes a memory 1106 for storing information and instructions to be executed by processor(s) 1104. Memory 1106 can be comprised of any combination of Random Access Memory (RAM), Read Only Memory (ROM), flash memory, cache, static storage such as a magnetic or optical disk, or any other types of non-transitory computer-readable media or combinations thereof. Non-transitory computer-readable media may be any available media that can be accessed by processor(s) 1104 and may include volatile media, non-volatile media, or both. The media may also be removable, non-removable, or both.
Additionally, computing system 1100 includes a communication device 1108, such as a transceiver, to provide access to a communications network via a wireless and/or wired connection according to any currently existing or future-implemented communications standard and/or protocol.
Processor(s) 1104 are further coupled via bus 1102 to a display 1110 that is suitable for displaying information to a user. Display 1110 may also be configured as a touch display and/or any suitable haptic I/O device.
A keyboard 1112 and a cursor control device 1114, such as a computer mouse, a touchpad, etc., are further coupled to bus 1102 to enable a user to interface with computing system. However, in certain embodiments, a physical keyboard and mouse may not be present, and the user may interact with the device solely through display 1110 and/or a touchpad (not shown). Any type and combination of input devices may be used as a matter of design choice. In certain embodiments, no physical input device and/or display is present. For instance, the user may interact with computing system 1100 remotely via another computing system in communication therewith, or computing system 1100 may operate autonomously.
Memory 1106 stores software modules that provide functionality when executed by processor(s) 1104. The modules include an operating system 1116 for computing system 1100 and one or more additional functional modules 1118 configured to perform all or part of the processes described herein or derivatives thereof.
One skilled in the art will appreciate that a “system” could be embodied as a server, an embedded computing system, a personal computer, a console, a personal digital assistant (PDA), a cell phone, a tablet computing device, a quantum computing system, or any other suitable computing device, or combination of devices without deviating from the scope of the invention. Presenting the above-described functions as being performed by a “system” is not intended to limit the scope of the present invention in any way, but is intended to provide one example of the many embodiments of the present invention. Indeed, methods, systems, and apparatuses disclosed herein may be implemented in localized and distributed forms consistent with computing technology, including cloud computing systems.
It should be noted that some of the system features described in this specification have been presented as modules, in order to more particularly emphasize their implementation independence. For example, a module may be implemented as a hardware circuit comprising custom very large scale integration (VLSI) circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, graphics processing units, or the like. A module may also be at least partially implemented in software for execution by various types of processors. An identified unit of executable code may, for instance, include one or more physical or logical blocks of computer instructions that may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may include disparate instructions stored in different locations that, when joined logically together, comprise the module and achieve the stated purpose for the module. Further, modules may be stored on a computer-readable medium, which may be, for instance, a hard disk drive, flash device, RAM, tape, and/or any other such non-transitory computer-readable medium used to store data without deviating from the scope of the invention. Indeed, a module of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, merely as electronic signals on a system or network.
The foregoing merely illustrates the principles of the disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its spirit and scope. Furthermore, all examples and conditional language recited herein are principally intended to be only for pedagogical purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future.
This application is a continuation of prior-filed U.S. Utility patent application Ser. No. 16/917,861, filed Jun. 30, 2020; the disclosure of which is herein incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6671874 | Passova | Dec 2003 | B1 |
7970240 | Chao | Jun 2011 | B1 |
8321251 | Opalach et al. | Nov 2012 | B2 |
8619084 | Curbera et al. | Dec 2013 | B2 |
8719775 | Cole et al. | May 2014 | B1 |
8782103 | Ahlborn | Jul 2014 | B2 |
9342272 | Tattrie et al. | May 2016 | B2 |
10339027 | Garcia et al. | Jul 2019 | B2 |
20060253490 | Krishna et al. | Nov 2006 | A1 |
20090018877 | Houck et al. | Jan 2009 | A1 |
20090287958 | Bhatt et al. | Nov 2009 | A1 |
20120253880 | Kumar et al. | Oct 2012 | A1 |
20130159047 | Mayerle et al. | Jun 2013 | A1 |
20130297528 | Stiehl | Nov 2013 | A1 |
20140019190 | Arend et al. | Jan 2014 | A1 |
20140282199 | Basu et al. | Sep 2014 | A1 |
20140324518 | Roitman et al. | Oct 2014 | A1 |
20150142587 | Salgar et al. | May 2015 | A1 |
20170213167 | Rinke et al. | Jul 2017 | A1 |
20200349486 | Moolman et al. | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
10-2012-0067726 | Jun 2012 | KR |
Entry |
---|
Blickle et al., “Automatic Process Discovery with ARIS Process Performance Manager (ARIS PPM),” Expert Paper, IDS Scheer, 2009, pp. 1-12. |
International Search Report and Written Opinion mailed Mar. 29, 2021, in connection with International Patent Application No. PCT/US2020/052219, 9 pgs. |
Number | Date | Country | |
---|---|---|---|
20240070127 A1 | Feb 2024 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16917861 | Jun 2020 | US |
Child | 18502546 | US |