A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
This application is continuation-in-part of U.S. patent application Ser. No. 16/410,999 (the '999 application), filed on May 13, 2019, entitled Robotic Process Automation System with Hybrid Workflows, which is assigned to the assignee of the present application and which is hereby incorporated by reference in its entirety. The '999 application claims priority to U.S. provisional patent application No. 62/670,820 filed on May 13, 2018, entitled Computerized Workflow Generation with Integrated Bot Selection and Generation.
This disclosure relates generally to the field of data processing systems and more particularly to computerized task automation.
Robotic process automation (RPA) is the application of technology that allows workers in an organization to configure computer software, known as a “robot” to capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems. The software robots in conventional RPA systems employ the software robots to interpret the user interface of third-party applications and to execute steps identically to a human user. For example, many tasks within organizations require individuals to perform the same repetitive tasks, such as entering data from invoices into an enterprise accounts payable application or entering data from a loan application into a loan processing system. RPA permits the automation of such application level repetitive tasks via software robots that are coded to repeatedly and accurately perform the repetitive task.
The software robots in conventional RPA systems execute on devices, physical or virtual, that are separate from an RPA server and which contain software to permit creation and/or execution of the software robot. When operating, the software robots and the accompanying software, such as a player that controls execution of the software robot, consume valuable system resources. This can be greatly magnified in larger organizations that may deploy thousands of software robots at any given time. Moreover, management of the deployment of software robots in such larger organizations can become quite complex. There is accordingly a need for computerized systems and methods by which the software robots are deployed and managed.
A computerized task automation system is disclosed herein which comprises a computerized data storage containing one or more software robots. Each software robot is encoded with a set of instructions that cause the software robot to interact with one or more applications, as encoded by the set of instructions, to perform one or more tasks with the one or more applications to complete a task in a manner that a user would perform the task. A processor is programmed with instructions that when executed by the processor, cause the processor to respond to an execution request that specifies a first software robot by retrieving the first software robot and enabling execution of the first software robot. The first software robot has encoded therein a first instruction that requires the software robot to await occurrence of a first trigger that specifies occurrence of a first event in order to execute the first instruction, and a second trigger that specifies occurrence of a second event in order to execute the first instruction. The processor initiates execution of the first software robot only upon occurrence of the first and the second trigger. Other features include the ability to concurrently monitor triggers for multiple bots and the ability to automatically generate a summary file upon conclusion of execution of a bot where the summary file contains actions taken by the bot and a user of the bot during execution of the bot.
The ability to specify the conditions under which a bot's execution is to be initiated in a highly granular manner reduces power consumption and increases system performance. Additional aspects related to the invention will be set forth in part in the description which follows, and in part will be apparent to those skilled in the art from the description or may be learned by practice of the invention. Aspects of the invention may be realized and attained by means of the elements and combinations of various elements and aspects particularly pointed out in the following detailed description and the appended claims.
It is to be understood that both the foregoing and the following descriptions are exemplary and explanatory only and are not intended to limit the claimed invention or application thereof in any manner whatsoever.
The accompanying drawings, which are incorporated in and constitute a part of this specification exemplify the embodiments of the present invention and, together with the description, serve to explain and illustrate principles of the inventive techniques disclosed herein. Specifically:
In the following detailed description, reference will be made to the accompanying drawings, in which identical functional elements are designated with like numerals. The aforementioned accompanying drawings show by way of illustration, and not by way of limitation, specific embodiments and implementations consistent with principles of the present invention. These implementations are described in sufficient detail to enable those skilled in the art to practice the invention and it is to be understood that other implementations may be utilized and that structural changes and/or substitutions of various elements may be made without departing from the scope and spirit of present invention. The following detailed description is, therefore, not to be construed in a limited sense.
The user 102 may advantageously employ the interface 104 to implement certain tasks of the business process being displayed by the interface 104 by way of software robots 116, stored in storage 106. Each software robot comprises a set of task processing instructions operable to interact at a user level with one or more designated user level application programs. As used herein, the term “bot” is generally synonymous with the term software robot. In certain contexts, as will be apparent to those skilled in the art in view of the present disclosure, the term “bot runner” refers to a device (virtual or physical), such as devices 118, 119, 120, having the necessary software capability on which a bot will execute or is executing.
The bots 116 execute on a player, via a computing device, to perform the functions encoded by the bot. A bot 116 is created by recording of user actions and/or by encoding of instructions to implement the bot 116. The recording of user actions may be performed by a recorder that detects and stores user actions while interacting with software applications, including operating system functions provided by a computer system. The resulting bot 116 is stored under control of an RPA system controller, such as control room 122 described herein. The bot 116 may be subsequently retrieved and executed by the same user or a different user and executed on the same machine on which it was created or a different machine to perform the user level commands encoded in the bot 116 to reproduce human actions in interacting with applications, including user level operating system functions, that are available to the bot 116. Additional aspects of operation of bots may be found in the following pending patent application, which refers to bots as automation profiles, System and Method for Compliance Based Automation, filed in the U.S. Patent Office on Jan. 6, 2016, and assigned application Ser. No. 14/988,877, which is hereby incorporated by reference in its entirety.
Some or all of the bots 116 may in certain embodiments be located remotely from control room 122. Moreover, the devices 118-120 may also be located remotely from control room 122. The bots 104 and the tasks 106 are shown stored in separate storage containers for purposes of illustration but they may be stored in separate or the same device(s), or across multiple devices. The control room 122 performs user management functions, source control of the bots 116, along with providing a dashboard that provides analytics and results of the bots 116, performs license management of software required by the bots 116 and manages overall execution and management of scripts, clients, roles, credentials, and security etc. The major functions performed by the control room 122 include: (i) a dashboard that provides a summary of registered/active users, tasks status, repository details, number of clients connected, number of scripts passed or failed recently, tasks that are scheduled to be executed and those that are in progress; (ii) user/role management—permits creation of different roles, such as bot creator, bot runner, admin, and custom roles, and activation, deactivation and modification of roles; (iii) repository management—to manage all scripts, tasks, workflows and reports etc.; (iv) operations management—permits checking status of tasks in progress and history of all tasks, and permits the administrator to stop/start execution of bots currently executing; (v) audit trail—logs creation of all actions performed in the control room; (vi) task scheduler—permits scheduling tasks which need to be executed on different clients at any particular time; (vii) credential management—permits password management; and (viii) security: management—permits rights management for all user roles. The control room 122 is shown generally for simplicity of explanation. Multiple instances of the control room 122 may be employed where large numbers of bots are deployed to provide for scalability of the system 10. Additional details of certain aspects of control room 122 may be found in U.S. patent application Ser. No. 16/146,485, filed on Sep. 28, 2018, entitled ROBOTIC PROCESS AUTOMATION SYSTEM WITH QUEUE ORCHESTRATION AND TASK PRIORITIZATION, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety.
The process modeler 124 allows a user to create, modify, import and export a business process. Business processes are modelled as a series of steps with logic flow between them. The process modeler 124 also enables creation of workflows by connecting existing bots with various types of logic. Data can be passed between bots. The Bot Mapper 126 allows a user to create bot(s) or assign existing bot(s) for any step in a business process. Once a bot is associated with a business process step, this information is available to all bots/services on the RPA platform. The Bot Recommender 128 recommends other similar/complementary bots developed by the user 102's organization or by other organizations and available in a shared repository (such as the Bot Store offered by Automation Anywhere, Inc.) based on the existing bots already mapped to business process steps. This enables bot discovery for maximum re-use of bots and existing automation ROI. The Automation Calculator 130 computes the amount of automation already done and the amount remaining (backlog). It does this by comparing the number of automated steps to the total number of steps in the process. The Automation Calculator 130 also computes the ROI based on the automation already done for a business process by aggregating the calculated human labor and time for all automated business process steps. The Bot Sketch module 132 consists of visual screenshots of all key actions taken by the user, which will be executed by the bot. A non-technical user can create a Bot Sketch by simply turning on a recorder and carrying out the actions that the bot needs to execute in the correct order of sequence. The Bot Sketch module 132 will show all/key connected visuals for the bot. Bot sketch is only visible to those users who have access to that view for that specific bot. The Bot Sketch is the first step in defining the bot that needs to be created. The Bot Design 134 is a visual bot modeling interface that allows a user to create a bot by defining building blocks, using various recorders found in the RPA platform, steps to manipulate data and dragging & dropping various automation commands. The bot is represented in a visual, workflow style interface geared towards non-technical users. The Bot Code 136 is an integrated development environment (IDE) where a developer can directly write code for a bot. The Bot Model Engine 138 stores the bot design, the underlying command structure and all the metadata associated with the bot. It enables the Bot View Translator 140 to translate the Bot Design to Bot Code. The Bot View Translator 140 enables users to switch between the Bot Design and Bot Code views. It contains the viewing logic to enable these conversions at a automation command/granular level. The Privilege Administrator 142 stores and enforces view level privileges so users can view either bot design, bot code or both views.
The bots 116 may take one of a variety of forms. Unattended bots, seen as uBot 1, uBot 2, . . . , uBotn, are encoded to operate automatically without human user involvement. These bots may be deployed by a human user or may be deployed, without human involvement, programmatically by another bot or other software. uBots are particularly useful in batch processing environments where a large amount of documents, for example, need to be processed, an such bots may be scheduled to run at particular times or upon occurrence of particular events. Attended bots, seen as aBot 1, aBot 2, . . . , aBot n, are encoded to automatically perform certain tasks but with human user involvement, which may include for example, entry of certain data and making of subjective judgments when presented with certain data. An aBot performs certain tasks automatically and accepts user input, such as for example in a call center, as needed. Cognitive bots, seen as cBot 1, cBot 2, . . . , cBot n, are encoded to automatically interact with one or more application programs without any user input and are further encoded to automatically alter their interactions with the one or more application programs by way of a machine learning engine. The cognitive bots permit automation of tasks involving unstructured data to permit use of technologies such as computer vision, natural language processing, fuzzy logic, and machine learning without the help of data scientists or highly trained experts. When employed with computer vision, a cBot can identify and categorize unstructured content allowing the cBot to intelligently extract decision-making data. For natural language processing a cBot can comprehend the meaning and intent of content to improve decision making. By employing fuzzy logic, a cBot can conduct phonetic algorithm and fuzzy string matching against enterprise applications to validate and enrich extracted data. By employing machine learning a cBot can learn by observing human behavior and developing domain expertise increasing accuracy and reducing exceptions. Additional details of certain aspects of cBots maybe found in U.S. patent application Ser. No. 16/023,786, filed on Jun. 29, 2018, entitled ROBOTIC PROCESS AUTOMATION SYSTEM AND METHOD WITH CROSS-DOMAIN LEARNING, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety.
The user 102 may employ the system 10 by way of the interface 104 to define a process 105 and to specify which tasks should be performed by a bot, which may be a uBot, a aBot and/or a cBot. As seen, by the dotted line from uBot 1 to process task 108, the user 102 has designated an unattended bot, uBot 1, to perform process task 108, generate purchase order. Task 110, invoice reconciliation, has been assigned by the user 102 aBot 1 and task 112, make payment, has been assigned cBot 1.
Preferably, the bots 116 that are available to the user 102 have associated metadata to characterize the bot's capabilities and functions to permit searching and identification of appropriate bot(s). As further described herein, the bots 116 may in certain embodiments be assigned by the user 102 to a particular task, once identified, by conventional drag and drop actions. Each bot 116 that is assigned for processing of a process 105 executes on a device, that may be a physical device, or a virtual device (such as implemented by a virtual machine), when invoked within the process 105, via control room 122. As seen, uBot 1 executes on device 118, cBot 1 executes on device 119 and aBot 1 executes on device 120.
Operation of the creation and editing of a process 105 may be better understood in connection with
As seen from the foregoing description, the workflows for hybrid RPA provide for a number of benefits including: (i) an easy way to stitch bots together with conditional logic, (ii) parallel/serial execution of attended bots, unattended bots and cognitive bots, transactional control and scope definition for task bots, (iii) an interface that permits easy design, visualization and execution of bots, (iv) run time visibility into work flow state, and (v) configurability to permit use of same triggers as bots.
The bot aBot 2 is encoded to initiate operation upon occurrence of events corresponding to occurrence of both of two triggers, Trigger 1 and Trigger 2. A listener 924 operates in conjunction with player 922 to initiate operation of aBot 2. The listener 924 monitors events corresponding to triggers encoded into aBot 2, Trigger 1, Trigger 2 and initiates execution of aBot 2 upon occurrence of both triggers, Trigger 1 and Trigger 2, as encoded by aBot 2. By way of example, Trigger 1 corresponds to a file/folder trigger which has been encoded into aBot 2 as occurrence or storage into a file system existing in a data storage system 926 of an identified file (as identified for example by a specified file name). Further by way of example, Trigger 2 corresponds to occurrence of a menu item in an application (App 1) being used by user 921. For example, if user 921 is processing an invoice, then the menu item may be a command to “process invoice” in an invoice processing application (App 1) and Document 1 may correspond to a particular invoice. Upon detection of occurrence of both events corresponding to Trigger 1 and Trigger 2, listener 924 initiates execution of aBot 2 to cause processing of the invoice by user 921. In one embodiment, the listener 924 runs as a background process. Once the trigger conditions are satisfied the bot in question begins execution.
The control room 122 provides for decoupling of triggers from bots to facilitate reusability. Centralized storage of the triggers created by users of the RPA system 900 is provided by storage mechanism 932. A bot that is created by a user of the system 900 is uploaded to the control room 122 which causes storage into storage 932. Upon upload of a bot, the control room scans the uploaded bot and automatically identifies any triggers encoded in the bot. The identified triggers are stored to trigger repository 930. Each of the stored triggers is advantageously available to the same user that created the trigger and also to other users. The control room 122 effectively maintains a public space of triggers and for each user a private space for triggers. This can simplify the creation of triggers for bots by permitting copying of an existing trigger instead of having to create one from scratch. Existing triggers may be modified to create new triggers, which are themselves stored to the trigger repository 930. Although the same trigger may be employed with multiple bots, when encoded into a bot, the trigger operates as a trigger for the bot into which it is encoded and does not operate system wide within the system 900. In other words, in such an embodiment, a trigger once encoded into a bot is a local trigger and typically does not affect any other bot. Therefore the same trigger may be encoded into different bots that may be executing on separate devices but the occurrence of an event on one device that causes a bot executing on that device to execute by virtue of occurrence of a trigger will not have the same effect on the other device. In another embodiment, it is possible for a single trigger to affect more than one bot. In practice, while this is possible it is likely to not occur very often.
The RPA system 900 employs a configuration file 934 for each user to associate information specific to that user's use of and interaction of the system 900. For example, upon logging into the system 900 to create a bot or to add triggers to a bot, the user will be able to retrieve (or may automatically be shown) bots previously created and/or used by the user along with triggers used and/or created by the user, to facilitate bot and trigger creation. In practice, it is likely that any given user will reuse triggers previously created by themselves, with some modifications to the previously created trigger. It should be noted that the storage 932 shown for trigger repository 930 and configuration files 934 and the storage 926 for the documents are shown as separate from each other and from storage 116 for bots for simplicity of illustration. The bots, triggers, credential files and documents may be stored in separate storage mechanisms, in the same mechanism or distributed across multiple storage mechanisms by known techniques.
One of the benefits of permitting the encoding of multiple conditions required to occur before initiation of execution of a bot is the savings in system resources. A bot that has been initiated but that is not being used consumes valuable system resources. A user who is using a physical device to execute a bot may experience degraded performance while a bot is executing, and the user is attempting to use the device to perform other tasks. Additionally, in the event that the user is using another device, such as described in the aforementioned patent application, ZERO FOOTPRINT ROBOTIC PROCESS AUTOMATION SYSTEM, the system 900 will be required to allocate resources, such as a virtual machine, or physical device to execute the bot. This can increase power consumption and/or limit the ability of the system to perform other processing, such as allocating virtual devices to other users, while valuable resources are being consumed by an executing bot that is not being used. Consequently, the ability to specify the conditions under which a bot's execution is to be initiated in a highly granular manner reduces power consumption and increases system performance. In this respect, it should be noted that long periods of time can elapse between occurrence of one trigger and another trigger. For example, in the example of aBot 1 in
The system 900 advantageously supports a variety of different types of triggers including: (i) file/folder triggers that indicate the creation, modification, deletion of a file, multiple files, folder, multiple folder, including wild cards to enable identification of multiple files/folders named with a specified string of characters; (ii) email triggers that specify a particular field of an email (e.g. to, from, cc, bcc, subject, date) and include, receipt, and in certain embodiments, sending of an email; (iii) application or operating system service designated by name or other unique identifier, such as Windows® services and the stopping, starting or pausing of the service; (iv) process designated by name or other unique identifier, such as a process ID and the stopping, starting, termination or pausing of the process; (v) performance specification such as minimum processor capability such as CPU clock speed, processor specification by name, storage capability such as minimum disk availability; (vi) window—of a certain name and any action associated with the window such as create, open, close, kill; (vii) User Interface (UI) object, such as a button or textbox and any action taken therewith; (viii) image, within a specified region on a screen and any event associated with the image such as appearance (image load) or disappearance or selection of an image located within the specified region on the screen; (ix) menu item-any menu item rendered by any application; and (x) integrated voice response (IVR) system-any specific output provided by an IVR system. In addition to the foregoing, a bot can provide an event, including termination of the bot, that can be used as a trigger for another bot.
In the example of
The embodiments herein can be implemented in the general context of computer-executable instructions, such as those included in program modules, being executed in a computing system on a target real or virtual processor. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Computer-executable instructions for program modules may be executed within a local or distributed computing system. The computer-executable instructions, which may include data, instructions, and configuration parameters, may be provided via an article of manufacture including a computer readable medium, which provides content that represents instructions that can be executed. A computer readable medium may also include a storage or database from which content can be downloaded. A computer readable medium may also include a device or product having content stored thereon at a time of sale or delivery. Thus, delivering a device with stored content, or offering content for download over a communication medium may be understood as providing an article of manufacture with such content described herein.
The terms “computer system” “system” and “computing device” are used interchangeably herein. Unless the context clearly indicates otherwise, neither term implies any limitation on a type of computing system or computing device. In general, a computing system or computing device can be local or distributed and can include any combination of special-purpose hardware and/or general-purpose hardware with software implementing the functionality described herein.
Computing system 1500 may have additional features such as for example, storage 1510, one or more input devices 1514, one or more output devices 1512, and one or more communication connections 1516. An interconnection mechanism (not shown) such as a bus, controller, or network interconnects the components of the computing system 1500. Typically, operating system software (not shown) provides an operating system for other software executing in the computing system 1500, and coordinates activities of the components of the computing system 1500.
The tangible storage 1510 may be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium which can be used to store information in a non-transitory way, and which can be accessed within the computing system 1500. The storage 1510 stores instructions for the software implementing one or more innovations described herein.
The input device(s) 1514 may be a touch input device such as a keyboard, mouse, pen, or trackball, a voice input device, a scanning device, or another device that provides input to the computing system 1500. For video encoding, the input device(s) 1514 may be a camera, video card, TV tuner card, or similar device that accepts video input in analog or digital form, or a CD-ROM or CD-RW that reads video samples into the computing system 1500. The output device(s) 1512 may be a display, printer, speaker, CD-writer, or another device that provides output from the computing system 1500.
The communication connection(s) 1516 enable communication over a communication medium to another computing entity. The communication medium conveys information such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal. A modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can use an electrical, optical, RF, or other carrier.
It should be understood that functions/operations shown in this disclosure are provided for purposes of explanation of operations of certain embodiments. The implementation of the functions/operations performed by any particular module may be distributed across one or more systems and computer programs and are not necessarily contained within a particular computer program and/or computer system.
While the invention has been described in connection with a preferred embodiment, it is not intended to limit the scope of the invention to the particular form set forth, but on the contrary, it is intended to cover such alternatives, modifications, and equivalents as may be within the spirit and scope of the invention as defined by the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
5949999 | Song et al. | Sep 1999 | A |
5983001 | Boughner et al. | Nov 1999 | A |
6133917 | Feigner et al. | Oct 2000 | A |
6389592 | Ayres et al. | May 2002 | B1 |
6427234 | Chambers et al. | Jul 2002 | B1 |
6473794 | Guheen et al. | Oct 2002 | B1 |
6496979 | Chen et al. | Dec 2002 | B1 |
6640244 | Bowman-Amuah | Oct 2003 | B1 |
6704873 | Underwood | Mar 2004 | B1 |
6898764 | Kemp | May 2005 | B2 |
6954747 | Wang et al. | Oct 2005 | B1 |
7091898 | Arling et al. | Aug 2006 | B2 |
7246128 | Jordahl | Jul 2007 | B2 |
7398469 | Kisamore et al. | Jul 2008 | B2 |
7441007 | Kirkpatrick et al. | Oct 2008 | B1 |
7533096 | Rice et al. | May 2009 | B2 |
7568109 | Powell et al. | Jul 2009 | B2 |
7571427 | Wang et al. | Aug 2009 | B2 |
7765525 | Davidson et al. | Jul 2010 | B1 |
7805317 | Khan et al. | Sep 2010 | B2 |
7805710 | North | Sep 2010 | B2 |
7810070 | Nasuti et al. | Oct 2010 | B2 |
7846023 | Evans et al. | Dec 2010 | B2 |
8028269 | Bhatia et al. | Sep 2011 | B2 |
8056092 | Allen et al. | Nov 2011 | B2 |
8095910 | Nathan et al. | Jan 2012 | B2 |
8132156 | Malcolm | Mar 2012 | B2 |
8209738 | Nicol et al. | Jun 2012 | B2 |
8234622 | Meijer et al. | Jul 2012 | B2 |
8245215 | Extra | Aug 2012 | B2 |
8352464 | Fotev | Jan 2013 | B2 |
8396890 | Lim | Mar 2013 | B2 |
8438558 | Adams | May 2013 | B1 |
8443291 | Ku et al. | May 2013 | B2 |
8464240 | Fritsch et al. | Jun 2013 | B2 |
8498473 | Chong et al. | Jul 2013 | B2 |
8504803 | Shukla | Aug 2013 | B2 |
8631458 | Banerjee | Jan 2014 | B1 |
8682083 | Kumar et al. | Mar 2014 | B2 |
8713003 | Fotev | Apr 2014 | B2 |
8769482 | Batey et al. | Jul 2014 | B2 |
8819241 | Washburn | Aug 2014 | B1 |
8832048 | Lim | Sep 2014 | B2 |
8874685 | Hollis et al. | Oct 2014 | B1 |
8943493 | Schneider | Jan 2015 | B2 |
8965905 | Ashmore et al. | Feb 2015 | B2 |
9104294 | Forstall et al. | Aug 2015 | B2 |
9213625 | Schrage | Dec 2015 | B1 |
9278284 | Ruppert et al. | Mar 2016 | B2 |
9444844 | Edery et al. | Sep 2016 | B2 |
9462042 | Shukla et al. | Oct 2016 | B2 |
9571332 | Subramaniam et al. | Feb 2017 | B2 |
9621584 | Schmidt et al. | Apr 2017 | B1 |
9946233 | Brun et al. | Apr 2018 | B2 |
10235192 | Hall | Mar 2019 | B2 |
10311360 | Ares | Jun 2019 | B1 |
10324457 | Neelakandan | Jun 2019 | B2 |
10437984 | Votaw | Oct 2019 | B2 |
10452674 | Diwan | Oct 2019 | B2 |
10606687 | Purushothaman | Mar 2020 | B2 |
10642647 | Sturtivant | May 2020 | B2 |
10682761 | Geffen | Jun 2020 | B2 |
10839404 | Ramamurthy | Nov 2020 | B2 |
10855625 | Viswanathan | Dec 2020 | B1 |
11138539 | Sethi | Oct 2021 | B2 |
20030033590 | Leherbauer | Feb 2003 | A1 |
20030101245 | Srinivasan et al. | May 2003 | A1 |
20030159089 | DiJoseph | Aug 2003 | A1 |
20040083472 | Rao et al. | Apr 2004 | A1 |
20040172526 | Tann et al. | Sep 2004 | A1 |
20040210885 | Wang et al. | Oct 2004 | A1 |
20040243994 | Nasu | Dec 2004 | A1 |
20050188357 | Derks et al. | Aug 2005 | A1 |
20050204343 | Kisamore et al. | Sep 2005 | A1 |
20050257214 | Moshir et al. | Nov 2005 | A1 |
20060095276 | Axelrod et al. | May 2006 | A1 |
20060150188 | Roman et al. | Jul 2006 | A1 |
20070101291 | Forstall et al. | May 2007 | A1 |
20070112574 | Greene | May 2007 | A1 |
20080005086 | Moore | Jan 2008 | A1 |
20080028392 | Chen et al. | Jan 2008 | A1 |
20080209392 | Able et al. | Aug 2008 | A1 |
20080222454 | Kelso | Sep 2008 | A1 |
20080263024 | Landschaft et al. | Oct 2008 | A1 |
20090037509 | Parekh et al. | Feb 2009 | A1 |
20090103769 | Milov et al. | Apr 2009 | A1 |
20090172814 | Khosravi et al. | Jul 2009 | A1 |
20090199160 | Vaitheeswaran et al. | Aug 2009 | A1 |
20090217309 | Grechanik et al. | Aug 2009 | A1 |
20090249297 | Doshi et al. | Oct 2009 | A1 |
20090313229 | Fellenstein et al. | Dec 2009 | A1 |
20090320002 | Peri-Glass et al. | Dec 2009 | A1 |
20100023602 | Martone | Jan 2010 | A1 |
20100023933 | Bryant et al. | Jan 2010 | A1 |
20100100605 | Allen et al. | Apr 2010 | A1 |
20100138015 | Colombo et al. | Jun 2010 | A1 |
20100235433 | Ansari et al. | Sep 2010 | A1 |
20110022578 | Fotev | Jan 2011 | A1 |
20110145807 | Molinie et al. | Jun 2011 | A1 |
20110197121 | Kletter | Aug 2011 | A1 |
20110276568 | Fotev | Nov 2011 | A1 |
20110276946 | Pletter | Nov 2011 | A1 |
20110302570 | Kurimilla et al. | Dec 2011 | A1 |
20120042281 | Green | Feb 2012 | A1 |
20120124062 | Macbeth et al. | May 2012 | A1 |
20120330940 | Caire et al. | Dec 2012 | A1 |
20130173648 | Tan et al. | Jul 2013 | A1 |
20130290318 | Shapira et al. | Oct 2013 | A1 |
20140181705 | Hey et al. | Jun 2014 | A1 |
20150082280 | Betak et al. | Mar 2015 | A1 |
20150347284 | Hey et al. | Dec 2015 | A1 |
20160019049 | Kakhandiki et al. | Jan 2016 | A1 |
20160078368 | Kakhandiki et al. | Mar 2016 | A1 |
20180321989 | Shetty | Nov 2018 | A1 |
20190155225 | Kothandaraman | May 2019 | A1 |
20190303779 | Van Briggle | Oct 2019 | A1 |
Entry |
---|
Wroblewska et al., “Robotic Process Automation of Unstructured Data with Machine Learning” 2018. (Year: 2018). |
Kopec et al., “Hybrid Approach to Automation, RPA, and Machine Learning: a Method for the Human-centered Design of Software Robots” Nov. 6, 2018. (Year: 2018). |
Isaac et al., “Delineated Analysis of Robotic Process Automation Tools” Dec. 24, 2017. (Year: 2017). |
Workato “The modern approach to RPA: Integration-powered RPA for Intelligent Automation” 2022, https://www.workato.com/modern-rpa. (Year: 2022). |
Workato “What are Recipes and Triggers?” May 12, 2017, https://www.youtube.com/watch?v=cPBVTpmA8z0. (Year: 2017). |
Workato “Event Based Automation with Triggers” Oct. 28, 2017, https://www.youtube.com/watch?v=-KfYHRqLBgs. (Year: 2017). |
Al Sallami, Load Balancing in Green Cloud Computation, Proceedings of the World Congress on Engineering 2013 vol. II, WCE 2013, 2013, pp. 1-5 (Year: 2013). |
B. P. Kasper “Remote: a Means of Remotely Controlling and Storing Data from a HAL Quadrupole Gas Analyzer Using an IBM-PC Compatible Computer”, Nov. 15, 1995, Space and Environment Technology Center. |
Bergen et al., RPC automation: making legacy code relevant, May 2013, 6 pages. |
Hu et al., Automating GUI testing for Android applications, May 2011, 7 pages. |
Konstantinou et al., An architecture for virtual solution composition and deployment in infrastructure clouds, 9 pages (Year: 2009). |
Nyulas et al., An Ontology-Driven Framework for Deploying JADE Agent Systems, 5 pages (Year: 2008). |
Tom Yeh, Tsung-Hsiang Chang, and Robert C. Miller, Sikuli: Using GUI Screenshots for Search and Automation, Oct. 4-7, 2009, 10 pages. |
Yu et al., Deploying and managing Web services: issues, solutions, and directions, 36 pages (Year: 2008). |
Zhifang et al., Test automation on mobile device, May 2010, 7 pages. |
Number | Date | Country | |
---|---|---|---|
62670820 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16410999 | May 2019 | US |
Child | 16458138 | US |