Robotic process automation system with separate code loading

Information

  • Patent Grant
  • 11954514
  • Patent Number
    11,954,514
  • Date Filed
    Tuesday, August 31, 2021
    2 years ago
  • Date Issued
    Tuesday, April 9, 2024
    a month ago
Abstract
A robotic process automation system includes a server processor that performs an automation task to process a work item, by initiating a java virtual machine on a second device. A first user session that employs credentials of a first user for managing execution of the automation task is also initiated on the second device. The server processor loads into the java virtual machine, with a platform class loader, one or more modules, such as logging and security, that perform functions common to the sets of task processing instructions. A first class loader a first set of task processing instructions is also loaded. Then each instruction in the first set of task processing instructions is loaded with a separate class loader. The server processor causes execution, under control of the first user session, on the second device, the task processing instructions that correspond to the work item.
Description
FIELD OF THE DISCLOSURE

This disclosure relates generally to the field of data processing systems and more particularly to robotic process automation systems.


BACKGROUND

Robotic process automation (RPA) is the application of technology that allows workers in an organization to configure computer software, known as a “robot” to capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems. The software robots in conventional RPA systems employ the software robots to interpret the user interface of third-party applications and to execute steps identically to a human user. For example, many tasks within organizations require individuals to perform the same repetitive tasks, such as entering data from invoices into an enterprise accounts payable application or entering data from a loan application into a loan processing system. RPA permits the automation of such application level repetitive tasks via software robots that are coded to repeatedly and accurately perform the repetitive task.


The software robots in conventional RPA systems execute on devices, physical or virtual, that are separate from an RPA server and which contain software to permit creation and/or execution of the software robot. While this has proven to be highly beneficial in facilitating data processing, the requirement for bot creation/execution software to be loaded onto different devices increases administrative complexity and can limit the processing capability of the RPA system. Moreover, because the software robots operate at an application level, as a human user would engage with such applications, conventional RPA systems are operating system dependent. A software robot encoded to perform tasks on, for example, a Windows® operating system, will need to be executed to perform the tasks for which it has been encoded on the Windows® operating system. This limitation can limit the scalability and increase the cost of deployment of an RPA system.


SUMMARY

Computerized RPA methods and systems that increase the flexibility, lower the cost and increase reliability with which RPA systems may be deployed are disclosed herein. A robotic process automation system includes data storage which stores a plurality of sets of task processing instructions. Each set of task processing instructions is operable to interact at a user level with one or more designated user level application programs. The data storage also stores a plurality of work items, where each work item is stored for subsequent processing by executing a corresponding set of task processing instructions. A server processor is operatively coupled to the data storage and is configured to execute instructions that cause the server processor to respond to a request to perform an automation task to process a work item from the plurality of work items, by initiating a java virtual machine on a second device. Also initiated on the second device is a first user session that employs credentials of a first user, for managing execution of the automation task. The server processor permits retrieval of the set of task processing instructions that correspond to the work item. The server processor loads into the java virtual machine, with a platform class loader, one or more modules, such as logging and security, that perform functions common to the sets of task processing instructions. A first class loader a first set of task processing instructions is also loaded. Then each instruction in the first set of task processing instructions is loaded with a separate class loader. The server processor causes execution, under control of the first user session, on the second device, the task processing instructions that correspond to the work item.


The platform class loader control of certain common functions (such as security and logging) of its child class loaders, such as the first class loader and the class loaders for each command, permits centralized control of key functions and ensures that each set of task processing instructions is sandboxed, i.e. limited to its own credential authorizations and cannot affect the activities of other sets of task processing instructions. This is just as a human user is limited to resources and activities permitted by their own credentials. Moreover, this approach is extended to each command where each class loader is a child of the parent class loader, thus permitting direct application of the platform class loader's common services and preventing override of such services by way of the task processing instruction's class loader. Furthermore, employing a separate class loader for each command limits the impact of implementation of each command. A class loaded for one command will not inadvertently control the same class name for another class loader, as would be the case by employing a common class loader for all commands in a set of task processing instructions.


These and additional aspects related to the invention will be set forth in part in the description which follows, and in part will be apparent to those skilled in the art from the description or may be learned by practice of the invention. Aspects of the invention may be realized and attained by means of the elements and combinations of various elements and aspects particularly pointed out in the following detailed description and the appended claims.


It is to be understood that both the foregoing and the following descriptions are exemplary and explanatory only and are not intended to limit the claimed invention or application thereof in any manner whatsoever.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this specification exemplify the embodiments of the present invention and, together with the description, serve to explain and illustrate principles of the inventive techniques disclosed herein. Specifically:



FIG. 1 is a high-level block diagram of an embodiment of an RPA system with server-based bot creation and execution.



FIG. 2 illustrates commands exchanged between a client device and a server in the RPA system of FIG. 1.



FIGS. 3A, 3B, 3C, 3D, 3E and 3F illustrate operation of various modules of the RPA system of FIG. 1.



FIG. 4 illustrates a bot farm service that may be used in connection with the RPA system of FIG. 1.



FIG. 5 illustrates a second embodiment of the RPA system of FIG. 1.



FIGS. 6A and 6B illustrate embodiments of virtual machine configurations.



FIG. 7 illustrates an embodiment of code translation that may be employed by the embodiment of the RPA system in FIG. 5.



FIG. 8 illustrates an embodiment for providing bots in a platform independent manner.



FIG. 9 illustrates details of class loading in an embodiment employing a java virtual machine.



FIG. 10 illustrates a block diagram of hardware that may be employed in an implementation of the RPA systems disclosed herein.





DETAILED DESCRIPTION

In the following detailed description, reference will be made to the accompanying drawings, in which identical functional elements are designated with like numerals. Elements designated with reference numbers ending in a suffix such as 0.1, 0.2, 0.3 are referred to collectively by employing the main reference number without the suffix. For example, 100 refers to topics 100.1, 100.2, 100.3 generally and collectively. The aforementioned accompanying drawings show by way of illustration, and not by way of limitation, specific embodiments and implementations consistent with principles of the present invention. These implementations are described in sufficient detail to enable those skilled in the art to practice the invention and it is to be understood that other implementations may be utilized and that structural changes and/or substitutions of various elements may be made without departing from the scope and spirit of present invention. The following detailed description is, therefore, not to be construed in a limited sense.


In FIG. 1, the embodiments disclosed herein implement a robotic process automation system 10 that includes data storage, seen generally at 102 which stores a plurality of sets of task processing instructions 104. Each set of task processing instructions 104 implements a software robot, also referred to as a bot (seen as Bot 1, Bot 2, . . . , Bot n) which is operable to interact at a user level with one or more designated user level application programs (not shown). As used herein, the term “bot” is generally synonymous with the term software robot. In certain contexts, as will be apparent to those skilled in the art in view of the present disclosure, the term “bot runner” refers to a device (virtual or physical), having the necessary software capability (such as bot player 126), on which a bot will execute or is executing. The data storage 102 also stores a plurality of work items 106, where each work item 106 is stored for subsequent processing by executing a corresponding set of task processing instructions 104. A control room, seen generally at 108, is operatively coupled to the data storage 102 and is configured to execute instructions that when executed cause the RPA system 10 to respond to a request from a client device 110 that is issued by a user 112.1 to act as a server to provide to the client device 110 the capability to perform an automation task to process a work item from the plurality of work items 106. For simplicity of illustration and explanation, a single client device 110 is shown in detail. The RPA system 10 preferably is able to support multiple client devices 110 concurrently, each of which will have one or more corresponding user session(s) 118, which provides a context. The context includes security, permissions, audit trails, etc. to define the permissions and roles for bots operating under the user session 118. For example, a bot executing under a session, cannot access any files or use any applications that the user under whose credentials the bot is operating does not have permission to do so. This prevents any inadvertent or malicious acts from a bot under which a bot 104 executes.


The control room 108 provides to the client device 110, software code to implement a node manager 114 that executes on the client device 110 and which provides to a user 112 a visual interface via browser 113 to view progress of and to control execution of the automation task. It should be noted here that the node manager 114 is provided to the client device 110 on demand, when required by the client device 110 to execute a desired automation task. In one embodiment, the node manager 114 may remain on the client device 110 after completion of the requested automation task to avoid the need to download it again. In another embodiment, the node manager 114 may be deleted from the client device 110 after completion of the requested automation task. The node manager 114 also maintains a connection to the control room 108 to inform the control room 108 that device 110 is available for service by the control room 108, irrespective of whether a live user session 118 exists. When executing a bot 104, the node manager 114 impersonates the user 112 by employing credentials associated with the user 112. In certain embodiments, the system 10 employs user impersonation as described in U.S. patent application entitled ROBOTIC PROCESS AUTOMATION SYSTEM WITH DEVICE USER IMPERSONATION filed on Mar. 31, 2019, assigned application Ser. No. 16/371,046, which application is assigned to the assignee of the present application and which is hereby incorporated by reference in its entirety. In application Ser. No. 16/371,046 the term “bot runner” is used in the manner that the term “bot” is used in the present application.


The control room 108 initiates on the client device 110, a user session 118 (seen as a specific instantiation 118.1) to perform the automation task. The control room 108 retrieves the set of task processing instructions 104 that correspond to the work item 106. The task processing instructions 104 that correspond to the work item 106 execute under control of the user session 118.1, on the device 110. The node manager 114 provides update data indicative of status of processing of the work item to the control room 108. The control room 108 terminates the user session 118.1 upon completion of processing of the work item 106. User session 118.1 is shown in further detail at 119, where an instance 124.1 of user session manager 124 is seen along with a bot player 126, proxy service 128 and one or more virtual machine(s) 130, such as a virtual machine that runs Java® or Python®. The user session manager 124 provides a generic user session context within which a bot 104 executes.


The bots 104 execute on a player, via a computing device, to perform the functions encoded by the bot. Additional aspects of operation of bots may be found in the following pending patent application, which refers to bots as automation profiles, System and Method for Compliance Based Automation, filed in the U.S. Patent Office on Jan. 6, 2016, and assigned application Ser. No. 14/988,877, which is hereby incorporated by reference in its entirety.


Some or all of the bots 104 may in certain embodiments be located remotely from the control room 108. Moreover, the devices 110 and 111 may also be located remotely from the control room 108. The bots 104 and the tasks 106 are shown in separate containers for purposes of illustration but they may be stored in separate or the same device(s), or across multiple devices. The control room 108 performs user management functions, source control of the bots 104, along with providing a dashboard that provides analytics and results of the bots 104, performs license management of software required by the bots 104 and manages overall execution and management of scripts, clients, roles, credentials, and security etc. The major functions performed by the control room 108 include: (i) a dashboard that provides a summary of registered/active users, tasks status, repository details, number of clients connected, number of scripts passed or failed recently, tasks that are scheduled to be executed and those that are in progress; (ii) user/role management—permits creation of different roles, such as bot creator, bot runner, admin, and custom roles, and activation, deactivation and modification of roles; (iii) repository management—to manage all scripts, tasks, workflows and reports etc.; (iv) operations management permits checking status of tasks in progress and history of all tasks, and permits the administrator to stop/start execution of hots currently executing; (v) audit trail—logs creation of all actions performed in the control room; (vi) task scheduler—permits scheduling tasks which need to be executed on different clients at any particular time; (vii) credential management—permits password management; and (viii) security: management—permits rights management for all user roles. The control room 108 is shown generally for simplicity of explanation. Multiple instances of the control room 108 may be employed where large numbers of bots are deployed to provide for scalability of the RPA system 10.


In the event that a device, such as device 111 (seen operated by user 112.2) does not satisfy the minimum processing capability to run node manager 114, the control room 108 provides on another device, such as device U.S. that has the requisite capability, within a Virtual Machine (VM), seen as VM 116 that is resident on the device 115, a node manager 114 that is in communication with browser 113 on device 111, This permits RPA system 10 to operate with devices that may have lower processing capability, such as older laptops, desktops, and portable/mobile devices such as tablets and mobile phones. In certain embodiments browser 113 may take the form of a mobile application stored on the device 111. The control room 108 establishes a user session 118.2 for the user 112.2 while interacting with the control room 108 and the corresponding user session 118.2 operates as described above for user session 118.1 with user session manager 124 as described above in connection with device 110.


In certain embodiments, the user session manager 124 provides five functions. First is a health service 138 that maintains and provides a detailed logging of bot execution including monitoring memory and CPU usage by the bot and other parameters such as number of file handles employed. The bots 104 employ the health service 138 as a resource to pass logging information to the control room 108. Execution of the bot is separately monitored by the user session manager 124 to track memory, CPU and other system information. The second function provided by the user session manager 124 is a message queue 140 for exchange of data between bots executed within the same user session 118. Third is a deployment service 142 that connects to the control room 108 to request execution of a requested bot 104. The deployment service 142 also ensures that the environment is ready for bot execution such as by making available dependent libraries. Fourth is a bot launcher 144 which reads metadata associated with a requested bot 104 and launches an appropriate container and begins execution of the requested bot. Fifth is a debugger service 146 that can be used to debug bot code.


The bot player 126 executes, or plays back, the sequence of instructions encoded in a bot. The sequence of instructions is captured by way of a recorder when a human performs those actions, or alternatively the instructions are explicitly coded into the bot. These instructions enable the bot player 126, to perform the same actions as a human would do in their absence. The instructions are composed of a command (action) followed by set of parameters, for example: Open Browser is a command, and a URL would be the parameter for it to launch the site. Proxy service 128 enables the integration of external software or applications with the bot to provide specialized services. For example, an externally hosted artificial intelligence system could enable the bot to understand the meaning of a “sentence”


The user 112 interacts with node manager 114 via a conventional browser 113 which employs the node manager 114 to communicate with the control room 108. When for the first time 112 user logs from client device 110 onto the control room 108, they are prompted to download and install the node manager 114 on the device 110, if one is not already present. The node manager 114 establishes a web socket connection to the user session manager 124, deployed by the control room 108 that lets the user 112 subsequently create, edit and deploy the bots 104.


The node manager 114 which is provided to the device 110 by the control room 108, in certain embodiments provides three functions, as illustrated in FIG. 2. First is a discovery service 132 that establishes and maintains a connection to the control room 108 and acts as a resource to the control room 108 for the device 110. Second, the node manager 114 provides an autologin service 134 that provides a vehicle to allow the control room 108 to login or to create a user session 118 by launching user session manager 124 which works with the control room 108 to serve control room requests. Third, the node manager 212 provides a logging function 136 to provide a single, centralized point for streaming of all logging data back to the control room 108, via the health service 138, which stores the received log data to a data log 214.


Operation of the message queue 140 is illustrated in FIG. 3A. The basic exchange of data between bots, Bot 1 and Bot 2, that that are executed within the same user session is performed using message queue 140. Furthermore, the message queue 140 can be used as the mechanism to synch-up between different code blocks or between parallel execution of bots in the same user session. In one embodiment, there is no persistence of queue data, once the user session is killed, the queue is lost. In such an embodiment, for more long-term and exchange of data across different user sessions or between bots across different client devices 110, alternative messaging may be employed such as by use of JavaScript Object Notation (JSON objects).


Initiation of execution of a bot 104 is illustrated in FIG. 3B which shows two user sessions (118.1, 118.2) created on two devices. User session managers 124.1 and 124.2 at 301 initiate, in devices 110 and 115 respectively, user sessions 118.1 and 118.2, under control of deployment module 142, for bot execution. The deployment module 142 at 302 prepares the user session 118 for execution by setting up the environment needed for the bot execution. This includes setting up appropriate path variables, that the bot may call upon while executing. This ensures that all dependencies, like external libraries, are available for the bot to execute. At 304 the bot deployment module 142 issues bot deployment requests to the control room 108. The control room 108 responds by retrieving the requested bot, Bot 1 and providing it to user session manager 124.1 which is executing on device 110. In the case of device 111 which does not have the capability to execute the node manager 114, another device is selected, device 115 in this case, upon which the node manager will execute to permit the user session manager 124 to initiate user session 118.2 to execute Bot 2. At 306, the bot launcher 144 in user session manager 124.1 reads the metadata for Bot 1 and launches a container 308.1 within which Bot 1 will execute, and then initiates execution of Bot 1. Similar actions are performed by a bot launcher executing within user session 118.2 on device 304 to initiate execution of Bot 2.


Operation of the debugger 146 is seen in FIG. 3C. If the user 112 is logged into the control room 108 as a bot creator employing a bot creator 320, they may debug with debugger 146 the code of a selected bot 104. The debugger 146 enables the bot creator to step-through the instructions in the bot and ensure that it is working as designed or created. The debugger 146 interactively provides the state of various variables, input and output parameters, allowing the creator to fix any errors discovered during the bot testing.



FIGS. 3D, 3E and 3F are flow diagrams illustrating operation of certain aspects of three embodiments of bot launcher 144. In FIG. 3D, the bot launcher 144, upon receiving an identifier for a bot 104 requested by user 112 (such as for example Bot 1) and an identifier for a device requested by user 112, accesses at 330 the requested bot to identify at 332 requirements encoded within the requested bot that specify capabilities and resources required for the requested bot to execute its programmed instructions. The capabilities and resources may be explicitly identified within the requested bot and/or the bot launcher 144 may scan the coding in the requested bot to automatically determine some or all of the required capabilities and resources. Capabilities and resources required by the bot 104 may include minimum processing, storage, communications capabilities, access to required services, such as hosted applications (e.g. various enterprise resource planning or customer relationship management applications), various files that may be required, and application programs that may be required to be installed such as for example, Microsoft Office® applications (Word®, Excel®, Outlook®, Powerpoint®). Capabilities and resources, as just described, of the requested device are determined at 334. If the capabilities/resources of the requested device are determined at 336 to be sufficient to execute the requested bot the bot launcher 144 continues with other required operations to launch the requested bot. Otherwise, the user 112 is notified at 340 so that another device may be requested.



FIG. 3E illustrates operation of another embodiment of bot launcher 144 where the bot launcher 144, automates the process of identifying an available device with the capabilities/resources required by a requested bot. At 336, if the requested device does not have the required capabilities/resources then at 342, the bot launcher performs a scan of available devices as maintained by control room 108. If any devices are not currently available, the user 112 is informed at 346. If at 344 it is determined that one or more devices with sufficient capabilities/resources is/are currently available, the bot launcher 144 selects one of such devices at 348 and the bot launcher 144 continues with other required operations to launch the requested bot.



FIG. 3F illustrates operation of another embodiment of bot launcher 144 where the bot launcher 144, fully automates the process of identifying an available device with the capabilities/resources required by a requested bot. In FIG. 3F, the bot launcher receives at 330 only the identification of the requested bot and identifies, at operations 342, 344 and 348, an available device with sufficient capabilities/resources. In the embodiments of FIGS. 3D, 3E and 3F the devices scanned and selected may be physical devices and/or virtual devices such as described below in connection with FIG. 4.



FIG. 4 illustrates a bot farm service that may be used in connection with the RPA system of FIG. 1 to employ virtualization to provide larger scale bot processing capability. The scheduler service 402 provides for virtual machine (VM) creation 404 and VM deployment 410. VM creation 404 permits selection of configuration settings 406 where a time can be specified when the scheduler service 402 creates a VM image (i.e. virtual device 415). VM creation 404 also permits selection of a template or blueprint that contains specification for the VM such as processing capability, and memory and storage size. A user may employ the VM deployment module 410 to schedule a particular bot to run on an n number of VMs (for example n=100). Embodiments disclosed herein support a category of VM termed herein an “ephemeral device” which is a device that exists only for the duration of bot execution. To deploy devices, the scheduler at 412 determines if one or more of the devices requested to be deployed is an ephemeral device. If not, then deployment service 414 deploys the requested device(s). If a requested device is determined at 412 to be an ephemeral device then predeployment service 416 is employed to create the requested ephemeral device(s) in accordance with criteria specified by way of a blueprint that specifies required processing capabilities, storage capabilities and software requirements, such as application programs required to be installed on the ephemeral device. These ephemeral devices will then show-up as devices connected and available—these devices would then be associated with bot deployment metadata. Deployment service is then employed to deploy the ephemeral device(s). The bot farm engine 418 is a service that enables creating virtual machines on-demand using a native Application Program Interface (API) provided by a cloud provider. It instantiates VM's that can then be used to run/play the bots. The bot farm engine 418 uses templates or blueprints (pre-generated) that define the configuration of the VM that needs to be created. These VM's are virtual devices for playing the bots. On completion of the execution of the bots, the user session manager 124 from the respective devices indicate the completion, and control room 108 can then reclaim the virtual machines by spinning them down and closing them.



FIG. 5 illustrates a second embodiment of the RPA system of FIG. 1 which operates to provide a generalized runtime environment for digital workers. This flexible runtime environment advantageously permits extensibility of the platform to enable use of various languages in encoding bots. In the embodiment of FIG. 5, RPA system 10 operates in the manner described in connection with FIG. 1 and its accompanying figures, except that in the embodiment of FIG. 5, some or all of the user sessions 118 execute within a virtual machine 116. This permits the bots 104 to operate on an RPA system 10 that runs on an operating system different from an operating system on which a bot 104 may have been developed. For example, if a bot 104 is developed on the Windows® operating system, the platform agnostic embodiment of FIG. 5 permits bot 104 to be executed on a device 502 or 504 executing an operating system, 503/505 different than Windows®, such as for example, Linux. In one embodiment the VM 116 takes the form of a Java Virtual Machine (JVM) such as provided by the Oracle Corporation. As will be understood by those skilled in the art in view of the present disclosure, a JVM enables a computer to run Java® programs as well as programs written in other languages that are also compiled to Java® bytecode.


In the embodiment of FIG. 5, multiple devices 502 execute operating system 1, 503, which may for example be a Windows® operating system. Multiple devices 504 execute operating system 2, 505, which may for example be a Linux® operating system. For simplicity of explanation, two different operating systems are shown, by way of example and additional operating systems such as the macOS®, or other operating systems may also be employed on devices 503, 505 or other devices. Each device 503, 505 has installed therein one or more VM's 116, each of which executes its own operating system (not shown), which may be the same or different than the host operating system 503/505. Each VM 116 has installed upon it, either in advance, or on demand from control room 108, a node manager 114. Except as specifically noted herein, the embodiment of FIG. 5 operates as described above in connection with FIGS. 1, 2, 3A, 3B, 3C, 3D, 3E, 3F and 4 and reference is made to those figures and accompanying description for the detailed operation of control room 108, node manager 114 and user sessions 118 and user session manager 124. The embodiment of FIG. 5 differs from that in FIG. 1 in that the devices 502 and 504 have installed thereon one or more VMs 116 as described above, with each VM 116 having installed thereon an operating system that may or may not be compatible with an operating system required by an automation task. Moreover, each VM has installed thereon a runtime environment 506, each of which has installed thereon one or more interpreters (shown as interpreter 1, interpreter 2, interpreter 3). Three interpreters are shown by way of example but any run time environment 506 may at any given time have installed thereupon less than or more than three different interpreters. Each interpreter 506 is specifically encoded to interpret instructions encoded in a particular programming language. For example, interpreter 1 may be encoded to interpret software programs encoded in the Java® programming language, seen as language 1 in Bot 1 and Bot 2. Interpreter 2 may be encoded to interpret software programs encoded in the Python® programming language, seen as language 2 in Bot 1 and Bot 2, and interpreter 3 may be encoded to interpret software programs encoded in the R programming language, seen as language 3 in Bot 1 and Bot 2.


Turning to the bots Bot 1 and Bot 2, each bot may contain instructions encoded in one or more programming languages. In the example shown in FIG. 5, each bot contains instructions in three different programming languages, for example, Java®, Python® and R. This is for purposes of explanation and the embodiment of FIG. 5 may be able to create and execute bots encoded in more or less than three programming languages. The VMs 116 and the runtime environments 506 permit execution of bots encoded in multiple languages, thereby permitting greater flexibility in encoding bots. Moreover, the VMs 116 permit greater flexibility in bot execution. For example, a bot that is encoded with commands that are specific to an operating system, for example, open a file, or that requires an application that runs on a particular operating system, for example, Excel® on Windows®, can be deployed with much greater flexibility. In such a situation, the control room 108 will select a device with a VM 116 that has the Windows® operating system and the Excel® application installed thereon. Licensing fees can also be reduced by serially using a particular device with the required licensed operating system and application(s), instead of having multiple devices with such an operating system and applications, which may be unused for large periods of time.


In one embodiment, seen in FIG. 6A the VM 116 may be pre-created with all dependencies, such as application 1, application 2, and two files, file 1 and file 2, that a bot 104 may need. In another embodiment, seen in FIG. 6B, the bot 104 may have all dependencies clearly defined as metadata in the bot definition to enable access to and/or retrieval of required resources such as applications (application 1, application 2), files (file 1, file 2), and access information (e.g. login credentials) to various services. Deployment service 142 can use this metadata information to setup the environment. This permits the bot 104 to be more compact in size. The dependencies define resources or information needed for a bot to execute. For example, the bot may need 3rd party libraries, or certain configuration settings that are encoded in a separate file and that needs to be present at a location for the bot to consume and execute successfully. In certain embodiments, to manage and authorize bot execution within the confines of the node managers 114, the system 10 needs the ability to disallow bot execution via any other means. In such embodiments, a ClassLoader, as employed in the Java® programming language, within the generated code (as a preamble) is used to ping the local agent to dynamically load a class to execute. In case, this bot is executed elsewhere, the call to ClassLoader will fail preventing the bot from execution. This is to prevent the generated byte code being executed independently external to the bot runner/player. Given that the bot is encoded in Java Byte code, it is desirable to prevent any external Java® runtime virtual machine from directly executing the byte code.


The code in a bot 104 that is encoded in a language other than Java® may be converted by the control room 108 to Java®, or another language, in the manner set shown in FIG. 7. For example, if a bot 104 is encoded with commands suitable for the Windows® operating system, the operations shown in FIG. 7 can be employed by the RPA system 10 to convert the bot to Java®, or another language to enable the bot 104 to execute on an operating system other than Windows®. In FIG. 7, a test is performed at 704 to determine if a bot 104 selected for execution should be executed by a native execution engine, in other words, if the bot 104 can be executed without translation of its encoded instructions. In one embodiment, the control room automatically makes a determination as to whether to use a native execution engine 706. In such an embodiment, if the control room 108 has the capability to execute the bot 104 natively then it employs the native execution capability. If the control room 108 does not have the capability to execute the bot 104 natively then the instructions in the bot 104 may be converted in two different ways. One conversion technique is shown at 708 where an in-place replacement of native commands with Java® code snippets is performed. This involves a straightforward replacement of a native command for a first platform, e.g. Windows®, into a code snippet for a second platform, e.g. Java®. In some embodiments, the control room 108 may have the capability to perform translation by way of an alternative technique seen at 712, 714, 716, 718 and 720, which permits translation into a language other than Java® if needed. In such an embodiment, such a translation will be the default unless overridden by an administrator or user 102. The instructions in the bot 104 are deconstructed at 712 and mapped at 714 to an abstract syntax tree and then generated to target code at 716 and 718 into Java® 710 or some other code 720 The abstract syntax tree is a data structure for representing bot instructions in a language neutral form and is machine readable. This allows for bot creation to be independent or agnostic of the language in which it needs to be executed. In the event that new commands are added, the corresponding commands and the associated snippet can be obtained by the control room 108 on demand by from a centralized repository that distributes new commands, such as for example from GitHub.com hosted by Automation Anywhere, Inc.



FIGS. 8 and 9 illustrate embodiments of the system 10 in which bots 104 are converted to and executed in a portable, operating system independent format such as shown in the embodiments of FIGS. 6A and 6B. In the embodiments of FIGS. 8 and 9, the bots 104 are converted to a Java ARchive (JAR) format for execution in a Java Runtime Environment by way of a Java Virtual Machine (JVM). As will be understood by those skilled in the art, a JAR is a file format used for aggregating many files into one, which permits Java applications and their requisite components (.class files, images and sounds) can be downloaded in a single HTTP transaction, instead or requiring a new connection for each piece. This improves the speed with which an application can be loaded into the JVM and begin functioning. The JAR format also supports compression, which reduces the size of the file and further improves download time. Moreover, individual entries in a JAR file may be digitally signed by the application author to authenticate their origin.


Turning to FIG. 8, control room 108 automatically converts a preexisting bot 104 to a JAR format employing conventional techniques upon a bot deployment request, which may be triggered upon a user action or by a schedule provided by a user or administrator. In the embodiment of FIG. 8, each command 802 supported by the system 10 for use in a bot 104 is stored in JAR format. Upon receipt of a bot deployment request, the control room 108 retrieves the requested bot 104 from repository 102 and checks its format at 806. If the requested bot 104 is in a JAR format, then the requested bot 104 is provided at 808. If the requested bot 104 has not been converted to a JAR format or if the requested bot has changed, then the requested bot 104 is converted at 810 to a JAR format by processing the bot 104 to replace each command in the bot 104 with an equivalent command stored in a JAR format. The requested bot 104 is then provided at 808. As seen, at any given time, the repository 102 may contain some bots 104 that have been converted to a JAR format (Bot 2-jar, Bot 3-jar) and other bots that have not yet been converted to a JAR format (Bot 1, Bot 4). In the embodiment of FIG. 8, newly created bots, such as by way of bot creator 320 are created and stored in a JAR format by use of commands 802 that exist in a JAR format. The node manager 114 inspects the bot jar and provides information regarding the bot jar to the bot launcher 144 which launches the bot 104 in a user session 118 which provides isolation in the user session 118 for execution of the bot 104 which executes with the credentials of the user, which is as if a human user were logged into the system 10 under their credentials and accessing system resources under their credentials. The bot 104 has the permissions of the user 104 in performing its programmed tasks.



FIG. 9 illustrates details of class loading in an embodiment employing a java virtual machine. In the embodiment of FIG. 9, execution of bots and instructions within bots, which as described in connection with FIG. 8 are stored in a JAR format, are advantageously contained by separate java class loaders to increase security and isolation, thereby increasing predictability of execution of the bots. In the embodiment of FIG. 9, three types of class loaders are employed: platform class loader 902, bot class loader 904 and command class loader 906. The class loaders are arranged in a hierarchy with each bot class loader 904 and each command class loader 906 being a child of the platform class loader 902. Each bot 104 is assigned its own class loader and each command in each bot is assigned its own class loader. As will be appreciated by those skilled in the art, a Java ClassLoader is a part of the Java Runtime Environment that dynamically loads Java classes into the Java virtual machine. The bot launcher 144 advantageously creates a platform class loader 902 and creates an association between the platform class loader 902 and each bot class loader 904 and each command class loader 906. The platform class loader 902 advantageously spawns a separate bot class loader 904 for each bot 104 that is loaded and spawns a separate command class loader 906 for each command in each bot. As seen in FIG. 9, a bot class loader 904.1 is employed to load Bot 1-JAR and separate class loaders 904.x are employed for additional bots, Bot 2-JAR and so forth. In the embodiment of FIG. 9, the bot launcher 144 advantageously spawns a separate class loader 906 for each command in each bot. As in the embodiment of FIG. 8, each command 802 is stored in a JAR format. As seen in FIG. 9, a command class loader 906.1 is employed to load Command 1 and separate class loaders 906.x are employed for the additional commands in Bot 1-JAR. Similarly, the for the other bots (e.g. Bot 2-JAR, . . . ) separate command class loaders 906 are used for each command in each of the bots. Commands are loaded via their own class loader 906 and then injected into the bot class loader 904 that owns the commands. Additionally, as seen in FIG. 9, a bot class loader 904.1 for a bot (Bot 1-JAR) that incorporates another (child) bot (Bot 4-JAR) has assoiciated therewith, a child class loader 904.1.1, which is spawned by the platform class loader 902.


The class loaders employ the following rules for delegation. The platform class loader 902 has a hardcoded list of what packages should be shared with the bot and command packages from either the bot launcher 144 or the bot-runtime. For the bot class loader 904, all the command and bot-related contracts are attempted to load from the bot JAR first but all the other classes are delegated to load from the parent first. As will be appreciated by those skilled in the art, a contract employed by a bot is an agreement that the class will expose certain methods, certain properties, and certain behaviors. All commands associated with this bot will be fed from a local map that gets populated with command classes that are loaded by its own class loader. All other classes except the bot-related classes will check the parent first. All JARs in the local class path of this loader will be checked. For the command class loader 906, all classes are delegated to load from the parent first. If a requested class is in a package that must be shared, the request will be passed on to the bot launcher 144 class loader, which may be a default class loader provided by Java to run the bot launcher class loader. Otherwise, it is passed directly to the bootstrap classloader which is provided by the JVM and is typically part of the core JVM and serves as the parent for all class loaders in the system. This means that no classes loaded by the bootstrap classloader, launcher or any transitive dependencies will be made available to the bot or command package unless explicitly added to a shared package list maintained by the platform class loader 902. Requests to load bot runtime classes are satisfied from a jar file that must be supplied when the class loader is instantiated. In one embodiment, the platform classloader 902 has a hardcoded list of what packages should be shared with the bot and command packages from either the engine or the bot-runtime. As will be appreciated by those skilled in the art, a bootstrap class loader is a portion of machine code that loads the system class loader upon startup of the JVM. The bootstrap classloader also takes care of loading all of the code needed to support the basic Java Runtime Environment (JRE), including classes in the java.util and the java.lang packages. Other than the bootstrap class loader, all classes in one embodiment are implemented as Java classes.


The hierarchical arrangement of class loaders shown in FIG. 9 provides a number of advantages. Centralization of certain common functions in the platform class loader 902 ensures that each bot class loader 904 and command class loader 906 inherit directly, without any intervening class loader, the common functions provided by the platform class loader. Thus, every action in the system 10 performed by a bot is tracked by way of the logging function provided by the platform class loader 902 and every action in the system 10 is performed in accordance with the security rules put in place by the platform class loader 902. Direct delegation of these properties by the platform class loader 902 to each command class loader 906 avoids inadvertent or deliberate intervention and override by a bot via the bot class loader 904. The delegation of these properties is enforced and cannot be overridden by a bot or by a command. Moreover, allocation of a separate class loader 906 for each command ensures that use of the same class name in two commands will result in loading of the desired class for each command. An individual developing a command 802 may include a dependency on a function that is employed by the bot executing the command 802 and an individual in another organization for example may develop a command 802 with a reference to another class by the same class name. A Java class loader when encountering a class name checks to see if that class name has been loaded and if it has not been loaded then loads the class corresponding to that class name. When it encounters another class with a name that has already been loaded, it will skip loading that class. Thus, if multiple commands are loaded by the same class loader, in a first command that references a class foo, the class foo will be loaded. A subsequent command that references a class foo, but with a slightly different implementation will not get its foo class loaded. When executing, the subsequent command will be provided with the foo class of the first command and may fail execution because the first foo command does not provide the expected behavior (i.e. functionality) of the foo command implemented by the subsequent command. The foregoing isolation permits use in the system 10 of bots and commands that are developed by numerous individuals/entities. Entity X can employ a bot developed by entity Y with commands developed by entities W, Y and Z. This increases the pace with which application automation can be achieved, while maintaining the security and application isolation in such automated environments as would occur with humans performing the tasks using credentials provided to them by their respective system administrators.


The common functions implemented by the platform class loader 902 include the bot and command contract, security functions, logging, Java Native Interface (JNI), Java Native Access (JNA), Java FX, metrics, and the message interface. A contract in a Java class is an agreement that the class will expose certain methods, certain properties, and certain behaviors. The Java Native Interface (JNI) is a foreign function interface programming framework that enables Java code running in a JVM to call and be called by native applications (those programs specific to a hardware and operating system platform) and libraries written in other languages such as C, C++ and assembly. Java Native Access (JNA) is a community-developed library that provides Java programs easy access to native shared libraries without using the JNI. JNA's design aims to provide native access in a natural way with a minimum of effort. No boilerplate or generated glue code is required. Java FX are libraries provided by Java to render U/I components. Metrics provide information on performance, such as how fast is a command being processed, how often is a command being used, etc. The message interface provides a language independent common messaging interface, independent of a particular language.


The embodiments herein can be implemented in the general context of computer-executable instructions, such as those included in program modules, being executed in a computing system on a target real or virtual processor. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The program modules may be obtained from another computer system, such as via the Internet, by downloading the program modules from the other computer system for execution on one or more different computer systems. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Computer-executable instructions for program modules may be executed within a local or distributed computing system. The computer-executable instructions, which may include data, instructions, and configuration parameters, may be provided via an article of manufacture including a computer readable medium, which provides content that represents instructions that can be executed. A computer readable medium may also include a storage or database from which content can be downloaded. A computer readable medium may also include a device or product having content stored thereon at a time of sale or delivery. Thus, delivering a device with stored content, or offering content for download over a communication medium may be understood as providing an article of manufacture with such content described herein.



FIG. 10 illustrates a block diagram of hardware that may be employed in an implementation of the RPA system as disclosed herein. FIG. 10 depicts a generalized example of a suitable general-purpose computing system 1000 in which the described innovations may be implemented in order to improve the processing speed and efficiency with which the computing system 1000 operates to perform the functions disclosed herein. With reference to FIG. 10 the computing system 1000 includes one or more processing units 1002, 1004 and memory 1006, 1008. The processing units 1002, 1006 execute computer-executable instructions. A processing unit can be a general-purpose central processing unit (CPU), processor in an application-specific integrated circuit (ASIC) or any other type of processor. The tangible memory 1006, 1008 may be volatile memory (e.g., registers, cache, RAM), non-volatile memory (e.g., ROM, EEPROM, flash memory, etc.), or some combination of the two, accessible by the processing unit(s). The hardware components in FIG. 10 may be standard hardware components, or alternatively, some embodiments may employ specialized hardware components to further increase the operating efficiency and speed with which the system 10 operates. The various components of computing system 1000 may be rearranged in various embodiments, and some embodiments may not require nor include all of the above components, while other embodiments may include additional components, such as specialized processors and additional memory.


Computing system 1000 may have additional features such as for example, storage 1010, one or more input devices 1014, one or more output devices 1012, and one or more communication connections 1016. An interconnection mechanism (not shown) such as a bus, controller, or network interconnects the components of the computing system 1000. Typically, operating system software (not shown) provides an operating system for other software executing in the computing system 1000, and coordinates activities of the components of the computing system 1000.


The tangible storage 1010 may be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium which can be used to store information in a non-transitory way, and which can be accessed within the computing system 1000. The storage 1010 stores instructions for the software implementing one or more innovations described herein.


The input device(s) 1014 may be a touch input device such as a keyboard, mouse, pen, or trackball, a voice input device, a scanning device, or another device that provides input to the computing system 1000. For video encoding, the input device(s) 1014 may be a camera, video card, TV tuner card, or similar device that accepts video input in analog or digital form, or a CD-ROM or CD-RW that reads video samples into the computing system 1000. The output device(s) 1012 may be a display, printer, speaker, CD-writer, or another device that provides output from the computing system 1000.


The communication connection(s) 1016 enable communication over a communication medium to another computing entity. The communication medium conveys information such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal. A modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can use an electrical, optical, RF, or other carrier.


The terms “system” and “computing device” are used interchangeably herein. Unless the context clearly indicates otherwise, neither term implies any limitation on a type of computing system or computing device. In general, a computing system or computing device can be local or distributed and can include any combination of special-purpose hardware and/or general-purpose hardware with software implementing the functionality described herein.


While the invention has been described in connection with a preferred embodiment, it is not intended to limit the scope of the invention to the particular form set forth, but on the contrary, it is intended to cover such alternatives, modifications, and equivalents as may be within the spirit and scope of the invention as defined by the appended claims.

Claims
  • 1. A robotic process automation system comprising: data storage for storing a set of task processing instructions operable to interact at a user level with one or more designated user level application programs; anda processor, of a user computing device, configured to execute instructions to process at least one work item, by: accessing the set of task processing instructions that are to be carried out;loading, at the user computing device using a first class loader, the set of task processing instructions to be carried out by the user computing device, the set of task processing instructions identifying an instruction package referenced by or used by the set of task processing instructions, the instruction package including one or more package processing instructions;loading the instruction package referenced by or used in the first set of task processing instructions with a second class loader; andcausing execution, at the user computing device, instructions that have been loaded by the first class loader or the second class loader,wherein the processor is configured to execute instructions for initiating on the user computing device a user session for managing execution of an automation task, andwherein the causing execution of the instructions loaded by the first class loader and the second class loader is performed under control of the user session on the user computing device.
  • 2. The robotic process automation system of claim 1, wherein the processer is configured to perform the act of: loading, with a platform class loader, shared instruction modules.
  • 3. The robotic process automation system of claim 2, wherein the platform class loader spawns the first class loader and the second class loader.
  • 4. The robotic process automation system of claim 3, wherein the first class loader is a bot class loader, and wherein the second class loader is a command class loader.
  • 5. The robotic process automation system of claim 2, wherein the first class loader and the second class loader are JAVA class loaders.
  • 6. The robotic process automation system of claim 1, wherein, during the execution, the instructions loaded by the first package code loader are sandboxed from the instructions loaded by the second package code loader.
  • 7. The robotic process automation system of claim 6, wherein the first class loader is a bot class loader, and wherein the second class loader is a command class loader.
  • 8. The robotic process automation system of claim 1, wherein the first class loader is a bot class loader, and wherein the second class loader is a command class loader.
  • 9. The robotic process automation system of claim 1, wherein at least one of the first class loader and the second class loader is a JAVA class loader.
  • 10. A robotic process automation system comprising: data storage for storing a plurality of sets of task processing instructions, each set of task processing instructions operable to interact at a user level with one or more designated user level application programs; anda processor configured to execute instructions that when executed cause the processor to perform an automation task to process at least one work item, by performing at least the acts of: initiating on a user computing device a user session for managing execution of the automation task;accessing the set of task processing instructions that are to be carried out, the set of task processing instructions to be carried out by the user computing device, the set of task processing instructions identifying a plurality of instruction packages referenced or used by the set of task processing instructions, each of the instruction packages including one or more processing instructions, the instruction packages including at least a first instruction package and a second instruction package;loading, for the user session, the first instruction package referenced or used in the first set of task processing instructions using a first package class loader;loading, for the user session, the second instruction package referenced or used in the first set of task processing instructions using a second package class loader; andcausing execution, under control of the user session on the user computing device, instructions that have been loaded by the first package class loader and the package second class loader, wherein, during the execution, the instructions loaded by the first package class loader are sandboxed from the instructions loaded by the second package class loader.
  • 11. The robotic process automation system of claim 10, wherein the processor is configured to perform the act of: loading, at the user computing device using a first class loader, the set of task processing instructions to be carried out by the user computing device.
  • 12. The robotic process automation system of claim 11, wherein the processer is configured to perform the act of: loading, for the user session, with a platform class loader, shared instruction modules.
  • 13. The robotic process automation system of claim 12, wherein the platform class loader spawns the first package class loader and the second package class loader.
  • 14. The robotic process automation system of claim 12, wherein the platform class loader spawns the first class loader, and wherein the first class loader is a bot class loader.
  • 15. The robotic process automation system of claim 14, wherein the platform class loader spawns the first package class loader and the second package class loader.
  • 16. A non-transitory computer readable storage medium including stored thereupon one or more program modules comprising computer-executable instructions for execution on a computer system, the computer-executable instructions causing the computer system to implement a robotic process automation system that employs a set of task processing instructions operable to interact at a user level with one or more designated user level application programs, the computer readable medium comprising: instructions for accessing the set of task processing instructions that are to be carried out, the set of task processing instructions to be carried out by the user computing device, the set of task processing instructions identifying a plurality of instruction packages referenced or used by the set of task processing instructions, each of the instruction packages including one or more processing instructions, the instruction packages including at least a first instruction package and a second instruction package;instructions for loading the first instruction package referenced or used in the first set of task processing instructions using a first package code loader;instructions for loading the second instruction package referenced or used in the first set of task processing instructions using a second package code loader; andinstructions for causing execution instructions that have been loaded by the first package code loader and the package second code loader, wherein, during the execution, the instructions loaded by the first package code loader are sandboxed from the instructions loaded by the second package code loader.
  • 17. A non-transitory computer readable storage medium as recited in claim 16, wherein the computer system is a user computing device.
  • 18. A non-transitory computer readable storage medium as recited in claim 17, wherein the computer readable medium comprises: instructions for initiating on the user computing device a user session for managing execution of an automation task.
  • 19. A non-transitory computer readable storage medium as recited in claim 18, wherein the loading of the first instruction package using the first package code loader is for the user session, wherein the loading of the second instruction package using the second package code loader is for the user session, and wherein the causing execution of the instructions loaded by the first package code loader and the second package code loader is performed under control of the user session on the user computing device.
  • 20. A non-transitory computer readable storage medium as recited in claim 16, wherein the first package code loader and the second package code loader are class loaders.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application entitled ROBOTIC PROCESS AUTOMATION SYSTEM WITH SEPARATE PLATFORM, BOT AND COMMAND CLASS LOADERS, application Ser. No. 16/731,044, filed on Dec. 31, 2019, which is hereby incorporated by reference in its entirety. The prior application Ser. No. 16/731,044 is a continuation-in-part of U.S. patent application entitled PLATFORM AGNOSTIC ROBOTIC PROCESS AUTOMATION, application Ser. No. 16/398,600, filed on Apr. 30, 2019, and U.S. patent application entitled ZERO FOOTPRINT ROBOTIC PROCESS AUTOMATION SYSTEM, application Ser. No. 16/398,532, filed on Apr. 30, 2019. Each of the aforementioned applications is hereby incorporated by reference in its entirety.

US Referenced Citations (258)
Number Name Date Kind
5389592 Weissman Feb 1995 A
5427234 Upchurch Jun 1995 A
5473794 Kobayashi Dec 1995 A
5496979 Behr Mar 1996 A
5640244 Hellstrom Jun 1997 A
5931544 Dietrich Aug 1999 A
5949999 Song et al. Sep 1999 A
5983001 Boughner et al. Nov 1999 A
6133917 Feigner et al. Oct 2000 A
6226407 Zabih et al. May 2001 B1
6389592 Ayres et al. May 2002 B1
6427234 Chambers et al. May 2002 B1
6473794 Guheen et al. Oct 2002 B1
6496979 Chen et al. Dec 2002 B1
6640244 Bowman-Amuah Oct 2003 B1
6704873 Underwood Mar 2004 B1
6898764 Kemp May 2005 B2
6954747 Wang et al. Oct 2005 B1
6957186 Guheen et al. Oct 2005 B1
7091898 Arling et al. Aug 2006 B2
7246128 Jordahl Jul 2007 B2
7398469 Kisamore et al. Jul 2008 B2
7441007 Kirkpatrick et al. Oct 2008 B1
7533096 Rice et al. May 2009 B2
7568109 Powell et al. Jul 2009 B2
7571427 Wang et al. Aug 2009 B2
7765525 Davidson et al. Jul 2010 B1
7783135 Gokturk Aug 2010 B2
7805317 Khan et al. Sep 2010 B2
7805710 North Sep 2010 B2
7810070 Nasuti et al. Oct 2010 B2
7646023 Evans et al. Dec 2010 B2
7846023 Evans et al. Dec 2010 B2
8028269 Bhatia et al. Sep 2011 B2
8056092 Allen et al. Nov 2011 B2
8095910 Nathan et al. Jan 2012 B2
8132156 Malcolm Mar 2012 B2
8209738 Nicol et al. Jun 2012 B2
8234622 Meijer et al. Jul 2012 B2
8245215 Extra Aug 2012 B2
8352464 Folev Jan 2013 B2
8365147 Grechanik Jan 2013 B2
8396890 Lim Mar 2013 B2
8438558 Adams May 2013 B1
8443291 Ku et al. May 2013 B2
8464240 Fritsch et al. Jun 2013 B2
8498473 Chong et al. Jul 2013 B2
8504803 Shukla Aug 2013 B2
8631458 Banerjee Jan 2014 B1
8682083 Kumar et al. Mar 2014 B2
8713003 Fotev Apr 2014 B2
8724907 Sampson et al. May 2014 B1
8769482 Batey et al. Jul 2014 B2
8819241 Washbum Aug 2014 B1
8832048 Lim Sep 2014 B2
8874685 Hollis et al. Oct 2014 B1
8943493 Schneider Jan 2015 B2
8965905 Ashmore et al. Feb 2015 B2
8966458 Asai Feb 2015 B2
9032314 Mital et al. May 2015 B2
9104294 Forstall et al. Aug 2015 B2
9171359 Lund Oct 2015 B1
9213625 Schrage Dec 2015 B1
9251413 Meler Feb 2016 B2
9278284 Ruppert et al. Mar 2016 B2
9444844 Edery et al. Sep 2016 B2
9462042 Shukla et al. Oct 2016 B2
9571332 Subramaniam et al. Feb 2017 B2
9600519 Schoning et al. Mar 2017 B2
9621584 Schmidt et al. Apr 2017 B1
9934129 Budurean Apr 2018 B1
9946233 Brun et al. Apr 2018 B2
9965139 Nychis May 2018 B2
9990347 Raskovic et al. Jun 2018 B2
10015503 Ahammad Jul 2018 B1
10043255 Pathapati et al. Aug 2018 B1
10282280 Gouskova May 2019 B1
10489682 Kumar et al. Nov 2019 B1
10592738 Northrup Mar 2020 B2
10654166 Hall May 2020 B1
10706218 Milward et al. Jul 2020 B2
10706228 Buisson Jul 2020 B2
10713068 Zohar Jul 2020 B1
10936807 Walters Mar 2021 B1
11099972 Puszkiewicz Aug 2021 B2
11176443 Selva Nov 2021 B1
11182178 Singh et al. Nov 2021 B1
11182604 Methaniya Nov 2021 B1
11243803 Anand et al. Feb 2022 B2
11263391 Potts Mar 2022 B2
11348353 Sundell et al. May 2022 B2
11614731 Anand et al. Mar 2023 B2
11775321 Singh et al. Oct 2023 B2
11775339 Anand et al. Oct 2023 B2
11775814 Anand et al. Oct 2023 B1
11782734 Ginoya et al. Oct 2023 B2
20020029232 Bobrow et al. Mar 2002 A1
20020057678 Jiang May 2002 A1
20030033590 Leherbauer Feb 2003 A1
20030088604 Kuck et al. May 2003 A1
20030101245 Srinivasan et al. May 2003 A1
20030110382 Leporini Jun 2003 A1
20030114959 Sakamoto Jun 2003 A1
20030159089 DiJoseph Aug 2003 A1
20040019897 Taylor Jan 2004 A1
20040034448 Siegers Feb 2004 A1
20040083472 Rao et al. Apr 2004 A1
20040153649 Rhoads Aug 2004 A1
20040172526 Tann et al. Sep 2004 A1
20040210885 Wang et al. Oct 2004 A1
20040243994 Nasu Dec 2004 A1
20050021713 Dugan et al. Jan 2005 A1
20050188357 Derks et al. Aug 2005 A1
20050204343 Kisamore et al. Sep 2005 A1
20050257214 Moshir et al. Nov 2005 A1
20060074994 Smits Apr 2006 A1
20060095276 Axelrod et al. May 2006 A1
20060150188 Roman et al. Jul 2006 A1
20060161896 Hicks Jul 2006 A1
20060218110 Simske et al. Sep 2006 A1
20060282509 Kilian Dec 2006 A1
20070030528 Quaeler et al. Feb 2007 A1
20070089101 Romanovskiy Apr 2007 A1
20070101291 Forstall et al. May 2007 A1
20070112574 Greene May 2007 A1
20070156677 Szabo Jul 2007 A1
20070169025 Moore et al. Jul 2007 A1
20070169110 Gupta et al. Jul 2007 A1
20070233741 Shen Oct 2007 A1
20070261124 Centonze Nov 2007 A1
20080005086 Moore Jan 2008 A1
20080027769 Eder Jan 2008 A1
20080028392 Chen et al. Jan 2008 A1
20080133052 Jones Jun 2008 A1
20080209392 Able et al. Aug 2008 A1
20080222454 Kelso Sep 2008 A1
20080263024 Landschaft et al. Oct 2008 A1
20080310625 Vanstone et al. Dec 2008 A1
20090037509 Parekh et al. Feb 2009 A1
20090103769 Milov et al. Apr 2009 A1
20090116071 Mantell May 2009 A1
20090172814 Khosravi et al. Jul 2009 A1
20090199160 Vaitheeswaran et al. Aug 2009 A1
20090217309 Grechanik et al. Aug 2009 A1
20090222798 Iguchi et al. Sep 2009 A1
20090249297 Doshi et al. Oct 2009 A1
20090313229 Fellenstein et al. Dec 2009 A1
20090320002 Peri-Glass et al. Dec 2009 A1
20100023602 Marlone Jan 2010 A1
20100023933 Bryant et al. Jan 2010 A1
20100100605 Allen et al. Apr 2010 A1
20100106671 Li et al. Apr 2010 A1
20100138015 Colombo et al. Jun 2010 A1
20100161399 Posner Jun 2010 A1
20100235433 Ansari et al. Sep 2010 A1
20100251163 Keable Sep 2010 A1
20110022578 Folev Jan 2011 A1
20110106284 Catoen May 2011 A1
20110138363 Schmelter Jun 2011 A1
20110145807 Molinie et al. Jun 2011 A1
20110173239 Sayed et al. Jul 2011 A1
20110197121 Kletter Aug 2011 A1
20110258550 Dinh-Trong Oct 2011 A1
20110267490 Goktekin Nov 2011 A1
20110276568 Fotev Nov 2011 A1
20110276946 Pletter Nov 2011 A1
20110302570 Kurimilla et al. Dec 2011 A1
20120011458 Xia et al. Jan 2012 A1
20120042281 Green Feb 2012 A1
20120124062 Macbeth et al. May 2012 A1
20120131456 Lin et al. May 2012 A1
20120143941 Kim Jun 2012 A1
20120265976 Spiers Oct 2012 A1
20120266149 Lebert Oct 2012 A1
20120324333 Lehavi Dec 2012 A1
20120330940 Caire et al. Dec 2012 A1
20130145006 Tammam Jun 2013 A1
20130173648 Tan et al. Jul 2013 A1
20130227535 Kannan Aug 2013 A1
20130236111 Pintsov Sep 2013 A1
20130290318 Shapira et al. Oct 2013 A1
20130332511 Hala Dec 2013 A1
20130332524 Fiala Dec 2013 A1
20140036290 Miyagawa Feb 2014 A1
20140045484 Kim et al. Feb 2014 A1
20140046645 White Feb 2014 A1
20140075371 Carmi Mar 2014 A1
20140181705 Hey et al. Jun 2014 A1
20140189576 Carmi Jul 2014 A1
20140379666 Bryon Dec 2014 A1
20150082280 Betak et al. Mar 2015 A1
20150088982 Johnson Mar 2015 A1
20150113528 Kim et al. Apr 2015 A1
20150310268 He Oct 2015 A1
20150347284 Hey et al. Dec 2015 A1
20150350048 Sampat Dec 2015 A1
20150363224 Argenti et al. Dec 2015 A1
20150365349 Verma Dec 2015 A1
20160019049 Kakhandiki et al. Jan 2016 A1
20160034441 Nguyen et al. Feb 2016 A1
20160055376 Koduru Feb 2016 A1
20160063269 Liden Mar 2016 A1
20160078368 Kakhandiki et al. Mar 2016 A1
20160259654 Nychis et al. Sep 2016 A1
20160379010 Farkash et al. Dec 2016 A1
20170048170 Smullen Feb 2017 A1
20170270431 Hosabettu Sep 2017 A1
20180113781 Kim Apr 2018 A1
20180210824 Kochura Jul 2018 A1
20180218429 Guo et al. Aug 2018 A1
20180275835 Prag Sep 2018 A1
20180276462 Davis Sep 2018 A1
20180311815 Shaw et al. Nov 2018 A1
20180321955 Liu Nov 2018 A1
20180349730 Dixon Dec 2018 A1
20180370029 Hall Dec 2018 A1
20190005050 Proux Jan 2019 A1
20190026215 Agarwal Jan 2019 A1
20190028587 Unitt Jan 2019 A1
20190034041 Nychis Jan 2019 A1
20190042286 Bailey Feb 2019 A1
20190095440 Chakra Mar 2019 A1
20190114370 Cerino Apr 2019 A1
20190126463 Purushothaman May 2019 A1
20190141596 Gay May 2019 A1
20190188462 Nishida Jun 2019 A1
20190213822 Jain Jul 2019 A1
20190250891 Kumar Aug 2019 A1
20190266692 Stach et al. Aug 2019 A1
20190303779 Briggle et al. Oct 2019 A1
20190317803 Maheshwari Oct 2019 A1
20190324781 Ramamurthy Oct 2019 A1
20190340240 Duta Nov 2019 A1
20190377987 Price et al. Dec 2019 A1
20200019767 Porter et al. Jan 2020 A1
20200034976 Stone et al. Jan 2020 A1
20200057946 Singaraju Feb 2020 A1
20200059441 Viet Feb 2020 A1
20200097742 Kumar et al. Mar 2020 A1
20200104350 Allen Apr 2020 A1
20200147791 Safary May 2020 A1
20200151444 Price et al. May 2020 A1
20200151591 Li May 2020 A1
20200159647 Puszkiewicz May 2020 A1
20200159648 Ghare May 2020 A1
20200249964 Fernandes Aug 2020 A1
20200285353 Rezazadeh Sereshkeh Sep 2020 A1
20200311210 Nama Oct 2020 A1
20200334249 Canim Oct 2020 A1
20210049128 Kernick Feb 2021 A1
20210107140 Singh Apr 2021 A1
20210141497 Magureanu May 2021 A1
20210216334 Barrett Jul 2021 A1
20210279166 Peng Sep 2021 A1
20220245936 Valk Aug 2022 A1
20220405094 Farquhar Dec 2022 A1
20230052190 Goyal et al. Feb 2023 A1
20230053260 Goyal et al. Feb 2023 A1
Foreign Referenced Citations (3)
Number Date Country
2016163901 Oct 2016 WO
2019092672 May 2019 WO
2022076488 Apr 2022 WO
Non-Patent Literature Citations (64)
Entry
U.S. Appl. No. 16/925,956, filed Jul. 10, 2020, Dabhi.
International Search Report for PCT/US2020/030496.
International Search Report for PCT/US2020/030506.
Robert Nystrom, Game Programming Patterns, 2009, gameprogrammingpatterns.com/bytecode.html, pp. 1-26 (Year: 2006).
Written Opinion of the International Searching Authoritv for PCT/US2020/030496.
Written Opinion of the International Searching Authority for PCT/US2020/030506.
Al Sallami, Load Balancing in Green Cloud Computation, Proceedings of the World Congress of Engineering 2013 vol. II, WCE 2013, 2013, pp. 1-5 (Year: 2013).
B.P. Kasper “Remote: A Means of Remotely Controlling and Storing Data from a HAL Quadrupole Gas Analyzer Using an IBM-PC Compatible Computer”, Nov. 15, 1995, Space and Environment Technology Center.
Bergen et al., RPC automation: making legacy code relevant, May 2013, 6 pages.
Hu et al., An architecture for Virtual solution composition and deployment in infrastructure clouds, 9 pages (Year: 2009).
Nyulas et al., An Ontology-Drive Framework for Deploying JADE Agent Systems, 5 pages (Year: 2008).
Tom Yeh, Tsung-Hsiang Chang, and Robert C. Miller, Sikuli: Using GUI Screenshots for Search and Automation, 4-7, Oct. 2009, 10 pages.
Yu et al., Deploying and managing Web services; issues, solutions, and direction, 36 pages (Year: 2008).
Al Sallami, Load Balancing in Green Cloud Computation, Proceedings of the World Congress on Engineering 2013 vol. II, WCE 2013, 2013, pp. 1-5 (Year: 2013).
B.P. Kasper “Remote: A Means of Remotely Controlling and Storing Data from a HAL Quadrupole Gass Analyzer Using an IBM-PC Compatible Computer”, Nov. 15, 1995, Space and Environment Technology Center.
Bergen et al., RPC automation: making legacy code releant, May 2013, 6 pages.
Hu et al., Automating GUI testing for Android applications, May 2011, 7 pages.
Konstantinou et al., An architecture for virtual solution composition and deployment in infrastructure clouds, 9 pages (Year: 2009).
Nyulas et al., An Ontology-Driven Framework for Deploying JADE Agent Systems, 5 pages (Year: 2006).
Tom Yeh, Tsung-Hsiang Chang, and Robert C. Miller, Sikuli: Using GUI Screenshots for Search and Automation. Oct. 4-7, 2009. 10 pages.
Yu et al., Deplying and managing Web services: issues, solutions, and directions, 36 pages (Year: 2008).
Zhifang et al., Test automation on moble device, May 2010, 7 pages.
Non-Final Office Action for U.S. Appl. No. 17/230,492, dated Oct. 14, 2022.
Notice of Allowance for U.S. Appl. No. 16/398,532, dated Oct. 23, 2022.
Non-Final Office Action for U.S. Appl. No. 16/876,530, dated Sep. 29, 2020.
Final Office Action for U.S. Appl. No. 16/876,530, dated Apr. 13, 2021.
Notice of Allowance for U.S. Appl. No. 16/876,530, dated Jul. 22, 2021.
Dai, Jifeng et al., “R-fcn: Object detectiom via region-based fully convolutional networks”, Advances in neural information processing systems 29 (2016). (Year: 2016).
Ren, Shaoqing et al., “Faster r-cnn: Towards real0time object detection with region proposal network.” Advances in neutral information processing systems 28 (2015). (Year: 2015).
Zhifang et al., Test automation on mobile device, May 2010, 7 pages.
International Search Report for PCT/US2021/053669, dated May 11, 2022.
Embley et al., “Table-processing paradigms: a research survey”, International Journal on Document Analysis and Recognition, vol. 8, No. 2-3, May 9, 2006, pp. 66-86.
Non-Final Office Action for U.S. Appl. No. 16/925,956, dated Sep. 16, 2021.
Notice of Allowance for U.S. Appl. No. 16/925,956, dated Jan. 7, 2022.
Pre-Interview Office Action for U.S. Appl. No. 16/398,532, dated Jul. 8, 2022.
Notice of Allowance for U.S. Appl. No. 16/398,532, dated Jul. 8, 2022.
Non-Final Office Action for U.S. Appl. No. 17/139,838, dated Feb. 22, 2022.
Final Office Action for U.S. Appl. No. 17/139,838, dated Nov. 15, 2023.
Notice of Allowance for U.S. Appl. No. 17/139,838, dated Apr. 5, 2023.
International Search Report and Written Opinion for PCT/US2021/015691, dated May 11, 2021.
A density-based algorithm for discovering clusters in large spatial databases with noise, Ester, Martin; Kriegel, Hans-Peter; Sander, Jorg; Xu, Xiaowei, Simoudis, Evangelos; Han, Jiawei; Fayyad, Usama M., eds., Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (KDD-96). AMI Press. pp. 226-231 (1996).
Deep Residual Learning for Image Recognition, by K. He, X. Zhang, S. Ren, and J. Sun, arXiv:1512.03385 (2015).
FaceNet: A Unified Embedding for Face Recognition and Clustering, by F. Schroff, D. Kalenichenko, J. Philbin, arXiv:1503.03832 (2015).
Muhammad et al. “Fuzzy multilevel graph embedding”, copyright 2012 Elsevier Ltd.
Sharma et al. Determining similarity in histological images using graph-theoretic description and matching methods for content-based image retrieval in medical diagnostics, Biomed Center, copyright 2012.
First Action Interview Pilot Program Pre-Interview communication for U.S. Appl. No. 16/779,462, dated Dec. 3, 2021.
Reply under 37 CDT 1.111 to Pre-Interview Communication for U.S. Appl. No. 16/779,462, filed Jan. 25, 2022.
Notice of Allowance for U.S. Appl. No. 16/779,462 dated Feb. 9, 2022.
Notice of Allowance for U.S. Appl. No. 17/131,674, dated Jun. 22, 2023.
Non-Final Office Action for U.S. Appl. No. 16/731,044, dated Jan. 25, 2021.
Notice of Allowance for U.S. Appl. No. 16/731,044, dated May 5, 2021.
Final Office Action for U.S. Appl. No. 16/930,247 dated Oct. 12, 2023.
Notice of Allowance for U.S. Appl. No. 17/534,443 dated Oct. 24, 2023.
International Search Report and Written Opinion for PCT/US2022/013026, dated Sep. 21, 2022.
Non-Final Office Action for U.S. Appl. No. 18/126,935, dated Jul. 13, 2023.
Non-Final Office Action for U.S. Appl. No. 17/139,842, dated Jul. 18, 2023.
Notice of Allowance for U.S. Appl. No. 17/588,588, dated Aug. 2, 2023.
Pre-Interview Office Action for U.S. Appl. No. 16/859,488, dated Jan. 25, 2021.
First Action Interview for U.S. Appl. No. 16/859,488, dated Mar. 22, 2021.
Final Office Action for U.S. Appl. No. 16/859,488, dated Jul. 8, 2021.
Notice of Allowance for U.S. Appl. No. 16/859,488, dated Mar. 30, 2022.
Final Office Action for U.S. Appl. No. 17/463,494, dated Sep. 6, 2023.
Final Office Action for U.S. Appl. No. 17/160,080, dated Sep. 11, 2023.
Final Office Action for U.S. Appl. No. 17/534,443, dated Sep. 11, 2023.
Related Publications (1)
Number Date Country
20210389971 A1 Dec 2021 US
Continuations (1)
Number Date Country
Parent 16731044 Dec 2019 US
Child 17463494 US
Continuation in Parts (2)
Number Date Country
Parent 16398600 Apr 2019 US
Child 16731044 US
Parent 16398532 Apr 2019 US
Child 16398600 US