This disclosure relates generally to the field of data processing systems and more particularly to computerized task automation.
Organizations are increasingly seeking to automate repetitive business processes. One effective approach is by way of Robotic Process Automation (RPA) which is the application of technology that allows workers in an organization to configure a computer software program, (also referred to as a “robot” or “bot”) to capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems. Conventional RPA systems employ software robots to interpret the user interface of third-party applications and to execute steps identically to a human user. While this has proven very useful in automating repetitive business tasks, identifying tasks for automation and then generating the software robot(s) to automate such tasks can be an involved process often necessitating the involvement of individuals with expertise in a variety of fields including RPA. This can be justified when automating significant processes within an organization. It is often difficult though to justify such resources when identifying and automating processes that are not quite as important and/or well known within an organization. A suggested solution, (see, e.g. https://www.mimica.ai and https://www.fortressiq.com), is to record over a period of time the activities of a user on their computer. The recorded data can then be processed to generate a type of process map that may be used to guide and help develop automated tasks to replace the manual operations performed by the user whose actions have been recorded. While this can simplify the identification and automation of computer-implemented tasks, there remains a need for improved computerized techniques to identify and automate computer-implemented tasks that are currently performed manually or are partially automated.
The benefits of RPA may be easily extended to many manually generated tasks by way of the disclosed embodiments. A robotic process automation system is disclosed herein including a plurality of automated software robots, each of which when deployed performs a sequence of actions with respect to one or more computer applications on a designated computer system with an identity and credentials of a designated human user to process one or more work items. The system simplifies bot identification and creation by recording, over a period of time, inputs of a computing device user to generate a log of inputs by the user in connection with one or more task applications, where the task applications comprise one or more applications with which the user interacts to perform a first task. The system stores inputs of the user along with information pertaining to the one or more task applications. The log is processed to identify the one or more task applications to generate a user action file. The log is further processed to identify fields in the task applications with which the user entered inputs and storing the identified fields to the user action file. The user action file is processed to identify one or more actions performed by the user with respect to the identified task applications. The one or more actions performed by the user are compared with respect to the identified task applications to actions performed by a set of automated software robots and if a match is identified then a user, such as an administrator is notified of the match. If a match is not identified then a new automated software robot is generated and is encoded with instructions to perform automatically, when invoked, the actions performed by the user with respect to the identified task applications.
Additional aspects related to the invention will be set forth in part in the description which follows, and in part will be apparent to those skilled in the art from the description or may be learned by practice of the invention. Aspects of the invention may be realized and attained by means of the elements and combinations of various elements and aspects particularly pointed out in the following detailed description and the appended claims.
It is to be understood that both the foregoing and the following descriptions are exemplary and explanatory only and are not intended to limit the claimed invention or application thereof in any manner whatsoever.
The accompanying drawings, which are incorporated in and constitute a part of this specification exemplify the embodiments of the present invention and, together with the description, serve to explain and illustrate principles of the inventive techniques disclosed herein. Specifically:
In the following detailed description, reference will be made to the accompanying drawings, in which identical functional elements are designated with like numerals. The aforementioned accompanying drawings show by way of illustration, and not by way of limitation, specific embodiments and implementations consistent with principles of the present invention. These implementations are described in sufficient detail to enable those skilled in the art to practice the invention and it is to be understood that other implementations may be utilized and that structural changes and/or substitutions of various elements may be made without departing from the scope and spirit of present invention. The following detailed description is, therefore, not to be construed in a limited sense.
Certain tasks performed by certain users may be automated by way of bots 125. User's employing a bot 125 to automate a task interact with the RPA system 10 by way of control room module 122 which operates to control processing of tasks within work queues 126. Examples of such tasks, as noted above are invoices, new hire onboarding documents and expense reports. These are simple examples and many other tasks may be processed with RPA system 10. Each work queue (Q1, Q2, . . . , Qn) preferably includes a set of data (work item) that maps to a specific data type. For example, Q1 may contain invoices, Q2 may contain new hire onboarding documents, etc. The records in each queue 126 are processed by a bot 125 stored in storage 108. The various storage mechanisms such as 124, 120 are shown separately for simplicity of explanation but may constitute any of a number of forms of storage such as separate storage systems or a single integrated storage system. Embodiments disclosed herein provide the capability to identify tasks that are not manually performed by a user and to generate a bot 125 that automates the manual task.
The recorders 106 operate under control of control room 122 that controls deployment of the recorders 106. Control room 122 permits system management of the system 10 by providing (i) user management to control and authorize users of the system, (ii) source control to manage development and deployment of bots 125, (iii) a dashboard to provide analytics and results of bots 125, and (iv) license management to permit usage by licensed users. The control room 122 is configured to execute instructions that when executed cause the RPA system 10 to respond to a request from a client device 102, 104 that is issued by a user 112, 114 to act as a server to provide to the client device the capability to perform an automation task to process a work item from the plurality of work items 126. The user interacts with the control room 122 to schedule automation tasks to be performed on one or more devices as if the user were manually interacting with the necessary application program(s) and the operating system of the devices to perform the tasks directly. The control room 122 holds all software application license and user information. The control room 122 also tracks all bots that have been deployed and knows the status of all deployed bots.
The action logs 118 are each processed by data cleansing module 128 to remove extraneous user actions, to transform the data as required, such as normalizing the data as required for identification of the contents of the screens viewed by the viewer and the actions taken by the user. An example of an extraneous user action is mouse movements that result in no action, and actions that are undone by the user. Each cleaned and normalized action log 118′ is then processed at 130 to identify the user actions. This is performed by recognizing the application(s) with which the user interacts and the field(s) within each application with which the user interacts, such as by taking an action as permitted by a particular user interface object or by entering data. In some instances, the recorder 106 may have access to the metadata of an application with which the user is interacting. This would occur for example with a web-based application where the fields displayed to the user would be discernable by the HTML file being rendered by the user's browser. In such an instance identification of the field with which the user is interacting and any action taken, or value entered by the user is readily discernable by processing the code provided by the web-based application to the user's browser to render the application screen in the user's browser. Desktop based applications may not necessarily expose as much information as a web-based application. Also, information regarding application fields, objects, controls etc, will not be as easily discernable if, for example the user is remotely located and is interacting with the RPA system 10 via a compliance boundary such as described in System and Method for Compliance Based Automation, filed in the U.S. Patent Office on Jan. 6, 2016, and assigned application Ser. No. 14/988,877, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety. In such instances, processing will need to be performed to automatically identify application fields from the screen image being displayed to the user. Such processing can include the use of a fingerprint generator such as described in DETECTION AND DEFINITION OF VIRTUAL OBJECTS IN REMOTE SCREENS, filed in the U.S. Patent Office on Apr. 19, 2018 and assigned application Ser. No. 15/957,030, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety. As described in the aforementioned '030 application, a fingerprint generator analyzes an image file for various objects, such as automation controls (markers) and their locations. The combination of various objects, object metadata, properties and types, and location on the screen is used to generate a unique set of keys that can together represent a “fingerprint” or signature of that screen that assists in recognition of that specific screen, among a database of any other possible screens. Additional aspects of such an operation are disclosed in patent application entitled System And Method For Resilient Automation Upgrade filed in the U.S. Patent Office on Aug. 25, 2015 and assigned application Ser. No. 14/834,773, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety.
The recognition of application fields and the data contained within a recognized field in an image file may also be performed as described in AUTOMATIC KEY/VALUE PAIR EXTRACTION FROM DOCUMENT IMAGES USING DEEP LEARNING, filed in the U.S. Patent Office on Dec. 29, 2017, and assigned application Ser. No. 15/858,976, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety. The recognition of visual application objects with which a user interacts may be performed as described in AUTOMATED DETECTION OF CONTROLS IN COMPUTER APPLICATIONS WITH REGION BASED DETECTORS, filed in the U.S. Patent Office on Jul. 31, 2019, and assigned application Ser. No. 16/527,048, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety. The recognition of text in a screen image of an application may be performed as described in OPTICAL CHARACTER RECOGNITION EMPLOYING DEEP LEARNING WITH MACHINE GENERATED TRAINING DATA, filed in the U.S. Patent Office on Dec. 21, 2017, and assigned application Ser. No. 15/851,617, which application is assigned to the assignee of the present application and which application is hereby incorporated by reference in its entirety. A commercially available implementation of the ability to recognize objects and application controls in applications, including web-based applications, desktop applications and those employed via compliance interface, such as via a Citrix server, may be obtained from Automation Anywhere, Inc. in the form of its Object Cloning functionality.
The output of screen and field-value pair identification module 130 is a user action file 119 containing a sequential listing of actions taken by the user with an identification of the application with which the user interacts and the fields within each application that the user interacts with. The user action file 119 is then processed by task identification module 131, in a manner described in connection with
In another embodiment, the availability of information may be further restricted by use of techniques disclosed in pending patent application entitled ROBOTIC PROCESS AUTOMATION WITH SECURE RECORDING, filed on Jan. 29, 2018 and assigned application Ser. No. 15/883,020, which application is assigned to the assignee of the current application and which application is hereby incorporated by reference in its entirety. This application describes an RPA system that provides a secure recording mode that is responsive to an operator accessible setting, that prevents presentation of full screen images, and that permits presentation of one or more selected data fields and associated labels within one or more of the full screen images. In such an embodiment, the information desired to be protected is not displayed to the user 112.
The embodiments in
The unsupervised task identification engine 401 may be implemented by way of a trained Deep Neural Network (DNN). The initial training of such a DNN may be performed by generating the user action file from existing bots which are encoded to perform user level actions in conjunction with application programs. This training can then be refined with the results of the supervised task identification 403. The continued generation of data by more automated tasks implemented by bots will further improve the identification ability of the DNN.
An example of the results of task identification is shown in the table in
Examples of different patterns that may arise are shown in
Existing bots 125 may be recognized by way of the operations shown in
Generation of a new bot (seen in
An example of the various alternative ways that a task may be executed is as follows for execute a task called New Vendor Setup. Suppose User1 opens a structured data file, such as a CSV file, and pastes selected data from the file into a screen provided by an enterprise application, such as an accounts payable application, to create a first supplier (vendor). User2 opens different type of structured data file, such as created by default by a spreadsheet program such as Microsoft Excel, and pastes data into the same screen of the enterprise application as done by User1 to create a second supplier. User3 opens another application, for example an older legacy system, and transfers data from various fields of the legacy system into the enterprise application used by User1 and User2 to create a third supplier. These slightly different tasks are all variations of the same “New Vendor Setup” business process.
Creation of the bot includes generation of an execution file having one or more tasks for deployment. The tasks have command line arguments executable as variables by one or more remote computers. The command line arguments are assembled into a single execution file. The tasks are validated, and the nested tasks are organized by collecting nested task information for each task and accounting for all dependencies to ensure that files, tasks, and environments for running on one or more remote computers are present in the execution file. Creation of the execution file also includes scanning the tasks for event dependencies and embedding files and links needed for remote execution of the execution file. Dependencies are stored in a dependency file. The execution file and dependency file are scanned for security and verified for proper formatting.
The embodiments herein can be implemented in the general context of computer-executable instructions, such as those included in program modules, being executed in a computing system on a target real or virtual processor. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The program modules may be obtained from another computer system, such as via the Internet, by downloading the program modules from the other computer system for execution on one or more different computer systems. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Computer-executable instructions for program modules may be executed within a local or distributed computing system. The computer-executable instructions, which may include data, instructions, and configuration parameters, may be provided via an article of manufacture including a computer readable medium, which provides content that represents instructions that can be executed. A computer readable medium may also include a storage or database from which content can be downloaded. A computer readable medium may also include a device or product having content stored thereon at a time of sale or delivery. Thus, delivering a device with stored content, or offering content for download over a communication medium may be understood as providing an article of manufacture with such content described herein.
The terms “computer system” “system” and “computing device” are used interchangeably herein. Unless the context clearly indicates otherwise, neither term implies any limitation on a type of computing system or computing device. In general, a computing system or computing device can be local or distributed and can include any combination of special-purpose hardware and/or general-purpose hardware with software implementing the functionality described herein.
Computing system 1100 may have additional features such as for example, storage 1110, one or more input devices 1114, one or more output devices 1112, and one or more communication connections 1116. An interconnection mechanism (not shown) such as a bus, controller, or network interconnects the components of the computing system 1100. Typically, operating system software (not shown) provides an operating system for other software executing in the computing system 1100, and coordinates activities of the components of the computing system 1100.
The tangible storage 1110 may be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium which can be used to store information in a non-transitory way, and which can be accessed within the computing system 1100. The storage 1110 stores instructions for the software implementing one or more innovations described herein.
The input device(s) 1114 may be a touch input device such as a keyboard, mouse, pen, or trackball, a voice input device, a scanning device, or another device that provides input to the computing system 1100. For video encoding, the input device(s) 1114 may be a camera, video card, TV tuner card, or similar device that accepts video input in analog or digital form, or a CD-ROM or CD-RW that reads video samples into the computing system 1100. The output device(s) 1112 may be a display, printer, speaker, CD-writer, or another device that provides output from the computing system 1100.
The communication connection(s) 1116 enable communication over a communication medium to another computing entity. The communication medium conveys information such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal. A modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can use an electrical, optical, RF, or other carrier.
It should be understood that functions/operations shown in this disclosure are provided for purposes of explanation of operations of certain embodiments. The implementation of the functions/operations performed by any particular module may be distributed across one or more systems and computer programs and are not necessarily contained within a particular computer program and/or computer system.
While the invention has been described in connection with a preferred embodiment, it is not intended to limit the scope of the invention to the particular form set forth, but on the contrary, it is intended to cover such alternatives, modifications, and equivalents as may be within the spirit and scope of the invention as defined by the appended claims.
This application is a continuation of U.S. patent application Ser. No. 16/724,308, filed Dec. 22, 2019, and entitled “USER ACTION GENERATED PROCESS DISCOVERY,” the content of which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5949999 | Song et al. | Sep 1999 | A |
5983001 | Boughner et al. | Nov 1999 | A |
6133917 | Feigner et al. | Oct 2000 | A |
6226407 | Zabih et al. | May 2001 | B1 |
6389592 | Ayres et al. | May 2002 | B1 |
6427234 | Chambers et al. | May 2002 | B1 |
6473794 | Guheen et al. | Oct 2002 | B1 |
6496979 | Chen et al. | Dec 2002 | B1 |
6640244 | Bowman-Amuah | Oct 2003 | B1 |
6704873 | Underwood | Mar 2004 | B1 |
6898764 | Kemp | May 2005 | B2 |
6954747 | Wang et al. | Oct 2005 | B1 |
6957186 | Guheen et al. | Oct 2005 | B1 |
7091898 | Arling et al. | Aug 2006 | B2 |
7246128 | Jordahl | Jul 2007 | B2 |
7398469 | Kisamore et al. | Jul 2008 | B2 |
7441007 | Kirkpatrick et al. | Oct 2008 | B1 |
7533096 | Rice et al. | May 2009 | B2 |
7568109 | Powell et al. | Jul 2009 | B2 |
7571427 | Wang et al. | Aug 2009 | B2 |
7765525 | Davidson et al. | Jul 2010 | B1 |
7783135 | Gokturk | Aug 2010 | B2 |
7805317 | Khan et al. | Sep 2010 | B2 |
7805710 | North | Sep 2010 | B2 |
7810070 | Nasuti et al. | Oct 2010 | B2 |
7846023 | Evans et al. | Dec 2010 | B2 |
8028269 | Bhatia et al. | Sep 2011 | B2 |
8056092 | Allen et al. | Nov 2011 | B2 |
8095910 | Nathan et al. | Jan 2012 | B2 |
8132156 | Malcolm | Mar 2012 | B2 |
8209738 | Nicol et al. | Jun 2012 | B2 |
8234622 | Meijer et al. | Jul 2012 | B2 |
8245215 | Extra | Aug 2012 | B2 |
8352464 | Folev | Jan 2013 | B2 |
8365147 | Grechanik | Jan 2013 | B2 |
8396890 | Lim | Mar 2013 | B2 |
8438558 | Adams | May 2013 | B1 |
8443291 | Ku et al. | May 2013 | B2 |
8464240 | Fritsch et al. | Jun 2013 | B2 |
8498473 | Chong et al. | Jul 2013 | B2 |
8504803 | Shukla | Aug 2013 | B2 |
8631458 | Banerjee | Jan 2014 | B1 |
8682083 | Kumar et al. | Mar 2014 | B2 |
8713003 | Fotev | Apr 2014 | B2 |
8724907 | Sampson et al. | May 2014 | B1 |
8769482 | Batey et al. | Jul 2014 | B2 |
8819241 | Washbum | Aug 2014 | B1 |
8832048 | Lim | Sep 2014 | B2 |
8874685 | Hollis et al. | Oct 2014 | B1 |
8943493 | Schneider | Jan 2015 | B2 |
8965905 | Ashmore et al. | Feb 2015 | B2 |
8966458 | Asai | Feb 2015 | B2 |
9032314 | Mital et al. | May 2015 | B2 |
9104294 | Forstall et al. | Aug 2015 | B2 |
9171359 | Lund | Oct 2015 | B1 |
9213625 | Schrage | Dec 2015 | B1 |
9251413 | Meler | Feb 2016 | B2 |
9278284 | Ruppert et al. | Mar 2016 | B2 |
9444844 | Edery et al. | Sep 2016 | B2 |
9462042 | Shukla et al. | Oct 2016 | B2 |
9571332 | Subramaniam | Feb 2017 | B2 |
9600519 | Schoning et al. | Mar 2017 | B2 |
9621584 | Schmidt et al. | Apr 2017 | B1 |
9934129 | Budurean | Apr 2018 | B1 |
9946233 | Brun et al. | Apr 2018 | B2 |
9965139 | Nychis | May 2018 | B2 |
9990347 | Raskovic et al. | Jun 2018 | B2 |
10015503 | Ahammad | Jul 2018 | B1 |
10043255 | Pathapati et al. | Aug 2018 | B1 |
10282280 | Gouskova | May 2019 | B1 |
10489682 | Kumar et al. | Nov 2019 | B1 |
10592738 | Northrup | Mar 2020 | B2 |
10654166 | Hall | May 2020 | B1 |
10706218 | Milward et al. | Jul 2020 | B2 |
10706228 | Buisson | Jul 2020 | B2 |
10713068 | Zohar | Jul 2020 | B1 |
10936807 | Walters | Mar 2021 | B1 |
11099972 | Puszkiewicz | Aug 2021 | B2 |
11176443 | Selva | Nov 2021 | B1 |
11182178 | Singh et al. | Nov 2021 | B1 |
11182604 | Methaniya | Nov 2021 | B1 |
11243803 | Anand et al. | Feb 2022 | B2 |
11263391 | Potts | Mar 2022 | B2 |
11348353 | Sundell et al. | May 2022 | B2 |
11481304 | Kakhandiki | Oct 2022 | B1 |
11614731 | Anand et al. | Mar 2023 | B2 |
11775321 | Singh et al. | Oct 2023 | B2 |
11775339 | Anand et al. | Oct 2023 | B2 |
11775814 | Anand et al. | Oct 2023 | B1 |
11782734 | Ginoya et al. | Oct 2023 | B2 |
20020029232 | Bobrow et al. | Mar 2002 | A1 |
20030033590 | Leherbauer | Feb 2003 | A1 |
20030101245 | Srinivasan et al. | May 2003 | A1 |
20030114959 | Sakamoto | Jun 2003 | A1 |
20030159089 | DiJoseph | Aug 2003 | A1 |
20040083472 | Rao et al. | Apr 2004 | A1 |
20040153649 | Rhoads | Aug 2004 | A1 |
20040172526 | Tann et al. | Sep 2004 | A1 |
20040210885 | Wang et al. | Oct 2004 | A1 |
20040243994 | Nasu | Dec 2004 | A1 |
20050188357 | Derks et al. | Aug 2005 | A1 |
20050204343 | Kisamore et al. | Sep 2005 | A1 |
20050257214 | Moshir et al. | Nov 2005 | A1 |
20060095276 | Axelrod et al. | May 2006 | A1 |
20060150188 | Roman et al. | Jul 2006 | A1 |
20060218110 | Simske et al. | Sep 2006 | A1 |
20070030528 | Quaeler et al. | Feb 2007 | A1 |
20070089101 | Romanovskiy | Apr 2007 | A1 |
20070101291 | Forstall et al. | May 2007 | A1 |
20070112574 | Greene | May 2007 | A1 |
20070156677 | Szabo | Jul 2007 | A1 |
20070233741 | Shen | Oct 2007 | A1 |
20080005086 | Moore | Jan 2008 | A1 |
20080027769 | Eder | Jan 2008 | A1 |
20080028392 | Chen et al. | Jan 2008 | A1 |
20080133052 | Jones | Jun 2008 | A1 |
20080209392 | Able et al. | Aug 2008 | A1 |
20080222454 | Kelso | Sep 2008 | A1 |
20080263024 | Landschaft et al. | Oct 2008 | A1 |
20080310625 | Vanstone et al. | Dec 2008 | A1 |
20090037509 | Parekh et al. | Feb 2009 | A1 |
20090103769 | Milov et al. | Apr 2009 | A1 |
20090116071 | Mantell | May 2009 | A1 |
20090172814 | Khosravi et al. | Jul 2009 | A1 |
20090199160 | Vaitheeswaran et al. | Aug 2009 | A1 |
20090217309 | Grechanik et al. | Aug 2009 | A1 |
20090249297 | Doshi et al. | Oct 2009 | A1 |
20090313229 | Fellenstein et al. | Dec 2009 | A1 |
20090320002 | Peri-Glass et al. | Dec 2009 | A1 |
20100023602 | Marlone | Jan 2010 | A1 |
20100023933 | Bryant et al. | Jan 2010 | A1 |
20100100605 | Allen et al. | Apr 2010 | A1 |
20100106671 | Li et al. | Apr 2010 | A1 |
20100138015 | Colombo et al. | Jun 2010 | A1 |
20100235433 | Ansari et al. | Sep 2010 | A1 |
20100251163 | Keable | Sep 2010 | A1 |
20110022578 | Folev | Jan 2011 | A1 |
20110106284 | Catoen | May 2011 | A1 |
20110145807 | Molinie et al. | Jun 2011 | A1 |
20110197121 | Kletter | Aug 2011 | A1 |
20110267490 | Goktekin | Nov 2011 | A1 |
20110276568 | Fotev | Nov 2011 | A1 |
20110276946 | Pletter | Nov 2011 | A1 |
20110302570 | Kurimilla et al. | Dec 2011 | A1 |
20120011458 | Xia et al. | Jan 2012 | A1 |
20120042281 | Green | Feb 2012 | A1 |
20120124062 | Macbeth et al. | May 2012 | A1 |
20120131456 | Lin et al. | May 2012 | A1 |
20120143941 | Kim | Jun 2012 | A1 |
20120266149 | Lebert | Oct 2012 | A1 |
20120324333 | Lehavi | Dec 2012 | A1 |
20120330940 | Caire et al. | Dec 2012 | A1 |
20130145006 | Tammam | Jun 2013 | A1 |
20130173648 | Tan et al. | Jul 2013 | A1 |
20130227535 | Kannan | Aug 2013 | A1 |
20130236111 | Pintsov | Sep 2013 | A1 |
20130290318 | Shapira et al. | Oct 2013 | A1 |
20140036290 | Miyagawa | Feb 2014 | A1 |
20140045484 | Kim et al. | Feb 2014 | A1 |
20140075371 | Carmi | Mar 2014 | A1 |
20140181705 | Hey et al. | Jun 2014 | A1 |
20140189576 | Carmi | Jul 2014 | A1 |
20140379666 | Bryon | Dec 2014 | A1 |
20150082280 | Betak et al. | Mar 2015 | A1 |
20150235193 | Cummings | Aug 2015 | A1 |
20150310268 | He | Oct 2015 | A1 |
20150347284 | Hey et al. | Dec 2015 | A1 |
20160019049 | Kakhandiki et al. | Jan 2016 | A1 |
20160034441 | Nguyen et al. | Feb 2016 | A1 |
20160055376 | Koduru | Feb 2016 | A1 |
20160063269 | Liden | Mar 2016 | A1 |
20160078368 | Kakhandiki et al. | Mar 2016 | A1 |
20170270431 | Hosabettu | Sep 2017 | A1 |
20180113781 | Kim | Apr 2018 | A1 |
20180210824 | Kochura | Jul 2018 | A1 |
20180218429 | Guo et al. | Aug 2018 | A1 |
20180275835 | Prag | Sep 2018 | A1 |
20180276462 | Davis | Sep 2018 | A1 |
20180321955 | Liu | Nov 2018 | A1 |
20180349730 | Dixon | Dec 2018 | A1 |
20180370029 | Hall | Dec 2018 | A1 |
20180375886 | Kirti | Dec 2018 | A1 |
20190005050 | Proux | Jan 2019 | A1 |
20190026215 | Agarwal | Jan 2019 | A1 |
20190028587 | Unitt | Jan 2019 | A1 |
20190034041 | Nychis | Jan 2019 | A1 |
20190095440 | Chakra | Mar 2019 | A1 |
20190114370 | Cerino | Apr 2019 | A1 |
20190126463 | Purushothaman | May 2019 | A1 |
20190141596 | Gay | May 2019 | A1 |
20190213822 | Jain | Jul 2019 | A1 |
20190250891 | Kumar | Aug 2019 | A1 |
20190266692 | Stach et al. | Aug 2019 | A1 |
20190317803 | Maheshwari | Oct 2019 | A1 |
20190324781 | Ramamurthy | Oct 2019 | A1 |
20190340240 | Duta | Nov 2019 | A1 |
20190377987 | Price et al. | Dec 2019 | A1 |
20200019767 | Porter et al. | Jan 2020 | A1 |
20200034976 | Stone et al. | Jan 2020 | A1 |
20200059441 | Viet | Feb 2020 | A1 |
20200097742 | Kumar et al. | Mar 2020 | A1 |
20200104350 | Allen | Apr 2020 | A1 |
20200104511 | Stolfo | Apr 2020 | A1 |
20200147791 | Safary | May 2020 | A1 |
20200151444 | Price et al. | May 2020 | A1 |
20200151591 | Li | May 2020 | A1 |
20200159647 | Puszkiewicz | May 2020 | A1 |
20200159648 | Ghare | May 2020 | A1 |
20200249964 | Fernandes | Aug 2020 | A1 |
20200285353 | Rezazadeh Sereshkeh | Sep 2020 | A1 |
20200311210 | Nama | Oct 2020 | A1 |
20200334249 | Canim | Oct 2020 | A1 |
20210049128 | Kernick | Feb 2021 | A1 |
20210107140 | Singh | Apr 2021 | A1 |
20210141497 | Magureanu | May 2021 | A1 |
20210216334 | Barrett | Jul 2021 | A1 |
20210279166 | Peng | Sep 2021 | A1 |
20220245936 | Valk | Aug 2022 | A1 |
20220405094 | Farquhar | Dec 2022 | A1 |
20230052190 | Goyal et al. | Feb 2023 | A1 |
20230053260 | Goyal et al. | Feb 2023 | A1 |
Number | Date | Country |
---|---|---|
2019092672 | May 2019 | WO |
2022076488 | Apr 2022 | WO |
Entry |
---|
Al Sallami, Load Balancing in Green Cloud Computation, Proceedings of the World Congress on Engineering 2013 vol. II, WCE 2013, 2013, pp. 1-5 (Year: 2013). |
B.P. Kasper “Remote: A Means of Remotely Controlling and Storing Data from a HAL Quadrupole Gas Analyzer Using an IBM-PC Compatible Computer”, Nov. 15, 1995, Space and Environment Technology Center. |
Bergen et al., RPC automation: making legacy code relevant, May 2013, 6 pages. |
Hu et al., Automating GUI testing for Android applications, May 2011, 7 pages. |
Konstantinou et al., An Architecture for virtual solution composition and deployment in infrastructure clouds, 9 pages (Year: 2009). |
Nyulas et al., An Ontology-Driven Framework for Deploying JADE Agent Systems, 5 pages (Year: 2006). |
Tom Yeh, Tsung-Hsiang Chang, and Robert C. Miller, Sikuli: Using GUI Screenshots for Search and Automation, Oct. 4-7, 2009, 10 pages. |
Yu et al., Deploying and managing Web services: issues, solutions, and directions, 36 pages (Year: 2008). |
Zhifang et al., Test automation on mobile device, May 2010, 7 pages. |
Zhifang et al., Test automation on moble device, May 2010, 7 pages. |
Non-Final Office Action for U.S. Appl. No. 17/230,492, dated Oct. 14, 2022. |
Notice of Allowance for U.S. Appl. No. 16/398,532, dated Oct. 23, 2022. |
Non-Final Office Action for U.S. Appl. No. 16/876,530, dated Sep. 29, 2020. |
Final Office Action for U.S. Appl. No. 16/876,530, dated Apr. 13, 2021. |
Notice of Allowance for U.S. Appl. No. 16/876,530, dated Jul. 22, 2021. |
Dai, Jifeng et al., “R-fcn: Object detection via region-based fully convolutional networks”, Advances in neural information processing systems 29 (2016). (Year: 2016). |
Ren, Shaoqing et al., “Faster r-cnn: Towards real0time object detection with region proposal network.” Advances in neutral information processing systems 28 (2015). (Year: 2015). |
International Search Report for PCT/US2021/053669, dated May 11, 2022. |
Embley et al., “Table-processing paradigms: a research survey”, International Journal on Document Analysis and Recognition, vol. 8, No. 2-3, May 9, 2006, pp. 66-86. |
Non-Final Office Action for U.S. Appl. No. 16/925,956, dated Sep. 16, 2021. |
Notice of Allowance for U.S. Appl. No. 16/925,956, dated Jan. 7, 2022. |
Pre-Interview Office Action for U.S. Appl. No. 16/398,532, dated Jul. 8, 2022. |
Notice of Allowance for U.S. Appl. No. 16/398,532, dated Oct. 13, 2022. |
Non-Final Office Action for U.S. Appl. No. 17/139,838, dated Feb. 22, 2022. |
Final Office Action for U.S. Appl. No. 17/139,838, dated Nov. 15, 2022. |
Notice of Allowance for U.S. Appl. No. 17/139,838, dated Apr. 5, 2023. |
International Search Report and Written Opinion for PCT/US2021/015691, dated May 11, 2021. |
A density-based algorithm for discovering clusters in large spatial databases with noise, Ester, Martin; Kriegel, Hans-Peter; Sander, Jorg; Xu, Xiaowei, Simoudis, Evangelos; Han, Jiawei; Fayyad, Usama M., eds., Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (KDD-96). AMI Press. pp. 226-231 (1996). |
Deep Residual Learning for Image Recognition, by K. He, X. Zhang, S. Ren, and J. Sun, arXiv:1512.03385 (2015). |
FaceNet: A Unified Embedding for Face Recognition and Clustering, by F. Schroff, D. Kalenichenko, J. Philbin, arXiv:1503.03832 (2015). |
Muhammad et al. “Fuzzy multilevel graph embedding”, copyright 2012 Elsevier Ltd. |
Sharma et al. Determining similarity in histological images using graph-theoretic description and matching methods for content-based image retrieval in medical diagnostics, Biomed Center, copyright 2012. |
First Action Interview Pilot Program Pre-Interview communication for U.S. Appl. No. 16/779,462, dated Dec. 3, 2021. |
Reply under 37 CDT 1.111 to Pre-Interview Communication for U.S. Appl. No. 16/779,462, filed Jan. 25, 2022. |
Notice of Allowance for U.S. Appl. No. 16/779,462 dated Feb. 9, 2022. |
Notice of Allowance for U.S. Appl. No. 17/131,674, dated Jun. 22, 2023. |
Non-Final Office Action for U.S. Appl. No. 16/731,044, dated Jan. 25, 2021. |
Notice of Allowance for U.S. Appl. No. 16/731,044, dated May 5, 2021. |
Non-Final Office Action for U.S. Appl. No. 18/126,935, dated Jul. 13, 2023. |
Non-Final Office Action for U.S. Appl. No. 17/139,842, dated Jul. 18, 2023. |
Notice of Allowance for U.S. Appl. No. 17/588,588, dated Aug. 2, 2023. |
Pre-Interview Office Action for U.S. Appl. No. 16/859,488, dated Jan. 25, 2021. |
First Action Interview for U.S. Appl. No. 16/859,488, dated Mar. 22, 2021. |
Final Office Action for U.S. Appl. No. 16/859,488, dated Jul. 8, 2021. |
Notice of Allowance for U.S. Appl. No. 16/859,488, dated Mar. 30, 2022. |
Final Office Action for U.S. Appl. No. 17/463,494, dated Sep. 6, 2023. |
Final Office Action for U.S. Appl. No. 17/160,080, dated Sep. 11, 2023. |
Final Office Action for U.S. Appl. No. 17/534,443, dated Sep. 11, 2023. |
Final Office Action for U.S. Appl. No. 16/930,247 dated Oct. 12, 2023. |
Notice of Allowance for U.S. Appl. No. 17/534,443 dated Oct. 24, 2023. |
International Search Report and Written Opinion for PCT/US2022/013026, dated Sep. 21, 2022. |
Number | Date | Country | |
---|---|---|---|
20230072084 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16724308 | Dec 2019 | US |
Child | 17967011 | US |