This application claims the benefit of and priority to U.S. Non-Provisional application Ser. No. 17/704,900, filed Mar. 25, 2022, titled “SMART ACTIONS IN APPLICATION ECOSYSTEM,” which is incorporated herein by reference in its entirety.
The present disclosure relates generally to application ecosystems and, more particularly, to navigational and executional operations in an application ecosystem.
Current application platforms promote rapid integration of many applications into an entire ecosystem of applications used by their customers and partners. Previously, the workflow of many tasks was performed within a single monolithic application or suite of applications. The advent of prebuilt integrations and new tools from application providers along with the development of low code application platforms has significantly simplified and accelerated application integration for ecosystems. Many enterprise workflows extend across multiple applications, some of which are owned by the enterprise and others that belong to customers and partners.
For example, onboarding a new employee into a company involves interactions between dozens of applications including applicant tracking, background checking, and a human capital system. In addition to these human resources applications and systems, other applications may be used to add a new employee to the payroll system, connect the new employee with a benefits provider, and perhaps enroll the new employee in learning management, and so on. Some of these applications may be developed or belong to the enterprise and others by partners and customers.
In an aspect of the present disclosure, a method includes: receiving, by a computing device, a natural language request input by a user to perform a task in an application ecosystem of a plurality of applications; determining, by the computing device, an actionable task from the natural language request to perform in the application ecosystem; generating, by the computing device, a user interface screen to perform the task with input parameters required to perform the task populated in elements of the user interface screen derived from the natural language request; and performing the task with the input parameters required in the application ecosystem of the plurality of applications.
In another aspect of the present disclosure, there is a computer program product including one or more computer readable storage media having program instructions collectively stored on the one or more computer readable storage media. The program instructions are executable to: receive, by a computing device, a natural language request input by a user to perform a task in an application ecosystem of a plurality of applications; send, by the computing device, the natural language request to perform the task to a cloud-based service providing the application ecosystem; display, by the computing device, a user interface screen to perform the task with input parameters required to perform the task populated in elements of the user interface screen derived from the natural language request; and send, by the computing device, an indication from the user to perform the task to the cloud-based service providing the application ecosystem.
In a further aspect of the present disclosure, there is a computer system including a processor, a computer readable memory, one or more computer readable storage media, and program instructions collectively stored on the one or more computer readable storage media. The program instructions are executable to: receive, by a computing device, a natural language request input by a user to perform a task in an application ecosystem of a plurality of applications; determine, by the computing device, an actionable task from the natural language request to perform in the application ecosystem; generate, by the computing device, a user interface screen to perform the task with input parameters required to perform the task populated in elements of the user interface screen derived from the natural language request; and perform the task with the input parameters required in the application ecosystem of the plurality of applications.
Aspects of the present disclosure are described in the detailed description which follows, in reference to the noted plurality of drawings by way of non-limiting examples of exemplary embodiments of the present disclosure.
The present disclosure relates generally to application ecosystems and, more particularly, to smart actions in an application ecosystem and methods of operation. In more specific embodiments, the present disclosure provides navigational and executional operations in an application ecosystem. For example, an application ecosystem for human capital management such as ADP® NextGen HCM can now provide smart actions in an evolutionary user interface system for users to interact with the application ecosystem. Advantageously, aspects of the present disclosure provide a novel mechanism in which users can access navigational and executional actions they want to perform in an application ecosystem that is faster than traditional menu-driven access and without the need of an in-depth familiarity of the application ecosystem or user interfaces of the application ecosystem. For example, a user can be navigated easily to a deeply embedded page of an ecosystem to perform the desired action, and the user can specify in the query any parameters necessary to perform the desired action that are used to prefill input parameters on the page for the desired action.
In more specific embodiments, a user may access navigational and executional operations within an application ecosystem, such as ADP® NextGen HCM, by entering a natural language request in the application ecosystem, for instance, as a search query in a global search bar of the home page of the application ecosystem. Accordingly, the user may simply describe their intention in a query typed in a search bar. The system, in embodiments, applies machine learning classifiers to determine an actionable navigational or executional operation of the application ecosystem in the request. As part of understanding what action the user is requesting, the system, in embodiments, may also apply machine learning classifiers to determine required parameters of an actionable task of the application ecosystem in the request. Thus, the machine learning classifiers can learn how a user expresses their intent and understand the actions desired to be performed in the ecosystem.
The system presents the action to the user in an executable format as a search result in the global search bar which the user may select, in embodiments. For a navigational operation presented as a search result, the system generates a hyperlink (“link”) to a page in the application ecosystem that can be loaded on the user's device if selected. Thus the user can click this link and be taken to the page the user desires to visit. For a selected executional operation presented as a search result, the system may generate a display page, in embodiments, for performing the task with input parameters populated in elements of the display page derived from the request and sends the display page to the user for presentation. For example, the system can show a confirmation box of the intended outcome if the user typed out “take leave from Monday to Thursday” for instance. The confirmation box would confirm the dates understood and possibly allow the user to enter any additional information needed to take the leave (such as notes for manager etc.). Once the user confirms this information, and changes it if needed on this confirmation box, the user can submit their leave by just clicking a confirmation button displayed.
Upon receiving confirmation by the user to perform the task with the input parameters, the system, in embodiments, executes the task in the application ecosystem. Such interface access to the ecosystem is faster than traditional menu-driven access and without the need of an in-depth familiarity of the application ecosystem or user interfaces of the application ecosystem to navigate to a deeply embedded page or perform an action by specifying the details of the action in a natural language request.
Moreover, the system responds to the user's description of the navigational and executional actions they want to perform and helps the user execute the action, instead of the user adapting to the system and learning the system's menu-driven navigational and executional user interface and application ecosystem workflow to perform desired tasks. By adapting to the user, aspects of the present disclosure help a user become acclimated to the ecosystem of applications.
Embodiments of the present disclosure may be implemented in many different application ecosystems, such as an application ecosystem of productivity tools, an application ecosystem for enterprise operations, an application ecosystem for workflow automation, and so forth. For instance, embodiments of the present disclosure may be implemented in an application ecosystem for human capital management that may include such applications as applicant tracking, performance management, payroll, benefits administration, and onboarding among others applications.
The smart actions of the evolutionary user interface system for users to interact with an application ecosystem provide a technical solution to a problem arising from integration of many applications into an entire ecosystem of applications used by an enterprise, their customers and partners. This is done by facilitating access to navigational and executional actions they want to perform in an application ecosystem without the need of an in-depth familiarity of the application ecosystem or menu-driven user interfaces of the application ecosystem. In doing so, the technical solution changes the way computers operate in interacting with users by the technological improvements of deploying machine learning classifiers to understand natural language intentions of a user interacting with the computer system and by translating their requests to performable actions by the computer system and executing those actions, among other features as described herein. Unlike conventional systems where a user learns and adapts to a menu-driven user interface and application ecosystem workflow to perform desired tasks, a computer system implementing the technological improvements of the present disclosure adapts to the user and learns how a user describes the actions they want to perform and helps the user execute the action.
Implementations of the present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
As shown in
The bus 110 permits communication among the components of computing device 105. For example, bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures to provide one or more wired or wireless communication links or paths for transferring data and/or power to, from, or between various other components of computing device 105.
The processor 115 may be one or more processors or microprocessors that include any processing circuitry operative to interpret and execute computer readable program instructions, such as program instructions for controlling the operation and performance of one or more of the various other components of computing device 105. In embodiments, processor 115 interprets and executes the processes, steps, functions, and/or operations of the present disclosure, which may be operatively implemented by the computer readable program instructions. For example processor 115 enables the computing device 105 to provide services for users on client devices to request navigational and executional operations within an application ecosystem, deploy machine learning classifiers to understand the natural language intentions of users, translate their requests to performable actions, and execute those actions, as described in more detail herein.
In embodiments, processor 115 may receive input signals from one or more input devices 130 and/or drive output signals through one or more output devices 135. The input devices 130 may be, for example, a keyboard, touch sensitive user interface (UI), etc., as is known to those of skill in the art such that no further description is required for a complete understanding of the present disclosure. The output devices 135 can be, for example, any display device, printer, etc., as is known to those of skill in the art such that no further description is required for a complete understanding of the present disclosure.
The storage device 120 may include removable/non-removable, volatile/non-volatile computer readable media, such as, but not limited to, non-transitory media such as magnetic and/or optical recording media and their corresponding drives. The drives and their associated computer readable media provide for storage of computer readable program instructions, data structures, program modules and other data for operation of computing device 105 in accordance with the different aspects of the present disclosure. In embodiments, storage device 120 may store operating system 145, application programs 150, and program data 155 in accordance with aspects of the present disclosure.
The system memory 125 may include one or more storage mediums, including for example, non-transitory media such as flash memory, permanent memory such as read-only memory (“ROM”), semi-permanent memory such as random access memory (“RAM”), any other suitable type of storage component, or any combination thereof. In some embodiments, an input/output system 160 (BIOS) including the basic routines that help to transfer information between the various other components of computing device 105, such as during start-up, may be stored in the ROM. Additionally, data and/or program modules 165, such as at least a portion of operating system 145, application programs 150, and/or program data 155, that are accessible to and/or presently being operated on by processor 115 may be contained in the RAM.
The communication interface 140 may include any transceiver-like mechanism (e.g., a network interface, a network adapter, a modem, or combinations thereof) that enables computing device 105 to communicate with remote devices or systems, such as a mobile device or other computing devices such as, for example, a server in a networked environment, e.g., cloud environment. For example, computing device 105 may be connected to remote devices or systems via one or more local area networks (LAN) and/or one or more wide area networks (WAN) using communication interface 140.
As discussed herein, computing system 100 may be configured as a special-purpose computing device providing navigational and executional operations for tasks in an application ecosystem requested in a natural language query from a user. In particular, computing device 105 may perform tasks (e.g., process, steps, methods and/or functionality) in response to processor 115 executing program instructions contained in a computer readable medium, such as system memory 125. The program instructions may be read into system memory 125 from another computer readable medium, such as data storage device 120, or from another device via the communication interface 140 or server within or outside of a cloud environment. In embodiments, an operator may interact with computing device 105 via the one or more input devices 130 and/or the one or more output devices 135 to facilitate performance of the tasks and/or realize the end results of such tasks in accordance with aspects of the present disclosure. In additional or alternative embodiments, hardwired circuitry may be used in place of or in combination with the program instructions to implement the tasks, e.g., steps, methods and/or functionality, consistent with the different aspects of the present disclosure. Thus, the steps, methods and/or functionality disclosed herein can be implemented in any combination of hardware circuitry and software.
As depicted in
Cloud computing environment 200 may be configured such that cloud resources 205 provide computing resources to client devices 210 through a variety of service models, such as Software as a Service (SaaS), Platforms as a service (PaaS), Infrastructure as a Service (IaaS), and/or any other cloud service models. Cloud resources 205 may be configured, in some cases, to provide multiple service models to a client device 210. For example, cloud resources 205 can provide both SaaS and IaaS to a client device 210. Cloud resources 205 may be configured, in some cases, to provide different service models to different client devices 210. For example, cloud resources 205 can provide SaaS to a first client device 210 and PaaS to a second client device 210.
Cloud computing environment 200 may be configured such that cloud resources 205 provide computing resources to client devices 210 through a variety of deployment models, such as public, private, community, hybrid, and/or any other cloud deployment model. Cloud resources 205 may be configured, in some cases, to support multiple deployment models. For example, cloud resources 205 can provide one set of computing resources through a public deployment model and another set of computing resources through a private deployment model.
In embodiments, software and/or hardware that performs one or more of the aspects, functions and/or processes described herein may be accessed and/or utilized by a client (e.g., an enterprise or an end user) as one or more of an SaaS, PaaS and IaaS model in one or more of a private, community, public, and hybrid cloud. Moreover, although this disclosure includes a description of cloud computing, the systems and methods described herein are not limited to cloud computing and instead can be implemented on any suitable computing environment.
Cloud resources 205 may be configured to provide a variety of functionality that involves user interaction. Accordingly, a user interface (UI) can be provided for communicating with cloud resources 205 and/or performing tasks associated with cloud resources 205. The UI can be accessed via a client device 210 in communication with cloud resources 205. The UI can be configured to operate in a variety of client modes, including a fat client mode, a thin client mode, or a hybrid client mode, depending on the storage and processing capabilities of cloud resources 205 and/or client device 210. Therefore, a UI can be implemented as a standalone application operating at the client device in some embodiments. In other embodiments, a web browser-based portal can be used to provide the UI. Any other configuration to access cloud resources 205 can also be used in various implementations.
Server 302 includes, in a server memory 304, such as memory 125 described with respect to
The server 302 also includes, in a server memory 304, a query results processing module 310 having functionality to receive actionable navigational and executional operations of the application ecosystem determined by the machine learning classifiers 308 in a request from a user and generate a search results page that displays a link to a page in the application ecosystem for navigational operations that can be loaded on a user's device if selected by a user and displays a link to a page, in embodiments, for executional operations that can be loaded for performing the task with input parameters populated in graphical user interface elements of the display page derived from the request.
The server 302 further includes, in a server memory 304, a user interface builder module 312 having functionality, in embodiments, to generate graphical user interface elements for a display page that performs the task with input parameters populated in elements of the display page. In an implementation, the graphical user interface elements may be provided for the application in the ecosystem that performs the executional operation.
Additionally, the server 302 includes, in a server memory 304, an application execution module 314 having functionality to invoke execution of the navigational and executional operations in the application ecosystem. In an implementation, the application execution module 314 may invoke application programming interfaces of the applications 316 to perform the navigational and executional operations, in embodiments.
And the server 302 includes, in a server memory 304, the applications 316 of the ecosystem that are available to client devices. For example, the applications of an ecosystem for human resources provided as a service to client devices in the cloud computing environment 300 may include applicant tracking, performance management, payroll, benefits administration, and onboarding among others applications.
The action request handler module 306, the machine learning classifiers 308, the query results processing module 310, the user interface builder module 312, the application execution module 314, and the applications 316 may each comprise one or more program modules such as program modules 165 described with respect to
In accordance with aspects of the disclosure,
The executional operation, “Request Time off,” in the declaration 404 may include three parameters declared for the executional operation, “timeOffType” 408 assigned the value of “casual leave”, “startDate” 410 assigned the value of “2021-4-15T00:00+000”, and “endDate” 412 assigned the value of “2021-4-19T00:00+000.” Each of these values assigned to the parameters in the declaration 404 may be derived from the natural language request by a machine learning classifier, in embodiments. For an actionable execution request determined by a machine classifier, the system may generate a display page, such as display page 414, for performing the task with input parameters populated in graphical user interface elements of the display page derived from the request. In embodiments, a user can change the values of the parameters of the executional operations, such as “casual leave” as an example, and can select a graphical user interface control to submit the request to perform the executional operation.
Moreover, the exemplary flowcharts and/or block diagrams can be illustrative of a system, a process, and/or a computer program product and related functionality implemented on the computing system of
At step 602, the system receives a natural language request from a user to perform a task in an application ecosystem. For example, a user may access navigational and executional operations within an application ecosystem, in embodiments, by entering a natural language request in the application ecosystem, for instance, as a search query in a global search bar of the home page of the application ecosystem displayed on a client device. In embodiments, an action request handler 306 implemented on a server 302, each described with respect to
At step 604, the system determines an actionable task from the natural language request to perform in the application ecosystem. In embodiments, the system applies machine learning classifiers to determine an actionable navigational or executional operation of the application ecosystem in the request. For example, the natural language request entered by a user may be an executional operation requesting time off. In an implementation, there may be a machine learning classifier for each navigational and executional task in the application ecosystem as an example. In embodiments, machine learning classifiers 308 as described with respect to
At step 606, the system may determine required input parameters from the natural language request to perform the actionable task. As part of understanding what action the user is requesting, the system, in embodiments, applies machine learning classifiers to determine required parameters of an actionable task of the application ecosystem in the request. For example, a natural language request for time off entered by a user may include parameters such as the type of time off, a start date and an end date for the period of time off. The machine learning classifiers 308 as described with respect to
At step 608, the system displays a user interface screen to perform the task to the user with required input parameters populated in elements of the user interface screen derived from the natural language request. For an actionable execution request determined by a machine classifier, a UI builder module 312 described with respect to
At step 610, the system receives confirmation from the user to perform the task with the populated parameters. In embodiments, a user on a client device can select a graphical user interface control to confirm the request to perform the executional operation, such as the graphical user interface control labeled “Submit Request” on display page 414 shown in
At step 612, the system executes the task with the populated parameters in the application ecosystem. For example, the application execution module 314 described with respect to
At step 702, a client device receives a natural language request from a user to perform a task in an application ecosystem. For example, a user may enter a natural language request in the application ecosystem as a search query in a global search bar of the home page of the application ecosystem displayed on a client device such as client device 210 described with respect to
At step 704, the client device sends the natural language request to perform the task to a cloud-based service providing the application ecosystem. For example, the client device, such as client device 210 described with respect to
At step 706, the client device can receive instructions to display a user interface screen for performing the task with required input parameters populated in elements of the user interface screen derived from the natural language request. For example, the client device, such as client device 210 described with respect to
At step 708, the client device can display the user interface screen for performing the task with required input parameters populated in elements of the user interface screen derived from the natural language request. For example, a display page, such as display page 414 shown in
At step 710, the client device can receive an indication from the user to perform the task with the input parameters populated in elements of the user interface screen. In embodiments, a user on a client device, such as client device 210 described with respect to
At step 712, the client device can send the indication from the user to perform the task to the cloud-based service providing the application ecosystem. Upon a user selecting a graphical user interface control to confirm the request to perform the executional operation, such as the graphical user interface control labeled “Submit Request” on display page 414 shown in
At step 802, a server receives a natural language request input by a user to perform a task in an application ecosystem that may include input parameters. For example, a server in a cloud computing environment, such as server 302 described with respect to
At step 804, the server determines an actionable task in the application ecosystem from the natural language request. In embodiments, the server applies machine learning classifiers to determine an actionable navigational or executional operation of the application ecosystem in the request. For example, the natural language request entered by a user may be an executional operation requesting time off. In an implementation, there may be a machine learning classifier for each navigational and executional task in the application ecosystem, as an example. In embodiments, machine learning classifiers 308 implemented on a server 302 described with respect to
At step 806, the server generates a user interface screen for performing the task with required input parameters populated in elements of the user interface screen. For an actionable execution request determined by a machine classifier, a UI builder module 312 implemented on a server 302 described with respect to
At step 808, the server sends to the user device the user interface screen for performing the task with required input parameters populated in elements of the user interface screen. In embodiments, an action request handler 306 implemented on a server 302 described with respect to
At step 810, the server receives an indication from the user device to perform the task with the input parameters populated in elements of the user interface screen. Upon a user selecting a graphical user interface control to confirm the request to perform the executional operation, such as the graphical user interface control labeled “Submit Request” on display page 414 shown in
At step 812, the server performs the task with the input parameters populated in elements of the user interface screen. For example, the application execution module 314 implemented on server 302 described with respect to
At step 902, the server inputs the natural language request to each machine learning classifier for a parameter of an actionable task for each application in the application ecosystem. In embodiments, the action request handler 306 implemented on a server 302 described with respect to
At step 904, each machine learning classifier for a parameter of an actionable task determines probabilities that the request includes required parameters of an actionable task for a respective application in the application ecosystem. In embodiments, each machine learning classifier 318 for a parameter of an actionable task implemented on a server 302 described with respect to
In embodiments, a machine learning classifier may be built and trained as a global supervised text classification model for each navigational and executional operation or action and for each parameter of each navigational and executional operation or action. For example, the executional operation, “Request Time off,” has three parameters for the executional operation, “timeOffType,” “startDate,” and “endDate.” Four machine learning classifiers can be built for this executional operation, one for “Request Time off,” one for “timeOffType,” one for “startDate,” and one for “endDate.” A single text classification model may be built and trained across all users for each model of a navigational and executional operation or action. In embodiments, a text classification model can be built for each parameter of each navigational and executional operation or action and may be trained across all user or may be trained for specific users. For example, the model for the parameter of “timeOffType” for the executional operation, “Request Time off,” may be trained for a specific user. The model for the parameter of “associate” for an executional operation of “View Org Info” to view the name of an associate in an organization may be trained for instance across all users, in embodiments.
Each of these text classification models may be built as an ensemble machine learning text classifier using three types of estimators: a naïve bayes classifier using full word features such as unigram and bigrams, naïve bayes classifier using sub-string and prefix features such as sub-strings of 1-5 characters and prefix strings of up-to 10 characters, and an estimator pipeline that combines text embedding with a knn-classfier using an approximate nearest-neighbor search algorithm. The text embedding can be pre-trained multi-lingual text embeddings, in embodiments.
For a rolling window of 1-3 words in the natural language request, each machine learning classifier for a parameter of an actionable task can predict probabilities for up to 5 candidates that the request includes parameters identified for an actionable task of a respective application in the application ecosystem at step 904 and can predict probabilities for up to 5 candidates that the request includes an identified actionable task at step 906.
At step 906, each machine learning classifier for each navigational and executional operation determines probabilities that the request includes an actionable task for a respective application in the application ecosystem for parameters identified in the request. In embodiments, each machine learning classifier 318 for each navigational and executional operation implemented on a server 302 described with respect to
At step 908, each machine learning classifier for each navigational and executional operation determines probabilities that the request includes both an actionable task and its required parameters for a respective application in the application ecosystem. In embodiments, each machine learning classifier 318 for each navigational and executional operation implemented on a server 302 described with respect to
At step 910, the system ranks actionable task candidates by their probabilities that the request includes both an actionable task and its required parameters for an application in the application ecosystem. In embodiments, a query results processing module 310 implemented on a server 302 described with respect to
At step 912, the system selects the actionable task with the highest probability that the request includes both an actionable task and its required parameters for an application in the application ecosystem. In embodiments, a query results processing module 310 implemented on a server 302 described with respect to
The foregoing examples have been provided merely for the purpose of explanation and are in no way to be construed as limiting of the present disclosure. While aspects of the present disclosure have been described with reference to an exemplary embodiment, it is understood that the words which have been used herein are words of description and illustration, rather than words of limitation. Changes may be made, within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of the present disclosure in its aspects. Although aspects of the present disclosure have been described herein with reference to particular means, materials and embodiments, the present disclosure is not intended to be limited to the particulars disclosed herein; rather, the present disclosure extends to all functionally equivalent structures, methods and uses, such as are within the scope of the appended claims.
Number | Date | Country | |
---|---|---|---|
Parent | 17704900 | Mar 2022 | US |
Child | 18737776 | US |