This application claims the benefit of Indian Appl. No. 201941006158, filed Feb. 15, 2019. This application is incorporated herein by reference in its entirety to the extent consistent with the present application.
Organizations that provide resources to individuals and other organizations usually have some sort of resource management system to control the flow of resources to requestors. For example, organizations that sell goods or services oftentimes utilize some sort of order management system or mechanism. In general, order management is a collection of processes and actions to successfully deliver a product or service to a customer. These processes and actions typically involve receiving order requests from customers, organizing the order requests, tracking the order requests, and then fulfilling the orders. Organizations may allocate resources, including computing resources, software, and manpower to their order management systems, in part, because streamlined and efficient order management may support increased revenue along with satisfied and repeat customers as a result of customer orders being timely fulfilled. Similarly, some organizations that provide computing resources to individuals and other organizations may have some sort of computing resource management system or mechanism to facilitate timely provisioning of requested resources to requestors.
A part of managing the provisioning of resources includes risk mitigation as relates to the requestor. In the example of the organization providing goods or services, the organization may allow certain customers to order and receive a certain amount of goods or services on “credit.” In this context, “credit” refers to fulfillment prior to payment for the goods or services. However, as a part of the risk mitigation, the organization may limit the amount of credit extended to a particular customer. Credit may be limited based on factors such as the customer's revenue or income, prior credit history, prior payment history with the organization, etc. In the computer resource management example, certain constraints may likewise be placed on a requestor's access to computing resources. In this computer resource management context, a requestor may represent a customer (e.g., of cloud-based resources or another computer process requesting additional resources automatically).
A problem arises, however, when a customer or requestor of resources places an order or request that causes the customer or requestor to exceed an allotted credit or resource limit. In such a case, the customer's order or requestor's request may be placed on hold pending further processing. This further processing may involve additional computing and manpower resources to resolve the hold. Additionally, further processing may delay or may even inhibit revenue flow, for example, if the order or request is denied. Moreover, these holds often lead to decreased customer or requestor satisfaction due to a delay or denial in fulfilling the order or request. These reactive responses to resolving holds are, thus, undesirable for both the organization and the customer or requestor.
The present disclosure may be better understood from the following detailed description when read with the accompanying Figures. It is emphasized that, in accordance with standard practice in the industry, various features are not drawn to scale. In fact, the dimensions or locations of functional attributes may be relocated or combined based on design, security, performance, or other factors known in the art of computer systems. Further, order of processing may be altered for some functions, both internally and with respect to each other. That is, some functions may not require serial processing and therefore may be performed in an order different than shown or possibly in parallel with each other. For a detailed description of various examples, reference will now be made to the accompanying drawings, in which:
Examples of the subject matter claimed below will now be disclosed. In the interest of clarity, not all features of an actual implementation are described in each example of this specification. It will be appreciated that in the development of any such actual example, numerous implementation-specific decisions may be made to achieve the developer's specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort, even if complex and time-consuming, would be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
Automated systems are provided to predict, using artificial intelligence and machine learning, how to address an order hold scenario for a request for resources, also referred to herein as a resource request. The resource request may be an order of tangible goods or an order for computer resources. For example, a buyer in a business-to-business (“B2B”) transaction places an order apparently in violation of a credit relationship (e.g., credit limit crossing). Systems as disclosed may predict if the order may be placed on hold due to the apparent violation or if temporary waiver of the violation may be acceptable. Similar techniques may be applied to a computer management system where resource constraints are predicted to be violated for an acceptable amount of time (e.g., no system failure). Recommendations may be determined and automatically or manually applied so that the order hold (or resource request denial in the context of a computer management system) may be avoided.
Referring now to
Flow continues to block 110A where predictive analysis begins by retrieving requestor's data to provide as input to a predictive analysis algorithm. The requestor's data may include historical data of previous requests for a resource by the requestor and additional data for the requestor. The additional data may include a resource limit assigned to the requestor (also referred to herein as a threshold limit for the resource), balance history for the requestor (which may include an outstanding payment balance for the requestor), payment history for the requestor, etc. Flow continues to block 115A where the predictive analysis algorithm is invoked using the previously retrieved data for the requestor. Output of the predictive analysis is collected in block 120A. Flow continues to decision 125A where an evaluation of the output of the prediction analysis algorithm is used to determine if a request is predicted to be placed on hold. The predictive algorithm of block 115A may also be used to predict a future request for a resource by the requestor and to predict a hold on fulfilling the future request. If there is no order hold predicted, the NO prong of decision 125A, flow continues to decision 130A. Alternatively, if an order hold is predicted, the YES prong of decision 125A, flow continues to block 140A where a recommendation algorithm uses the prediction analysis algorithm results to determine one or more possible preventative actions to minimize actualization, in other words realization or materialization, of the hold on fulfilling the future request. For instance, the recommendation algorithm may determine one or more recommendations to prevent a request from being placed on hold.
In an example, at least a part of the prediction analysis algorithm may include predicting, based on the historical data of previous requests, a future request for the resource from the requestor and predicting, based on the future request for the resource and the additional data for the requestor, a hold on fulfilling the future request. At least part of the prediction algorithm may also include predicting, based on the payment history, a future payment for the resource and predicting the hold on fulfilling the future request based on the predicted future payment for the resource. Additionally, at least part of the prediction algorithm may also include predicting a date and amount of the future request based on the historical data of previous requests, predicting a date and amount of the future payment based on the payment history, and predicting the hold on fulfilling the future request based on the data and amount of the future request, the date and amount of the future payment, a resource limit, and an outstanding payment balance.
Flow continues to block 145A to evaluate if one or more of the recommendations produced by the recommendation algorithm meets criteria for automatically applying the recommendation without user input. The flow then continues to decision 150A. If decision 150A determines to automatically apply the recommendation, the YES prong of decision 150A, the flow proceeds to block 160A. In block 160A the requestor's account is automatically updated based on the one or more recommendations that met the criteria. Alternatively, the NO prong of decision 150A branches the flow to block 155A. At block 155A, the recommendation can be saved for later presenting to a user, for instance through a user interface. This allows the user to determine whether or not to apply one or more of the recommendations, and then update the requestor's account based on the one or more recommendations. Accordingly, a preventative action can be applied in response to input through the user interface. Decision 125A and blocks 155A and 160A flow to decision 130A after completion. At decision 130A a determination is made to evaluate if more requestors are waiting to have the prediction and recommendation algorithms applied. If there are more requestor's to evaluate, the YES prong of decision 130A, flow continues back to block 110A where the next requestor is chosen. Alternatively, if there are no more requestors to evaluate (e.g., apply the prediction and recommendation algorithms), the NO prong of decision 130A, process flow continues to block 135A where evaluation stops.
Referring now to
Flow continues to block 110B where predictive analysis begins by retrieving the buyer's data to provide as input to a predictive analysis algorithm. Flow continues to block 115B where the predictive analysis algorithm is invoked using the previously retrieved data for the requestor. Output of the predictive analysis is collected in block 120B. For example, the predictive analysis algorithm may predict an amount and a date of a next order and an amount and date of a next payment for the selected buyer. The predictive analysis algorithm may use the predicted next order and predicted next payment to predict whether the anticipated order may be placed on hold. For instance, order history, earlier order holds history, and the buyer's credit profile may be used to predict the amount and date of the next order. Past payment history, payment behavior, and invoice aging may be used to predict the amount and date of the next payment.
Flow continues to decision 125B where an evaluation of the output of the prediction analysis algorithm is used to determine if a predicted order is further predicted to be placed on hold. If there is no order hold predicted, the NO prong of decision 125B, flow continues to decision 130B. Alternatively, if an order hold is predicted, the YES prong of decision 125B, flow continues to block 140B where a recommendation algorithm uses the prediction analysis algorithm results to determine one or more possible recommendations to prevent a predicted order from being placed on hold.
Flow continues to block 145B to evaluate if one or more of the recommendations produced by the recommendation algorithm meets criteria for automatically increasing a credit limit for the buyer. The flow then continues to decision 150B. If decision 150B determines to automatically increase the buyer's credit limit, the YES prong of decision 150B, the flow proceeds to block 160B. In block 160B the buyer's credit limit is temporarily increased. For example, the credit limit may be increased until a predicted payment of the buyer is received. Alternatively, the NO prong of decision 1506 branches the flow to block 155B, where other recommendations may be saved for later presenting to a user. This allows the user to determine whether or not to apply one or more of the recommendations, and then update the buyer's account based on the one or more recommendations. Example recommendations may include requesting an advanced payment for the next order or following up with the buyer for payment.
Decision 125B and blocks 155B and 160B flow to decision 130B after completion. At decision 130B a determination is made to evaluate if more buyers are waiting to have the prediction and recommendation algorithms applied. If there are more buyer's to evaluate, the YES prong of decision 1306, flow continues back to block 1106 where the next buyer is chosen. Alternatively, if there are no more buyers to evaluate (e.g., apply the prediction and recommendation algorithms), the NO prong of decision 130B, process flow continues to block 135B where evaluation stops.
Referring to
A machine-readable storage medium, such as 202 of
Referring now to
Block 305 indicates that prediction analysis may be performed for one or more requests on hold. In one example, prediction analysis may be performed for each request that is on hold. Each request may represent one of potentially many requests for a single requestor that may be on hold. While the process diagram of
Still referring to
Referring to
A machine-readable storage medium, such as 402 of
Having an understanding of the above overview with respect to predicting and resolving request holds for a purchasing/procurement system or a system for provisioning computer resources, this disclosure will now explain a non-limiting example that may implement all or parts of the methods 100A, 100B, and/or 300. Accordingly that above description and the concepts and specifics therein apply to the specific example discussed below. Likewise, the following description of the specification example and the concepts therein apply within context of the above description. Accordingly, the description and concepts and specifics therein with respect to
This example implementation is explained with reference to the figures and includes: an example workflow that an analyst can follow that implements methods for predicting if an order may be placed on hold (
In accordance with the examples, one or more methods are described that may allow a seller to avoid placing a buyer's orders on hold, for instance methods 100A of
To prevent orders from going on hold, a predictive algorithm may be employed. The predictive algorithm may utilize data such as the buyer's order history, history of previous order holds, and other data from the buyer's credit history to predict the date and amount of the next order that will be placed by the buyer. The predictive algorithm, using techniques such as machine learning and artificial intelligence, can produce a prediction that then may initiate an assessment of the buyer's current state in regard to credit limit. The assessment may also utilize past data such as the buyer's payment frequency to predict if an order of the predicted value of the predicted size will be likely to exceed the credit limit. The sources of the data used in the prediction may also include any public source of data in any part of the predictive algorithm's assessment. This additional data about the buyer may include news articles, social media reputations, stock quotes, or any other public or private sources of data that may contribute to increasing the accuracy of the prediction.
As a result of the prediction, a recommended course of action may be formulated based on a level of certainty that may be assigned to the predicted order date and order amount. These recommendations, for example, may include, but are not limited to, instructions such as to proactively ask the buyer for a payment, to recommend a temporary increase in the buyer's credit limit for a period of time or until an event such as the buyer placing an order has occurred, or even to perform no action in the case the prediction is that the buyer will not place an order that exceeds the credit limit. The number of recommendations may be pre-programmed instructions that are configured to be presented based, for example, on the certainty of the prediction, a time window for which the prediction is valid, or even a sliding window scale where the certainty is reduced gradually over time. The possible recommended actions and method of assigning how they are presented is without limit. This predictive analysis may be applied to a single buyer or multiple buyers concurrently so that the recommendations may be executed before an order is placed that may be put on hold. Use of artificial intelligence and machine learning leads to improved efficiency in computing systems that employ the disclosed methods. Moreover, the predictive nature of the algorithms remove the algorithms from the realm of possibility for pen and paper or manual analysis by a user.
A method, e.g., method 100A, of preventing orders from being placed on hold may allow for prediction and quickly determining one or more actions that may be taken by the seller organization, but there are some cases where an order being placed on hold is unavoidable. Some recommendations may require that the buyer participates in executing the recommendation such as when the recommendation involves contacting the buyer and requesting a payment. There are an infinite number of reasons an order hold may occur despite the best efforts to prevent the order hold. When an order is placed on hold, it may be beneficial to reduce the time required to decide a course of action that may be taken to remove the order hold.
Once an order is on hold, a Credit Department, for instance, if unaided by an automated algorithm, may be forced to spend a significant amount of time working to determine the best action to take to remove the hold on the order. Using techniques such as machine learning and artificial intelligence, the same data that is used to predict an order hold as previously described may be utilized to predict the likelihood of an order being released from hold. As a prediction is calculated about the likelihood of an order being released from hold, the system may automatically increase the buyer's credit limit thereby releasing the order from hold based on the certainty of the prediction. For example, the system may determine whether to release the hold based on whether the likelihood of the hold being released exceeds a release threshold. The system may then automatically increase a threshold or resource limit (such as a credit limit) to release or prevent the hold when the likelihood of the hold being released exceeds the release threshold. The release threshold may be based on previous or current holds for the requestor, payment history for the requestor, a credit profile for the requestor, etc. The release threshold may be expressed as an integer, a percentage, etc.
This automated increase of credit limit and release of orders on hold may significantly decrease the amount of time a person spends to research the order holds. In some examples, the system may be configured for rules that determine if a prediction should cause the order hold to be released automatically without user intervention. In other examples, the system present a decision to a user to confirm that an order should be released and present the results of the prediction to aid in choosing to release the order from hold.
Referring now to
Referring now to
Referring now to
The user interface 605 may present a summary 610 at the top of the screen to enable a quick assessment of the magnitude of the number of predicted orders that may be placed on hold. This summary may include additional information such as recommendations for courses of action instead of or in addition to any summary information available. Data 615-1 through 615-4, which may be actioned by a user, may be presented in any order and format. In this example, the data 615-1 through 615-4 is presented as a summary of a buyer's predicted order that may be placed on hold. The application may allow the user to perform an action such as clicking on an area that may be logically representing a single data point for actioning. At the time of the user clicking on a logical area to perform actions on a selected data point, the application may present the user with views that lead the user into indicating the actions for the program to take.
Referring now to
Referring now to
The prediction for an order hold 735 may include information such as the credit exposure that may allow the user to understand the total amount of money extended as credit to the customer. This credit exposure may be the result of a calculation of all open invoices that are not yet paid and all orders that may be in the process of fulfillment that have not yet been billed. The prediction for an order hold 735 may additionally include the credit limit assigned to the buyer that allows the seller to limit the credit exposure to the buyer. Recommendations 730 may indicate the recommended course of action that the user may take by selecting button 715. An input to the application from the button 715 being selected may apply the recommendation to a buyer's account. Alternatively, and input to the application from the user selecting button 720 may display a more detailed user interface that may allow the user to implement all or part of the recommendation 730.
Referring now to
Referring now to
Each of these networks can contain wired or wireless programmable devices and operate using any number of network protocols (e.g., TCP/IP) and connection technologies (e.g., WiFi® networks, or Bluetooth®. In another embodiment, customer network 802 represents an enterprise network that could include or be communicatively coupled to one or more local area networks (LANs), virtual networks, data centers and/or other remote networks (e.g., 808, 810). In the context of the present disclosure, customer network 802 may include multiple devices configured with software that executes the disclosed order hold prevention and held order release prediction algorithms such as those described above. Also, one of the many computer storage resources in customer network 802 (or other networks shown) may be configured to store any customer or order data utilized by any algorithm described in the disclosed examples.
As shown in
Network infrastructure 800 may also include other types of devices generally referred to as Internet of Things (IoT) (e.g., edge IOT device 805) that may be configured to send and receive information via a network to access cloud computing services or interact with a remote web browser application (e.g., to receive configuration information).
Network infrastructure 800 also includes cellular network 803 for use with mobile communication devices. Mobile cellular networks support mobile phones and many other types of mobile devices such as laptops etc. Mobile devices in network infrastructure 800 are illustrated as mobile phone 804D, laptop computer 804E, and tablet computer 804C. A mobile device such as mobile phone 804D may interact with one or more mobile provider networks as the mobile device moves, typically interacting with a plurality of mobile network towers 820, 830, and 840 for connecting to the cellular network 803. In the context of the current disclosed order hold prediction and held order release prediction algorithms, operations to access and process data may be facilitated by systems communicating through network infrastructure 800.
Although referred to as a cellular network in
In
As also shown in
Computing device 900 may also include communications interfaces 925, such as a network communication unit that could include a wired communication component and/or a wireless communications component, which may be communicatively coupled to processor 905. The network communication unit may utilize any of a variety of proprietary or standardized network protocols, such as Ethernet, TCP/IP, to name a few of many protocols, to effect communications between devices. Network communication units may also comprise one or more transceiver(s) that utilize the Ethernet, power line communication (PLC), WiFi, cellular, and/or other communication methods.
As illustrated in
Persons of ordinary skill in the art are aware that software programs may be developed, encoded, and compiled in a variety of computing languages for a variety of software platforms and/or operating systems and subsequently loaded and executed by processor 905. In one embodiment, the compiling process of the software program may transform program code written in a programming language to another computer language such that the processor 905 is able to execute the programming code. For example, the compiling process of the software program may generate an executable program that provides encoded instructions (e.g., machine code instructions) for processor 905 to accomplish specific, non-generic, particular computing functions.
After the compiling process, the encoded instructions may then be loaded as computer executable instructions or process steps to processor 905 from storage device 920, from memory 910, and/or embedded within processor 905 (e.g., via a cache or on-board ROM). Processor 905 may be configured to execute the stored instructions or process steps in order to perform instructions or process steps to transform the computing device into a non-generic, particular, specially programmed machine or apparatus. Stored data, e.g., data stored by a storage device 920, may be accessed by processor 905 during the execution of computer executable instructions or process steps to instruct one or more components within the computing device 900.
A user interface (e.g., output devices 915 and input devices 930) can include a display, positional input device (such as a mouse, touchpad, touchscreen, or the like), keyboard, or other forms of user input and output devices. The user interface components may be communicatively coupled to processor 905. When the output device is or includes a display, the display can be implemented in various ways, including by a liquid crystal display (LCD) or a cathode-ray tube (CRT) or light emitting diode (LED) display, such as an organic light emitting diode (OLED) display. Persons of ordinary skill in the art are aware that the computing device 900 may comprise other components well known in the art, such as sensors, powers sources, and/or analog-to-digital converters, not explicitly shown in
Certain terms have been used throughout this description and claims to refer to particular system components. As one skilled in the art will appreciate, different parties may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not function. In this disclosure and claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . .” Also, the term “couple” or “couples” is intended to mean either an indirect or direct wired or wireless connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections. The recitation “based on” is intended to mean “based at least in part on.” Therefore, if X is based on Y, X may be a function of Y and any number of other factors.
The above discussion is meant to be illustrative of the principles and various implementations of the present disclosure. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
Number | Date | Country | Kind |
---|---|---|---|
201941006158 | Feb 2019 | IN | national |