Heuristic credit risk assessment engine

Information

  • Patent Grant
  • 11544783
  • Patent Number
    11,544,783
  • Date Filed
    Wednesday, August 5, 2020
    3 years ago
  • Date Issued
    Tuesday, January 3, 2023
    a year ago
Abstract
A heuristic engine includes capabilities to collect an unstructured data set and a current business context to calculate a credit worthiness score. Providing a heuristic algorithm, executing within the engine, with the data set and the context may allow determination of predicted future contexts and recommend subsequent actions, such as assessing a credit risk of a customer transaction and reducing the risk of customer transactions by processing the available data. Such heuristic algorithms may learn from past data transactions and appropriate correlations with events and available data.
Description
FIELD OF THE INVENTION

The disclosure generally relates to systems, methods, apparatus, and non-transitory computer readable media for using heuristic algorithms to assess a credit risk of a customer transaction and, more particularly, to process natural language inputs and unstructured data sets to reduce a credit risk of customer transactions by processing available past transaction data.


BACKGROUND

Organizations involved in customer service activities often process large amounts of unstructured data to make decisions while interacting with a customer in real-time. For example, in the case of a customer service representative speaking on the telephone with a customer experiencing an issue with a product or service, appropriate solutions may include a combination of timeliness of response and accuracy in content.


Such unstructured data may include voluminous transaction records spanning decades, unstructured customer service data, or real-time transcripts of customer service interactions with scattered contextual indicators. To reasonably expect a customer service representative to effectively leverage such large data sets in real-time places an unreasonable burden on a customer service representative. However, failing to do so robs the customer service representative of vital context not readily apparent, and the wealth of knowledge gained throughout the history of an organization that would otherwise need to be distilled to briefing materials and expensively trained over time. Thus, organizations may value tools to rapidly process large data sets, to infer context, suggest lessons learned based upon transaction data, while learning through successive process iterations. Furthermore, appropriate application of such tools may provide a competitive advantage in a crowded and competitive customer service industry.


In an effort to automate and provide better predictability of customer service experiences, many organizations develop customer relationship management (CRM) software packages. Organizations that develop these software packages often develop custom solutions, at great expense, to best meet the needs of their customers in unique industries. Such tools while providing a great level of detail for the customer service representative, lack the flexibility to react to changing business conditions or fully exploit the underlying technology, driving additional cost into an already expensive solution.


Some organizations where able to make concessions on customized solutions turn to off-the-shelf or commercially available software solutions that reduce the overall cost of implementation. Such solutions may provide customer service representative prompting tools with question and answer formats that allow for consistency of customer experience, however, at the expense of a less personalized experience required in many industries. While more flexible than fully-custom solutions, the impersonal question-answer format of customer interaction may not improve without costly software revisions, rarely performed by original equipment manufacturers (OEMs) of off-the-shelf solutions.


The ability for a customer service experience to learn and improve over successive iterations remains paramount for organizations to offer discriminating customer service experiences. Often the burden of continual improvement falls to the customer service representative, as a human being able to adapt and learn to changing conditions more rapidly even within the confines of a rigid customer service software application. However, with the advent of outsourcing prevalent in the customer service industry, the customer service representative may lack much of the necessary context required to provide high levels of relevant customer service. This lack of context in an interconnected company is less an issue of distance and more an issue of data access and the ability to contextually process data to present relevant solutions in a timely manner.


SUMMARY

One exemplary embodiment includes a computer-implemented method, executed with a computer processor, that generates a credit score. This method may include retrieving an un-structured data set including an aggregated transaction set that includes a plurality of users and at least one correlation of a user to a credit score. This method may include receiving a plurality of financial transactions, accessing and executing a heuristic algorithm to generate a credit score using the aggregated transaction set, the correlation, and/or the plurality of financial transactions. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


Yet another alternative embodiment includes a computer-implemented method, executed with a computer processor, that generates cross-selling recommendations using an aggregated customer transaction list. The method may include retrieving an aggregated transaction list from a plurality of customers and receiving a natural language input in a customer service environment. The method may also include accessing and executing a heuristic algorithm to generate at least one product recommendation using the language input and the transaction list. A product category of the recommendation, for example, may correlate with a predicted need. The method may include receiving an indication of interest in the recommendation and/or updating the algorithm using a calculated correlation between the recommendation and the indication. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


Still another embodiment includes a computer-implemented method, executed with a computer processor, that predicts an impact on a book of business by a change in an offered credit interest rate. The method may include retrieving an aggregated behavior list from a plurality of customers including offered credit interest rate data, and/or receiving the offered credit interest rate. The method may also include accessing and executing a heuristic algorithm to generate a predicted impact on a book of business, including the number of customers using the offered credit interest rate and the behavior list. Further, the method may include receiving an actual behavior with a human machine interface and/or updating the algorithm using a calculated correlation between the offered rate and the behavior. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In yet another embodiment, a computer-implemented method, executed in a computer processor, includes targeting a portion of a business process for modification. The method may include retrieving an un-structured transaction set from a plurality of customers including a time associated with a plurality of portions of the business process. Furthermore, the process may include accessing and executing a heuristic algorithm to generate an indication associated with the portion of the business process that exceeds a threshold required for modification, using the un-structured transaction set. Still further, the method may include receiving a quantified impact on the portion of the business process and/or updating the algorithm using the quantified impact. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


An exemplary embodiment includes a computer-implemented method, executed with a computer processor, that generates a financial literacy suggestion using a transaction history. The method may include retrieving an un-structured transaction set, associated with a customer, accessing and executing a heuristic algorithm to generate the financial literacy suggestion using the transaction history. Furthermore, the method may include receiving an indication of relevance from the customer and updating, the algorithm using a calculated correlation between the suggestion and the relevance. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


Exemplary embodiments may include computer-implemented methods that may in other embodiments include apparatus configured to implement the method, and/or non-transitory computer readable mediums comprising computer-executable instructions that cause a processor to perform the method.


Advantages will become more apparent to those skilled in the art from the following description of the preferred embodiments which have been shown and described by way of illustration. As will be realized, the present embodiments may be capable of other and different embodiments, and their details are capable of modification in various respects. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS

The Figures described below depict various aspects of the system and methods disclosed herein. It should be understood that each figure depicts an aspect of a particular aspect of the disclosed system and methods, and that each of the Figures is intended to accord with a possible aspect thereof. Further, wherever possible, the following description refers to the reference numerals included in the following Figures, in which features depicted in multiple Figures are designated with consistent reference numerals.


There are shown in the Figures arrangements which are presently discussed, it being understood, however, that the present embodiments are not limited to the precise arrangements and instrumentalities shown, wherein:



FIG. 1 illustrates an exemplary computer system to assess credit in accordance with one aspect of the present disclosure;



FIG. 2 illustrates an exemplary computer-implemented method to assess credit in accordance with one aspect of the present disclosure;



FIG. 3 illustrates an exemplary computer system to enable cross-selling in accordance with one aspect of the present disclosure;



FIG. 4 illustrates an exemplary computer-implemented method to enable cross-selling in accordance with one aspect of the present disclosure;



FIG. 5 illustrates an exemplary computer system to assess impact to a book of business in accordance with one aspect of the present disclosure;



FIG. 6 illustrates an exemplary computer-implemented method to assess impact to a book of business in accordance with one aspect of the present disclosure;



FIG. 7 illustrates an exemplary computer system to assess a business process impact in accordance with one aspect of the present disclosure;



FIG. 8 illustrates an exemplary computer-implemented method to assess a business process impact in accordance with one aspect of the present disclosure;



FIG. 9 illustrates an exemplary computer system to generate financial literacy recommendations in accordance with one aspect of the present disclosure;



FIG. 10 illustrates an exemplary computer-implemented method to generate financial literacy recommendations in accordance with one aspect of the present disclosure;



FIG. 11 illustrates an exemplary computing system to in accordance with one aspect of the present disclosure; and



FIG. 12 illustrates an exemplary article of manufacture in accordance with one aspect of the present disclosure.





The Figures depict preferred embodiments for purposes of illustration only. Alternative embodiments of the systems and methods illustrated herein may be employed without departing from the principles of the invention described herein.


DETAILED DESCRIPTION

Various embodiments of the present disclosure include the collection of unstructured data sets together with a current context. Heuristic algorithms processing these unstructured data sets together the context may allow calculation of a future context, and the presentation of context relevant data that improves over time. By subsequently training the heuristic algorithm with the outcome of a current and future predicted context, and the relevance of presented data, the heuristic algorithm may improve its efficiency as the unstructured data set grows.


Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical. Numerous alternative embodiments may be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.


Credit Risk Assessment


FIG. 1 illustrates a block diagram of an exemplary computer system 100 to determine a credit risk assessment. The exemplary system 100 enables a customer 150 to interface with a terminal 155 to perform a transaction and allows a customer service representative 140, using a service terminal 145, to assess a credit risk associated with a transaction. The customer 150 and service representative 140 may communicate through a human-machine interface 135, that together with a processor 120, and network interface 110, comprise a heuristic engine 115. In other embodiments, the heuristic engine 115 may include a variety of memory devices, interface devices, and processing devices, to execute required functions. In one embodiment, the processor 120 may interface to a heuristic server 125, a transaction server 130, and a remote server 105.


In accordance with one aspect of the present disclosure, the system 100 may perform the computer-implemented method 1000, as illustrated in FIG. 10. However, in one embodiment, the method 1000 does not, or may not, specifically require the system 100, nor do the elements included therein require a particular arrangement, to perform the method steps illustrated in the process 1000.


Method 200 may include a customer, such as the customer 150 from FIG. 1, that initiates a transaction requiring a credit score, or credit risk assessment (block 205). In one embodiment, the processor 120 may retrieve an aggregated user data set and corresponding credit scores, for example from the remote server 105, or the transaction server 130 (block 210). In one embodiment (block 220), a customer may provide background data related to a transaction set. For example, the processor 120 may encode a customer background based upon, in one embodiment, natural language inputs (block 215), for example the background data related to the transaction set, or otherwise. In one embodiment (block 225), the processor 120 may retrieve a heuristic algorithm from the heuristic server 125.


The processor 120 may execute the algorithm with the data set (block 230) and the background data. In one exemplary embodiment the processor may generate a correlation between a credit score and a subset of the background data (block 235). The service representative 140 may provide a credit score to the user, using the service terminal 145 and the user terminal 155 (block 240). In one embodiment (block 245), the processor 120 may calculate the proposed transaction with the calculated credit score, and (block 250) execute the transaction if the credit score exceeds a predetermined threshold.


Cross-Selling Recommendation


FIG. 3 illustrates a block diagram of an exemplary computer system 300 to generate cross selling recommendations based upon a transaction set. The exemplary system 300 enables a user 340, using a computer terminal 345, to perform a transaction, and receive a recommendation. The terminal 345 may communicatively couple through a human-machine interface 335 to a projector screen 350 and telephone 354 interfacing with a customer service representative 352. The human-machine interface 335 interfaces to a computer processor 320 that likewise interfaces to a heuristic server 325, a transaction server 330, and a network interface 310. The human-machine interface 335, processor 320, and network interface 310 together comprise, in one embodiment, a heuristic engine 315. In other embodiments, the heuristic engine 315 may include a variety of memory devices, interface devices, and processing devices, to execute required functions. In one embodiment, the network interface 310 may communicatively couple to a remote server 305.


In accordance with one aspect of the present disclosure, the system 300 may perform the computer-implemented method 400, as illustrated in FIG. 4. However, in one embodiment, the method 400 does not, or may not, specifically require the system 300, nor do the elements included therein require a particular arrangement, to perform the method steps illustrated in the process 400.


The method 400, in one embodiment, includes a user, such as the customer 340 of FIG. 3, that initiates a transaction (block 405). The processor 320 may retrieve (block 410) an aggregated transaction list from, for example, the remote server 305, or the transaction server 330. In one embodiment, (block 415), the user or customer may speak in natural language, in a manner wherein the terminal 345 encodes the language. The processor 320 may retrieve a heuristic algorithm, for example from the heuristic server 325 (block 420).


In one embodiment, the processor 320 may execute the algorithm with the transaction list and the natural language input. The processor 320 may calculate a correlation with the natural language and a product category (block 430). According to one embodiment, the service representative 352 may communicate the product category to the customer 340. The processor 320 may calculate a correlation between an actual need and the category (block 440). In one embodiment, the processor 320 may update the heuristic algorithm with the calculated correlation.


Business Portfolio Impact Calculation


FIG. 5 illustrates a block diagram of an exemplary computer system 500 to indicate an impact on a book of business based upon a change in an offered credit interest rate. The exemplary system 500 enables a user 555 to interface, for example, with a cellular telephone 560, to initiate a transaction that may result in an interest rate change to an existing or future product. The cellular telephone 560 may communicate over a wireless protocol 565 through a wireless access point 550 that communicatively couples to a human-machine interface 545. A customer service representative 505 may, in one embodiment, interface with a tablet computer 515 and a telephone 510, communicatively coupled to a network interface 520. The network interface 520 may interface to a remote server 530, and a computer processor 540. The computer processor 540 may interface to a heuristic server 535, a transaction server 547, and the human-machine interface 545.


In accordance with one aspect of the present disclosure, the system 500 may perform the computer-implemented method 600, as illustrated in FIG. 6. However, in one embodiment, the method 600 does not, or may not, specifically require the system 500, nor do the elements included therein require a particular arrangement, to perform the method steps illustrated in the process 600.


The method 600 includes, in one embodiment, a user, for example the user 555 of FIG. 5, that initiates a transaction that may require a change to an interest rate on a current or future product (block 605). The processor 540 may retrieve a transaction set correlating transactions to a change in a book of business, for example from the remote server 530 or the transaction server 547 (block 610). In one embodiment, the processor 540 may retrieve a heuristic algorithm from the heuristic server 535. The processor may execute the algorithm with the transaction set to calculate an impact of the interest rate change (block 620). In one exemplary embodiment, the service representative 505 communicates an offer of modified interest rate to the customer 555 (block 625). The processor 540 determines a change in a book of business as a result of a change in the modified interest rate (block 630). In one embodiment, the processor 540 updates the heuristic algorithm, for example in the heuristic server 535, using the change in the book of business.


Business Process Assessment and Modification


FIG. 7 illustrates a block diagram of an exemplary computer system 700 to determine candidate portions of a business process for re-design using aggregate transaction data. The exemplary system 700 enables a user 760 to interface with a user terminal 765 to, for example, execute a business process. The user terminal 765, and a local service terminal 755 servicing a local service representative 750, may in one embodiment, communicatively couple to a network interface 745. The network interface 745, a heuristic server 735, and a transaction server 740 may each communicatively couple to a computer processor 730. In one embodiment, the computer processor 730 and the network interface 745 together comprise a heuristic engine 725. In other embodiments, the heuristic engine 725 may include a variety of memory devices, interface devices, and processing devices, to execute required functions. A computer network 715 may interconnect a remote server 720, a remote service terminal 710 servicing a remote customer service representative, and the computer processor 730.


In accordance with one aspect of the present disclosure, the system 700 may perform the computer-implemented method 800, as illustrated in FIG. 8. However, in one embodiment, the method 800 does not, or may not, specifically require the system 700, nor do the elements included therein require a particular arrangement, to perform the method steps illustrated in the process 800.


Process 800 includes, in one embodiment, a user, such as the user 760 of FIG. 7, that initiates a business process transaction (block 805), as part of an existing business. The processor 730 may execute the business process (block 810) including a plurality of steps and record the time associate with at least one of the steps, storing the data, for example in the remote server 720 or the transaction server 740. However, in some embodiments, the transaction data may reside within the processor for an indeterminate time. The processor 730 may retrieve a heuristic algorithm from the heuristic server 735 (block 815). In one embodiment the processor 730 may execute the algorithm with the times recorded by at least one business process step. In an alternative embodiment (block 825) at least one of the local representative 750 or the remote representative 705 may offer a business process modification to the user 760.


The processor 730 may identify business process steps with maximum time impact to the overall business process (block 830). In one embodiment, (block 835) the processor may update the heuristic algorithm in the heuristic server 735 with modification data on the identified steps.


Financial Literacy Training


FIG. 9 illustrates a block diagram of an exemplary computer system 900 to provide targeted financial literacy recommendations to a customer based upon a transaction history. The exemplary system 900 enables a user 960 to interface with a customer tablet computer 970 and/or a cellular telephone 965 to initiate and perform a financial transaction. The tablet 970 and/or cellular telephone 965 may communicate over a wireless protocol 975 to a wireless access point 955 that communicatively couples to a human-machine interface 945. A customer service representative 905 may, in another embodiment, interface with a service tablet computer 910 and a telephone 915. The tablet 910, telephone 915, and a remote server 925 each interface to a network interface 920. The network interface 920, a heuristic server 935, a transaction server 930, and the human-machine interface 945 each interface to a computer processor 940.


In one exemplary embodiment, a heuristic engine 950 comprises the network interface 920, the computer processor 940, and the human-machine interface 945. In other embodiments, the heuristic engine 935 may include a variety of memory devices, interface devices, and processing devices, to execute required functions.


In accordance with one aspect of the present disclosure, the system 900 may perform the computer-implemented method 1000, as illustrated in FIG. 10. However, in one embodiment, the method 1000 does not, or may not, specifically require the system 900, nor do the elements included therein require a particular arrangement, to perform the method steps illustrated in the process 1000.


The method 1000 includes a user, for example the user 960 of FIG. 9 in one embodiment, that initiates a transaction (block 1005). The processor 940 may retrieve financial literacy suggestions with correlations to transactions (block 1010). In one embodiment, a user 960 may speak in a natural language (block 1015), for example using the cellular telephone 960. The processor 940 may retrieve a heuristic algorithm from the heuristic server 935 (block 1020). In accordance with one embodiment, the processor 940 may execute the algorithm with the literacy suggestions and the natural language data.


The processor 940 may calculate correlation data with the natural language and positive financial literacy outcomes (block 1030), and the representative 905 may communicate the financial literacy suggestion to the user 960 (block 1035). In one embodiment, the processor may calculate a correlation (block 1040) between the detected relevance of the suggestion (block 1035) and an expected financial outcome. The processor 940 may update the heuristic algorithm (block 1045), for example in the heuristic server 935, with the correlation data.



FIG. 11 illustrates an exemplary computing system 1100 in accordance with the embodiments disclosed in FIGS. 1-10 and 12. The exemplary computing system 1100 and components disclosed therein may comprise part, all, or none of the disclosed embodiments of FIGS. 1-10 and 12. The system 1100 includes one or more microprocessors 1105, coupled to supporting devices through multi-access busses 1125 and 1140. Dynamic random access memory 1130 and 1135 may interface to data bus 1125, and store data used by the one or more microprocessors 1105. The system 1100 includes instruction registers 1120 that store executable instructions for the one or more microprocessors 1105, and data registers 1115 that store data for execution. In some embodiments, the system 1100 includes one or more arithmetic co-processors 1110, to assist or supplement the one or more microprocessors 1105.


Data bus 1140 includes interfaces to a graphics interface 1145 that may in some embodiments process and transmit graphical data for a user on a display or similar devices. Likewise, data bus 1140 includes interfaces for a digital I/O interface that processes and transmits, for example, keyboard, pointing device, and other digital and analog signals produced and consumed by users or other machines. A network interface 1155 processes and transmits encoded information over wired and wireless networks to connect the system 1100 to other machines and users. Data bus 1140 also includes at least one interface to a non-volatile memory interface, that may process and transmit data that resides on non-volatile memory devices.



FIG. 12 illustrates a non-transitory computer readable medium 1205, that comprises processor executable instructions 1210. Such processor executable instructions may include instructions executed by the one or more processors 1105 of FIG. 11.


Machine Learning and Other Matters

In certain embodiments, the heuristic engine and algorithms discussed herein may include machine learning, cognitive learning, deep learning, combined learning, and/or pattern recognition techniques. For instance, a processor or a processing element may be trained using supervised or unsupervised machine learning, and the machine learning program may employ a neural network, which may be a convolutional neural network, a deep learning neural network, or a combined learning module or program that learns in two or more fields or areas of interest. Machine learning may involve identifying and recognizing patterns in existing data in order to facilitate making predictions for subsequent data. Models may be created based upon example inputs in order to make valid and reliable predictions for novel inputs.


Additionally or alternatively, the machine learning programs may be trained by inputting sample data sets or certain data into the programs, such as image, mobile device, insurer database, and/or third-party database data. The machine learning programs may utilize deep learning algorithms that may be primarily focused on pattern recognition, and may be trained after processing multiple examples. The machine learning programs may include Bayesian program learning (BPL), voice recognition and synthesis, image or object recognition, optical character recognition, and/or natural language processing—either individually or in combination. The machine learning programs may also include natural language processing, semantic analysis, automatic reasoning, and/or machine learning.


In supervised machine learning, a processing element may be provided with example inputs and their associated outputs, and may seek to discover a general rule that maps inputs to outputs, so that when subsequent novel inputs are provided the processing element may, based upon the discovered rule, accurately predict the correct output. In unsupervised machine learning, the processing element may be required to find its own structure in unlabeled example inputs. In one embodiment, machine learning techniques may be used to extract the relevant data for one or more tokenized icons from user device details, user request or login details, user device sensors, geolocation information, image data, the insurer database, a third-party database, and/or other data.


In one embodiment, a processing element (and/or heuristic engine or algorithm discussed herein) may be trained by providing it with a large sample of images and/or user data with known characteristics or features. Based upon these analyses, the processing element may learn how to identify characteristics and patterns that may then be applied to analyzing user device details, user request or login details, user device sensors, geolocation information, image data, the insurer database, a third-party database, and/or other data. For example, the processing element may learn, with the user's permission or affirmative consent, to identify the user and/or the asset that is to be the subject of a transaction, such as generating an insurance quote or claim, opening a financial account, handling a loan or credit application, processing a financial (such as a credit card) transaction or the like.


ADDITIONAL CONSIDERATIONS

All of the foregoing computer systems may include additional, less, or alternate functionality, including that discussed herein. All of the computer-implemented methods may include additional, less, or alternate actions, including those discussed herein, and may be implemented via one or more local or remote processors and/or transceivers, and/or via computer-executable instructions stored on computer-readable media or medium.


The processors, transceivers, mobile devices, service terminals, servers, remote servers, database servers, heuristic servers, transaction servers, and/or other computing devices discussed herein may communicate with each via wireless communication networks or electronic communication networks. For instance, the communication between computing devices may be wireless communication or data transmission over one or more radio links, or wireless or digital communication channels.


Customers may opt into a program that allows them share mobile device and/or customer, with their permission or affirmative consent, with a service provider remote server. In return, the service provider remote server may provide the functionality discussed herein, including security, fraud, or other monitoring, and generate recommendations to the customer and/or generate alerts for the customers in response to abnormal activity being detected.


The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


Additionally, certain embodiments are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.


The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.


Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.


As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.


As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).


In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description, and the claims that follow, should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.


The patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s).


The systems and methods described herein are directed to improvements to computer functionality, and improve the functioning of conventional computers.


This detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One may be implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.

Claims
  • 1. A computer-implemented method, executed by one or more computer processors, to generate a credit score, comprising: receiving, by the one or more computer processors and from a network interface device, unstructured background data related to an aggregated transaction set and a plurality of pending transactions, the aggregated transaction set including at least a first correlation between a user of a plurality of users and a first credit score;encoding, by the one or more computer processors, the unstructured background data to create encoded background data;accessing, by the one or more computer processors, a heuristic algorithm stored in a second memory;generating, by the one or more computer processors, a second credit score by executing the heuristic algorithm using the aggregated transaction set, the first correlation, and the encoded background data;generating, by the one or more computer processors, a second correlation between the second credit score and a subset of the encoded background data associated with a pending transaction of the plurality of pending transactions;determining, by the one or more computer processors, that the second correlation exceeds a predefined credit score threshold; andexecuting, by the one or more computer processors and based at least in part on the determining, the pending transaction.
  • 2. The computer-implemented method of claim 1, wherein the second credit score complies with a credit reporting standard.
  • 3. The computer-implemented method of claim 1, wherein the aggregated transaction set comprises a plurality of recent transactions.
  • 4. The computer-implemented method of claim 1, wherein the aggregated transaction set comprises transactions related to at least one account.
  • 5. The computer-implemented method of claim 1, wherein the first memory comprises an external transaction server.
  • 6. The computer-implemented method of claim 1, wherein the second memory comprises an external heuristic server.
  • 7. The computer-implemented method of claim 1, wherein the second credit score complies with a standard measure of credit worthiness issued by a regulatory authority.
  • 8. A computer system configured to generate a credit score, the computer system comprising at least one of one or more processors or one or more transceivers, the computer system being configured to: receive, by the at least one of the one or more processors or the one or more transceivers and from a network interface device, unstructured background data related to an aggregated transaction set and a plurality of pending transactions, the aggregated transaction set including at least a first correlation between a user of a plurality of users and a first credit score;encode, by the at least one of the one or more processors or the one or more transceivers, the unstructured background data to create encoded background data;access, by the at least one of the one or more processors or the one or more transceivers, a heuristic algorithm stored in a second memory;generate, by the at least one of the one or more processors or the one or more transceivers, a second credit score by executing the heuristic algorithm using the aggregated transaction set, the first correlation, and the encoded background data;generate, by the at least one of the one or more processors or the one or more transceivers, a second correlation between the second credit score and a subset of the encoded background data, the subset including at least one pending transaction of the plurality of pending transactions;determine, by the at least one of the one or more processors or the one or more transceivers, that the second correlation exceeds a predefined credit score threshold; andexecute, by the at least one of the one or more processors or the one or more transceivers and based at least in part on the determining, the at least one pending transaction.
  • 9. The computer system of claim 8, wherein the second credit score complies with a credit reporting standard.
  • 10. The computer system of claim 8, wherein the aggregated transaction set comprises a plurality of recent transactions.
  • 11. The computer system of claim 8, wherein the aggregated transaction set comprises transactions related to at least one account.
  • 12. The computer system of claim 8, wherein the first memory comprises an external transaction server.
  • 13. The computer system of claim 8, wherein the second memory comprises an external heuristic server.
  • 14. A non-transitory computer readable medium, comprising computer readable instructions that, when executed, cause one or more computer processors to: receive, by the one or more computer processors and from a network interface device, unstructured background data related to an aggregated transaction set and a plurality of pending transactions, the aggregated transaction set including at least a first correlation between a user of a plurality of users and a first credit score;encode, by the one or more computer processors, the unstructured background data to create encoded background data;access, by the one or more computer processors, a heuristic algorithm stored in a second memory;generate, by the one or more computer processors, a second credit score by executing the heuristic algorithm using the aggregated transaction set, the first correlation, and the encoded background data;generate, by the one or more computer processors, a second correlation between the second credit score and a subset of the encoded background data;determine, by the one or more computer processors, that the second correlation exceeds a predefined credit score threshold; andexecute, by the one or more computer processors and based at least in part on the determining, a pending transaction in the subset of the encoded background data.
  • 15. The non-transitory computer readable medium of claim 14, wherein the second credit score complies with a credit reporting standard.
  • 16. The non-transitory computer readable medium of claim 14, wherein the aggregated transaction set comprises a plurality of recent transactions.
  • 17. The non-transitory computer readable medium of claim 14, wherein the aggregated transaction set comprises transactions related to at least one account.
  • 18. The non-transitory computer readable medium of claim 14, wherein the first memory comprises an external transaction server.
  • 19. The non-transitory computer readable medium of claim 14, wherein the second memory comprises an external heuristic server.
  • 20. The non-transitory computer readable medium of claim 14, wherein the second credit score complies with a standard measure of credit worthiness issued by a regulatory authority.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of, and claims priority to, U.S. patent application Ser. No. 15/495,621, filed on Apr. 24, 2017, now U.S. Pat. No. 10,769,722 issued on Sep. 8, 2020, which claims the benefit of U.S. Provisional Patent Application Nos. 62/337,711 and 62/335,374, filed respectively on May 12, 2016 and May 17, 2016, and U.S. Provisional Application Nos. 62/368,448, 62/368,406, 62/368,359, 62/368,588, 62/368,572, 62/368,548, 62/368,536, 62/368,525, 62/368,512, 62/368,503, 62/368,332, 62/368,298, 62/368,271, filed on Jul. 29, 2016, the disclosures of which are hereby incorporated herein by reference.

US Referenced Citations (142)
Number Name Date Kind
5434933 Karnin et al. Jul 1995 A
6658393 Basch et al. Dec 2003 B1
7266537 Jacobsen et al. Sep 2007 B2
7765557 Young Jul 2010 B2
7870431 Cirne et al. Jan 2011 B2
8019678 Wright et al. Sep 2011 B2
8117097 Abbott Feb 2012 B2
8135633 LeBaron et al. Mar 2012 B1
8156132 Kaminski, Jr. Apr 2012 B1
8217396 Yamazaki et al. Jul 2012 B2
8271396 Ronning et al. Sep 2012 B2
8463000 Kaminski, Jr. Jun 2013 B1
8549022 Kaminski, Jr. Oct 2013 B1
8560390 Higgins et al. Oct 2013 B2
8600873 Fisher Dec 2013 B2
8615520 Fallah Dec 2013 B2
8635117 Acuna-Rohter Jan 2014 B1
8649499 Koster et al. Feb 2014 B1
8688579 Ethington et al. Apr 2014 B1
8732023 Mikurak May 2014 B2
8775299 Achanta Jul 2014 B2
8781962 Vasten Jul 2014 B2
8811711 Calman et al. Aug 2014 B2
8826155 Dixon et al. Sep 2014 B2
8838474 Macy et al. Sep 2014 B2
8838491 Erbey et al. Sep 2014 B2
8842331 Enge Sep 2014 B1
8885963 Coleman Nov 2014 B2
8892461 Lau et al. Nov 2014 B2
8996411 Thomas Mar 2015 B2
9117117 Macciola et al. Aug 2015 B2
9349145 Rozman et al. May 2016 B2
9473634 Ouimette et al. Oct 2016 B1
9626453 Koerner et al. Apr 2017 B2
9942250 Stiansen et al. Apr 2018 B2
10069971 Shaw et al. Sep 2018 B1
10341365 Ha Jul 2019 B1
10380685 Phillips Aug 2019 B1
10489861 Ross et al. Nov 2019 B1
10504029 Edelen et al. Dec 2019 B2
10607228 Gai et al. Mar 2020 B1
11270311 Jass Mar 2022 B1
20020194117 Nabe et al. Dec 2002 A1
20030074328 Schiff et al. Apr 2003 A1
20040036961 McGuire, Jr. Feb 2004 A1
20040039691 Barratt et al. Feb 2004 A1
20040190759 Caldwell Sep 2004 A1
20050091524 Abe et al. Apr 2005 A1
20060050932 Tumey et al. Mar 2006 A1
20060212386 Willey et al. Sep 2006 A1
20060253468 Ramsey et al. Nov 2006 A1
20060253578 Dixon et al. Nov 2006 A1
20060253579 Dixon et al. Nov 2006 A1
20060256953 Pulaski et al. Nov 2006 A1
20060265090 Conway et al. Nov 2006 A1
20070168285 Girtakovskis et al. Jul 2007 A1
20070255649 Vagim, III Nov 2007 A1
20080255891 Stone Oct 2008 A1
20080294621 Kanigsberg et al. Nov 2008 A1
20090049550 Shevchenko Feb 2009 A1
20090064323 Lin Mar 2009 A1
20090204530 Hanson Aug 2009 A1
20090254971 Herz et al. Oct 2009 A1
20100017263 Zernik et al. Jan 2010 A1
20100076994 Soroca et al. Mar 2010 A1
20100222053 GiriSrinivasaRao et al. Sep 2010 A1
20100332287 Gates et al. Dec 2010 A1
20100332500 Pan et al. Dec 2010 A1
20110047071 Choudhuri et al. Feb 2011 A1
20110071857 Malov et al. Mar 2011 A1
20110131125 Lawrence et al. Jun 2011 A1
20110137789 Kortina Jun 2011 A1
20110178901 Imrey et al. Jul 2011 A1
20110191250 Bishop et al. Aug 2011 A1
20110225138 Johnston Sep 2011 A1
20110258049 Ramer et al. Oct 2011 A1
20110262536 Jordan et al. Oct 2011 A1
20110295722 Reisman Dec 2011 A1
20110306028 Galimore Dec 2011 A1
20110307258 Liberman et al. Dec 2011 A1
20120116972 Walker May 2012 A1
20120158541 Ganti et al. Jun 2012 A1
20120158572 Dorai et al. Jun 2012 A1
20120158573 Crocker Jun 2012 A1
20120158574 Brunzell et al. Jun 2012 A1
20120259722 Mikurak Oct 2012 A1
20120262461 Fisher et al. Oct 2012 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130080316 Pawlusiak Mar 2013 A1
20130151325 Poidomani et al. Jun 2013 A1
20130218752 Pawlusiak Aug 2013 A1
20130282430 Kannan et al. Oct 2013 A1
20130297412 Batra et al. Nov 2013 A1
20140006166 Chiang et al. Jan 2014 A1
20140081832 Merrill et al. Mar 2014 A1
20140189829 McLachlan et al. Jul 2014 A1
20140214654 Greenbaum et al. Jul 2014 A1
20140222631 Love et al. Aug 2014 A1
20140236829 Ganti Aug 2014 A1
20140249872 Stephan et al. Sep 2014 A1
20140270145 Erhart et al. Sep 2014 A1
20140278839 Lynam et al. Sep 2014 A1
20140324564 Walker et al. Oct 2014 A1
20140337083 Goyal et al. Nov 2014 A1
20150106265 Stubblefield et al. Apr 2015 A1
20150117747 Smith et al. Apr 2015 A1
20150142595 Acuna-Rohter May 2015 A1
20150142713 Gopinathan et al. May 2015 A1
20150178371 Seth et al. Jun 2015 A1
20150201077 Konig et al. Jul 2015 A1
20150235240 Chang et al. Aug 2015 A1
20150254719 Barfield, Jr. et al. Sep 2015 A1
20150278944 Searson et al. Oct 2015 A1
20150287026 Yang et al. Oct 2015 A1
20160042359 Singh Feb 2016 A1
20160044054 Stiansen et al. Feb 2016 A1
20160055184 Fokoue-Nkoutche et al. Feb 2016 A1
20160080485 Hamedi Mar 2016 A1
20160098705 Kurapati Apr 2016 A1
20160132886 Burke et al. May 2016 A1
20160179877 Koerner et al. Jun 2016 A1
20160180726 Ahuja et al. Jun 2016 A1
20160203485 Subramanian Jul 2016 A1
20160225076 Merrill et al. Aug 2016 A1
20160247068 Lin Aug 2016 A1
20160373891 Ramer et al. Dec 2016 A1
20170004408 Edelen et al. Jan 2017 A1
20170032466 Feldman et al. Feb 2017 A1
20170041464 Sharma Feb 2017 A1
20170236125 Guise et al. Aug 2017 A1
20170262852 Florimond et al. Sep 2017 A1
20170289168 Bar et al. Oct 2017 A1
20170364918 Malhotra et al. Dec 2017 A1
20180075527 Nagla Mar 2018 A1
20180181962 Barnhardt et al. Jun 2018 A1
20200226284 Yin Jul 2020 A1
20210233166 Coulter Jul 2021 A1
20210264437 Flowers et al. Aug 2021 A1
20210264511 Flowers et al. Aug 2021 A1
20210357771 Flowers et al. Nov 2021 A1
20210357839 Flowers et al. Nov 2021 A1
20210374749 Vukich Dec 2021 A1
Foreign Referenced Citations (2)
Number Date Country
WO2002015454 Feb 2002 WO
WO-2005086636 Sep 2005 WO
Non-Patent Literature Citations (43)
Entry
Kim, “Empirical Evidence of Faulty Credit Scoring and Business Failure in P2P Lending”, KB Financial Group, Seoul, Republic of Korea, Global Business & Finance Review, vol. 26 Issue, 2, Summer (Year: 2021).
Final Office Action dated Dec. 23, 2020 for U.S. Appl. No. 15/495,678, “Natural Language Troubleshooting Engine”, Flowers, 15 pages.
Final Office Action dated Jan. 13, 2021 for U.S. Appl. No. 15/495,743, “Heuristic Sales Agent Training Assistant”, Flower, 11 pages.
Alzoubi, et al, “The Impact of Business Process Management on Business Performance Superiority”, 4 International Journal of Business and Management Review, vol. 3, No. 2, Feb. 2015, 19 pages.
Office Action for U.S. Appl. No. 15/495,594, dated May 27, 2021, Flowers, “Natural Language Virtual Assistant”, 24 pages.
Office Action for U.S. Appl. No. 15/495,716, dated May 27, 2021, Flowers, “Process Re-Design Targeting Engine”, 37 pages.
Non Final Office Action dated Oct. 30, 2020 for U.S. Appl. No. 15/495,594, “Natural Language Virtual Assistant”, Flowers, 28 pages.
Advisory Action and AFCP Decision dated Mar. 22, 2021 for U.S. Appl. No. 15/495,678, “Natural Language Troubleshooting Engine”, Flowers, 6 pages.
Office Action dated Mar. 25, 2021 for U.S. Appl. No. 15/495,699, “Book of Business Impact Assessment Engine”, Flowers, 26 pages.
Office Action for U.S. Appl. No. 16/897,088, dated Jan. 20, 2022, Flowers, “Heuristic Document Verification and Real Time Deposit Engine”, 6 Pages.
Office Action for U.S. Appl. No. 16/874,417, dated Jul. 12, 2021, Flowers, “Cross Selling Recommendation Engine”, 7 pages.
Final Office Action dated Nov. 5, 2020 for U.S. Appl. No. 15/495,699, “Book of Business Impact Assessment Engine”, Flowers, 22 pages.
Cooper, et al., “An Evaluation of Machine-Learning Methods for Predicting Pneumonia Mortality” Artificial Intelligence in Medicine, vol. 9, Oct. 14, 1997, pp. 107-138.
Kingston, et al., “Towards a Financial Fraud Ontology: a Legal Modelling Approach”, Artificial Intelligence and Law, (2004) 12 : DOI 10.1007/s10506-0005-4163-0, Springer 2006, pp. 419-446.
Kotsiantis, et al., “Efficiency of Machine Learning Teghniques in Bankrupty Prediction”, 2nd International Conference on Enterprise Systems and Accounting, Jul. 11-12, 2005, Thessaloniki, Greece, pp. 39-49.
Final Office Action dated Aug. 24, 2020 for U.S. Appl. No. 15/495,549, “Heuristic Identity Authentication Engine”, Flowers, 17 pages.
Final Office Action dated Aug. 24, 2020 for U.S. Appl. No. 15/495,724, “Financial Literacy Mentoring Assistant Engine”, Flowers, 13 pages.
Non Final Office Action dated Sep. 4, 2020 for U.S. Appl. No. 15/495,743, “Heuristic Sales Agent Training Assistant”, Flowers, 8 pages.
Final Office Action dated Sep. 24, 2020 for U.S. Appl. No. 15/495,716, “Process Re-Design Targeting Engine”, Flowers, 32 pages.
Non Final Office Action dated Oct. 9, 2020 for U.S. Appl. No. 15/495,579, “Heuristic Context Prediction Engine”, Flowers, 11 pages.
Zask, “Finding Financial Fraudsters: Quantitative and Behavioural Finance Approaches”, Journal of Securities Operations & Custody, vol. 6, No. 4, 10, Mar. 2014, pp. 308-324.
Office Action for U.S. Appl. No. 15/495,549, dated Apr. 30, 2021, Flowers, “Heuristic Identity Authentication Engine”, 12 Pages.
Bishop, “Pattern Recognition and Machine Learning”, Springer Science, 2006, 758 pages.
Boldi, et al., “Query Suggestions Using Query-Flow Graphs”, Proc. of the 2009 Workshop on Web Search Click Data, 2009, pp. 56-63.
Jaroslav, et al., “Internal Model of Commercial Bank as an Instrument for Measuring Credit Risk of the Borrower in Relation to Financial Performance (Credit Scoring and Bankruptcy Models)”, Journal of Competitiveness, Iss. 4, 2011, pp. 104-120.
Jeffords, et al., “New Technologies to Combat Check Fraud”, The CPA Journal, Mar. 1999, pp. 30-35.
Madaan, et al., “A Data Mining Approach to Predict Users' Next Question in QA System”, 2015 2nd International Conference on Computing for Sustainable Global Development, pp. 211-215.
Ng'Ambi, “Pre-empting User Questions through Anticipation—Data Mining FAQ Lists”, Proc. of the 2002 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on Enablement through Technology, 2002, pp. 101-109.
Final Office Action dated Jan. 22, 2020 for U.S. Appl. No. 15/495,743 “Heuristic Sales Agent Training Assistant” Flowers, 9 pages.
Final Office Action dated Jan. 7, 2020 for U.S. Appl. No. 15/495,699 “Book of Business Impact Assessment Engine” Flowers, 26 pages.
Office Action for U.S. Appl. No. 15/495,603, dated Dec. 9, 2019, Flowers,“Heuristic Money Laundering Detection Engine”, 20 Pages.
Non Final Office Action dated Apr. 15, 2020 for U.S. Appl. No. 15/495,716 “Process Re-Design Targeting Engine” Flowers, 31 pages.
Non Final Office Action dated Apr. 29, 2020 for U.S. Appl. No. 15/495,724 “Financial Literacy Mentoring Assistant Engine” Flowers, 9 pages.
Non Final Office Action dated May 1, 2020 for U.S. Appl. No. 15/495,549 “Heuristic Identity Authentication Engine” Flowers, 13 pages.
Final Office Action dated May 26, 2020 for U.S. Appl. No. 15/495,603 “Heuristic Money Laundering Detection Engine” Flowers, 12 pages.
Non Final Office Action dated May 29, 2020 for U.S. Appl. No. 15/495,699 “Book of Business Impact Assessment Engine” Flowers, 14 pages.
Non Final Office Action date Jun. 11, 2020 for U.S. Appl. No. 15/495,678 “Natural Language Troubleshooting Engine” Flowers, 15 pages.
Sordoni, “A Hierarchical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion”, Proc. of the 24th ACM International Conference on Information and Knowledge Management, 2015, pp. 553-562.
Tyler, et al., “Large Scale Query Log Analysis of Re-Finding”, Proc. of the 3rd ACM International Conference on Web Search and Data Mining, 2010, pp. 191-200.
Fraud Detection in Fintech: How to detect and prevent frauds in the lending industry, Finanial Express [New Delhi], Proquest Document ID: 2535744028, Jun. 2021, 3 pgs.
Office Action for U.S. Appl. No. 17/066,319, dated Mar. 15, 2022, Flowers, “Heuristic Money Laundering Detection Engine”, 13 pages.
Office Action for U.S. Appl. No. 16/997,741, dated 0/24/2022, Flowers, “Heuristic Account Fraud Detection Engine”, 7 pages.
Ryman-Tubb, Nicholas Francis. Understanding payment card fraud through knowledge extraction from neural networks usinglarge-scale datasets. University of Surrey (United Kingdom). ProQuest Dissertations Publishing, 2016. (Year: 2016), 610 pages.
Provisional Applications (15)
Number Date Country
62368548 Jul 2016 US
62368271 Jul 2016 US
62368298 Jul 2016 US
62368332 Jul 2016 US
62368359 Jul 2016 US
62368406 Jul 2016 US
62368448 Jul 2016 US
62368503 Jul 2016 US
62368512 Jul 2016 US
62358525 Jul 2016 US
62368536 Jul 2016 US
62368572 Jul 2016 US
62368588 Jul 2016 US
62337711 May 2016 US
62335374 May 2016 US
Continuations (1)
Number Date Country
Parent 15495621 Apr 2017 US
Child 16986132 US