Machine learning archive mechanism using immutable storage

Information

  • Patent Grant
  • 11409990
  • Patent Number
    11,409,990
  • Date Filed
    Friday, March 1, 2019
    6 years ago
  • Date Issued
    Tuesday, August 9, 2022
    3 years ago
Abstract
An apparatus and method for providing an immutable audit trail for machine learning applications is described herein. The audit trail is preserved by recording the machine learning models and data in a data structure in immutable storage such as a WORM device, a cloud storage facility, or in a blockchain. The immutable audit trail is important for providing bank auditors with the reasons for lending or account opening reasons, for example. A graphical user interface is described to allow the archive of machine learning models to be viewed.
Description
BACKGROUND
Prior Application

This application is a priority application.


Technical Field

The system, apparatuses and methods described herein generally relate to machine learning techniques, and, in particular, to a mechanism for creating immutable archives.


Description of the Related Art

The banking industry has been criticized in the past for making decisions on lending, account opening and check cashing procedures based on the neighborhoods in which customers resided. This was seen as a form of racism because certain neighborhoods were predominantly filled with one race or another. Since these criticisms, banks and other lending organizations have instituted procedures to assure that their banking activities avoid any type of prejudices. The requirements to fairly conduct banking activities are codified in the Equal Credit Opportunity Act and the Fair Housing Act in the United States.


Up until the last few years, decisions on whether to loan money or open accounts were determined by bank employees who were trained in compliance with Fair Lending expectations. Bank Examiners then audited the banks for compliance.


In recent years, banks have turned to machine learning techniques to combat fraud in lending and in bank account opening. These machine learning algorithms are taught, given data sets of customers along with fraud activates. The machine develops its own algorithms based on the patterns in the data. Since the machine learning focuses solely on the data in the training sets, there is no way to enforce public policy techniques to assure that the Fair Lending rules are in compliance.


For instance, if a certain neighborhood has a high incidence of fraud, the computer, as part of its algorithm to detect fraud clusters and route banking activities away from these areas of fraud, may determine that certain neighborhoods are high fraud risks, and the machine may restrict banking activities in those neighborhoods. There is a potential that this machine learning behavior will open up a bank to accusations of violations of the Fair Lending rules.


To avoid or limit liability, the reasons why an adverse action is taken should be saved so that the evidence of the reasoning behind the decision is retained. The saving of the reasoning must be in a form that prevents modification.


There are a number of techniques using in computing to avoid data loss and to assure the integrity of the data. Write-once-read-many devices such as write only compact disk devices (CR-R) and write only Digital Versatile Disc Recordable (DVD-R) devices provide a hardware solution that creates an immutable record of the data. Microsoft, in their Azure system, has created a Blob storage functionality that includes an immutable parameter that provides a software cloud immutable storage.


Blockchain technology also provides an immutable ability to store the reasoning concerning a decision so that a bank examiner or auditor can double check the decision. Blockchain technology is unmodifiable yet viewable, and can be added to without impacting the previously stored data. This allows new archive information to be added to the chain without impacting the integrity of the previously stored information.


There is a need in the banking industry to provide an immutable archive of the reasoning behind a decision to provide defensible parameters around the decision. The present inventions address these issues.


BRIEF SUMMARY OF THE INVENTION

An apparatus for archiving machine learning models is described herein. The apparatus is made up of a special purpose server with an immutable storage facility connected. The apparatus further includes an application that executes code on the special purpose server that sends data to a machine learning engine receives a result from the machine learning engine. A machine learning model, that is updated generally, is integrated with the machine learning engine. Every time the machine learning model is updated it is stored by the special purpose server in the immutable storage facility.


In some embodiments, the immutable storage facility is a blockchain, or a write-once-read-many storage product, or a software cloud immutable storage. Furthermore, customer data (which could relate to banking) could also be stored in the immutable storage facility by the special purpose server. This data could also include the machine learning result. The customer data could be stored each time the machine learning engine is called by the application. The machine learning model could be updated periodically or whenever the customer data is used to train the machine learning model.


A method for archiving machine learning models is also described here. The method is made up the steps of (1) receiving data from an application at a machine learning engine running on a special purpose server, (2) calling a machine learning model by the machine learning engine, (3) executing the machine learning model using the data to determine a result, (4) returning the result to the machine learning engine and to the application, (5) updating the machine learning model by the special purpose server, and (6) storing the machine learning model in an immutable storage facility when the machine learning model is updated.


The immutable storage facility could be a blockchain, a write-once-read-many storage product, or a software cloud immutable storage in some embodiments of the method. The steps of the method could also include (7) storing customer data in the immutable storage facility by the special purpose server. In some embodiments, the customer data relates to banking. The customer data could be stored each time the machine learning engine is called by the application and could also include the machine learning result. In some embodiments, the machine learning model is updated periodically or when the customer data is used to train the machine learning model.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a machine learning system.



FIG. 2 shows a small data set of simple model.



FIG. 3 is one embodiment of a data structure representing a machine learning model.



FIG. 4 is an electrical architecture of one embodiment.



FIG. 5 illustrates a possible data structure for the input to a machine learning model.





DETAILED DESCRIPTION

There is a strong need in the machine learning industry to provide an audit trail for machine learning models, so that bank regulators, legal teams, and internal auditors can verify that the decisions by the machine learning software was done without bias.



FIG. 1 shows a block diagram of a machine learning architecture with the storage of the machine learning model 103 archived in immutable storage 106. The training dataset 102 is first run through a training module 101 to develop the machine learning model 103. As the training module 101 processes the training dataset 102, the training module 101 plots the training data on a multi-dimensional graph and then attempts to create a mathematical model around the dataset. A sample training data set can be seen in the customer information in FIG. 5, items 504, 505, 506, 507, 508 and the result 510.


A sample model is seen in FIG. 2, although displayed in a very simple, two-dimensional form. Positive data 205 and negative data 204 are plotted on an x-axis 202 and a y-axis 201. The data is run through a curve matching algorithm, and the result is a linear equation y=Ax−B, which is shown on FIG. 2 as line 203. For example, the x value could be the customer's assets 504 and the y value could be the customer's income 506. This is a very simple model. Using real world data, the model could be a seven or ten dimensional quadratic equation with scores of constants. To use this model, the machine learning engine function 104 is called. The machine learning engine function 104 returns a Boolean to the application 105 that specifies whether the input parameters x and y predicts positive 205 or negative 204 data. In this case, the machine learning engine function is a one line function: return (y>((A*x)−B)).


For a banking model 103, the input parameters may include such factors as total assets 504, total debt 505, total income 506, total expenses 507, zip code 508, and other criteria, and the model may decide whether to allow or deny a loan. In some uses, the model recommends and a human reviews the model's recommendation (called supervised machine learning), either approving or disapproving the model's result. If the human overrules the machine, the results are fed back into the training set so that the machine/model learns from the mistake.


Every time the model 103 is updated, the model is stored in the immutable storage 106 so that there is an auditable trail of why decisions were made by the machine. The immutable storage 106 could be a form of write once, read many hardware device such as a compact disk devices (CR-R) and write only Digital Versatile Disc Recordable (DVD-R) device. Read only memory chips could also be used in some embodiments. A version of the Secure Digital flash memory card exists in which the internal microprocessor does not allow rewrites of any block of the memory. There are multiple vendors providing Magnetic Storage technologies including (but not limited to) NetApp, EMC Centera, KOM Networks, and others. Prevention of rewrite is done at the physical disk level and cannot be modified or overridden by the attached computer.


Microsoft, in their Azure system, has created a Blob storage functionality that includes an immutable parameter that provides a software cloud immutable storage. Iron Mountain offers a WORM (Write-once-read-many) storage product. Both of these cloud services offer the ability to time limit items in the immutable storage, so that document retention policies can be enforced. Thus, while the stored records are immutable, they also expire after a predetermined time.


Blockchain technology also provides an immutable ability to store the reasoning concerning a decision so that a bank examiner or auditor can double check the decision. Blockchain technology is unmodifiable yet viewable, and can be added to without impacting the previously stored data. This allows new archive information to be added to the chain without impacting the integrity of the previously stored information.


A simple blockchain is nothing more than a list of transactions that is chained or linked together using novel data structures, and some basic cryptographic principles. Anyone can make a blockchain, and create blocks. To create secure, immutable blockchains, the blockchain must be distributed, and the ability to create blocks must be more effort and cost than the value of what is being protected. The rules need to provide mathematical certainty from how transactions are “signed” through to how much “proof of work” needs to accompany a block.


There are many parts to the secure blockchain's rulebook, and each and every part is essential to the scheme's overall security; no single element (including the blockchain) is enough to secure the transactions. The rules ensure that the computing power required to tear out a recent block, alter a transaction, then re-create all subsequent blocks is more than any attacker and even most governments could amass.


Any of these technologies could be used to provide the immutable storage 106. In many embodiments, the training module 101 is run periodically, so the model 103 is updated periodically. This means that a reasonably small number of model 103 versions need to be stored. However, in some embodiments, particularly in supervised machine learning models, the model is updated in real time (or close to real time) as a user corrects the machine's decisions. In these cases the model 103 would need to be stored with high frequency in the immutable storage 106. Special hardware may need to be implemented to store the volume of models 103 and at a performance that can handle the rapidity in which the model 103 changes.


In some embodiments, the data (FIG. 5) from the application 105 is also stored in the immutable storage 106 to allow a complete audit trail of inputs and model changes. This is useful in quality assurance applications to assure that the model behaves the same way when provided a set of test data. This data storage also provides a bank examiner or audit with all of the inputs, models, and outputs for a banking decision, allowing for verification of the reasoning behind the decision.


A model audit graphical user interface 107 could also be provided to review the history of the model changes in the immutable storage 106. This audit GUI 107 could allow search capabilities as well as an ability to list when and how often the model changed. Graphical capabilities to visually present the model are features of some embodiments (i.e. show the curve on an x-, y-, z-axis, similar to FIG. 2).


In addition the audit GUI 107 could be used to reset the model 103 back to a certain time frame to allow selective running of the model with a certain set of data. Another feature could set the model 103 to a certain time or to certain values, allowing the machine learning engine 104 to resume using the GUI 107 selected model 103. This feature is useful if the machine has been incorrectly trained through improper supervised learning, allowing the impact to be reversed by rewinding the model.


The model audit GUI 107 is not permitted to delete or alter the data in the immutable storage 106, although the GUI 107 may be allowed to add to the immutable storage 106 to annotate the model records. This allows comments on why a historical or a new model was forced by the GUI 107 into the machine learning model 103. The GUI 107 could also have facilities for printing or exporting one or more models to another program.


In some embodiments, the data input 504, 505, 506, 507, 508 to the model 103 is also stored in the immutable storage 106. In this embodiment, the GUI 107 could allow the running of a data set or a range of data sets to be run through the current model 103 and the results returned to the GUI 107.


Looking to FIG. 3, we see one possible embodiment of the model stored in a record 300 in the immutable storage 106. The first data field in the record 300 is a link to the next record 301 in the linked list of records 300. In most embodiments, the next record in the linked list contains the previous version of the model 103 (newest to oldest). In other embodiments, the record sizes could be fixed and the next record 301 would be unnecessary. The next record 301 could run the linked list from the oldest to the newest in other embodiments without detracting from the invention, although buffering techniques would need to be employed to hold the newest record 300 until the next model is to be saved so that the next record link 301 is known before writing. In other embodiments, the location is allocated before use so that it is known when the previous record is stored. In some embodiments, this link 301 is a two element array with one link pointing to the next record and the other link pointing to the previous record. In still another embodiment, a separate table of links to records is stored separate from the data to point to the records.


The next data field in the record 300 is the model version 302. This is an optional field that specifies the version of the model 103 and could simply be an incremented counter.


A timestamp 303 is the next field in the record 300, and records both a time and a data value. The timestamp 303 could be a two element array of the timestamp when the model began use and the timestamp when the model ended use. In other embodiments, either the beginning or ending timestamp is stored, and the other time implied by the previous or next record.


A machine learning algorithm 304 is next stored in the record 300 in this example embodiment. The algorithm 304 could be a string describing the type of curve used or a constant specifying the type of curve or algorithm used. In the example used in FIG. 2 this would be a linear algorithm. In other models could specify a range, a power, a logarithmic, a polynomial, an exponential, a moving average, or a linear forecast algorithms, for example. Combinations or more complicated algorithms could be specified as well.


The next field specifies the order 305 of a polynomial or other formula. This field is not used in all algorithms. This field is in the FIG. 2 example is set to one, because a linear equation is a first order polynomial.


The next four or more fields contain constants 306, 307, 308, 309 for the algorithm. In the FIG. 2 example, constants A and B are used, so field Constant A 306 will have the value of A and field Constant B 307 will have the value of B. The other constant fields 308, 309 are of no concern and could be set to zero or left as random values.


Because of the complexities of machine learning algorithms, special purpose computing may be needed to build and execute the machine learning model described herein. FIG. 4 shows one such embodiment. The user views the user interface 107 described here on a personal computing device such as a personal computer, laptop, tablet, smart phone, monitor, or similar device 401. The personal computing device 401 communicated through a network 402 such as the Internet, a local area network, or perhaps through a direct interface to the server 403. The server 403 is a high performance, multi-core computing device with significant storage facilities 404 in order to store the training data 102 for the model 103. Since this training data 102 is continuously updated in some embodiments, this data must be kept online and accessible so that it can be updated. In addition, the real-time editing of the model 103 as the user provides feedback to the model 103 requires significant processing power to rebuild the model as feedback is received.


The server 403 is a high performance computing machine electrically connected to the network 402 and to the storage facilities 404. Furthermore, the server 403 is electrically connected to the immutable storage 405. Immutable storage facilities 405 are not common to most servers 403, but are used in the inventions described herein. In some embodiments, the immutable storage 405 is located in the cloud and indirectly connected to the server 403 through a network.


In order to preserve a complete archive of the machine learning transactions, some embodiments will also store all data that is run through the model 103 in the immutable storage 106. This data could look similar to that illustrated in FIG. 5.


The first data field in the record 500 is a link to the next record 501 in the linked list of records 500. In most embodiments, the next record in the linked list contains the previous version of the model 103 (newest to oldest). In other embodiments, the record sizes could be fixed and the next record 501 would be unnecessary. The next record 501 could run the linked list from the oldest to the newest in other embodiments without detracting from the invention, although buffering techniques would need to be employed to hold the newest record 500 until the next model is to be saved so that the next record link 501 is known before writing. In other embodiments, the location is allocated before use so that it is known when the previous record is stored. In some embodiments, this link 501 is a two element array with one link pointing to the next record and the other link pointing to the previous record. In still another embodiment, a separate table of links to records is stored separate from the data to point to the records.


The next data field in the record 500 is the model version 502. This field specifies the version of the model 103 that processed the data 500. In some embodiments, rather than a version number, a link to the immutable storage 106 record 300 for the model is stored in field 502.


A timestamp 303 is the next field in the record 300. The timestamp 303 represents the date and time when the record is processed by the model 103.


After the timestamp field 303, the input parameters for the model are stored: customer assets 504, customer debt 505, customer income 506, customer expenses 507, and customer zip code 508. Any of these customer input fields could be deleted or additional fields added without detracting from the inventions herein. These fields, in most implementations, would be the same fields in the training dataset 102, with the addition of the correct result 510.


The next field that needs to be stored in the data record 500 is the machine learning result 509. This preserves the outcome of the model based on the above customer data set 504, 505, 506, 507, 508. In most embodiments, this value is a Boolean.


The final field is the human result 510. This is the result that a trainer or supervisor of the model 103 gives to the given user dataset 504, 505, 506, 507, 508. If this value does not agree with the machine learning result 509, then the machine model may need to be re-trained using the data. In some embodiments, a comment may be attached to the record 500 to memorialize the trainer's thinking in overriding the model 103. In some machine learning embodiments, the result 510 is not based on a human trainer but on another machine or real-world result. For instance, if the loan approved by the machine learning model defaults, then the result 510 may change to a negative value, and the data used to retrain the model.


The foregoing devices and operations, including their implementation, will be familiar to, and understood by, those having ordinary skill in the art.


The above description of the embodiments, alternative embodiments, and specific examples, are given by way of illustration and should not be viewed as limiting. Further, many changes and modifications within the scope of the present embodiments may be made without departing from the spirit thereof, and the present invention includes such changes and modifications.

Claims
  • 1. An apparatus for archiving machine learning models, the apparatus comprising: a special purpose multi-core server electrically connected to a storage facility;a blockchain connected to the special purpose multi-core server;an application that executes code on the special purpose multi-core server;a machine learning engine that receives data from the application and returns a result to the application; anda machine learning model integrated with the machine learning engine, wherein the machine learning model updates periodically;wherein the machine learning model is stored in the blockchain by the special purpose multi-core server when the machine learning model is updated.
  • 2. The apparatus of claim 1 further comprising customer data that is stored in the blockchain by the special purpose multi-core server.
  • 3. The apparatus of claim 2 wherein the customer data relates to banking.
  • 4. The apparatus of claim 2 wherein the customer data is stored each time the machine learning engine is called by the application.
  • 5. The apparatus of claim 2 wherein the result is stored in the blockchain by the special purpose multi-core server.
  • 6. The apparatus of claim 2 wherein the machine learning model is updated when the customer data is used to train the machine learning model.
  • 7. An apparatus for archiving machine learning models, the apparatus comprising: a special purpose multi-core server;a blockchain stored on a write-once-read-many hardware storage facility electrically connected to the special purpose multi-core server;an application that executes code on the special purpose multi-core server;a machine learning engine that receives data from the application and returns a result to the application; anda machine learning model integrated with the machine learning engine, wherein the machine learning model updates periodically;wherein the machine learning model is stored in the blockchain stored on the write-once-read-many hardware storage facility by the special purpose multi-core server when the machine learning model is updated.
  • 8. The apparatus of claim 7 further comprising customer data that is stored in the blockchain stored on the write-once-read-many hardware storage facility by the special purpose multi-core server.
  • 9. The apparatus of claim 8 wherein the customer data relates to banking.
  • 10. The apparatus of claim 8 wherein the customer data is stored each time the machine learning engine is called by the application.
  • 11. The apparatus of claim 8 wherein the result is stored in the blockchain of the write-once-read-many hardware storage facility by the special purpose multi-core server.
  • 12. The apparatus of claim 8 wherein the machine learning model is updated when the customer data is used to train the machine learning model.
US Referenced Citations (152)
Number Name Date Kind
5262942 Earle Nov 1993 A
5729594 Klingman Mar 1998 A
5809483 Broka et al. Sep 1998 A
5815657 Williams et al. Sep 1998 A
5875108 Hoffberg et al. Feb 1999 A
5890140 Clark et al. Mar 1999 A
5913202 Motoyama Jun 1999 A
5970482 Pham et al. Oct 1999 A
6023684 Pearson Feb 2000 A
6105012 Chang et al. Aug 2000 A
6141699 Luzzi et al. Oct 2000 A
6151588 Tozzoli et al. Nov 2000 A
6400996 Hoffberg et al. Jun 2002 B1
6505175 Silverman et al. Jan 2003 B1
6523016 Michalski Feb 2003 B1
6675164 Kamath et al. Jan 2004 B2
6687693 Cereghini et al. Feb 2004 B2
6708163 Kargupta et al. Mar 2004 B1
6856970 Campbell et al. Feb 2005 B1
7092941 Campos Aug 2006 B1
7308436 Bala et al. Dec 2007 B2
D593579 Thomas Jun 2009 S
7617283 Aaron et al. Nov 2009 B2
7716590 Nathan May 2010 B1
7720763 Campbell et al. May 2010 B2
7725419 Lee et al. May 2010 B2
7805370 Campbell et al. Sep 2010 B2
8229875 Roychowdhury Jul 2012 B2
8229876 Roychowdhury Jul 2012 B2
8429103 Aradhye et al. Apr 2013 B1
8776213 Mclaughlin et al. Jul 2014 B2
8990688 Lee et al. Mar 2015 B2
9003312 Ewe et al. Apr 2015 B1
9405427 Curtis et al. Aug 2016 B2
D766292 Rubio Sep 2016 S
D768666 Anzures et al. Oct 2016 S
9489627 Bala Nov 2016 B2
9537848 Mclaughlin et al. Jan 2017 B2
D780188 Xiao et al. Feb 2017 S
D783642 Capela et al. Apr 2017 S
D784379 Pigg et al. Apr 2017 S
D784381 Mcconnell et al. Apr 2017 S
D785657 Mcconnell et al. May 2017 S
9667609 Mclaughlin et al. May 2017 B2
D790580 Hatzikostas Jun 2017 S
D791161 Hatzikostas Jul 2017 S
D803238 Anzures et al. Nov 2017 S
D814490 Bell Apr 2018 S
9946995 Dwyer et al. Apr 2018 B2
10262235 Chen et al. Apr 2019 B1
D853424 Maier et al. Jul 2019 S
10423948 Wilson et al. Sep 2019 B1
D871444 Christiana et al. Dec 2019 S
D872114 Schuster Jan 2020 S
D873277 Anzures et al. Jan 2020 S
D878385 Medrano et al. Mar 2020 S
D884003 Son et al. May 2020 S
D900847 Saltik Nov 2020 S
D901528 Maier et al. Nov 2020 S
D901530 Maier et al. Nov 2020 S
D902957 Cook et al. Nov 2020 S
10924514 Altman et al. Feb 2021 B1
10949825 Brosamer Mar 2021 B1
11003999 Gil et al. May 2021 B1
20020016769 Barbara et al. Feb 2002 A1
20020099638 Coffman et al. Jul 2002 A1
20020118223 Steichen et al. Aug 2002 A1
20020135614 Bennett Sep 2002 A1
20020138431 Antonin et al. Sep 2002 A1
20020188619 Low Dec 2002 A1
20020194159 Kamath et al. Dec 2002 A1
20030041042 Cohen et al. Feb 2003 A1
20030093366 Halper et al. May 2003 A1
20030184590 Will Oct 2003 A1
20030212629 King Nov 2003 A1
20030220844 Marnellos et al. Nov 2003 A1
20030233305 Solomon Dec 2003 A1
20040034558 Eskandari Feb 2004 A1
20040034666 Chen Feb 2004 A1
20050010575 Pennington Jan 2005 A1
20050154692 Jacobsen et al. Jul 2005 A1
20050171811 Campbell et al. Aug 2005 A1
20050177495 Crosson Smith Aug 2005 A1
20050177504 Crosson Smith Aug 2005 A1
20050177521 Crosson Smith Aug 2005 A1
20060015822 Baig et al. Jan 2006 A1
20060080245 Bahl et al. Apr 2006 A1
20060101048 Mazzagatti et al. May 2006 A1
20060190310 Gudla et al. Aug 2006 A1
20060200767 Glaeske et al. Sep 2006 A1
20060265662 Gertzen Nov 2006 A1
20070156673 Maga et al. Jul 2007 A1
20070266176 Wu Nov 2007 A1
20080091600 Egnatios et al. Apr 2008 A1
20080104007 Bala May 2008 A1
20080228751 Kenedy et al. Sep 2008 A1
20090150814 Eyer et al. Jun 2009 A1
20090192809 Chakraborty et al. Jul 2009 A1
20090240647 Green et al. Sep 2009 A1
20090307176 Jeong et al. Dec 2009 A1
20100066540 Theobald et al. Mar 2010 A1
20110302485 D'Angelo et al. Dec 2011 A1
20120041683 Vaske et al. Feb 2012 A1
20120054095 Lesandro et al. Mar 2012 A1
20120072925 Jenkins et al. Mar 2012 A1
20120197795 Campbell et al. Aug 2012 A1
20120290379 Hoke et al. Nov 2012 A1
20120290382 Martin et al. Nov 2012 A1
20120290474 Hoke Nov 2012 A1
20120290479 Hoke et al. Nov 2012 A1
20130054306 Bhalla et al. Feb 2013 A1
20130071816 Singh et al. Mar 2013 A1
20130110750 Newnham et al. May 2013 A1
20130211937 Elbirt et al. Aug 2013 A1
20130219277 Wang et al. Aug 2013 A1
20130231974 Harris et al. Sep 2013 A1
20130339187 Carter Dec 2013 A1
20140129457 Peeler May 2014 A1
20140143186 Bala May 2014 A1
20140241609 Vigue et al. Aug 2014 A1
20140244491 Eberle et al. Aug 2014 A1
20140258104 Harnisch Sep 2014 A1
20140279484 Dwyer et al. Sep 2014 A1
20140317502 Brown et al. Oct 2014 A1
20150310195 Bailor Oct 2015 A1
20150363801 Ramberg et al. Dec 2015 A1
20160011755 Douek et al. Jan 2016 A1
20160086185 Adjaoute Mar 2016 A1
20160164757 Pape Jun 2016 A1
20160232546 Ranft et al. Aug 2016 A1
20160292170 Mishra et al. Oct 2016 A1
20170019490 Poon et al. Jan 2017 A1
20170178245 Rodkey Jun 2017 A1
20170213131 Hammond Jul 2017 A1
20180005323 Grassadonia Jan 2018 A1
20180025140 Edelman Jan 2018 A1
20180075527 Nagla et al. Mar 2018 A1
20180253464 Kohli Sep 2018 A1
20180285839 Yang Oct 2018 A1
20180337770 Angel et al. Nov 2018 A1
20180349446 Triolo et al. Dec 2018 A1
20190122149 Caldera et al. Apr 2019 A1
20190205977 Way et al. Jul 2019 A1
20190213660 Astrada et al. Jul 2019 A1
20190236598 Padmanabhan Aug 2019 A1
20190258818 Yu et al. Aug 2019 A1
20190340847 Hendrickson et al. Nov 2019 A1
20190379615 Karp Dec 2019 A1
20200119905 Revankar et al. Apr 2020 A1
20200151812 Gil et al. May 2020 A1
20200184017 Batra Jun 2020 A1
20210125272 Sinharoy Apr 2021 A1
Foreign Referenced Citations (6)
Number Date Country
2003288790 Mar 2005 AU
2756619 Sep 2012 CA
3376361 Sep 2018 EP
57-101980 Jun 1982 JP
2019021312 Jan 2019 WO
2020068784 Apr 2020 WO
Non-Patent Literature Citations (22)
Entry
AccceptEasy, “In-Chat Payments”, webpage downloaded from https://www.accepteasy.com/en/in-chat-paymentson Mar. 21, 2019.
AcceptEasy > Public, webpage downloaded from https://plus.google.com/photos/photo/104149983798357617844/6478597596568340146 on Mar. 21, 2019.
Bansal, Nikhil, Avrim Blum, and Shuchi Chawla. “Correlation clustering.” Machine Learning 56.1-3 (2004): 89-113.
Bloomberg Terminal, Wikipedia, Feb. 20, 2021, webpage downloaded from https://en.wikipedia.org/wiki/Bloomberg_Terminal on Apr. 27, 2021.
Data mining to reduce churn. Berger, Charles. Target Marketing; Philadelphia 22.8 (Aug. 1999): 26-28.
Distributed Mining of Classification Rules, By Cho and Wuthrich, 2002 http://www.springerlink.com/21nnasudlakyzciv54i5kxz0)/app/home/contribution.asp?referrer=parent&backto=issue, 1,6;journal,2,3,31;linkingpublicationresults, 1:105441, 1.
Eckoh, “ChatGuard, The only PCI DSS compliant Chat Payment solution”, webpage downloaded from https://www.eckoh.com/pci-compliance/agent-assisted-payments/ChatGuard on Mar. 25, 2019.
Finley, Thomas, and Thorsten Joachims. “Supervised clustering with support vector machines.” Proceedings of the 22nd international conference on Machine learning, ACM, 2005.
James Wilson, “How to take your UX to a new level with personalization”, Medium blog, Oct. 24, 2018, found at https://medium.com/nyc-design/how-to-take-your-ux-to-a-new-level-with-personalization-1579b3ca414c on Jul. 29, 2019.
Jhingran, Ananl, “Moving toward outcome-based intelligence”, IDG Contributor Network, Mar. 19, 2018.
Khalid, “Machine Learning Algorithms 101”, Feb. 17, 2018 https://towardsml.com/2018/02/17/machine-learning-algorithms-101/ (Year: 2018).
Live chat Screenshols, mylivechal, webpage downloaded May 17, 2021 from https://mylivechal.com/screenshols.aspx.
LiveChat, “Pagato”, webpage downloaded from https://www.livechatinc.com/marketplace/apps/pagato/ on Mar. 21, 2019.
ManyChat, “[Updated] Accept Paayments Directly in Facebook Messenger Using ManyChat!”, webpage downloaded from https://blog.manychat.com/manychat-payments-using-facebook-messenger/ on Mar. 21, 2019.
Meia et al., Comparing clusterings—an information based distance, Journal of Multivariate Analysis 98 (2007) 873-895.
Mining for gold. Edelstein, Herb. InformationWeek; Manhasset 627 (Apr. 21, 1997): 53-70.
Pagato, “Accept Payments via Chat”, webpage downloaded from https://pagato.com/ on Mar. 21, 2019.
PCIpal, “SMS & Web Chat”, webpage downloaded from https://www.pcipal.com/us/solutions/sms-web-chat/ on Mar. 21, 2019.
The Clearing House, “U.S. Real-Time Payments Technology Playbook”, Version 1.0, Nov. 2016.
Traffic Analysis of a Web Proxy Caching Hierarchy. Anirban Mahanti et al., University of Saskatchewan, IEEE Network, May/Jun. 2000.
Hampton, Nikolai, Understanding the blockchain hype: Why much of it is nothing more than snake oil and spin, Computerworld, Sep. 5, 2016.
McConaghy, Trent, Blockchains for Artificial Intelligence, Medium, Jan. 3, 2017.