The disclosure relates generally to fraud detection and, more specifically, to electronically identifying fraudulent retail transactions.
Some transactions, such as some in-store or online retail transactions, are fraudulent. For example, a fraudster may attempt to purchase an item using a payment form, such as a credit card, belonging to another person. The fraudster may have stolen or found the payment form, and is now attempting to use the payment form for the purchase without permission from the payment form's rightful owner. In some cases, such as with in-store purchases, a fraudster may present another's identification (ID) card (e.g., driver's license), in addition to the payment form, when attempting to purchase the item, thereby facilitating the in-store fraudulent purchase.
Conveniences associated with online retail purchases also may facilitate fraudulent online transactions. For example, at least some retail websites allow a customer to make purchases without “signing in.” Instead of logging into an account of the customer on the website, the customer may choose to proceed under a “guest” option that does not require the customer to sign in to a particular account. As a result, a fraudster may make a purchase using an unauthorized payment form using the “guest” option. In addition, at least some retail websites allow a customer to ship purchased products to any address, such as a store location (e.g., ship-to-store), or a home location (e.g., ship-to-home). Although some retailers may require the showing of an ID when a customer shows to pick up a purchased item at a store, as noted above a fraudster may have an ID card of a victimized person. Thus, these online purchase conveniences may facilitate fraudulent online retail transactions.
In each of these examples, the fraudster is involved in a fraudulent activity. Fraudulent activities may cause victimized persons time and, in some examples, financial losses. For example, a victimized person may need to contact a financial institution and/or retailer to be credited for a fraudulent activity. In some examples, the victimized person may not be able to recover the financial losses. Fraudulent activities may also cause financial harm to a company, such as a retailer. For example, the true owner of the payment form may identify the fraudulent transaction and have the transaction cancelled. As such, the retailer may not receive payment for the purchase items. Thus, customers and retailers may benefit from the identification of fraudulent transactions before those transactions are completed.
The embodiments described herein are directed to automatically identifying fraudulent transactions. The embodiments may identify a fraudulent activity as it is taking place, for example, allowing a retailer to stop or not allow the transaction. In some examples, the embodiments may allow a retailer to identify a suspected fraudulent in-store or online purchase. The transaction may be disallowed if fraud is identified. As a result, the embodiments may allow customers to avoided being defrauded. The embodiments may also allow a retailer to decrease expenses related to fraudulent transactions.
In accordance with various embodiments, exemplary systems may be implemented in any suitable hardware or hardware and software, such as in any suitable computing device. For example, in some embodiments, a computing device is configured to receive purchase data identifying a purchase attempt (e.g., a current purchase attempt, such as at a store or on a website) using a first device and a first payment form. The computing device may also be configured to determine whether the first device is trusted to the first payment form based on first trust data obtained, for example, from a database. If the first device is trusted to the first payment form, the computing device is configured to generate a first trust value. If, however, the first device is not trusted to the second payment form, the computing device executes a machine learning process based on the purchase data, and generates a second trust value based on execution of the machine learning process. The computing device may further be configured to generate trust score data based on at least one of the first trust value or the second trust value. The computing device may be configured to transmit the trust score data to another computing device.
In some embodiments, a method is provided that includes receiving purchase data identifying a purchase attempt using a first device and a first payment form. The method may also include determining whether the first device is trusted to the first payment form based on first trust data obtained from a database. If the first device is trusted to the first payment form, the method further includes generating a first trust value. If the first device is not trusted to the second payment form, the method further includes executing a machine learning process based on the purchase data, and generating a second trust value based on execution of the machine learning process. The method may also include generating trust score data based on at least one of the first trust value or the second trust value. The method may further include transmitting the trust score data to another computing device.
In yet other embodiments, a non-transitory computer readable medium has instructions stored thereon, where the instructions, when executed by at least one processor, cause a computing device to perform operations that include receiving purchase data identifying a purchase attempt using a first device and a first payment form. The operations may also include determining whether the first device is trusted to the first payment form based on first trust data obtained from a database. If the first device is trusted to the first payment form, the operations further include generating a first trust value. If the first device is not trusted to the second payment form, the operations further include executing a machine learning process based on the purchase data, and generating a second trust value based on execution of the machine learning process. The operations may also include generating trust score data based on at least one of the first trust value or the second trust value. The operations may further include transmitting the trust score data to another computing device.
The features and advantages of the present disclosures will be more fully disclosed in, or rendered obvious by the following detailed descriptions of example embodiments. The detailed descriptions of the example embodiments are to be considered together with the accompanying drawings wherein like numbers refer to like parts and further wherein:
The description of the preferred embodiments is intended to be read in connection with the accompanying drawings, which are to be considered part of the entire written description of these disclosures. While the present disclosure is susceptible to various modifications and alternative forms, specific embodiments are shown by way of example in the drawings and will be described in detail herein. The objectives and advantages of the claimed subject matter will become more apparent from the following detailed description of these exemplary embodiments in connection with the accompanying drawings.
It should be understood, however, that the present disclosure is not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives that fall within the spirit and scope of these exemplary embodiments. The terms “couple,” “coupled,” “operatively coupled,” “operatively connected,” and the like should be broadly understood to refer to connecting devices or components together either mechanically, electrically, wired, wirelessly, or otherwise, such that the connection allows the pertinent devices or components to operate (e.g., communicate) with each other as intended by virtue of that relationship.
Turning to the drawings,
For example, fraud detection computing device 102 can be a computer, a workstation, a laptop, a server such as a cloud-based server, or any other suitable device. Each of multiple customer computing devices 110, 112, 114 can be a mobile device such as a cellular phone, a laptop, a computer, a table, a personal assistant device, a voice assistant device, a digital assistant, or any other suitable device.
Additionally, each of fraud detection computing device 102, web server 104, workstations 106, and multiple customer computing devices 110, 112, 114 can include one or more processors, one or more field-programmable gate arrays (FPGAs), one or more application-specific integrated circuits (ASICs), one or more state machines, digital circuitry, or any other suitable circuitry.
Although
Workstation(s) 106 are operably coupled to communication network 118 via router (or switch) 108. Workstation(s) 106 and/or router 108 may be located at a store 109, for example. Workstation(s) 106 can communicate with fraud detection computing device 102 over communication network 118. The workstation(s) 106 may send data to, and receive data from, fraud detection computing device 102. For example, the workstation(s) 106 may transmit data related to a transaction, such as a purchase transaction, to fraud detection computing device 102. In response, fraud detection computing device 102 may transmit an indication of whether the transaction is fraudulent. Workstation(s) 106 may also communicate with web server 104. For example, web server 104 may host one or more web pages, such as a retailer's website. Workstation(s) 106 may be operable to access and program (e.g., configure) the webpages hosted by web server 104.
Database 116 can be a remote storage device, such as a cloud-based server, a memory device on another application server, a networked computer, or any other suitable remote storage. Fraud detection computing device 102 is operable to communicate with database 116 over communication network 118. For example, fraud detection computing device 102 can store data to, and read data from, database 116. Although shown remote to fraud detection computing device 102, in some examples, database 116 can be a local storage device, such as a hard drive, a non-volatile memory, or a USB stick.
Communication network 118 can be a WiFi® network, a cellular network such as a 3GPP® network, a Bluetooth® network, a satellite network, a wireless local area network (LAN), a network utilizing radio-frequency (RF) communication protocols, a Near Field Communication (NFC) network, a wireless Metropolitan Area Network (MAN) connecting multiple wireless LANs, a wide area network (WAN), or any other suitable network. Communication network 118 can provide access to, for example, the Internet.
First customer computing device 110, second customer computing device 112, and Nth customer computing device 114 may communicate with web server 104 over communication network 118. For example, web server 104 may host one or more webpages of a website. Each of multiple computing devices 110, 112, 114 may be operable to view, access, and interact with the webpages hosted by web server 104. In some examples, web server 104 hosts a web page for a retailer that allows for the purchase of items. For example, an operator of one of multiple computing devices 110, 112, 114 may access the web page hosted by web server 104, add one or more items to an online shopping cart of the web page, and perform an online checkout of the shopping cart to purchase the items. In some examples, web server 104 may transmit data that identifies the attempted purchase transaction to fraud detection computing device 102. In response, fraud detection computing device 102 may transmit an indication of whether the transaction is fraudulent to web server 104.
Fraud detection computing device 102 may determine whether a transaction is to be trusted. If the transaction is trusted (e.g., a trusted transaction), the transaction is allowed. For example, fraud detection computing device 102 may determine that an in-store or online purchase is to be trusted. Fraud detection computing device 102 may transmit a message to store 109 or web server 104, for example, indicating that the in-store or online transaction, respectively, is trusted. Store 109 or web server 104, respectively, may then allow the in-store or online transaction.
If fraud detection system 100 determines that the transaction is not trusted, the transaction may not be allowed. For example, fraud detection computing device 102 may determine that an in-store or online purchase is not to be trusted. Fraud detection computing device 102 may transmit a message to store 109 or web server 104, for example, indicating that the in-store or online transaction, respectively, is not trusted. Store 109 or web server 104, respectively, may then reject (e.g., not allow) the in-store or online transaction. In some examples, untrusted transactions may be allowed if one or more requirements are met. For example, store 109 may allow an untrusted in-store transaction if a customer shows an identification (ID), such as a driver's license or passport. Web server 104 may allow an untrusted online transaction if a customer answers security questions, or uses a different form of payment (e.g., a debit card instead of a credit card, a different credit card, etc.), for example.
To determine transactions that are not trusted (and thus potentially fraudulent), fraud detection computing device 102 executes one or more machine learning processes to generate a “trust score” (e.g., a value indicating whether a transaction should be trusted). In some examples, the machine learning processes may include logistic regression based models (e.g., algorithms), decision tree based models (e.g., XGBoost models). In some examples, the machine learning processes may include deep learning algorithms, or neural networks.
The machine learning processes may be trained with supervised data. For example, the machine learning processes may be trained with features generated from data identifying previous transactions that are labelled trusted or not trusted. In some examples, the machine learning processes are trained with unsupervised data. For example, the machine learning processes may be trained with features generated from data identifying previous transactions including whether payments were rejected or charged back (e.g., payment returned to paying source).
In some examples, to generate a trust score indicating that the transaction is trustworthy, the machine learning process determines whether a transaction was conducted with a device (e.g., computer, mobile phone) that has been determined (e.g., by fraud detection computing device 102) to be “connected to” a payment form (e.g., credit card, debit card) via a “trusted edge.” For example, a device and payment form may be connected via a trusted edge if they were previously used together to make a previous purchase, and the previous purchase was made earlier than a threshold amount of time (e.g., at least 3 months ago). If, for example, the same device and payment form were used to make a previous purchase transaction on a website at least earlier than the threshold amount of time, the machine learning process generates a trust score indicating that the current transaction is trusted (and thus should be allowed). Assuming a scale of 0 to 1, where 0 indicates no trust and 1 indicates full trust, for example, in this example the machine learning process may generate a trust score of 1. Here, the machine learning process executes more quickly than, for example, if the device were not connected to the payment form via a trusted edge, because it identifies the transaction as a trusted transaction based on the device and payment form. Otherwise, the machine learning process may need to operate on additional features to generate a trust score, as is described further below.
Data indicating devices connected to payment forms via trusted edges may be generated by fraud detection computing device 102. For example, fraud detection computing device 102 may generate trusted device data and trusted payment form data for each of a plurality of customers based on historical purchase transactions for each customer. Fraud detection computing device 102 may determine, for each customer, devices and payment forms used in transactions (e.g., purchase transactions) that took place at least earlier than the threshold amount of time. For each transaction, fraud detection computing device 102 may identify a device, and a payment form. If there was no chargeback on the transaction, or no complaint filed (e.g., a customer called to say they did not make a transaction), fraud detection computing device 102 may generate trusted device data and trusted payment form data connecting the device to the payment form via a trusted edge.
Each trusted device identified by the trusted device data is connected to at least one trusted payment form identified by the trusted payment form data via a trusted edge. Fraud detection computing device 102 may generate and/or update trusted device data and trusted payment form data, for example, on a periodic basis (e.g., nightly, monthly, etc.). In some examples, fraud detection computing device 102 may generate and/or update trusted device data and trusted payment form data in real time (e.g., as each in-store on online transaction is received).
In some examples, fraud detection computing device 102 may connect a second device to a trusted payment form via a trusted edge. For example, assume a first device and a payment form are connected via a trusted edge. Also assume that the customer attempts to make a purchase with a second device using the payment form (i.e., the payment form connected via a trusted edge to the first device). For this transaction using the second device, fraud detection computing device 102 may execute the machine learning process to determine a trust score. Because the second device is not connected to the payment form via a trusted edge, the machine learning process may operate on additional features. The additional features may be generated from, for example, user profile change data (e.g., password reset, address change), customer data, device data, payment data, product risk data, network data (e.g., number of nodes or edges in a graph, e.g., see
In some examples, features can also be generated based on a date or time associate with each of these forms of data. For example, password resets that occurred earlier than a threshold amount of time (e.g., more than 3 months ago) may be ignored, while password resets that occurred during the threshold amount of time (e.g., during the last 3 months) are relevant.
Referring back to the example from above, fraud detection computing device 102 may connect the second device to the payment form via a trusted edge after a threshold amount of time has passed, assuming no chargeback and no complaint becomes associated with the transaction during the threshold amount of time.
In some examples, the attempted purchase with the second device is declined (e.g., based on a trusted score generated by fraud detection computing device 102). Assuming the transaction was at store 109 (e.g., the customer attempted making the purchase using an application on the second device using a payment form linked to the application), store 109 may allow the customer to complete the purchase using the payment form by, for example, scanning the payment form (e.g., credit card, debit card) on a card reader (e.g., credit card or debit card reader). The customer may also need to show a valid ID. If the customer successfully scans the payment form and present the ID, and the purchase is made, fraud detection computing device 102 may then connect the second device with the payment form via a trusted edge.
Although in the above examples trusted edges are described between devices and payment forms, fraud detection computing device 102 can generate trusted edges between other items as well. For example, fraud detection computing device 102 can determine trusted edges between a payment form and a store based on the last time the customer used the payment form at the store. In other examples, fraud detection computing device 102 can generate trusted edges between a customer (e.g., a customer ID) and a device (e.g., based on when the customer last used the device to make a purchase), a customer and a payment form (e.g., based on when the customer last used the payment form to make a purchase), a customer and a home address (e.g., based on when the customer last changed their home address in a user profile), a customer and a store location (e.g., based on when the customer last visited the store to make a purchase), and a customer and a phone number (e.g., based on when the customer last updated their phone number in a user profile), for example.
In some examples, fraud detection computing device 102 assigns values (e.g., weights) to trusted edges. For example, fraud detection computing device 102 may assign a trusted edge between a customer and a device a higher weight than to a trusted edge between a customer and a store. The machine learning process may apply the weights to the trusted edges in generating trust scores.
In some examples, store 109 and web server 104 determine whether a transaction is allowed based on the trust score. For example, transactions with a trust score above a threshold (e.g., 0.8 on a 0 to 1 scale) may be allowed, while transaction with a trust score below the threshold are denied. In some examples, denied transactions may be subsequently allowed if one or more requirements are satisfied, such as scanning a payment form on a card reader, presenting one or more IDs, or any other suitable requirement.
Processors 201 can include one or more distinct processors, each having one or more cores. Each of the distinct processors can have the same or different structure. Processors 201 can include one or more central processing units (CPUs), one or more graphics processing units (GPUs), application specific integrated circuits (ASICs), digital signal processors (DSPs), and the like.
Processors 201 can be configured to perform a certain function or operation by executing code, stored on instruction memory 207, embodying the function or operation. For example, processors 201 can be configured to perform one or more of any function, method, or operation disclosed herein.
Instruction memory 207 can store instructions that can be accessed (e.g., read) and executed by processors 201. For example, instruction memory 207 can be a non-transitory, computer-readable storage medium such as a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), flash memory, a removable disk, CD-ROM, any non-volatile memory, or any other suitable memory.
Processors 201 can store data to, and read data from, working memory 202. For example, processors 201 can store a working set of instructions to working memory 202, such as instructions loaded from instruction memory 207. Processors 201 can also use working memory 202 to store dynamic data created during the operation of fraud detection computing device 102. Working memory 202 can be a random access memory (RAM) such as a static random access memory (SRAM) or dynamic random access memory (DRAM), or any other suitable memory.
Input-output devices 203 can include any suitable device that allows for data input or output. For example, input-output devices 203 can include one or more of a keyboard, a touchpad, a mouse, a stylus, a touchscreen, a physical button, a speaker, a microphone, or any other suitable input or output device.
Communication port(s) 209 can include, for example, a serial port such as a universal asynchronous receiver/transmitter (UART) connection, a Universal Serial Bus (USB) connection, or any other suitable communication port or connection. In some examples, communication port(s) 209 allows for the programming of executable instructions in instruction memory 207. In some examples, communication port(s) 209 allow for the transfer (e.g., uploading or downloading) of data, such as transaction data.
Display 206 can display user interface 205. User interfaces 205 can enable user interaction with fraud detection computing device 102. For example, user interface 205 can be a user interface for an application of a retailer that allows a customer to purchase one or more items from the retailer. In some examples, a user can interact with user interface 205 by engaging input-output devices 203. In some examples, display 206 can be a touchscreen, where user interface 205 is displayed on the touchscreen.
Transceiver 204 allows for communication with a network, such as the communication network 118 of
Fraud detection computing device 102 may execute a machine learning process (e.g., model, algorithm) based on store purchase data 302 to generate a trust score. For example, machine learning algorithm data 370, stored in database 116, may identify and characterize the machine learning process. The machine learning process may be based on decision trees, such as XGBoost, for example. Fraud detection computing device 102 may obtain machine learning algorithm data 370 from database 116, and execute the machine learning process to generate a trust score for the transaction. Fraud detection computing device 102 may then generate store trust score data 304 identifying the trust score. Store trust score data 304 may be transmitted to store 109, for example.
To generate store trust score data 304, fraud detection computing device 102 may determine trusted device data 357 and trusted payment form data 358 for a customer based on store purchase data 302. Trusted device data 357 and trusted payment form data 358 may be linked to a customer via a customer ID, for example. Fraud detection computing device 102 may identify the customer based on a customer ID identified by store purchase data 302, and obtain trusted device data 357 and trusted payment form data 358 for the customer from database 116.
Fraud detection computing device 102 may then execute the machine learning process to determine whether the device and the payment form being used for the purchase identified by store purchase data 302 are trusted to the customer. For example, fraud detection computing device 102 can determine whether trusted device data 357 for the customer includes the device, and whether trusted payment form data 358 for the customer includes the payment form. Further, fraud detection computing device 102 may determine if trusted device data 357 and trusted payment form data 358 indicate a trusted edge linking the device and the payment form.
If fraud detection computing device 102 determines the device and the payment form are trusted to the customer, fraud detection computing device 102 generates store trust score data 304 indicating that the transaction is trusted. For example, on a scale of 0 to 1 (where 0 indicates no trust and 1 indicates full trust), inclusive, fraud detection computing device 102 may generate a score of 1.
If, however, fraud detection computing device 102 determines that the device and the payment form are not trusted to the customer, the machine learning process may further execute to generate store trust score data 304. For example, fraud detection computing device 102 may generate features based on customer data 350 for the customer identified by store purchase data 302. Customer data 350 may include, for example, a customer ID 352 (e.g., a customer name, an ID number, online ID, etc.), store history data 354 identifying historical in-store purchase transactions for the customer, and online history data 356 identifying online purchase transactions for the customer. Store history data 354 and online history data 356 may also include labelled data, such as previously identified trusted transactions for the customer, and chargebacks associated with previous transactions, for example. In some examples, customer data 350 includes one or more of user profile change data, device data, payment data, product risk data, network data, and geospatical data (e.g., physical location of a store the customer has visited, billing address, etc.). In some examples, fraud detection computing device 102 further generates features based on store purchase data 350.
Based on the generated features, fraud detection computing device 102 may execute the machine learning process to generate store trust score data 304 for the transaction. Upon receiving store trust score data 304, store 109 may determine whether to allow the transaction. For example, store 109 may allow the transaction if the trust score identified by store trust score data 304 is at or above a threshold (e.g., 0.8 on a 0 to 1 scale, inclusive). If, however, the trust score is below the threshold, store 109 may deny the transaction. In some examples, store 109 may allow the transaction if one or more requirements are met. For example, store 109 may allow the transaction if a customer ID is presented, or if the customer uses a different form of payment.
In some examples, if the customer attempted to pay with a payment form via (e.g, a credit card) an application executing on a computing device, such as first customer computing device 110, and the transaction was denied, store 109 may allow the transaction if the customer instead swipes the payment form on a card reader.
In some examples, store trust score data 304 identifies whether the transaction is to be allowed. For example, fraud detection computing device 102 may determine if the generated trust score is above the threshold. If the generated trust score is at or above the threshold, fraud detection computing device 102 generates store trust score data 304 identifying that the transaction is to be allowed. If, however, the generated trust score is below the threshold, fraud detection computing device 102 generates store trust score data 304 identifying that the transaction is not to be allowed. Store 109 may then allow or disallow the transaction based on store trust score data 304.
Similarly, fraud detection computing device 102 can receive from a web server 104, such as a web server hosting a retailer's website, online purchase data 310 identifying the purchase attempt of one or more items from the website. For example, web server 104 may receive purchase request data 306 from customer computing device 112, where purchase request data 306 identifies an attempt to purchase one or more items from a website, such as a retailer's website. Web server 104 may generate online purchase data 310 based on purchase request data 306. For example, online purchase data 310 may include one or more of the following: an identification of one or more items being purchased; an identification of the customer (e.g., customer ID, a user name, a driver's license number, etc.); an identification of a device (e.g., a computer, mobile phone, etc.) being used for the purchase (e.g., a device ID, a user name for an application running on the device, a MAC address, etc.); a monetary amount (e.g., price) of each item being returned; the method of payment (i.e., payment form) used to purchase the items (e.g., credit card, cash, check); a Universal Product Code (UPC) number for each item; a time and/or date; and/or any other data related to the attempted purchase transaction.
Fraud detection computing device 102 may execute the machine learning process based on online purchase data 310 to generate a trust score. For example, fraud detection computing device 102 may obtain machine learning algorithm data 370 from database 116, and execute the machine learning process to generate a trust score for the transaction. Fraud detection computing device 102 may then generate online trust score data 312 identifying the trust score. Online trust score data 312 may be transmitted to web server 104, for example. Web server 104 may generate purchase response data 308 identifying the trust score, and may transmit purchase response data 308 to customer computing device 112 in response to receiving purchase request data 306.
To generate online trust score data 312, fraud detection computing device 102 may determine trusted device data 357 and trusted payment form data 358 for the customer based on online purchase data 310. Trusted device data 357 and trusted payment form data 358 may be linked to a customer via a customer ID or user name, for example. Fraud detection computing device 102 may identify the customer based on a customer ID identified by online purchase data 310, and obtain trusted device data 357 and trusted payment form data 358 for the customer from database 116.
Fraud detection computing device 102 may then execute the machine learning process to determine whether the device and the payment form being used for the purchase identified by store purchase data 302 are trusted to the customer. If fraud detection computing device 102 determines the device and the payment form are trusted to the customer, fraud detection computing device 102 generates online trust score data 312 indicating that the transaction is trusted.
If, however, fraud detection computing device 102 determines that the device and the payment form are not trusted to the customer, the machine learning process may further execute to generate online trust score data 312. For example, fraud detection computing device 102 may generate features based on customer data 350 for the customer identified by online purchase data 310. Based on the generated features, fraud detection computing device 102 may execute the machine learning process to generate online trust score data 312 for the transaction. Upon receiving online trust score data 312, web server 104 may determine whether to allow the transaction. For example, web server 104 may allow the transaction if the trust score identified by online trust score data 312 is at or above a threshold. If, however, the trust score is below the threshold, web server 104 may deny the transaction.
In some examples, web server 104 may allow the transaction if one or more requirements are met. For example, web server 104 may allow the transaction if the customer provides additional information, such as a driver's license number, or uses a different form of payment. In some examples, the customer may complete the payment at a store, such as store 109, where the customer may be required to present a customer ID, or swipe the payment from on a card reader.
In some examples, online trust score data 312 identifies whether the transaction is to be allowed. For example, fraud detection computing device 102 may determine if the generated trust score is above the threshold. If the generated trust score is at or above the threshold, fraud detection computing device 102 generates online trust score data 312 identifying that the transaction is to be allowed. If, however, the generated trust score is below the threshold, fraud detection computing device 102 generates online trust score data 312 identifying that the transaction is not to be allowed. Web server 104 may then allow or disallow the transaction based on online trust score data 312.
Customer determination engine 410 may receive a request to determine whether a transaction, such as a purchase transaction, is to be trusted. For example, customer determination engine 410 can receive store purchase data 302 from store 109. Customer determination engine 410 can also receive online purchase data 312 from web server 104. Customer determination engine 410 may identify and obtain, from database 116, one or more of trusted device data 357, trusted payment form data 358, and customer data 350 for a customer associated with store purchase data 302 or online purchase data 312.
Machine learning engine 406 can receive request data (e.g., store purchase data 302 and online purchase data 310), as well as trusted device data 357 and trusted payment form data 358, from customer determination engine 410. Machine learning engine 406 may then execute one or more machine learning processes to generate a trust score for the transaction. For example, machine learning engine 406 may determine whether trusted device data 357 and trusted payment form data 358 identify a trusted edge between a device and a payment form used for the transaction. If machine learning engine 406 determines that trusted device data 357 and trusted payment form data 358 identify a trusted edge between the device and the payment form, machine learning engine 406 generates trust score data 407 identifying a trust score that indicates that the transaction is to be trusted (e.g., a 1 in a 0 to 1 scale). Trust score data 407 is provided to allowance determination engine 408.
If, however, machine learning engine 406 determines that trusted device data 357 and trusted payment form data 358 do not identify a trusted edge between the device and the payment form, machine learning engine 406 may transmit a feature data request 405 to feature determination engine 402.
To generate features, feature determination engine 402 may obtain the data from customer determination engine 410, and generate one or more features. Feature determination engine 402 may execute, for example, a feature extraction algorithm based on the obtained data, and generate feature data 403 identifying the extracted features.
Machine learning engine 406 may obtain feature data 403 from feature determination engine 402, and execute a machine learning process to generate trust score data 407. For example, machine learning engine 406 may provide the feature data as input to a machine learning algorithm, and may execute the machine learning algorithm. The machine learning algorithm may be based on decision trees, such as one based on XGBoost. Execution of the machine learning algorithm can result in generation of a trust score. Machine learning engine 406 may transmit trust score data 407, identifying the trust score, to allowance determination engine 408.
Allowance determination engine 408 may receive trust score data 407, and provide a response to store purchase data 302 or online purchase data 310 based on trust score data 407. For example, assuming store purchase data 302 was received by customer determination engine 410, allowance determination engine 408 may generate store trust score data 304 identifying the trust score received in trust score data 407. Store trust score data 304 may be a message that includes the trust score, where the message is formatted for transmission through a particular communication channel.
In some examples, allowance determination engine 408 determines whether the trust score is beyond a threshold. For example, allowance determination engine 408 may determine if the trust score is at or above the threshold. If the trust score is at or above the threshold, allowance determination engine 408 generates store trust score data 304 identifying that the transaction is to be allowed. If, however, the generated trust score is below the threshold, allowance determination engine 408 generates store trust score data 304 identifying that the transaction is not to be allowed. Store 109 may then allow or disallow the transaction based on store trust score data 304.
Similarly, and assuming online purchase data 310 was received by customer determination engine 410, allowance determination engine 408 may generate online trust score data 312 identifying the trust score received in trust score data 407. Online trust score data 312 may be a message that includes the trust score, where the message is formatted for transmission through a particular communication channel, such as over the internet.
In some examples, allowance determination engine 408 determines whether the trust score is beyond a threshold. For example, allowance determination engine 408 may determine if the trust score is at or above the threshold. If the trust score is at or above the threshold, allowance determination engine 408 generates online trust score data 312 identifying that the transaction is to be allowed. If, however, the generated trust score is below the threshold, allowance determination engine 408 generates online trust score data 312 identifying that the transaction is not to be allowed. Web server 104 may then allow or disallow the transaction based on online trust score data 312.
In
In addition, to determine if a transaction is to be trusted, the machine learning process may consider one or more trust associations to a customer, such as customer 502. For example, the machine learning process may generate a trust score for a transaction based on a device and payment form being used for the transaction, as well as other trust associations for the customer. For example, the machine learning process may generate a trust score of 0.8 for a customer using a trusted device and trusted payment form in a current transaction, but a transaction score of 0.9 for a customer that, in addition to using a trusted device and trusted payment form in a current transaction, also has an addition trusted payment form (e.g., that is not being used in the current transaction).
At step 1006, a determination is made as to whether the device and the payment form identified in the purchase data are trusted based on the obtained trusted device data and trusted payment data. For example, fraud detection computing device 102 may determine if obtain trusted device data 357 and trusted payment form data 358 identify a trusted edge between the device and the payment form. If the device and the payment form are trusted, the method proceeds to step 1008, where a relatively high target score is generated. For example, fraud detection computing device 102 may generate a target score that indicates the purchase is to be allowed.
Otherwise, if at step 1006, the device and the payment form are not trusted, the method proceeds to step 1010. At step 1010, a trained machine learning process is executed. The trained machine learning process may be based on decision trees, for example, and may be trained with labelled historical purchase transaction data. The trained machined learning process operates on the purchase data to generate a trust score.
From steps 1008 and 1010, the method proceeds to step 1012, where the generated trust score is transmitted to the computing device. The method then ends.
The method proceeds to step 1106, where the trained machine learning process is executed. The trained machined learning process operates on the purchase data to generate a trust score.
Proceeding to step 1108, a determination is made as to whether the trust score is beyond a threshold. For example, a determination may be made as to whether the trust score is at or above the threshold. If the trust score is beyond the threshold, the method proceeds to step 1110 where trust score data is generated indicating that the transaction is to be allowed. If at step 1108, however, the trust score is not beyond the threshold (e.g., below the threshold), the method proceeds to step 1112. At step 1113, trust score data is generated indicating that the transaction is not be allowed.
From each of steps 1110 and 1112, the method proceeds to step 1114. At step 1114, the trust score is transmitted to the computing device. The method then ends.
Although the methods described above are with reference to the illustrated flowcharts, it will be appreciated that many other ways of performing the acts associated with the methods can be used. For example, the order of some operations may be changed, and some of the operations described may be optional.
In addition, the methods and system described herein can be at least partially embodied in the form of computer-implemented processes and apparatus for practicing those processes. The disclosed methods may also be at least partially embodied in the form of tangible, non-transitory machine-readable storage media encoded with computer program code. For example, the steps of the methods can be embodied in hardware, in executable instructions executed by a processor (e.g., software), or a combination of the two. The media may include, for example, RAMs, ROMs, CD-ROMs, DVD-ROMs, BD-ROMs, hard disk drives, flash memories, or any other non-transitory machine-readable storage medium. When the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the method. The methods may also be at least partially embodied in the form of a computer into which computer program code is loaded or executed, such that, the computer becomes a special purpose computer for practicing the methods. When implemented on a general-purpose processor, the computer program code segments configure the processor to create specific logic circuits. The methods may alternatively be at least partially embodied in application specific integrated circuits for performing the methods.
The foregoing is provided for purposes of illustrating, explaining, and describing embodiments of these disclosures. Modifications and adaptations to these embodiments will be apparent to those skilled in the art and may be made without departing from the scope or spirit of these disclosures.