Behavior analysis using distributed representations of event data

Information

  • Patent Grant
  • 11151468
  • Patent Number
    11,151,468
  • Date Filed
    Thursday, June 30, 2016
    8 years ago
  • Date Issued
    Tuesday, October 19, 2021
    2 years ago
Abstract
The features relate to artificial intelligence directed detection of user behavior based on complex analysis of user event data including language modeling to generate distributed representations of user behavior. Further features are described for reducing the amount of data needed to represent relationships between events such as transaction events received from card readers or point of sale systems. Machine learning features for dynamically determining an optimal set of attributes to use as the language model as well as for comparing current event data to historical event data are also included.
Description
BACKGROUND
Field

The present developments relate to artificial intelligence systems and methods, specifically to predictive systems and methods for fraud risk, and behavior-based marketing using distributed representations of event data.


Description of Related Art

With the advent of modern computing devices, the ways in which users use electronic devices to interact with various entities has dramatically increased. Each event a user performs, whether by making a small purchasing at a grocery store, logging into a web-site, checking a book out of a library, driving a car, making a phone call, or exercising at the gym, the digital foot print of the users interactions can be tracked. The quantity of event data collected for just one user can be immense. The enormity of the data may be compounded by the number of users connected and the increasing number of event types that are made possible through an increasing number of event sources and entities.


Accordingly, improved systems, devices, and methods for accurately and efficiently identifying fraud risk based on event data are desirable.


SUMMARY

The features relate to artificial intelligence directed detection of user behavior based on complex analysis of user event data including language modeling to generate distributed representations of user behavior. Further features are described for reducing the amount of data needed to represent relationships between events such as transaction events received from card readers or point of sale systems. Machine learning features for dynamically determining an optimal set of attributes to use as the language model as well as for comparing current event data to historical event data are also included.


The systems, methods, and devices of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.


In one innovative aspect, a computer-implemented method of artificial intelligence guided monitoring of event data is provided. The method includes several steps performed under control of one or more computing devices configured with specific computer-executable instructions. The method includes accessing, from a data store, a sequence of event records associated with a user, the sequence of event records indicating a history of events for the user. The method includes identifying a set of attributes of an event record to represent the event records. The method includes generating a model to provide a vector representing an event included in the history of events using values for the set of attributes of the sequence of event records. A first vector representing a first event at a first time indicates a higher degree of similarity to a second vector representing a second event at a second time than to a third vector representing a third event at a third time. A first difference between the first time and the second time is less than a second difference between the first time and the third time. The method includes receiving, from an event processing device, a candidate event for the user. The method includes generating a candidate event vector using the model and the candidate event. The method includes identifying a behavior anomaly using a degree of similarity between the candidate event vector and a prior event vector representing a prior event. The method includes providing an indication of the behavior anomaly for the candidate event for the user.


In some implementations of the method, identifying the behavior anomaly further may include identifying a set of prior events representing past behavior of the user, the set of prior events including the prior event. The method may include generating the degree of similarity between the candidate event vector and vectors representing the set of prior events. For example, the degree of similarity may be generated using a mean of similarity values between the candidate event vector and the vectors representing the set of prior events. In some implementations, the degree of similarity may be generated using a maximum or a minimum similarity value between the candidate event vector and the vectors representing the set of prior events.


In some implementations, the method includes generating the vectors representing each of the set of prior events. The method may include generating a composite vector for the user based on the vectors representing the set of prior events, and wherein the degree of similarity is generated using an exponentially weighted moving average.


In some implementations, the method includes receiving anomaly indicators for the set of prior events and generating an anomaly model that combines similarity metrics of the set of prior events to generate an output determination for a prior event corresponding to an anomaly indicator for the prior event. The method may further include generating similarity metrics for the candidate event, the similarity metrics indicating degrees of similarity between the candidate event and at least one of the prior events included in the set of prior events, the similarity metrics including the degree of similarity between the candidate event vector and a vector representation of one of prior events included in the set of prior events. The method may also include generating the indication of the behavior anomaly using the similarity metrics and the anomaly model.


In some implementations, the event processing device comprises a card reading device, and receiving the candidate event for the user includes receiving, from the card reading device, an authorization request including the candidate event, and wherein providing the indication of the behavior anomaly comprises providing an authorization response indicating the candidate event is unauthorized.


Some implementations of the method include receiving a third-party behavior score for the user from a third-party behavior scoring system, wherein identifying the behavior anomaly is further based at least in part on the third-party behavior score.


In another innovative aspect, a computer-implemented method of artificial intelligence guided monitoring of event data. The method may be performed under control of one or more computing devices configured with specific computer-executable instructions. The instructions may cause the one or more computing devices to perform the method including receiving, from an event processing device, a candidate event for a user, generating a candidate event vector using a model and the candidate event, identifying a behavior anomaly using a degree of similarity between the candidate event vector and a prior event vector for a prior event, and providing an indication of the behavior anomaly for the candidate event for the user.


Some implementations of the method include accessing, from a data store, a sequence of event records associated with the user, the sequence of event records indicating a history of events for the user, identifying a set of attributes of an event record to represent the event records, and generating a model to provide the vector of the prior event included in the history of events using values for the set of attributes of the sequence of event records. Some implementations of the method include receiving a third-party behavior score for the user from a third-party behavior scoring system, wherein identifying the behavior anomaly is further based at least in part on the third-party behavior score.


Providing the indication of the behavior anomaly may include providing an authorization response indicating a transaction associated with the candidate event is unauthorized, wherein the authorization response causes configuration of the event processing device to acquire additional event information to authorize the candidate event. In some implementations, the authorization response may cause configuration of the event processing device to acquire additional event information to authorize the transaction associated with the candidate event.


In a further innovative aspect, an artificial intelligence event monitoring system is provided. The system includes an electronic data processing device comprising instructions stored on a computer readable medium that, when executed by the electronic data processing device, cause the electronic data processing device to receive, from an event processing device, a candidate event for a user, generate a candidate event vector using a model and the candidate event, identify a behavior anomaly using a degree of similarity between the candidate event vector and a prior event vector for a prior event, and provide an indication of the behavior anomaly for the candidate event for the user.


The computer readable medium may store additional instructions that cause the electronic data processing device to access, from a data store, a sequence of event records associated with the user, the sequence of event records indicating a history of events for the user, identify a set of attributes of an event record to represent the event records, and generate a model to provide the vector of the prior event included in the history of events using values for the set of attributes of the sequence of event records.


In a further innovative aspect, a computer-implemented method of artificial intelligence guided content provisioning is provided. The method includes, under control of one or more computing devices configured with specific computer-executable instructions, accessing, from a data store, a sequence of event records associated with a user, the sequence of event records indicating a history of events for the user. The method includes identifying a set of attributes of an event record to represent the event records. The method includes generating a model to provide a vector representation of an event included in the history of events using values for the set of attributes of the sequence of event records. A first vector representation of a first event at a first time indicates a higher degree of similarity to a second vector representation of a second event at a second time than to a third vector representation of a third event at a third time. The difference between the first time and the second time is less than the difference between the first time and the third time. The method includes receiving a desired event related to a content item to be provided. The method further includes generating a candidate event vector representation for the desired event using the model and the desired event. The method includes identifying an event record having at least a predetermined degree of similarity with the desired event and providing the content item to a user associated with the event record.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates high dimensional space modeling using the examples of words in English language.



FIG. 2 shows a process flow diagram of a method for generating distributed representations of transactions.



FIGS. 3 and 4 illustrate configurations of neural networks which may be used to generate the distributed representations as used in some embodiments.



FIG. 5 shows an example recurrent neural network.



FIG. 6 illustrates a method of comparing new transactions to previous user transactions.



FIG. 7 illustrates a plot of experimental detection performance of an attribute similarity score.



FIG. 8 illustrates a plot of experimental detection performance for three different methods of detection.



FIG. 9A shows a geospatial fraud map.



FIG. 9B shows an alternative geospatial fraud map.



FIG. 9C shows yet another geospatial fraud map.



FIG. 10 illustrates a plot of experimental detection performance for nine different methods of detecting behavior abnormalities.



FIG. 11 illustrates a plot of experimental detection performance for four different methods of detecting behavior abnormalities using unsupervised learning.



FIG. 12 illustrates a plot of experimental detection performance for a combination of variables based on supervised learning.



FIG. 13A shows a functional block diagram of an example behavior scoring system.



FIG. 13B shows a functional block diagram of another example behavior scoring system.



FIG. 14 shows a message flow diagram of an example transaction with behavior detection.



FIG. 15 shows a message flow diagram of an example batch transaction processing with model retraining and regeneration of user distributed representations.



FIG. 16 shows a schematic perspective view of an example card reader.



FIG. 17 shows a functional block diagram of the exemplary card reader of FIG. 16.



FIG. 18 shows a plot of merchant clusters.



FIG. 19 shows a block diagram showing example components of a transaction analysis computing system 1900.





DETAILED DESCRIPTION

Disclosed herein are system and methods of analyzing, processing, and manipulating large sets of transaction data of users in order to provide various visualizations, alerts, and other actionable intelligence to control event processing devices, user electronic communication devices and the like as well as to users, merchants, and others. Transaction data may include, for example, data associated with any interaction by a user device with a server, website, database, and/or other online data owned by or under control of a requesting entity, such as a server controlled by a third party. Such events may include access of webpages, submission of information via webpages, accessing a server via a standalone application (e.g., an application on a mobile device or desktop computer), login in activity, Internet search history, Internet browsing history, posts to a social media platform, or other interactions between communication devices. In some implementations, the users may be machines interacting with each other (e.g., machine to machine communications). In some embodiments transaction data may include, for example, specific transactions on one or more credit cards of a user, such as the detailed transaction data that is available on credit card statements. Transaction data may include transaction-level debit information also, such as regarding debit card or checking account transactions. The transaction data may be obtained from various sources, such as from credit issuers (e.g., financial institutions that issue credit cards), transaction processors (e.g., entities that process credit card swipes at points of sale), transaction aggregators, merchant retailers, and/or any other source.


Each of the processes described herein may be performed by a transaction analysis processing system (also referred to as simply “the system,” “the transaction analysis system,” or “the processing system” herein), such as the example transaction analysis system illustrated in FIG. 19 and discussed below. In other embodiments, other processing systems, such as systems including additional or fewer components than are illustrated in FIG. 19 may be used to perform the processes. In other embodiments, certain processes are performed by multiple processing systems, such as one or more servers performing certain processes in communication with a user computing device (e.g., mobile device) that performs other processes.


As noted above, in one embodiment the transaction analysis processing system accesses transaction data associated with a plurality of users in order to generate machine learning models that can provide efficient and accurate behavior detection and predictions based on users' transaction data. It may be desirable to detect abnormal behavior (e.g., fraudulent behavior) during a transaction. Such “real-time” data allows transaction participants to receive relevant information at a specific point in time when a potentially abnormal transaction may be further verified or stopped.


Exemplary Definitions

To facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.


Transaction data (also referred to as event data) generally refers to data associated with any event, such as an interaction by a user device with a server, website, database, and/or other online data owned by or under control of a requesting entity, such as a server controlled by a third party, such as a merchant. Transaction data may include merchant name, merchant location, merchant category, transaction dollar amount, transaction date, transaction channel (e.g., physical point of sale, Internet, etc.) and/or an indicator as to whether or not the physical payment card (e.g., credit card or debit card) was present for a transaction. Transaction data structures may include, for example, specific transactions on one or more credit cards of a user, such as the detailed transaction data that is available on credit card statements. Transaction data may also include transaction-level debit information, such as regarding debit card or checking account transactions. The transaction data may be obtained from various sources, such as from credit issuers (e.g., financial institutions that issue credit cards), transaction processors (e.g., entities that process credit card swipes at points-of-sale), transaction aggregators, merchant retailers, and/or any other source. Transaction data may also include non-financial exchanges, such as login activity, Internet search history, Internet browsing history, posts to a social media platform, or other interactions between communication devices. In some implementations, the users may be machines interacting with each other (e.g., machine-to-machine communications). Transaction data may be presented in raw form. Raw transaction data generally refers to transaction data as received by the transaction processing system from a third party transaction data provider. Transaction data may be compressed. Compressed transaction data may refer to transaction data that may be stored and/or transmitted using fewer resources than when in raw form. Compressed transaction data need not be “uncompressible.” Compressed transaction data preferably retains certain identifying characteristics of the user associated with the transaction data such as behavior patterns (e.g., spend patterns), data cluster affinity, or the like.


An entity generally refers to one party involved in a transaction. In some implementations, an entity may be a merchant or other provider of goods or services to one or more users


A model generally refers to a machine learning construct which may be used by the transaction processing system to automatically generate distributed representations of behavior data and/or similarity metrics between distributed representations. A model may be trained. Training a model generally refers to an automated machine learning process to generate the model that accepts transaction data as an input and provides a distributed representation (e.g., vector) as an output. When comparing distributed representations, the model may identify comparisons between two vectors for generating a similarity score indicating how similar a given vector is to another. A model may be represented as a data structure that identifies, for a given value, one or more correlated values.


A vector encompasses a data structure that can be expressed as an array of values where each value has an assigned position that is associated with another predetermined value. For example, an entity vector will be discussed below. A single entity vector may be used represent the number of transaction for a number of users within a given merchant. Each entry in the entity vector represents the count while the position within the entity vector may be used to identify the user with whom the count is associated. In some implementations, a vector may be a useful way to hide the identity of a user but still provide meaningful analysis of their transaction data. In the case of entity vectors, as long as the system maintains a consistent position for information related to a user within the vectors including user data, analysis without identifying a user can be performed using positional information within the vectors. Other vectors may be implemented wherein the entries are associated with transaction categories or other classes of transaction data.


The term machine learning generally refers to automated processes by which received data is analyzed to generate and/or update one or more models. Machine learning may include artificial intelligence such as neural networks, genetic algorithms, clustering, or the like. Machine learning may be performed using a training set of data. The training data may be used to generate the model that best characterizes a feature of interest using the training data. In some implementations, the class of features may be identified before training. In such instances, the model may be trained to provide outputs most closely resembling the target class of features. In some implementations, no prior knowledge may be available for training the data. In such instances, the model may discover new relationships for the provided training data. Such relationships may include similarities between data elements such as entities, transactions, or transaction categories as will be described in further detail below. Such relationships may include recommendations of entities for a user based on past entities the user has transacted with.


A recommendation encompasses information identified that may be of interest to a user having a particular set of features. For example, a recommendation may be developed for a user based on a collection of transaction data associated with the user and through application of a machine learning process comparing that transaction data with third-party transaction data (e.g., transaction data of a plurality of other users). A recommendation may be based on a determined entity and may include other merchants related to the determined merchant. In some implementations, the recommendation may include recommendation content. The recommendation content may be text, pictures, multimedia, sound, or some combination thereof. The recommendation content may include information related to merchants or categories of merchants identified for a given user. In some implementations, the recommendation may include a recommendation strength. The strength may indicate how closely the recommendation matches user preferences as indicated by the provided transaction data features (e.g., transaction category, number of transaction within a category, date of transaction, etc.). For example, a user may have a very obscure set of features for which there are few recommendations, and of the recommendations that are able to be generated using the models, the strength is lower than a recommendation for another user who has more readily ascertainable features. As such, the strength may be included to allow systems receiving the recommendation to decide how much credence to give the recommendation.


A message encompasses a wide variety of formats for communicating (e.g., transmitting or receiving) information. A message may include a machine readable aggregation of information such as an XML document, fixed field message, comma separated message, or the like. A message may, in some implementations, include a signal utilized to transmit one or more representations of the information. While recited in the singular, a message may be composed, transmitted, stored, received, etc. in multiple parts.


The terms determine or determining encompass a wide variety of actions. For example, “determining” may include calculating, computing, processing, deriving, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” may include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” may include resolving, selecting, choosing, establishing, and the like.


The term selectively or selective may encompass a wide variety of actions. For example, a “selective” process may include determining one option from multiple options. A “selective” process may include one or more of: dynamically determined inputs, preconfigured inputs, or user-initiated inputs for making the determination. In some implementations, an n-input switch may be included to provide selective functionality where n is the number of inputs used to make the selection.


The terms provide or providing encompass a wide variety of actions. For example, “providing” may include storing a value in a location for subsequent retrieval, transmitting a value directly to a recipient, transmitting or storing a reference to a value, and the like. “Providing” may also include encoding, decoding, encrypting, decrypting, validating, verifying, and the like.


A user interface (also referred to as an interactive user interface, a graphical user interface or a UI) may refer to a web-based interface including data fields for receiving input signals or providing electronic information and/or for providing information to the user in response to any received input signals. A UI may be implemented in whole or in part using technologies such as HTML, Flash, Java, .net, web services, and RSS. In some implementations, a UI may be included in a stand-alone client (for example, thick client, fat client) configured to communicate (e.g., send or receive data) in accordance with one or more of the aspects described.


Introduction


This document provides a description of novel systems and methods for detecting abnormal behavior based on transactional data information. The application areas of such methodology can be for fraud detection (users, consumers, merchants, personnel, etc.), targeted marketing (user, consumer, and business), and credit/attrition risk prediction. The transactions can be any type of records describing the activity of users or business. For example, specific transactions on one or more plastic (credit or debit) cards of a user, such as the detailed transaction data that is available on card statements. Other examples include but are not limited to, online web click stream, and mobile phone location/activity. While the example embodiments discussed herein are generally directed toward the use of credit card transactions made by users, the systems and methods disclosed herein are not limited to such embodiments, and may be implemented using a variety of data sources.


Abnormal behavior of credit card transaction usage, may indicate that the credit card is being used by someone who is not an authorized user, thus it can point to fraudulent usage of the cards. In addition, for marketing type of applications, detection of the abnormal behavior can indicate that there is either a short-term behavior change such as travel, vacationing, or long-term life-stage change such as marriage, graduation, family with new born, etc. that causes the shift of behavior. In some embodiments, marketers may use the information about changed behaviors to offer different types of products to the user that better suit his/her new needs. Furthermore, for credit risk type of application, a shift of behavior can be associated with higher level of risk so that strategy can be devised to mitigate the new risk. The same technique can also be used to identify users that are similar to each other, or, have preference/dislike to combinations of certain types of merchants, so that such information can also be used to perform target marketing.


In some embodiments, the systems and methods disclosed herein may use concepts from computational linguistics and neural networks to represent the transactions in a distributed sense. For example, the transactions may be represented as high-dimensional vectors (such as 200-300 dimensions). Distributed representation of the transactions may encode the transactions as well as the relations with other transactions. Such encoding of the transactions and relationship may provide the following non-limiting advantages:


(1) It allows the models built on the representation to generalize to unseen but similar patterns;


(2) It provides a natural way of calculating the similarity among transactions; and


(3) It requires significantly less amount of storage (several order of magnitudes) to encode the similarity relation as compared to alternative methods, thus it enables near-real time look-up of similar transactions.


In some embodiments of the systems and methods disclosed herein, the unsupervised nature of the machine learning techniques employed allows for fraud detection, target marketing, and credit/attrition risk prediction without requiring prior knowledge of the ‘lag’ or ‘label’ for each of the transactions used. This provides the benefit of removing the collection of such ‘tag’ data, which can be costly and time-consuming. Thus, this systems and methods disclosed herein provide a solution to jump-start the prediction without needing to wait for the collection to complete.


Distributed Representation of Transactions and their Similarity


Abnormal behaviors may be defined as activity that is not normally seen in the user's transaction patterns. For example, systems may identify abnormal activities as those that are considered to be dissimilar to the user's normal activities. These dissimilar activities may not be identified by direct comparison to the user's previous activities (typically done in other systems). Instead, ‘similar’ activity to the user's past activity may be considered to be normal. Furthermore, the ‘similarity’ may be learned to see historically how various activities are associated with each other by learning from the behavior of pools of users. In some embodiments, the systems disclosed herein may define the similarity between transactions as how likely these transactions will be conducted by the same individual, potentially within a pre-defined timeframe.


In some embodiments, the similarities of transactions are generated using similar concepts as used in computational linguistics. In particular, some embodiments use, the language model, which aims to learn statistically how words appear in sentences. The language model utilizes the fact that words do not appear together randomly in real-life if the words are put together according to grammatical rules. Analogizing this concept to transactions, users tend to shop at similar stores and purchase goods per their preference and tastes. Therefore, many of the techniques in the language model can be applied in this area.


In some embodiments, systems and methods disclosed herein use a novel representation for the calculation and storage of the ‘transaction activity’, specifically the attributes of the transactions. A transaction is usually described by several attributes. For credit card transactions, transaction attributes may include: transaction date/time, transaction amount, merchant's method of accepting the card (e.g. swiped or keyed or internet), merchant location, merchant identification (name and ID), merchant category code (MCC, SIC, etc.), other ‘derived’ attributes that provide refined or composite information of these attributes, and/or the like. Instead of representing the activity either by its names/tokens such as “San Diego, Walmart,” its numerical values such as dollar amount $18.50, time 10:35 (am), or date Apr. 4, 2009, or based on other attributes detected during the activity, the activity can be projected into a high dimension vector of values. One example of this high dimension vector of values may be a series of numbers. For example, in some embodiments, transactions are represented as vectors whereby a vector includes an ordered series of values between −1.0 and 1.0. Each value within the vector may be used to indicate a value summarizing one or more transaction attributes. This may provide the benefit that any type of features included in transaction data for an activity may be incorporated into the distributed representation (e.g., a vector). Furthermore, composite features can also be represented as such a vector. For example, a composite feature may indicate co-occurrence of more than one feature in the transaction data. For example, a vector representation can be provided to indicate a transaction for shopping during lunch hour at department store.


The vectors representing similar transaction, based on any transaction attributes obtained by the system, are generated to be close to each other in the high dimensional space. FIG. 1 illustrates high dimensional space modeling using the examples of words in English language. In implementations using transaction data, each transaction would be analogized to a word. For simplicity, the modeled entity (e.g., words in FIG. 1) are shown on a three dimensional axis, although the vectors are in much higher dimensions. The closeness of the vectors can be measured by their cosine distance. One expression of cosine distance for two vectors (A) and (B) is shown in Equation 1.









similarity
=


cos


(
θ
)


=


A
·
B




A


·


B









Equation





1








Generating Distributed Representations


In some embodiments, the distributed vector representation and the learning of the similarity of among transaction activity can be learned by many different approaches, including matrix factorization and the likes. One embodiment is by using a neural network that learns to map a transaction's attributes to the vector representation, and simultaneously learns to embed the ‘similarity’ among the transactions in the representation of such vectors.



FIG. 2 shows a process flow diagram of a method for generating distributed representations of transactions. The method 200 may be implemented in whole or in part by one or more electronic devices such as the devices described in FIG. 19.


At block 202, a sequence of event records associated with a user are accessed. The sequence of event records may be stored in a data store, such as a relational database. Using a standards based communication (e.g., structured query language messages over TCP/IP), the data store may be queried for event records relating to a user. For example, the data store may associate event records with an identifier for a user. This identifier may be used to index the event records thereby providing an efficient way to retrieve event records for a specific user. The event records accessed from the data store indicate a historical record of events for the specified user. The event records may include time and/or date information indicating when the event occurred. This can be used to order the event records chronologically.


At block 204, a set of attributes of an event record are identified to use for representing the event record. Returning to the analogy with linguistic analysis, at block 204 the ‘words’ are selected for the transactions such that linguistic models may be applied. There may be several fields of data in each event record. As discussed above, transactions may have various attributes. In various embodiments, the ‘word’ representing a transaction may be the merchant's name, the merchant's classification or category (for example, the MCC, SIC, and/or the like), the time of the transaction, the place of a transaction, other attributes, a combination of these attributes, or derivative/composite attributes. For example, in some embodiments, each transaction may be treated as a word indicating the MCC of the merchant, may be treated as the MCC and amount of a transaction, or may be treated as the MCC, amount, and location of a transaction. In some embodiments, various attributes may be discretized for improved analysis. For example, the amount of a transaction may be represented as ranges, such as $100 increments, and the time of day may be represented as only the hour a transaction was made as opposed to using the minute and/or second a transaction occurred. The ‘word’ used to categorize a transaction may be set such that an appropriate number of words is used by the system for generating representations and later use of the representations, as well as based on the similarities that are desired to see in the analysis.


The set of attributes may be selected such that relationships between individual events may be represented in a quantity of memory that is greater than a quantity of memory used to represent the distributed representation of the individual event. For example, for transactions that have multiple attributes, to determine relationships between transactions may require a multidimensional database to represent the links between common transaction attributes. This can be resource intensive to create, maintain, and search. This can be particularly acute in real time implementations such as fraud detection during credit card authorization.


Once a dictionary of words to represent the transactions is selected, at block 206, the method 200 may proceed to generate a model that provides a numerical representation of an event included in the history of events. The numerical representation can be used to identify similarities between transactions. In some embodiments, generating the model may include initializing each of the words representing one or more transactions to a random vector in a high dimensional space. For example, the vectors may be in the range of 200-300 dimensions, or in higher or lower dimensional space. In some embodiments the method 200 may include normalizing the vectors, such as to a common magnitude. In some implementations this allows each vector to be represented as a unit vector. After an initial set of vectors is generated, the co-occurrence of the transactions are used to move related transactions closer to each other and unrelated transactions apart from each other. In some embodiments, this is achieved by finding the best vector representation of the transactions that maximize the likelihood measurement of the neighboring transactions appearing together. Equation (2) below shows one expression of how the best vector representation of the transactions (wj) can be generated.










1
T






t
=
1

T








-
c


j

c

,

j

0





log


p


(


w

t
+
j


|

w
t


)









Equation





2







where T is a number of event records to be processed;

    • wt is the transaction record at point t in the set of event records;
    • c is a window of analysis defining which event records will be compared to the transaction record wt; and
    • p(A/B) is the probability of A given B.


The output of the machine learning process may include a model comprising a set of vectors each representing a ‘word’ that represents one or more transactions in a data set.



FIGS. 3 and 4 illustrate configurations of neural networks which may be used to generate the distributed representations as used in some embodiments. In FIG. 3, information for a current transaction (W(t)) is provided as an input to the neural network. The information for the current transaction may be a distributed representation of the current transaction. Based on the information for the current transaction, information for a number of previous (e.g., W(t−n)) and subsequent (e.g., W(t+n)) transactions are generated as outputs from the neural network.


In FIG. 4, information for a number of previous (e.g., W(t−n)) and subsequent (e.g., W(t+n)) transactions are inputs to the neural network. As an output, the neural network provides information for an expected transaction at a current time (W(t)).


In some embodiments of FIG. 3 or 4, such as when the temporal order of the events may not be a prominent feature, the neural network can be trained to generate distributed representations of transactions via a simple gradient decent with no hidden layer, or, a more sophisticated neural network with one or more hidden layers can be used to capture their non-linear relation between the transactions. The historical data may be based on a time period during which the transactions were conducted for all users for a client (e.g., a party who wishes to detect fraud such as a bank or payment card issuer)


In some implementations, such as when the temporal order of the events may be considered, the neural network may be a recurrent neural network, such as that depicted in FIG. 5.



FIG. 5 shows an example recurrent neural network. A recurrent neural network may be used to include consideration of the order of the transactions (e.g., temporally) when generating predicted outcomes. In FIG. 5, w(t) is a word representing the current transaction at time t, and y(t) is next word. The next word (y(t)) may be W(t+1) shown in FIG. 3 or W(t) shown in FIG. 4.


In FIG. 5, s(t) is the context for the next word and s(t−1) is the context for the previous word. The neural network may learn how to combine the current word (w(t)) with the previous context (s(t−1)). U and W are functions that are trained to weight the current word (w(t)) with the previous context (s(t−1)), respectively, prior to generating the current content. Equation 3 is one expression of how the current content may be generated.

s(t)=f(U·w(t)+W·s(t−1))  Equation 3


where f(z) is an activation function for z, an example of which is shown in Equation 4. Equation 4 shows one example of a sigmoid activation function that may be included.










f


(
z
)


=

1

1
+

e

-
z








Equation





4







Having established the context for the current word (e.g., s(t)), the neural network then generates a the next word y(t). As shown in FIG. 5, V is a function that is trained to weight the context to generate the next word y(t). Equation 5 is one expression of how the next word y(t) may be generated.

y(t)=g(V·s(t))  Equation 5


where g(z) is a softmax function for z, an example of which is shown in Equation 6.










g


(

z
k

)


=


e

z
k





i






e

z
i








Equation





6









    • where k is the index of the word.





The neural network model may include the softmax function to allow the output of the neural network model to be used as posterior probabilities for a given variable. The softmax function generally reduces a set of outputs to a series of values between 0 and 1 wherein the set of outputs sum to 1.


Further details on training neural networks, such as recurrent neural networks, can be found in Herbert Jaeger's “A Tutorial on Training Recurrent Neural Networks,” GMD Report 159, German Nat'l Research Center for Info. Tech. (October 2002), the entirety of which is hereby incorporated by reference.


Fraud Detection Using Abnormality Calculations


In some embodiments, after a model is generated with transactions modeled as distributed representations, the system may use the model to determine abnormalities in a user's transactions. For example, to determine if a user's behavior has been shifted the system may compare new transactions to previous transaction of the user based on the vector representations of the transactions. For example, to detect whether a credit card been compromised by fraud the system compares the new transaction to the user's previous transactions.



FIG. 6 illustrates a method of comparing new transactions to previous user transactions. Similarities may be generated between a current transaction (Txn(k+1)) and one or more previous transactions (e.g., Txn 1 through Txn k as shown in FIG. 6).


There may be various metrics which can be generated to identify whether a new transaction is outside of an expected transaction based on the similarity to previous transactions. For example, the ‘distance’ between the new transaction and previous transaction may be calculated and the system may make decisions based on the distances. In some embodiments the distance may be measured as a cosine distance between the transaction vectors. In some embodiments, the system may analyze the similarities of previous transactions using a mean or percentile of the similarities between the current transaction (txn k+1) and all the transactions in an evaluation window. The evaluation window may specify which transactions to compare with the current transaction. For example, the evaluation window may which may comprise a number of transactions immediately preceding the current transaction. In some implementations, the evaluation window may identify transactions in a previous time window (e.g., range of time).


In some embodiments, the system may analyze the similarities of previous transactions using a maximum or minimum of the similarities between the current transaction (txn k+1) and all the transactions in the evaluation window In some embodiments, the system may analyze the similarities of previous transactions using a geometric mean of the similarities (scaled from (−1, 1) to (0,1)) between the current transaction (txn k+1) and all or a portion of the transactions in the evaluation window. In some embodiments, the system may analyze the similarities of previous transactions using a similarity of the newest transaction to a vector representing the exponentially weighted moving average of the user. For example, the similarity of the vector representing the current transaction (txn k+1) and the vector representing the user's behavior may be updated to consider the current transaction. One expression of how the vector representing the user's behavior may be updated is provided in Equation 7 below.











C

k
+
1




=

{






α







C
k




+


(

1
-
α

)




T

k
+
1









if





k


>
M









k

k
+
1





C
k




+


1

k
+
1





T

k
+
1









if





k



M









Equation





7









    • where {right arrow over (Ck+1)} is the vector representing client after the k+1-th transaction;

    • {right arrow over (Tk+1)} is the vector representing the k+1-th transaction;

    • α is an exponential decay factor; and

    • M is called a maturation window size, which prevents the earlier transactions (e.g., further in time from the current transaction) from getting a much higher weight than other transactions.





In some embodiments, depending on how the distributed representation of the transactions is generated, the vectors may not be normalized. All the vectors appearing in the equation above (e.g., {right arrow over (Ck)}, {right arrow over (Tk+1)}), however, can be normalized. If the transaction vectors are normalized, {right arrow over (Ck+1)} should also be normalized after it is updated. In some embodiments, vectors may be normalized so that each transaction would have an equal contribution in the evaluation, but not only contributions from those transactions represented by high-magnitude vectors. In some embodiments, whether vectors are normalized or may not substantially affect the systems performance.


As an example, in the area of plastic card fraud, the elements in the transactions that can be used to generate the distributed representation and to calculate the similarity between transactions can be, but not limited to one or more of:

    • merchant location (ZIP3, ZIP5, etc.)
    • merchant ID
    • merchant name
    • merchant category code (MCC)
    • standard industrial classification code (SIC)
    • transaction category code (TCC)
    • merchant category group code (MCG code)
    • transaction amount
    • point-of-sale acceptance method
    • transaction date/time, day of week, time of day, etc.
    • derivation of the aforementioned fields
    • combination of the aforementioned fields and/or their derivations


It can also be higher order of the aforementioned fields such as the difference in days, in amounts, percentage changes, etc. between two or more neighboring transactions. For non-card based transaction, behavior may be detected using one or more of IP address, geo-location, network location (e.g., webpage visited), items in an electronic shopping cart, SKU number, or the like.


In some embodiments, composite variables may also be created by combining two or more such variables. One example combination may be a comparison of the closest similarity for a current word in the evaluation window to an average similarity for the evaluation window. An expression of this combination is shown in Equation 8 below.










max_sim


(

w


(
t
)


)



mean_sim


(

w


(
t
)


)






Equation





8









    • where max_sim(w(t)) is the maximum similarity value of the set of similarity values between the current word (w(t)) and words in the evaluation window; and

    • mean_sim(w(t)) is the mean similarity of the set of similarity values between the current word (w(t)) and words in the evaluation window.





Another example combination may be a comparison of recent similarities to longer-term similarities. An expression of this combination is shown in Equation 9 below.











max_sim
x



(

w


(
t
)


)




mean_sim

x
+
n




(

w


(
t
)


)






Equation





9









    • where mean_simx(w(t)) is the mean similarity value of the set of similarity values between the current word (w(t)) and words in a first evaluation window x (e.g., x=30 days); and

    • mean_simx+n(w(t)) is the mean similarity of the set of similarity values between the current word (w(t)) and words in a second evaluation window that is larger than x by n (e.g., n=60 days).





It is observed that with the aforementioned measurement, as compared to the user's with normal behavior, the user whose card is compromised tend to have much higher chance to have low similarity or higher risk score between the fraudulent transactions and the user's historical transactions.



FIG. 7 illustrates a plot of experimental detection performance of an attribute similarity score. The x-axis of the plot shown in FIG. 7 represents the score of the attribute and the y-axis represents the probability of fraud. As shown in FIG. 7, the fraud curve indicates scores for the attribute that are likely to be associated with fraudulent behavior while the non-fraud curve indicates scores that are associated with non-fraudulent behavior. The plot may be generated using historical data for one or more users and attribute scores generated from the historical data.



FIG. 8 illustrates a plot of experimental detection performance for three different methods of detection. As shown in FIG. 8, the plot represents a receiver operating characteristic curve for three behavior detection models. Specifically, FIG. 8 compares the performance of behavior abnormality detection for fraud purposes of the word vector method described in this application along with a commonly used variable in fraud modeling namely variability in the amount for the transactions. A curve showing the performance of a random detection model is also shown as a baseline of performance for a model that randomly guesses whether a transaction is fraudulent.


As shown in FIG. 8, the x-axis represents a ratio of positive detections to truly fraudulent transactions. The y-axis represents a ratio of positive detections as fraudulent of truly legitimate transactions. The performance of the word vector detection model exceeds the variability model. The systems and methods disclosed herein, which incorporate such word vector detection methods, generate greater fraud detection with reduced false positives than the commonly used method relying on variability in the amount for the transactions.


Furthermore, the systems and methods disclosed herein generate additional insight when compared to other behavior detection techniques. For example, the systems may generate additional insight than what the traditional fraud variables can detect by using the distance of the transaction locations.



FIG. 9A shows a geospatial fraud map. As shown in the map 900 of FIG. 9A, the user may be associated with a home area 902. One way to detect fraud events is based on the distance from the home area 902. For example, a transaction at a second area 904 may be more likely to be a fraud event than a transaction within the home area 902. Similarly, a transaction occurring at a third area 906 may be more likely to be fraudulent than a transaction within the second area 908. The third area 910 may be associated with yet a higher probability of fraud than the home area 902, second area 904. A fourth area 908 may be associated with the highest probability of fraud and then any activity beyond the fourth area 908 may be automatically flagged as suspect or associated with a default probability.


One limitation of the geospatial fraud map shown in FIG. 9A is the linear nature of the probabilities. The probability of a given transaction being fraudulent is a function of how far away from the home area 902 the transaction is conducted. This treats the areas in a consistent fashion without considering the behavior of actual actors in these locations.



FIG. 9B shows an alternative geospatial fraud map. As shown in the map 910 of FIG. 9B, a home area 912 is identified. For the users within the home area 912, the map 910 identifies other locations that users from the home area 912 tend to conduct transactions. The locations may be identified based on the transaction similarities. For example, users from the home area 912, may be urban dwelling people reliant on public transportation. As such, the areas where these users tend to conduct transactions may be concentrated in a first area 914. The first area 914 may be associated with a lower probability of fraud than a more remote area 916 where users from the home area 912 do not tend to conduct transactions.



FIG. 9C shows yet another geospatial fraud map. It will be appreciated that the map 920, like the map 900 and the map 910, is of New York. In FIG. 9C, a home area 922 is shown. This home area 922 is generally located in Manhattan that tends to be an affluent area of New York. The users who live in this home area 922 may have further reaching areas for conducting transaction such as at their vacation home in a second area 924 or a resort located in a third area 926. FIG. 9C shows how the linear approach of FIG. 9A would fail to accurately represent the behavior risk for the users from the home area 922. FIG. 10 illustrates a plot of experimental detection performance for nine different methods of detecting behavior abnormalities. As shown in FIG. 10, the plot represents a receiver operating characteristic curve for nine different composite detection methods that may be implemented or included in the systems disclosed herein.


The detection methods shown in FIG. 10 are summarized in Table 1 below.










TABLE 1





Acronym
Description







max_w60
Detection based on maximum difference between a



vector for a transaction and a vector of transactions



within the last 60 days.


online0.95_100
Detection based on the comparison of a vector for a



transaction and a vector of transactions smoothed using



a parameter of 0.95 within the last 100 days.


max_w100
Detection based on maximum difference between a



vector for a transaction and a vector of transactions



within the last 100 days.


mean_w100
Detection based on mean difference between a vector for



a transaction and a vector of transactions within the last



100 days.


geomean_w100
Detection based on geometric mean between a vector for



a transaction and a vector of transactions within the last



100 days.


mean_w60
Detection based on mean difference between a vector for



a transaction and a vector of transactions within the last



60 days.


online0.95_20
Detection based on the comparison of a vector for a



transaction and a vector of transactions smoothed using



a parameter of 0.95 within the last 20 days.


geomean_w60
Detection based on geometric mean between a vector for



a transaction and a vector of transactions within the last



60 days.


online0.99_100
Detection based on the comparison of a vector for a



transaction and a vector of transactions smoothed using



a parameter of 0.99 within the last 100 days.









In some embodiments, the variables can also be combined to take advantage of the different information embedded in them. For example, the maximum difference may be combined with a geometric mean.


How the combination is performed can also be identified by the system using automated learning.


One way the combination can be generated is through unsupervised learning. In the unsupervised scenario, variables can be combined such as by generating an average of the variables, or generating an average weighted by confidence in how predictive each variable is considered of fraudulent behavior. It can also be combined by many other unsupervised learning algorithms such as principal component analysis (PCA), independent component analysis (ICA), or higher-order methodology such as non-linear PCA, compression neural network, and the like. One non-limiting benefit of using unsupervised learning is that it does not require ‘tags’ or other annotations to be added or included in the transactions to aid the learning.



FIG. 11 illustrates a plot of experimental detection performance for four different methods of detecting behavior abnormalities using unsupervised learning. As shown in FIG. 11, the plot represents a receiver operating characteristic curve for four different unsupervised learning methods that may be implemented or included in the systems disclosed herein.


The highest performing method, overall average, is labeled in FIG. 11. A curve showing the performance of a random detection model is also shown and labeled as a baseline of performance for a model that randomly guesses whether a transaction is fraudulent. The results shown in the plot of FIG. 11 clearly provide an improvement over random guessing when detecting behavior anomalies. Table 2 below summarizes the four methods for which curves are plotted in FIG. 11.










TABLE 2





Short Description
Long Description







Overall Average
Detection based on unsupervised machine learning



combination of a weighted average of the detection



results from all variables.


Geomean Average
Detection based on unsupervised machine learning



combination of a weighted average of the geometric



mean difference detection results.


Mean Average
Detection based on unsupervised machine learning



combination of a weighted average of the mean



difference detection results.


Max Average
Detection based on unsupervised machine learning



combination of a weighted average of the maximum



difference detection results.









Another way the combination of variables can be defined is through supervised learning. When training targets are available (e.g., historical data with known behaviors detected), the performance of the variables and how they are combined can be further improved by learning a linear or non-linear model of the combination of the variables. One example of supervised learning in neural network modeling. In a neural network model, the model is adjusted using feedback. The feedback is generated by processing an input and comparing the result from the model with an expected result, such as included in the training targets.



FIG. 12 illustrates a plot of experimental detection performance for a combination of variables based on supervised learning. As shown in FIG. 12, the plot represents a receiver operating characteristic curve for a neural network supervised learning method that may be implemented or included in the systems disclosed herein as compared to a random model. As in FIG. 11, the learned model significantly outperforms the random model.


Scoring System



FIG. 13A shows a functional block diagram of an example behavior scoring system. In some embodiments, a fraud detection system may include a behavior scoring system 1300 or communicate with the behavior scoring system (e.g., via a network) to obtain one or more similarity scores for a behavior (e.g., a transaction). As shown in FIG. 13A, the behavior for which a score will be generated is a user transaction 1302. The behavior scoring system 1300 may receive the user transaction 1302. The user transaction 1302 may be received from an event processing device such as a card reader.


A behavior vector generator 1320 may be included to generate a distributed representation of the user transaction 1302. The behavior vector generator 1320 may be configured to generate the distributed representation based on a model identified in a scoring configuration, such as generated by the method 200 in FIG. 2. The model may be identified by the behavior vector generator 1320 based on the user, the client (e.g., entity for whom the score is being generated such as a card issuer or a merchant), or operational characteristics of the behavior scoring system 1300. For example, computationally complex models may provide more accurate scores, but the processing to generate the score may be resource intensive. Accordingly, one or more operational characteristics of or available to the behavior scoring system 1300 may be used to select the model which consumes a level of resources (e.g., power, processor time, bandwidth, memory, etc.) available to the behavior scoring system 1300.


The current behavior vector may be provided to an in-memory vector storage 1322. The in-memory vector storage 1322 is a specially architected storage device to efficiently maintain distributed representations such as vectors. The in-memory vector storage 1322 may also store one or more historical vectors that can be used for generating the behavior score. The historical vectors may be received from a historical transaction data storage 1360. In some implementations, the user transaction 1320 may be stored in the historical transaction data storage 1360, such as after processing the user transaction 1320. The historical vectors for the historical transactions may be provided in response to a request from the behavior scoring system 1300. In some implementations, the user transaction 1320 may be provided in a message. The message may also include information to obtain the historical transaction data. Such information may include a user identifier, an authorization token indicating permission to release of at least a portion of the user's historical data, and the like. The behavior scoring system 1300 may, in turn, transmit a request including such information to the historical transaction data storage 1360. In response, the historical transaction data storage 1360 may provide historical transaction information. As shown in FIG. 13A, the distributed representation is provided from the historical transaction data storage 1360. In some implementations, where raw transaction data is provided, the raw transaction data may be processed by the behavior vector generator 1320 to obtain distributed representations of the historical transaction data.


In the implementation shown in FIG. 13A, to generate the distributed representation two memory components may be used. The historical transaction data storage 1360 may be included to store the historical transactions. The historical transaction data storage 1360 may be a specially architected memory indexed by information identifying users. This can expedite retrieval of the transaction data and/or distributed representations for a user using the identifying information. The in-memory vector storage 1322 may be included as a second memory component. The in-memory vector storage 1322 may be implemented as a storage (preferably using the main memory such as RAM) to store the distributed representation (or vectors) of the entities (e.g. merchant ID, ZIP5, etc.) to be compared. The behavior scoring system 1300 shown in FIG. 13A is an embodiment of a real-time fraud scoring system that can generate a score in real-time such as during the authorization of a transaction.


In the implementation shown in FIG. 13A, the user transaction 1302 is also processed by an optional transaction scoring system 1310. The transaction scoring system 1310 is a secondary behavior scoring system that may provide supplemental behavior scoring information. The transaction scoring system 1310 may be installed on the client's systems (e.g., merchant server, issuer server, authorization server, etc.). This configuration allows clients to affect the behavior scoring according to their individual needs. For example, a bank may wish to include one or more inputs to the behavior scoring generated by their own proprietary systems.


Having generated the corresponding distributed representations for the data attributes to be compared from the user transaction 1302 and obtained the corresponding distributed representation for the historical transactions of the same user, a behavior scoring module 1324 may be included in the behavior scoring system 1300. The behavior scoring module 1324 may be configured to generate the behavior score for the user transaction 1302. The behavior score may include a value indicating the likelihood that the behavior is consistent with the historical behavior. When available, the behavior scoring module 1324 may also include the supplemental behavior score to generate a final behavior score for the user transaction 1302. The user transaction 1302 may be stored in the historical transaction storage 1360.


In some implementations, the behavior scoring module 1324 may identify a behavior anomaly using a degree of similarity between the current and historical behavior vectors. In such implementations, a single output (e.g., indicating fraud or not fraud) may be provided rather than a similarity score. The single output may be generated by comparing a similarity score to a similarity threshold. If the similarity score indicates a degree of similarity that corresponds to or exceeds the similarity threshold, the no fraud result may be provided. As discussed above with reference to FIGS. 10-12, the score may be generated using a composite of different comparison metrics (e.g., mean, geometric mean, varying historical transaction windows, etc.).



FIG. 13B shows a functional block diagram of another example behavior scoring system. In some embodiments, a fraud detection system may include the behavior scoring system 1300 or communicate with the behavior scoring system (e.g., via a network) to obtain one or more similarity scores for a behavior (e.g., a transaction). The behavior scoring system 1300 may be implemented as an alternative or in combination with the behavior scoring system 1300 shown in FIG. 13A. The implementation of FIG. 13B illustrates the behavior scoring system 1300 providing one or more comparison metrics (e.g., mean, geometric mean, varying historical transaction windows, etc.) to a behavior detection system 1380. For example, the behavior detection system 1380 may wish to maintain a proprietary behavior detection methodology but may base the detection on the comparison metrics generated by the behavior scoring system 1300. As such, rather than receiving a behavior score from the behavior scoring system 1300 (as shown in FIG. 13A), a behavior comparator 1370 generates the comparison metrics based at least in part on the historical and current behavior vectors. The behavior comparator 1370 may provide the comparison metrics to the behavior detection system 1380. The behavior detection system 1380, based at least in part on the comparison metric(s), generates the final behavior score (e.g., fraud, cross-marketing, etc.). FIGS. 10-12 illustrate examples of the comparison metrics that can be generated.


Example Transaction Processing with Behavior Detection



FIG. 14 shows a message flow diagram of an example transaction with behavior detection. A merchant system 1410 may include a card reader 1402 and a point of sale system 1404. The card reader 1402 may be configured to receive card or other payment information from a user. The card reader 1402 may be in data communication with the point of sale system 1404 to receive information collected by the card reader 1402 and other equipment at the merchant site such as a cash register, product scanner, receipt printer, display terminal, and the like. The merchant system 1410 may communicate with an acquisition server 1412. The acquisition server 1412 may be configured to determine, for payment tendered for a transaction, which issuer is responsible for the payment presented. In the case of credit cards, the issuer may be a card issuer 1414. A card holder device 1416 is also shown in FIG. 14. The card holder device 1416 is an electronic communication device associated with a user who has been issued a payment card or otherwise accesses the system to perform a transaction.


The message flow 1400 shown in FIG. 14 provides a simplified view of messages that may be exchanged between the entities shown for gathering and processing transaction data as well as analyzing behavior based on the transaction data for a user. It will be understood that additional entities may mediate one or more of the messages shown in FIG. 14.


The message flow 1400 may begin with a card swipe detection by the card reader 1402 based on a message 1420. The message 1420 may include the payment information read from the card such as from a magnetic strip, an embedded memory chip, a near-field communication element, or other information source included on the card. Via message 1422, the card information may be transmitted from the card reader 1402 to the point of sale system 1404. The point of sale system 1404 may determine that the card is a credit card and identify the acquisition server 1412 as a source for determining whether the payment is authorized.


The point of sale system 1404 may transmit a message 1424 to the acquisition server 1412 including the card information and merchant information. The merchant information may include a merchant identifier, merchant transaction information (e.g., desired payment amount), or other information available to the merchant for the transaction. The acquisition service 1412 may identify the card issuer based on the card information received and transmit a message 1426 to the card issuer 1414. The message 1426 may include the card information and merchant information received via message 1424. The message 1426 may also include information about the acquisition server 1412.


Via message 1428, the card issuer 1414 may generate a behavior score for the current transaction. This may be generated using a behavior scoring system such as the behavior scoring system 1300 shown in FIG. 13.


The card issuer 1414 may then authorize the requested payment amount via message 1430. The authorization process determines whether or not the requested payment for the transaction is to be honored. Unlike conventional authorization that may seek to authorize based on credit limit, PIN number, or other discrete transaction information for the current transaction, the authorization via message 1430 is enhanced to further consider the behavior score generated via message 1428.


Via message 1432, the authorization decision and, in some implementations, behavior score may be transmitted back to the merchant system 1410 via the acquisition server 1412. Because the behavior score may be represented using a relatively small quantity of resources, this data may be efficiently included in the current messaging used to authorize transactions. The point of sale system 1404 may use the authorization information and/or behavior score to determine whether or not to allow the transaction to proceed. If the authorization is negative, then the point of sale system 1404 may request alternate payment from the user. In some implementations, the authorization information may include an intermediate authorization indicating that the transaction is provisionally authorized but may be fraudulent. In such instances, the merchant system 1410 may collect information for secondary verification such as a photo identification, PIN request, or other information to more accurately determine whether the current behavior is consistent with the purported user's behavior.


As shown in FIG. 14, the authorization determination may be transmitted to a card holder device 1490 via message 1436. This can provide timely and important information to a card holder regarding potential fraudulent activity related to their account.


In some implementations, a client system (not shown) may receive the message 1436. In response to receiving the alert 1436, the client system may generate one or more user communications to be transmitted to the card holder device 1490, whose transactional behavior has changed. The user communications may be generated by the client system or other messaging system. The alert 1436 may include transmission of email messages directed to the user's e-mail account(s), text messages (e.g., SMS or MMS) directed to the user's mobile device, and printed messages directed by postal or other delivery services to the user's home, place of business, or other physical location.


In certain implementations, the alert 1436 is operable to automatically activate a user communication service program on the client system. The activated user communication service program automatically generates one or more communications directed to the user about whom the alert 1436 was transmitted. Generation of the user communications can be informed by the informational content of the alert 1436. The user communications are then automatically transmitted to the card holder device 1490 in one or more modes of communication, such as, for example, electronic mail, text messaging, and regular postal mail, to name a few. In certain modes of communication to the user, the user communication may be configured to automatically operate on the card holder device 1490. For example, the user's mobile device may, upon receipt of the transmitted user communication, activate a software application installed on the user's mobile device to deliver the user communication to the user. Alternatively, the user communication may activate a web browser and access a web site to present the user communication to the user. In another example, a user communication may be transmitted to a user's email account and, when received, automatically cause the user's device, such as a computer, tablet, or the like, to display the transmitted user communication. The user communication may include information about the potential fraudulent transaction such as the time, place, amount, etc. of the questionable transaction. In some implementations, the user communication may include questions about the user's behavior that can be used to verify the transaction. For example, if a transaction in Milwaukee was flagged as potentially fraudulent for a user who lives in Muskegon, the user communication may ask “Have you ridden on a ferry recently?” The user response would assist in determining whether the user visited Milwaukee recently. In some implementations, the verification may be performed in real-time (e.g., prior to consummation of the questionable transaction). In some implementations, the user communication may not be presented to the user. For example, the user communication may contact the card holder device 1490 inquiring about the location of the card holder device 1490. If the location of the device is consistent with the transaction location, it may be determined that the user is in fact conducting the transaction. The inquiry for location may cause activation of the location services on the card holder device 1490 and transmission of a currently detected location for the card holder device 1490.



FIG. 15 shows a message flow diagram of an example batch transaction processing with model retraining and regeneration of user distributed representations. Some of the entities shown in FIG. 15 overlap with those shown in FIG. 14. Added to FIG. 15 is a card network server 1502. The card network server 1502 is an electronic device operated by the credit card network to which a given card belongs such as VISA.


The merchant system 1410 may transmit a batch of transaction data for multiple transactions with multiple users via message 1520. The batch of transaction data from a single merchant may include hundreds, thousands, or millions of individual transaction records. The merchant system 1410 may transmit the message 1520 to the acquisition server 1412 to initiate payment for the transactions. The acquisition server 1412, via message 1522, may transmit those transactions from a specific card network to the card network server 1520 to request payment from the specific network. Because a single card network may have multiple card issuers (e.g., banks, credit unions, etc.), the card network server 1502 may split the batch of transactions by card issuer. The transactions for a specific issuer are transmitted via message 1524 to, as shown in FIG. 15, the card issuer 1414. The card issuer 1414 stores the transaction data via message 1526.


The new transaction data may also provide new information that can be used to train or retrain, via message 1526, one or more machine learning behavior detection models. The behavior detection models may include the model for generating distributed representations and/or the model for scoring the behavior of a transaction. For example, if the new transactions represent a significant percentage of the overall transaction data stored by the system 100, the retraining of the models may be desirable to ensure accurate and current detection for the users. The retraining may also be needed to account for new transaction attributes that were previously not included in the training process. The retraining may also be needed to account for new transactions for new users who were previously not included in the training process.


Having retrained the models, via message 1528, the distributed representations for the users of the card issuer 1414 may be generated. This may include generating a vector representing the historical transactions for each user or a portion of users (e.g., users with new transactions since the last time their historical vector was generated).


Example Point of Sale Card Reader



FIG. 16 shows a schematic perspective view of an example card reader. As seen in FIG. 16, there is provided a point-of-sale card reader 1600 including a housing 10. The housing 10 may encloses transaction/event circuitry (not shown) and other electronic components to implement one or more of the behavior modeling and/or detection features described.


The reader 1600 includes a keypad 16, which interfaces with the point-of-sale transaction/event circuitry to provide input signals indicative of transaction or other events at or near the point-of-sale card reader 1600. The point-of-sale card reader 1600 also includes a magnetic card reader 18 and a smart card reader 20, which may be adapted to receive a smart card 22.


The point-of-sale card reader 1600 also includes a display 24 and a printer 26 configured to provide output information prior to, during, or after a transaction. The point-of-sale card reader 1600 may receive an authorization decision and/or behavior score for a transaction. In some implementations, the point-of-sale card reader 1600 may present a message requesting additional authorization information. For example, the behavior score may indicate a questionable transaction. In response, rather than canceling the transaction, the point-of-sale card reader 1600 may request a PIN number or other identifying information. This additional information may be used to further authorize the transaction as described above.



FIG. 17 shows a functional block diagram of the exemplary card reader of FIG. 16. A controller 40 which interfaces with the keypad 16, the display 24, the printer 26, and with a behavior score decoder 60 are shown. The controller 40, which may include card reader and/or point-of-sale terminal functionality may interface with the magnetic card reader 18 and, when available, the smart card reader 20. The controller 40 also interfaces with a mobile computing communication device 41 and may interface with an optional modem 42. The mobile computing communication device 41 and the modem 42 may be used by the reader 1300 to communicate messages such as between a point-of-sale system or other merchant transaction processing equipment.


The card reader 1700 shown in FIG. 17 includes a wireless modem 43 and various types of communications points such as an RF port 44, and IR port 46, a serial port 48, and a USB port 50. The communication ports may also be used by the reader 1700 to communicate messages as described in this application. A removable media adapter 52 may also interface with the transaction circuitry 12. Removable media may be employed for storage, archiving, and processing of data relevant to the reader 1700 functionality. For example, transaction data may be stored on removable media for transfer, at a later time, to merchant transaction processing equipment.


The behavior score decoder 60 may be configured to receive behavior scores and configure the reader 1700 in response to a received behavior score. For example, if the behavior score indicates a questionable transaction, the behavior score decoder 60 may cause the reader 1700 to obtain requested additional verification information such as via the keypad 16. In some implementations, the card reader 1700 may use one or more of the communication points to obtain one or more score decoding configurations. For example, a look up table may be provided to the card reader 1700 and stored in a memory of or accessible by the reader 1700 (not shown). The look up table may include a list of behavior scores or behavior score ranges and associated configurations for the reader 1770.


Segmentation for Improving Model Performance for Fraud/Credit Models


In some implementations, the users may be segmented. Segmentation (e.g., dividing the population into several groups and building separate models for each group) can improve the model performance. For example, segmentation may improve performance of fraud models by concentrating the model on specific behavior of the segment the model is built on (e.g., VIP segment, international segment, card not present segment, high-risk segment, or the like).


One or more of the following techniques for segmentation may be included to divide the users in the systems and methods described above:


1. Portfolio based: Separate models are created for different portfolios (e.g., VIP, prime credit user, subprime credit user, etc.);


2. Transaction type based: (e.g., international transactions, domestic transactions, Card Not Present transactions, Card Present transactions, etc.);


3. Decision Tree/Rule Based: Decision tree/rules is used to partition the data into segments (leaf nodes) by using different features.


4. Clustering based: Clustering algorithms is used in creating different segments/clusters based on some features. For example, transactional attributes or user demographic attributes may be used to define groups of users for segmentation. Examples of clustering are described in U.S. patent application Ser. No. 14/975,654, filed on Dec. 18, 2015 entitled “USER BEHAVIOR SEGMENTATION USING LATENT TOPIC DETECTION” commonly owned and assigned by applicant and hereby incorporated by reference in its entirety.


However, relying only on these segmentation techniques can produce inaccurate groupings, particularly when attempting to identify behavior anomalies. For example, in transaction type based segmentation, user's may switch from one segment to another too frequently (may even be every transaction). In portfolio based approaches, if the bank creates a new portfolio it will be difficult to decide which segment these will be scored by.


In some embodiments, a user's past behavior may be used to perform ‘behavior-based’ segmentation on the users to solve these issues. A user is dynamically assigned to one of the behavior segments based on which segment the user's past behavior fits the best.


In some embodiments, distributed representation transactions can be used to learn about the user's past behavior. Transaction may be described by several attributes. In some embodiments, the attributes (including composite ones) in the transactions can be represented as vectors. Then, a user may be represented as a composite vector based on the user's previous transactions.


In some embodiments, the distributed representation of a user's historical transaction data (e.g., a learned vector representation of the users' various behavior in transactions) can be grouped/clustered together. This will create groups of users with similar behavior based on their transactions. Then, by looking at past behavior of a certain user, the group most similar to this behavior can be used as the segment for this user. Grouping/clustering can be done by any clustering technique like K-Means, for example.


Separate transaction models can be built for each segment/cluster and these can be used for scoring transactions as abnormal or potentially fraudulent. Furthermore, companies can target specific users in specific groups using these segmentation techniques and can use different marketing strategies for each segment, such as those discussed below.


Marketing Applications


Whether based on individual distributed representations of users or segmented representations, the behavior detection features may be used to provide content or other information to users based on detected behaviors.


In some embodiments, the abnormality detection techniques described can be used to identify changes in life-stage such as moving, graduation, marriage, having babies, etc. When there is a change in the life-stage, the locations of the transactions, stores visited, and timing of the purchase can be different from the user's normal patterns. Therefore, identifying a reduction of the similarity between the current transactions and the historical transactions may be an indication of such events. The identification of such a change may cause selection and transmission of specific content to the user. The specific content may be selected based on the new transactions. For example, if the change detected indicates a new home, the content may be related to home ownership (e.g., home improvement stores, insurance, home service providers, appliances, etc.).


In some embodiments, the behavior information may be used to identify similar entities (e.g., merchants, users, consumers, etc.) for marketing. For example, marketers' may wish to determine the best strategy to market to users in order to increase the response rate and spending. To do so, it is often assumed that a user will respond similarly to the other users that have similar taste, or, have made similar purchases.


The distributed representation of the merchants can be used for such purpose. A distributed representation of merchants may be generated in a similar manner as described above. For example, a model may be generated based on a set of transactions received by the system for the merchant. The model may use the specific merchant as a ‘word’ when applying the linguistic model. Then, merchants that have high similarity after training using this approach are more likely to be shopped at by the same consumers. For example, TABLE 3 includes similarity scores for a distributed representation of users for Big Box Store to five other stores. As shown in TABLE 3, Big Box Store (BBS) is closer to New Army, Darshalls, Cool's, and Crevass, which may be stores that emphasize more on style with reasonable price than to Lmart which may be a store that focuses on discounts and value shopping.












TABLE 3








Similarity Score of Distributed



Store
Representation to Big Box Store









New Army
0.79



Darshalls
0.76



Cool's
0.74



Crevass
0.72



Lmart
0.21










TABLE 4 shows the similarity scores for another distributed representation of users for Discount Big Box Store (DBBS) to four other stores. As shown in TABLE 4, DBBS is closer to Lmart, euroBush, etc. that may be retailers that focus more on discounts and value shopping than New Army.












TABLE 4








Similarity Score of Distributed




Representation to Discount Big Box



Store
Store









Lmart
0.79



euroBush
0.76



Bingo Burger
0.74



Hunter's Haven
0.72



New Army
0.11










Furthermore, because the merchants may be represented as using distributed representations such as vectors, the merchants can be combined such as by performing addition and subtraction on their respective vectors. Empirically it's shown that such arithmetic operation, for example, such as subtraction, on their distributed representation generates a vector that gives preference to the minuend and disfavors the subtrahend. For example, a vector representing (BBS-DBBS) may be closer to merchants associated with stylish and higher-end goods; while a vector representing (DBBS-BBS) is much closer to budget oriented goods. As another example, a vector for Crazy Kids Clothing Place (CKCP) may be added to the vector representation of (BBS-DBBS) to emphasize children's wear, a vector that's closer to high-end children clothing stores may result. In contrast, CKCP added to the vector representation of (DBBS-BBS) can result in a vector that's closer to merchants offering discount children clothing.


These properties can be used to identify and select group of consumers that have specific preference of certain types of merchants (whose vectors are V+) and dislike of certain other types of merchants (whose vectors are V). This can be achieved by defining a composite vector using these merchants as a seed vector Vs,








V
s

=


1
Z



(






V
j



V
+





V
j


-





V
j



V
-





V
j



)



,

where





Z





is





a





normalizing





term





and compare it to the distributed representation of the merchants the consumers visited;


These properties can be used to identify consumers that have similar taste by comparing the distributed representation of the merchants the consumers visited. These properties can be used to segment consumers into groups of consumers that have similar transaction behaviors. To segment consumers, a composite vector may be generated by combining the distributed representation of the consumer's historical transactions. A clustering may then be performed using the composite vectors of the users such as described above. In some embodiments, the clustering can be performed using k-means clustering to find k groups S={S1, S2, S3, . . . , Sk} represented by their respective ‘centroid’ vectors so that the within-cluster sum of squares is minimized. Equation 10 below illustrates one example expression for clustering based on sum of squares minimization.











arg

min

s






i
=
1

k






x


S
i








x
-

μ
i




2







Equation





10







where k is the number of groups;

    • Si is a group;
    • X is a member of the group Si; and
    • μi is the mean for the group Si.


One example method of providing the recommendations may include identifying users to receive particular content. The method includes accessing, from a data store, a sequence of event records associated with a user. In some implementations, the sequence of event records indicates a history of events for the user. The method includes identifying a set of attributes of an event record to represent the event records. As discussed above the attributes serve as the “words” which will be used to generate distributed representations of the transactions. The method includes generating a model to provide a numerical representation of an event included in the history of events using values for the set of attributes of the sequence of event records. A first numerical representation of a first event at a first time indicates a higher degree of similarity to a second numerical representation of a second event at a second time than to a third numerical representation of a third event at a third time. The difference between the first time and the second time is less than the difference between the first time and the third time. The distributed representation provides resource efficiency gains. For example, the set of attributes for an individual event may be represented in a quantity of memory that is greater than a quantity of memory used to represent the distributed representation of the individual event. The method includes receiving a desired event related to a content item to be provided. The receiving may be via a network as described above and below. The method includes generating a candidate event numerical representation for the desired event using the model and the desired event. The candidate event may be a merchant or item that is to be offered and the method will identify those users having behaviors indicating a preference for the merchant or item. As such, the method identifies an event record having at least a predetermined degree of similarity with the desired event. The identification may be based on a similarity metric between a distributed representation of a user's behavior and the distributed representation of the candidate event. Once the users with the desired degree of similarity are identified, the method includes providing the content item to the identified users.


Providing the content may include transmitting a message to an electronic device of the user. The message may include transmission of email messages directed to the user's e-mail account(s), text messages (e.g., SMS or MMS) directed to the user's mobile device, and printed messages directed by postal or other delivery services to the user's home, place of business, or other physical location.


In certain implementations, the message is operable to automatically activate a user communication service program on the client system. The activated user communication service program automatically generates one or more communications directed to the user including all or a portion of the content. The user communications may then automatically transmitted to a card holder device, such as the card holder device 1490. The transmission may be via one or more modes of communication, such as, for example, electronic mail, text messaging, and regular postal mail, to name a few. In certain modes of communication to the user, the user communication may be configured to automatically operate on the card holder device receiving the communication.


Better Grouping of Merchants


There are tens of millions of merchants in the US. For applications that analyze the types of merchants that consumers visit, it may be desirable to create a categorization of the merchants so it is easier to analyze. The Merchant Category Code (MCC) used by the Card Associations such as VISA, MASTERCARD, etc., the Standard Industrial Code (SIC), and the North American Industry Classification System (NAICS) are a few attempts to do so. However, these classification systems only classify the merchants based on their industry and sub-industry groups.


In some embodiments, by using the distributed representation and the similarity definition described herein, groups of merchants may be created from diverse industries that serve similar purposes. By performing a clustering using techniques such as k-means clustering (discussed above), merchants that are related may be grouped together. For example, TABLE 5 shows a group of merchants that are related to international travel. Note, however, that none of the MCC, SIC, and NAICS has an international travel group. Furthermore, this group contains merchants from all aspects of international traveling, including airlines, hotels, international travel insurance, SIM card sales for international travel, visa services, etc. These can be identified using behavior as described above.











TABLE 5





MCC
MCC Description
Merchant Name







3005
BRITISH AIRWAYS
BRITISH A


3007
AIR FRANCE
AIR FRANCE 0571963678061


3010
KLM (ROYAL DUTCH AIRLINES)
KLM BELGIUM 0742469054336


3012
QUANTAS
QANTAS AIR 08173363730


3056
QUEBECAIRE
JET AIR 5894149559583


3077
THAI AIRWAYS
WWW.THAIAIRW1234567890


3078
CHINA AIRLINES
CHINA AIR2970836417640


3079
Airlines
JETSTAR AIR B7JLYP


3161
ALL NIPPON AIRWAYS
ANAAIR


3389
AVIS RENT-A-CAR
AVIS RENT A CAR


3503
SHERATON HOTELS
SHERATON GRANDE SUKHUMVIT


3545
SHANGRI-LA INTERNATIONAL
MAKATI SHANGRI LA HOTE


3572
MIYAKO HOTELS
SHERATON MIYAKO TOKYO H


3577
MANDARIN ORIENTAL HOTEL
MANDARIN ORIENTAL, BANGKOK


3710
THE RITZ CARLTON HOTELS
THE RITZ-CARLTON, HK16501


4011
Railroads
JR EAST


4111
Local/Suburban Commuter . . .
MASS TRANSIT RAILWAY


4111
Local/Suburban Commuter . . .
XISHIJI CRUISES


4112
Passenger Railways
Taiwan High Speed Rail


4121
Taxicabs and Limousines
AIZUNORIAIJIDOSHIYA KA


4131
Bus Lines, Including Charters . . .
CE/CRUZ DEL SUR


4215
Courier Services Air or Ground . . .
MYUS.COM


4511
Airlines, Air Carriers (not listed . . .
CAMBODIA ANGKOR AIR-TH


4511
Airlines, Air Carriers (not listed . . .
JETSTAR PAC


4722
Travel Agencies and Tour Operations
CHU KONG PASSENGER 28902


4722
Travel Agencies and Tour Operations
HOSTEL WORLD


4814
Fax services, Telecommunication
ONESIMCARD.COM


4814
Fax services, Telecommunication
PREPAID ONLINE TOP UP


5192
Books, Periodicals, and Newspapers
RELAY


5200
Home Supply Warehouse Stores
FUJI DOLL CHUOU


5251
Hardware Stores
TRUE VALUE AYALA CEBU


5300
Wholesale Clubs
COSTCO GUADALAJARA


5309
Duty Free Store
BEIRUT DUTY FREE ARRIVAL


5309
Duty Free Store
DFS INDIA PRIVATE LIMI


5411
Grocery Stores, Supermarkets
RUSTAN S SUPERMARKET


5411
Grocery Stores, Supermarkets
VILLA MARKET-NICHADA


5499
Misc. Food Stores Convenience . . .
HAKATAFUBIAN


5499
Misc. Food Stores Convenience . . .
I-MEI FOODS CO., LTD


5719
Miscellaneous Home Furnishing . . .
KULTURA FILIPINO SM C


5812
Eating places and Restaurants
RI YI CAN YIN


5813
Drinking Places (Alcoholic . . .
SHANGHAI PRESIDENT COFFEE


5814
Fast Food Restaurants
AJISEN RAMEN 02800


5814
Fast Food Restaurants
MCDONALD'S AIRPORT(290


5947
Card Shops, Gift, Novelty, and . . .
SENTOSA LEISURE MANAGE


5949
Sewing, Needle, Fabric, and Price . . .
SHANG HAI YUGUI INDUS


5962
Direct Marketing Travel Related . . .
CTRIP SH HUACHENG TRAVEL


5964
Direct Marketing Catalog . . .
AMAZON.CO.JP


5994
News Dealers and Newsstands
LS TRAVEL RETAIL DEUTSCHL


5999
Miscellaneous and Specialty Retail
MAXVALUKURASHIKAN ICHIHAM


6010
Financial Institutions Manual Cash
012BANCO DE CHILE VISA


6300
Insurance Sales, Underwriting, and . . .
WORLD NOMADS


6513
Real Estate Agents and Managers - . . .
PAY*KOKO RESORTS INC


7299
Miscellaneous Personal Services ( . . .
CHINA VISA SERVICE CENTER


7299
Miscellaneous Personal Services ( . . .
PASSPORTS & VISA.COM


7311
Advertising Services
AMOMA


7399
Business Services, Not Elsewhere . . .
MAILBOX FORWARDING, IN


7941
Commercial Sports, Athletic Fields
SINGAPORE FLYER PL


7991
Tourist Attractions and Exhibits
AT THE TOP LLC


7996
Amusement Parks, Carnivals, Circuses
Hong Kong Disneyland Lant


8062
Hospitals
BUMRUNGRAD HOSPITAL


8641
Civic, Fraternal, and Social Associations
INTERNATIONS GMBH


8699
Membership Organizations (Not . . .
PRIORITY PASS, INC


8999
Professional Services (Not . . .
U.S. VISA PLICATIONFEE


9399
Government Services (Not . . .
CN E US CONSULAT SHA









In some embodiments, by using the fact that consumers tend to shop at merchants at similar pricing level, the system can also use the same technique to classify hotels into low-end vs. high-end ones as they tend to have different distributed representations.



FIG. 18 shows a plot of merchant clusters. The vectors for transactions at particular merchants can be projected from many dimensions to the two dimensions shown in FIG. 18. The x and y axes illustrate two dimensions representing the ‘similarity’ relation among the merchants. The plot 1800 may include clusters of merchants. For example, a cluster 1810 and a cluster 1812, may represent a collections of merchants associated with a particular price point. Other merchants associated with a second price point may be grouped in cluster 1820 and cluster 1822. The projection may be taken from the vector representations of the transactions without analyzing the textual names of the merchant or amount of the transaction. This shows that without the knowledge of the identity or prices of the hotels, through clustering, similarities between groups of merchants may be automatically discovered.


Credit Risk/Attrition Applications


In account management, banks attempt to predict the users' potential to default or attrite in order to reduce the banks' exposure to their debt, and ensure their users continue to spend. Such business problems can benefit significantly from analyzing the users' behavior. The consumers' credit risk can increase when they: encounter life-stage change that may incur significant debt or become unable to afford the expense; begin to exhibit abnormal spending behavior; transact in stores that tend to attract consumers with financial difficulty, or; exhibit similar behavior to other consumers that more likely to default with the same behavior profile.


Similarly, the consumers likelihood of stopping using the bank's credit card can increase when they: encounter life-stage change that requires different types of product features in the credit cards; begin to exhibit abnormal spending behavior; and/or exhibit similar behavior to other consumers that more likely to attrite with the same behavior profile.


Therefore, the features described above, e.g. behavior abnormality detection, life-stage change detection, improved grouping of merchants, user segmentation, etc., can be applied to detect credit risk and attrition risk.


As an example, in some embodiments, the system may apply the behavior abnormality determination and life-stage change detection mentioned previously in the models to identify behavior shift in the user's spending. When detected, if the shift is indicating that the user is engaging riskier spending in merchants that indicative of financial distress, a credit-risk related treatment can be applied to the consumer. One benefit of these features, is that merchants may be grouped and analyzed without necessarily identifying a merchant by name. For example, the targeting may be based on the similarity between a merchant and others within a similar category (e.g., casino, coffee shops, etc.) The categories need not be known a priori either. As transaction vectors are compared, similarities may be discovered in the way certain merchants cluster. A cluster may then be associated with a level of riskiness or other measure of risk assessment. On the other hand, if the behavior shift indicates a life-stage change that the existing features in the credit is not likely to fit the consumer's new need, an attrition treatment with product cross-sell/up-sell or targeted incentives can be offer to the consumer.


Example System Implementation and Architecture



FIG. 19 shows a block diagram showing example components of a transaction analysis computing system 1900. The computing system 1900 includes, for example, a personal computer that is IBM, Macintosh, or Linux/Unix compatible or a server or workstation. In one embodiment, the computing system 1900 comprises a server, a laptop computer, a smart phone, a personal digital assistant, a kiosk, or a media player, for example. In one embodiment, the exemplary computing system 1900 includes one or more central processing unit (“CPU”) 1905, which may each include a conventional or proprietary microprocessor. The computing system 1900 further includes one or more memory 1932, such as random access memory (“RAM”) for temporary storage of information, one or more read only memory (“ROM”) for permanent storage of information, and one or more mass storage device 1922, such as a hard drive, diskette, solid state drive, or optical media storage device. Typically, the components of the computing system 1900 are connected to the computer using a standard based bus system 1980. In different embodiments, the standard based bus system could be implemented in Peripheral Component Interconnect (“PCI”), Microchannel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules of computing system 1900 may be combined into fewer components and modules or further separated into additional components and modules.


The computing system 1900 is generally controlled and coordinated by operating system software, such as Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing system 1900 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The exemplary computing system 1900 may include one or more commonly available input/output (I/O) devices and interfaces 1912, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 1912 include one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. The computing system 1900 may also include one or more multimedia devices 1942, such as speakers, video cards, graphics accelerators, and microphones, for example.


In the embodiment of FIG. 21, the I/O devices and interfaces 1912 provide a communication interface to various external devices. In the embodiment of FIG. 19, the computing system 1900 is electronically coupled to one or more networks, which comprise one or more of a LAN, WAN, and/or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link. The networks communicate with various computing devices and/or other electronic devices via wired or wireless communication links, such as the credit bureau data source and financial information data sources.


In some embodiments, information may be provided to the computing system 1900 over a network from one or more data sources. The data sources may include one or more internal and/or external data sources that provide transaction data, such as credit issuers (e.g., financial institutions that issue credit cards), transaction processors (e.g., entities that process credit card swipes at points of sale), and/or transaction aggregators. The data sources may include internal and external data sources which store, for example, credit bureau data (for example, credit bureau data from File Ones℠) and/or other user data. In some embodiments, one or more of the databases or data sources may be implemented using a relational database, such as Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the executing computing device, such as the computing system 100, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


In the example of FIG. 19, the modules 1910 may be configured for execution by the CPU 1905 to perform any or all of the processes or generate any or all of the interfaces discussed above with reference to FIGS. 1-18. For example, the modules 1910 may be implemented as instructions residing in a memory (such as the memory 1932) that, when executed, cause the transaction analysis system 1900 to perform all or some of the functions described. In some implementations, one or more of the modules 1910 may be implemented as a hardware device (e.g., circuit) configured to perform the functions described.


Additional Embodiments


Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The code modules may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like. The systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


All of the methods and processes described above may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers. For example, the methods described herein may be performed by the computing system and/or any other suitable computing device. The methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium. A tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems and methods can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the systems and methods should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the systems and methods with which that terminology is associated.

Claims
  • 1. A computer-implemented method of artificial intelligence guided monitoring of event data, the method comprising: under control of one or more computing devices configured with specific computer-executable instructions,accessing, from a data store, a sequence of event records associated with a user, the sequence of event records indicating a history of events for the user;identifying a set of attributes of an event record to represent the event records;generating a model to provide a vector representing an event included in the history of events using values for the set of attributes of the sequence of event records, wherein a first vector representing a first event at a first time indicates a higher degree of similarity to a second vector representing a second event at a second time than to a third vector representing a third event at a third time, wherein a first difference between the first time and the second time is less than a second difference between the first time and the third time;receiving, from an event processing device, a candidate event for the user;generating a candidate event vector using the model and the candidate event;identifying a behavior anomaly using a degree of similarity between the candidate event vector and a prior event vector representing a prior event, wherein the degree of similarity is generated using an exponentially weighted moving average Ck+1 that is determined by the equation:
  • 2. The computer-implemented method of claim 1, wherein identifying the behavior anomaly further comprises: identifying the set of prior events representing past behavior of the user, the set of prior events including the prior event; andgenerating the degree of similarity between the candidate event vector and vectors representing the set of prior events.
  • 3. The computer-implemented method of claim 2, further comprising generating the vectors representing each of the set of prior events.
  • 4. The computer-implemented method of claim 2, wherein the degree of similarity is generated using a mean of similarity values between the candidate event vector and the vectors representing the set of prior events.
  • 5. The computer-implemented method of claim 2, wherein the degree of similarity is generated using a maximum or a minimum similarity value between the candidate event vector and the vectors representing the set of prior events.
  • 6. The computer-implemented method of claim 2, further comprising generating the composite vector for the user based on the vectors representing the set of prior events.
  • 7. The computer-implemented method of claim 2, further comprising: receiving anomaly indicators for the set of prior events;generating an anomaly model that combines similarity metrics of the set of prior events to generate an output determination for a prior event corresponding to an anomaly indicator for the prior event;generating similarity metrics for the candidate event, the similarity metrics indicating degrees of similarity between the candidate event and at least one of the prior events included in the set of prior events, the similarity metrics including the degree of similarity between the candidate event vector and a vector representing of one of prior events included in the set of prior events; andgenerating the indication of the behavior anomaly using the similarity metrics and the anomaly model.
  • 8. The computer-implemented method of claim 1, wherein one of the maturation window size or the exponential decay factor is selected using the candidate event.
  • 9. The computer-implemented method of claim 1, wherein the event processing device comprises a card reading device, and wherein receiving the candidate event for the user comprises receiving, from the card reading device, an authorization request including the candidate event, and wherein providing the indication of the behavior anomaly comprises providing an authorization response indicating the candidate event is unauthorized.
  • 10. The computer-implemented method of claim 1, further comprising receiving a third-party behavior score for the user from a third-party behavior scoring system, wherein identifying the behavior anomaly is further based at least in part on the third-party behavior score.
  • 11. A computer-implemented method of artificial intelligence guided monitoring of event data, the method comprising: under control of one or more computing devices configured with specific computer-executable instructions, receiving, from an event processing device, a candidate event for a user;generating a candidate event vector using a model and the candidate event;identifying a behavior anomaly using a degree of similarity between the candidate event vector and a prior event vector for a prior event, wherein the degree of similarity is generated using an exponentially weighted moving average Ck+1 that is determined by the equation:
  • 12. The computer-implemented method of claim 11, further comprising: accessing, from a data store, a sequence of event records associated with the user, the sequence of event records indicating a history of events for the user;identifying a set of attributes of an event record to represent the event records; andgenerating a model to provide the vector of the prior event included in the history of events using values for the set of attributes of the sequence of event records.
  • 13. The computer-implemented method of claim 12, further comprising receiving a third-party behavior score for the user from a third-party behavior scoring system, wherein identifying the behavior anomaly is further based at least in part on the third-party behavior score.
  • 14. The computer-implemented method of claim 11, wherein a vector representation of a first event at a first time indicates a higher degree of similarity to a second vector representation of a second event at a second time than to a third vector representation of a third event at a third time, wherein a first difference between the first time and the second time is less than a second difference between the first time and the third time.
  • 15. The computer-implemented method of claim 11, wherein providing the indication of the behavior anomaly comprises providing an authorization response indicating a transaction associated with the candidate event is unauthorized, wherein the authorization response causes configuration of the event processing device to acquire additional event information to authorize the transaction associated with the candidate event.
  • 16. Non-transitory computer-readable media comprising instructions for artificial intelligence guided monitoring of event data, wherein the instructions, when executed by one or more computing devices associated with an electronic data processing device, cause the electronic data processing device to: receive, from an event processing device, a candidate event for a user;generate a candidate event vector using a model and the candidate event;identify a behavior anomaly using a degree of similarity between the candidate event vector and a prior event vector for a prior event, wherein the degree of similarity is generated using an exponentially weighted moving average Ck+1 that is determined by the equation:
  • 17. The non-transitory computer-readable media of claim 16 further comprising instructions that cause the electronic data processing device to: access, from a data store, a sequence of event records associated with the user, the sequence of event records indicating a history of events for the user;identify a set of attributes of an event record to represent the event records; andgenerate a model to provide the vector of the prior event included in the history of events using values for the set of attributes of the sequence of event records.
  • 18. The non-transitory computer-readable media of claim 17, wherein a first vector of a first event at a first time indicates a higher degree of similarity to a second vector of a second event at a second time than to a third vector of a third event at a third time, wherein a first difference between the first time and the second time is less than a second difference between the first time and the third time.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application No. 62/188,252, filed Jul. 2, 2015, which is incorporated by reference in its entirety Any and all priority claims identified in the Application Data Sheet, or any correction thereto, are hereby incorporated by reference under 37 C.F.R. § 1.57.

US Referenced Citations (1063)
Number Name Date Kind
2074513 Mills Mar 1937 A
3316395 Lavin et al. Apr 1967 A
3752904 Waterbury Aug 1973 A
4163290 Sutherlin et al. Jul 1979 A
5274547 Zoffel et al. Dec 1993 A
5323315 Highbloom Jun 1994 A
5386104 Sime Jan 1995 A
5414833 Hershey et al. May 1995 A
5454030 de Oliveira et al. Sep 1995 A
5504675 Cragun et al. Apr 1996 A
5563783 Stolfo et al. Oct 1996 A
5627886 Bowman May 1997 A
5679940 Templeton et al. Oct 1997 A
5696907 Tom Dec 1997 A
5696965 Dedrick Dec 1997 A
5739512 Tognazzini Apr 1998 A
5742775 King Apr 1998 A
5745654 Titan Apr 1998 A
5752242 Havens May 1998 A
5754632 Smith May 1998 A
5774868 Cragun et al. Jun 1998 A
5793497 Funk Aug 1998 A
5809478 Greco et al. Sep 1998 A
5819226 Gopinathan et al. Oct 1998 A
5819260 Lu et al. Oct 1998 A
5822741 Fischthal Oct 1998 A
5832068 Smith Nov 1998 A
5842178 Giovannoli Nov 1998 A
5870721 Norris Feb 1999 A
5872921 Zahariev et al. Feb 1999 A
5878403 DeFrancesco Mar 1999 A
5879297 Haynor et al. Mar 1999 A
5884289 Anderson et al. Mar 1999 A
5912839 Ovshinsky et al. Jun 1999 A
5913196 Talmor et al. Jun 1999 A
5943666 Kleewein et al. Aug 1999 A
5950179 Buchanan et al. Sep 1999 A
5987440 O'Neil et al. Nov 1999 A
5999907 Donner Dec 1999 A
5999940 Ranger Dec 1999 A
6023694 Kouchi et al. Feb 2000 A
6029139 Cunningham et al. Feb 2000 A
6029149 Dykstra et al. Feb 2000 A
6029154 Pettitt Feb 2000 A
6029194 Tilt Feb 2000 A
6044357 Garg Mar 2000 A
6055570 Nielsen Apr 2000 A
6094643 Anderson et al. Jul 2000 A
6119103 Basch et al. Sep 2000 A
6125985 Amdahl et al. Oct 2000 A
6142283 Amdahl et al. Nov 2000 A
6144988 Kappel Nov 2000 A
6157707 Baulier et al. Dec 2000 A
6182219 Feldbau et al. Jan 2001 B1
6208720 Curtis et al. Mar 2001 B1
6249228 Shirk et al. Jun 2001 B1
6253203 O'Flaherty et al. Jun 2001 B1
6254000 Degen et al. Jul 2001 B1
6263447 French et al. Jul 2001 B1
6269349 Aieta et al. Jul 2001 B1
6282658 French et al. Aug 2001 B2
6285983 Jenkins Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6292795 Peters et al. Sep 2001 B1
6311169 Duhon Oct 2001 B2
6317783 Freishtat et al. Nov 2001 B1
6321339 French et al. Nov 2001 B1
6330546 Gopinathan et al. Dec 2001 B1
6397197 Gindlesperger May 2002 B1
6418436 Degen et al. Jul 2002 B1
6424956 Werbos Jul 2002 B1
6448889 Hudson Sep 2002 B1
6456984 Demoff et al. Sep 2002 B1
6496936 French et al. Dec 2002 B1
5870721 Norris Jan 2003 C1
6505193 Musgrave et al. Jan 2003 B1
6510415 Talmor et al. Jan 2003 B1
6513018 Culhane Jan 2003 B1
6532459 Berson Mar 2003 B1
6542894 Lee et al. Apr 2003 B1
6543683 Hoffman Apr 2003 B2
6553495 Johansson et al. Apr 2003 B1
6571334 Feldbau et al. May 2003 B1
6597775 Lawyer et al. Jul 2003 B2
6612488 Suzuki Sep 2003 B2
6615193 Kingdon et al. Sep 2003 B1
6658393 Basch et al. Dec 2003 B1
6662023 Helle Dec 2003 B1
6696941 Baker Feb 2004 B2
6700220 Bayeur et al. Mar 2004 B2
6714918 Hillmer et al. Mar 2004 B2
6735572 Landesmann May 2004 B2
6740875 Ishikawa et al. May 2004 B1
6748426 Shaffer et al. Jun 2004 B1
6751626 Brown et al. Jun 2004 B2
6796497 Benkert et al. Sep 2004 B2
6811082 Wong Nov 2004 B2
6829711 Kwok et al. Dec 2004 B1
6850606 Lawyer et al. Feb 2005 B2
6857073 French et al. Feb 2005 B2
6866586 Oberberger et al. Mar 2005 B2
6871287 Ellingson Mar 2005 B1
6873979 Fishman et al. Mar 2005 B2
6898574 Regan May 2005 B1
6907408 Angel Jun 2005 B2
6908030 Rajasekaran et al. Jun 2005 B2
6913194 Suzuki Jul 2005 B2
6918038 Smith et al. Jul 2005 B1
6920435 Hoffman et al. Jul 2005 B2
6928546 Nanavati et al. Aug 2005 B1
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6934858 Woodhill Aug 2005 B2
6965881 Brickell et al. Nov 2005 B1
6965997 Dutta Nov 2005 B2
6973462 Dattero et al. Dec 2005 B2
6973575 Arnold Dec 2005 B2
6983381 Jerdonek Jan 2006 B2
6983882 Cassone Jan 2006 B2
6991174 Zuili Jan 2006 B2
6993659 Milgramm et al. Jan 2006 B2
7007174 Wheeler et al. Feb 2006 B2
7028052 Chapman et al. Apr 2006 B2
7035855 Kilger et al. Apr 2006 B1
7069240 Spero et al. Jun 2006 B2
7083090 Zuili Aug 2006 B2
7089592 Adjaoute et al. Aug 2006 B2
7092891 Maus et al. Aug 2006 B2
7104444 Suzuki Sep 2006 B2
7158622 Lawyer et al. Jan 2007 B2
7162640 Heath et al. Jan 2007 B2
7174335 Kameda Feb 2007 B2
7188078 Arnett et al. Mar 2007 B2
7203653 McIntosh Apr 2007 B1
7212995 Schulkins May 2007 B2
7222779 Pineda-Sanchez et al. May 2007 B1
7225977 Davis Jun 2007 B2
7234156 French et al. Jun 2007 B2
7240059 Bayliss et al. Jul 2007 B2
7240363 Ellingson Jul 2007 B1
7246067 Austin et al. Jul 2007 B2
7246740 Swift et al. Jul 2007 B2
7254560 Singhal Aug 2007 B2
7263506 Lee et al. Aug 2007 B2
7272728 Pierson et al. Sep 2007 B2
7272857 Everhart Sep 2007 B1
7277869 Starkman Oct 2007 B2
7277875 Serrano-Morales et al. Oct 2007 B2
7283974 Katz et al. Oct 2007 B2
7289607 Bhargava et al. Oct 2007 B2
7290704 Ball et al. Nov 2007 B1
7298873 Miller, Jr. et al. Nov 2007 B2
7310743 Gagne et al. Dec 2007 B1
7314162 Carr et al. Jan 2008 B2
7314167 Kiliccote Jan 2008 B1
7330871 Barber Feb 2008 B2
7333635 Tsantes et al. Feb 2008 B2
7340042 Cluff et al. Mar 2008 B2
7343149 Benco Mar 2008 B2
7356516 Richey et al. Apr 2008 B2
7370044 Mulhern et al. May 2008 B2
7370351 Ramachandran et al. May 2008 B1
7376618 Anderson et al. May 2008 B1
7383227 Weinflash et al. Jun 2008 B2
7386448 Poss et al. Jun 2008 B1
7386506 Aoki et al. Jun 2008 B2
7392534 Lu et al. Jun 2008 B2
7395273 Khan et al. Jul 2008 B2
7398915 Pineda-Sanchez et al. Jul 2008 B1
7406715 Clapper Jul 2008 B2
7412228 Barclay et al. Aug 2008 B2
7418431 Nies et al. Aug 2008 B1
7428509 Klebanoff Sep 2008 B2
7433855 Gavan et al. Oct 2008 B2
7433864 Malik Oct 2008 B2
7438226 Helsper et al. Oct 2008 B2
7444518 Dharmarajan et al. Oct 2008 B1
7457401 Lawyer et al. Nov 2008 B2
7458508 Shao et al. Dec 2008 B1
7466235 Kolb et al. Dec 2008 B1
7467401 Cicchitto Dec 2008 B2
7480631 Merced et al. Jan 2009 B1
7481363 Zuili Jan 2009 B2
7490052 Kilger et al. Feb 2009 B2
7490356 Lieblich et al. Feb 2009 B2
7497374 Helsper et al. Mar 2009 B2
7509117 Yum Mar 2009 B2
7512221 Toms Mar 2009 B2
7519558 Ballard et al. Apr 2009 B2
7522060 Tumperi et al. Apr 2009 B1
7533808 Song et al. May 2009 B2
7536346 Aliffi et al. May 2009 B2
7540021 Page May 2009 B2
7542993 Satterfield et al. Jun 2009 B2
7543739 Brown et al. Jun 2009 B2
7543740 Greene et al. Jun 2009 B2
7546271 Chmielewski et al. Jun 2009 B1
7548886 Kirkland et al. Jun 2009 B2
7552467 Lindsay Jun 2009 B2
7562184 Henmi et al. Jul 2009 B2
7562814 Shao et al. Jul 2009 B1
7568616 Zuili Aug 2009 B2
7575157 Barnhardt et al. Aug 2009 B2
7580884 Cook Aug 2009 B2
7581112 Brown et al. Aug 2009 B2
7584146 Duhon Sep 2009 B1
7587368 Felsher Sep 2009 B2
7591425 Zuili et al. Sep 2009 B1
7593891 Kornegay et al. Sep 2009 B2
7606401 Hoffman et al. Oct 2009 B2
7606790 Levy Oct 2009 B2
7610216 May et al. Oct 2009 B1
7610229 Kornegay Oct 2009 B1
7610243 Haggerty et al. Oct 2009 B2
7620596 Knudson et al. Nov 2009 B2
7623844 Herrmann et al. Nov 2009 B2
7630924 Collins et al. Dec 2009 B1
7630932 Danaher et al. Dec 2009 B2
7636853 Cluts et al. Dec 2009 B2
7644868 Hare Jan 2010 B2
7647344 Skurtovich, Jr. et al. Jan 2010 B2
7647645 Edeki et al. Jan 2010 B2
7653593 Zarikian et al. Jan 2010 B2
7657431 Hayakawa Feb 2010 B2
7668769 Baker et al. Feb 2010 B2
7668840 Bayliss et al. Feb 2010 B2
7668921 Proux et al. Feb 2010 B2
7672865 Kumar et al. Mar 2010 B2
7673793 Greene et al. Mar 2010 B2
7676418 Chung et al. Mar 2010 B1
7676433 Ross et al. Mar 2010 B1
7685096 Margolus et al. Mar 2010 B2
7686214 Shao et al. Mar 2010 B1
7689007 Bous et al. Mar 2010 B2
7689505 Kasower Mar 2010 B2
7689506 Fei et al. Mar 2010 B2
7690032 Peirce Mar 2010 B1
7701364 Zilberman Apr 2010 B1
7702550 Perg et al. Apr 2010 B2
7707163 Anzalone et al. Apr 2010 B2
7708190 Brandt et al. May 2010 B2
7708200 Helsper et al. May 2010 B2
7711635 Steele et al. May 2010 B2
7711636 Robida et al. May 2010 B2
7720750 Brody May 2010 B2
7725300 Pinto et al. May 2010 B2
7734523 Cui et al. Jun 2010 B1
7735125 Alvarez et al. Jun 2010 B1
7742982 Chaudhuri et al. Jun 2010 B2
7747520 Livermore et al. Jun 2010 B2
7747521 Serio Jun 2010 B2
7747559 Leitner et al. Jun 2010 B2
7752084 Pettitt Jul 2010 B2
7752236 Williams et al. Jul 2010 B2
7752554 Biggs et al. Jul 2010 B2
7756783 Crooks Jul 2010 B2
7761379 Zoldi et al. Jul 2010 B2
7761384 Madhogarhia Jul 2010 B2
7774270 MacCloskey Aug 2010 B1
7778885 Semprevivo et al. Aug 2010 B1
7779456 Dennis et al. Aug 2010 B2
7779457 Taylor Aug 2010 B2
7783281 Cook et al. Aug 2010 B1
7783515 Kumar et al. Aug 2010 B1
7788184 Kane Aug 2010 B2
7792715 Kasower Sep 2010 B1
7792864 Rice et al. Sep 2010 B1
7793835 Coggeshall et al. Sep 2010 B1
7801811 Merrell et al. Sep 2010 B1
7801828 Candella et al. Sep 2010 B2
7802104 Dickinson Sep 2010 B2
7805362 Merrell et al. Sep 2010 B1
7805391 Friedlander et al. Sep 2010 B2
7809797 Cooley et al. Oct 2010 B2
7813944 Luk et al. Oct 2010 B1
7827115 Weller et al. Nov 2010 B2
7832006 Chen et al. Nov 2010 B2
7835983 Lefner et al. Nov 2010 B2
7840459 Loftesness et al. Nov 2010 B1
7841004 Balducci et al. Nov 2010 B1
7844520 Franklin Nov 2010 B1
7848987 Haig Dec 2010 B2
7849029 Crooks et al. Dec 2010 B2
7853518 Cagan Dec 2010 B2
7853526 Milana Dec 2010 B2
7853533 Eisen Dec 2010 B2
7853998 Blaisdell et al. Dec 2010 B2
7856397 Whipple et al. Dec 2010 B2
7856494 Kulkarni Dec 2010 B2
7860769 Benson Dec 2010 B2
7860783 Yang et al. Dec 2010 B2
7865427 Wright et al. Jan 2011 B2
7865439 Seifert et al. Jan 2011 B2
7865937 White et al. Jan 2011 B1
7870078 Clark et al. Jan 2011 B2
7870599 Pemmaraju Jan 2011 B2
7873382 Rydgren et al. Jan 2011 B2
7874488 Parkinson Jan 2011 B2
7877304 Coulter Jan 2011 B1
7877784 Chow et al. Jan 2011 B2
7882548 Heath et al. Feb 2011 B2
7890433 Singhal Feb 2011 B2
7904360 Evans Mar 2011 B2
7904367 Chung et al. Mar 2011 B2
7908242 Achanta Mar 2011 B1
7909246 Hogg et al. Mar 2011 B2
7912865 Akerman et al. Mar 2011 B2
7917715 Tallman, Jr. Mar 2011 B2
7925582 Kornegay et al. Apr 2011 B1
7929951 Stevens et al. Apr 2011 B2
7933835 Keane et al. Apr 2011 B2
7941363 Tanaka et al. May 2011 B2
7945515 Zoldi et al. May 2011 B2
7950577 Daniel May 2011 B1
7958046 Doerner et al. Jun 2011 B2
7961857 Zoldi et al. Jun 2011 B2
7962404 Metzger, II et al. Jun 2011 B1
7962467 Howard et al. Jun 2011 B2
7970679 Kasower Jun 2011 B2
7970698 Gupta et al. Jun 2011 B2
7970701 Lewis et al. Jun 2011 B2
7971246 Emigh et al. Jun 2011 B1
7975299 Balducci et al. Jul 2011 B1
7983976 Nafeh et al. Jul 2011 B2
7983979 Holland, IV Jul 2011 B2
7984849 Berghel et al. Jul 2011 B2
7988043 Davis Aug 2011 B2
7991201 Bous et al. Aug 2011 B2
7991689 Brunzell et al. Aug 2011 B1
7991716 Crooks et al. Aug 2011 B2
7991751 Peled et al. Aug 2011 B2
7995994 Khetawat et al. Aug 2011 B2
7996521 Chamberlain et al. Aug 2011 B2
8001034 Chung et al. Aug 2011 B2
8001042 Brunzell et al. Aug 2011 B1
8001153 Skurtovich, Jr. et al. Aug 2011 B2
8001597 Crooks Aug 2011 B2
8005749 Ginsberg Aug 2011 B2
8006291 Headley et al. Aug 2011 B2
8009873 Chapman Aug 2011 B2
8019678 Wright et al. Sep 2011 B2
8020763 Kowalchyk et al. Sep 2011 B1
8024263 Zarikian et al. Sep 2011 B2
8024271 Grant Sep 2011 B2
8027439 Zoldi et al. Sep 2011 B2
8027518 Baker et al. Sep 2011 B2
8027947 Hinsz et al. Sep 2011 B2
8028168 Smithies et al. Sep 2011 B2
8028326 Palmer et al. Sep 2011 B2
8028329 Whitcomb Sep 2011 B2
8028896 Carter et al. Oct 2011 B2
8032448 Anderson et al. Oct 2011 B2
8032449 Hu et al. Oct 2011 B2
8032927 Ross Oct 2011 B2
8037097 Guo et al. Oct 2011 B2
8037512 Wright et al. Oct 2011 B2
8041597 Li et al. Oct 2011 B2
8042159 Basner et al. Oct 2011 B2
8042193 Piliouras Oct 2011 B1
8049596 Sato Nov 2011 B2
8055667 Levy Nov 2011 B2
8056128 Dingle et al. Nov 2011 B1
8058972 Mohanty Nov 2011 B2
8060424 Kasower Nov 2011 B2
8060915 Voice et al. Nov 2011 B2
8060916 Bajaj et al. Nov 2011 B2
8065233 Lee et al. Nov 2011 B2
8065525 Zilberman Nov 2011 B2
8069053 Gervais et al. Nov 2011 B2
8069084 Mackouse Nov 2011 B2
8069256 Rasti Nov 2011 B2
8069485 Carter Nov 2011 B2
8073785 Candella et al. Dec 2011 B1
8078569 Kennel Dec 2011 B2
8090648 Zoldi et al. Jan 2012 B2
8104679 Brown Jan 2012 B2
8116731 Buhrmann et al. Feb 2012 B2
8121962 Vaiciulis et al. Feb 2012 B2
8131615 Diev et al. Mar 2012 B2
8151327 Eisen Apr 2012 B2
8195549 Kasower Jun 2012 B2
8201257 Andres et al. Jun 2012 B1
8204774 Chwast et al. Jun 2012 B2
8214262 Semprevivo et al. Jul 2012 B1
8214285 Hu et al. Jul 2012 B2
8224723 Bosch et al. Jul 2012 B2
8225395 Atwood et al. Jul 2012 B2
8239677 Colson Aug 2012 B2
8244629 Lewis et al. Aug 2012 B2
8260914 Ranjan Sep 2012 B1
8280805 Abrahams et al. Oct 2012 B1
8280833 Miltonberger Oct 2012 B2
8285613 Coulter Oct 2012 B1
8285636 Curry et al. Oct 2012 B2
8296225 Maddipati et al. Oct 2012 B2
8296229 Yellin et al. Oct 2012 B1
8296250 Crooks et al. Oct 2012 B2
8332338 Vaiciulis et al. Dec 2012 B2
8346593 Fanelli Jan 2013 B2
8355896 Kumar et al. Jan 2013 B2
8359278 Domenikos et al. Jan 2013 B2
8364588 Celka et al. Jan 2013 B2
8374973 Herbrich et al. Feb 2013 B2
8386377 Xiong et al. Feb 2013 B1
8429070 Hu et al. Apr 2013 B2
8468090 Lesandro et al. Jun 2013 B2
8489479 Slater et al. Jul 2013 B2
8510329 Balkir et al. Aug 2013 B2
8515844 Kasower Aug 2013 B2
8516439 Brass et al. Aug 2013 B2
8543499 Haggerty et al. Sep 2013 B2
8548137 Zoldi et al. Oct 2013 B2
8548903 Becker Oct 2013 B2
8549590 de Villiers Prichard et al. Oct 2013 B1
8559607 Zoldi et al. Oct 2013 B2
8567669 Griegel et al. Oct 2013 B2
8578496 Krishnappa Nov 2013 B1
8630938 Cheng et al. Jan 2014 B2
8639920 Stack et al. Jan 2014 B2
8645301 Vaiciulis et al. Feb 2014 B2
8671115 Skurtovich, Jr. et al. Mar 2014 B2
8676684 Newman et al. Mar 2014 B2
8676726 Hore et al. Mar 2014 B2
8682755 Bucholz et al. Mar 2014 B2
8683586 Crooks Mar 2014 B2
8694427 Maddipati et al. Apr 2014 B2
8707445 Sher-Jan et al. Apr 2014 B2
8725613 Celka et al. May 2014 B1
8763133 Sher-Jan et al. Jun 2014 B2
8776225 Pierson et al. Jul 2014 B2
8781953 Kasower Jul 2014 B2
8781975 Bennett et al. Jul 2014 B2
8793777 Colson Jul 2014 B2
8805836 Hore et al. Aug 2014 B2
8812387 Samler et al. Aug 2014 B1
8819793 Gottschalk, Jr. Aug 2014 B2
8824648 Zoldi et al. Sep 2014 B2
8826393 Eisen Sep 2014 B2
8862514 Eisen Oct 2014 B2
8862526 Miltonberger Oct 2014 B2
8909664 Hopkins Dec 2014 B2
8918891 Coggeshall et al. Dec 2014 B2
8949981 Trollope et al. Feb 2015 B1
9118646 Pierson et al. Aug 2015 B2
9191403 Zoldi et al. Nov 2015 B2
9194899 Zoldi et al. Nov 2015 B2
9196004 Eisen Nov 2015 B2
9210156 Little et al. Dec 2015 B1
9235728 Gottschalk, Jr. et al. Jan 2016 B2
9251541 Celka et al. Feb 2016 B2
9256624 Skurtovich, Jr. et al. Feb 2016 B2
9280658 Coggeshall et al. Mar 2016 B2
9361597 Britton et al. Jun 2016 B2
9367520 Zhao et al. Jun 2016 B2
9390384 Eisen Jul 2016 B2
9412141 Prichard et al. Aug 2016 B2
9483650 Sher-Jan et al. Nov 2016 B2
9489497 MaGill et al. Nov 2016 B2
9531738 Zoldi et al. Dec 2016 B2
9558368 Gottschalk, Jr. et al. Jan 2017 B2
9595066 Samler et al. Mar 2017 B2
9652802 Kasower May 2017 B1
9704195 Zoldi Jul 2017 B2
9710523 Skurtovich, Jr. et al. Jul 2017 B2
9710868 Gottschalk, Jr. et al. Jul 2017 B2
9754256 Britton et al. Sep 2017 B2
9754311 Eisen Sep 2017 B2
9773227 Zoldi et al. Sep 2017 B2
9781147 Sher-Jan et al. Oct 2017 B2
9953321 Zoldi et al. Apr 2018 B2
10043213 Straub et al. Aug 2018 B2
10089679 Eisen Oct 2018 B2
10089686 Straub et al. Oct 2018 B2
10102530 Zoldi et al. Oct 2018 B2
10115153 Zoldi et al. Oct 2018 B2
10152736 Yang et al. Dec 2018 B2
10217163 Straub et al. Feb 2019 B2
10242540 Chen et al. Mar 2019 B2
10339527 Coleman et al. Jul 2019 B1
10373061 Kennel et al. Aug 2019 B2
10430604 Spinelli et al. Oct 2019 B2
10438308 Prichard et al. Oct 2019 B2
10497034 Yang et al. Dec 2019 B2
10510025 Zoldi et al. Dec 2019 B2
10528948 Zoldi et al. Jan 2020 B2
10579938 Zoldi et al. Mar 2020 B2
10592982 Samler et al. Mar 2020 B2
10593004 Gottschalk, Jr. et al. Mar 2020 B2
10692058 Zoldi et al. Jun 2020 B2
10699028 Kennedy et al. Jun 2020 B1
10713711 Zoldi Jul 2020 B2
10769290 Crawford et al. Sep 2020 B2
10791136 Zoldi et al. Sep 2020 B2
10896381 Zoldi et al. Jan 2021 B2
10902426 Zoldi et al. Jan 2021 B2
20010014868 Herz et al. Aug 2001 A1
20010014878 Mitra et al. Aug 2001 A1
20010027413 Bhutta Oct 2001 A1
20010029470 Schultz et al. Oct 2001 A1
20010034631 Kiselik Oct 2001 A1
20010039523 Iwamoto Nov 2001 A1
20020010684 Moskowitz Jan 2002 A1
20020013899 Faul Jan 2002 A1
20020019804 Sutton Feb 2002 A1
20020019938 Aarons Feb 2002 A1
20020032635 Harris et al. Mar 2002 A1
20020040344 Preiser et al. Apr 2002 A1
20020042879 Gould et al. Apr 2002 A1
20020052841 Guthrie et al. May 2002 A1
20020059521 Tasler May 2002 A1
20020062185 Runge et al. May 2002 A1
20020062281 Singhal May 2002 A1
20020073044 Singhal Jun 2002 A1
20020077178 Oberberger et al. Jun 2002 A1
20020077964 Brody et al. Jun 2002 A1
20020080256 Bates et al. Jun 2002 A1
20020087460 Hornung Jul 2002 A1
20020099649 Lee et al. Jul 2002 A1
20020119824 Allen Aug 2002 A1
20020130176 Suzuki Sep 2002 A1
20020138417 Lawrence Sep 2002 A1
20020138751 Dutta Sep 2002 A1
20020147695 Khedkar et al. Oct 2002 A1
20020156676 Ahrens et al. Oct 2002 A1
20020161664 Shaya et al. Oct 2002 A1
20020161711 Sartor et al. Oct 2002 A1
20020173994 Ferguson, III Nov 2002 A1
20020178112 Goeller et al. Nov 2002 A1
20020184509 Scheldt et al. Dec 2002 A1
20020188544 Wizon et al. Dec 2002 A1
20030004879 Demoff et al. Jan 2003 A1
20030009426 Ruiz-Sanchez Jan 2003 A1
20030018549 Fei et al. Jan 2003 A1
20030033261 Knegendorf Feb 2003 A1
20030046554 Leydier et al. Mar 2003 A1
20030048904 Wang et al. Mar 2003 A1
20030050882 Degen et al. Mar 2003 A1
20030057278 Wong Mar 2003 A1
20030061163 Durfield Mar 2003 A1
20030065563 Elliott et al. Apr 2003 A1
20030070101 Buscemi Apr 2003 A1
20030078877 Beirne et al. Apr 2003 A1
20030093366 Halper et al. May 2003 A1
20030097320 Gordon May 2003 A1
20030105696 Kalotay et al. Jun 2003 A1
20030115133 Bian Jun 2003 A1
20030143980 Choi et al. Jul 2003 A1
20030149744 Bierre et al. Aug 2003 A1
20030153299 Perfit et al. Aug 2003 A1
20030158751 Suresh et al. Aug 2003 A1
20030158960 Engberg Aug 2003 A1
20030182214 Taylor Sep 2003 A1
20030195859 Lawrence Oct 2003 A1
20030200447 Sjoblom Oct 2003 A1
20030208428 Raynes et al. Nov 2003 A1
20030222500 Bayeur et al. Dec 2003 A1
20030225656 Aberman et al. Dec 2003 A1
20030225692 Bosch et al. Dec 2003 A1
20030225742 Tenner et al. Dec 2003 A1
20030233278 Marshall Dec 2003 A1
20040004117 Suzuki Jan 2004 A1
20040005912 Hubbe et al. Jan 2004 A1
20040010698 Rolfe Jan 2004 A1
20040024709 Yu et al. Feb 2004 A1
20040026496 Zuili Feb 2004 A1
20040030649 Nelson et al. Feb 2004 A1
20040039586 Garvey et al. Feb 2004 A1
20040054619 Watson et al. Mar 2004 A1
20040059653 Verkuylen et al. Mar 2004 A1
20040064401 Palaghita et al. Apr 2004 A1
20040078324 Lonnberg et al. Apr 2004 A1
20040103147 Flesher et al. May 2004 A1
20040107363 Monteverde Jun 2004 A1
20040110119 Riconda et al. Jun 2004 A1
20040111305 Gavan et al. Jun 2004 A1
20040111335 Black et al. Jun 2004 A1
20040117235 Shacham Jun 2004 A1
20040128227 Whipple et al. Jul 2004 A1
20040128232 Descloux Jul 2004 A1
20040133440 Carolan et al. Jul 2004 A1
20040143526 Monasterio et al. Jul 2004 A1
20040149820 Zuili Aug 2004 A1
20040149827 Zuili Aug 2004 A1
20040153330 Miller et al. Aug 2004 A1
20040153656 Cluts et al. Aug 2004 A1
20040158520 Noh Aug 2004 A1
20040158523 Dort Aug 2004 A1
20040158723 Root Aug 2004 A1
20040167793 Masuoka et al. Aug 2004 A1
20040177046 Ogram Sep 2004 A1
20040193538 Raines Sep 2004 A1
20040199456 Flint et al. Oct 2004 A1
20040199462 Starrs Oct 2004 A1
20040204948 Singletary et al. Oct 2004 A1
20040225594 Nolan, III et al. Nov 2004 A1
20040230448 Schaich Nov 2004 A1
20040230527 Hansen et al. Nov 2004 A1
20040230538 Clifton et al. Nov 2004 A1
20040234117 Tibor Nov 2004 A1
20040243514 Wankmueller Dec 2004 A1
20040243518 Clifton et al. Dec 2004 A1
20040243567 Levy Dec 2004 A1
20040250085 Tattan et al. Dec 2004 A1
20040255127 Arnouse Dec 2004 A1
20040260922 Goodman et al. Dec 2004 A1
20050001028 Zuili Jan 2005 A1
20050005168 Dick Jan 2005 A1
20050010513 Duckworth et al. Jan 2005 A1
20050010780 Kane et al. Jan 2005 A1
20050021476 Candella et al. Jan 2005 A1
20050021519 Ghouri Jan 2005 A1
20050027983 Klawon Feb 2005 A1
20050038726 Salomon et al. Feb 2005 A1
20050038737 Norris Feb 2005 A1
20050039086 Krishnamurthy Feb 2005 A1
20050050577 Westbrook et al. Mar 2005 A1
20050058262 Timmins et al. Mar 2005 A1
20050065950 Chaganti et al. Mar 2005 A1
20050071282 Lu et al. Mar 2005 A1
20050075985 Cartmell Apr 2005 A1
20050081052 Washington Apr 2005 A1
20050086161 Gallant Apr 2005 A1
20050091164 Varble Apr 2005 A1
20050097039 Kulcsar et al. May 2005 A1
20050097051 Madill, Jr. et al. May 2005 A1
20050097364 Edeki et al. May 2005 A1
20050102206 Savasoglu et al. May 2005 A1
20050105719 Huda May 2005 A1
20050125226 Magee Jun 2005 A1
20050125686 Brandt Jun 2005 A1
20050138391 Mandalia et al. Jun 2005 A1
20050144143 Freiberg Jun 2005 A1
20050154664 Guy et al. Jul 2005 A1
20050154665 Kerr Jul 2005 A1
20050154671 Doan et al. Jul 2005 A1
20050165667 Cox Jul 2005 A1
20050197953 Broadbent et al. Sep 2005 A1
20050203885 Chenevich et al. Sep 2005 A1
20050216953 Ellingson Sep 2005 A1
20050229007 Bolle et al. Oct 2005 A1
20050240578 Biederman et al. Oct 2005 A1
20050242173 Suzuki Nov 2005 A1
20050251474 Shinn et al. Nov 2005 A1
20050256809 Sadri Nov 2005 A1
20050262014 Fickes Nov 2005 A1
20050273333 Morin et al. Dec 2005 A1
20050273442 Bennett et al. Dec 2005 A1
20050278542 Pierson et al. Dec 2005 A1
20050279827 Mascavage et al. Dec 2005 A1
20050279869 Barklage Dec 2005 A1
20060004663 Singhal Jan 2006 A1
20060014129 Coleman et al. Jan 2006 A1
20060032909 Seegar Feb 2006 A1
20060041464 Powers et al. Feb 2006 A1
20060045105 Dobosz et al. Mar 2006 A1
20060047605 Ahmad Mar 2006 A1
20060059073 Walzak Mar 2006 A1
20060059110 Madhok et al. Mar 2006 A1
20060064374 Helsper et al. Mar 2006 A1
20060074798 Din et al. Apr 2006 A1
20060074986 Mallalieu et al. Apr 2006 A1
20060080230 Freiberg Apr 2006 A1
20060080263 Willis et al. Apr 2006 A1
20060089905 Song et al. Apr 2006 A1
20060101508 Taylor May 2006 A1
20060106605 Saunders et al. May 2006 A1
20060112279 Cohen et al. May 2006 A1
20060112280 Cohen et al. May 2006 A1
20060129428 Wennberg Jun 2006 A1
20060129481 Bhatt et al. Jun 2006 A1
20060129840 Milgramm et al. Jun 2006 A1
20060131390 Kim Jun 2006 A1
20060136332 Ziegler Jun 2006 A1
20060140460 Coutts Jun 2006 A1
20060143073 Engel et al. Jun 2006 A1
20060144924 Stover Jul 2006 A1
20060149580 Helsper et al. Jul 2006 A1
20060149674 Cook et al. Jul 2006 A1
20060161435 Atef et al. Jul 2006 A1
20060161592 Ertoz Jul 2006 A1
20060173776 Shalley et al. Aug 2006 A1
20060173792 Glass Aug 2006 A1
20060177226 Ellis, III Aug 2006 A1
20060178971 Owen et al. Aug 2006 A1
20060179004 Fuchs Aug 2006 A1
20060195351 Bayburtian Aug 2006 A1
20060200855 Willis Sep 2006 A1
20060202012 Grano et al. Sep 2006 A1
20060204051 Holland, IV Sep 2006 A1
20060206725 Milgramm et al. Sep 2006 A1
20060212386 Willey et al. Sep 2006 A1
20060218069 Aberman et al. Sep 2006 A1
20060229961 Lyftogt et al. Oct 2006 A1
20060239512 Petrillo Oct 2006 A1
20060239513 Song et al. Oct 2006 A1
20060242046 Haggerty et al. Oct 2006 A1
20060242047 Haggerty et al. Oct 2006 A1
20060253358 Delgrosso et al. Nov 2006 A1
20060253583 Dixon et al. Nov 2006 A1
20060255914 Westman Nov 2006 A1
20060262929 Vatanen et al. Nov 2006 A1
20060265243 Racho et al. Nov 2006 A1
20060271456 Romain et al. Nov 2006 A1
20060271457 Romain et al. Nov 2006 A1
20060271633 Adler Nov 2006 A1
20060273158 Suzuki Dec 2006 A1
20060277043 Tomes et al. Dec 2006 A1
20060282285 Helsper et al. Dec 2006 A1
20060282372 Endres et al. Dec 2006 A1
20060282395 Leibowitz Dec 2006 A1
20060287765 Kraft Dec 2006 A1
20060287902 Helsper et al. Dec 2006 A1
20060288090 Kraft Dec 2006 A1
20060294023 Lu Dec 2006 A1
20070005508 Chiang Jan 2007 A1
20070011100 Libin et al. Jan 2007 A1
20070016500 Chatterji et al. Jan 2007 A1
20070016521 Wang Jan 2007 A1
20070016522 Wang Jan 2007 A1
20070022141 Singleton et al. Jan 2007 A1
20070038483 Wood Feb 2007 A1
20070038568 Greene et al. Feb 2007 A1
20070040017 Kozlay Feb 2007 A1
20070040019 Berghel et al. Feb 2007 A1
20070043577 Sower Feb 2007 A1
20070047770 Swope et al. Mar 2007 A1
20070048765 Abramson Mar 2007 A1
20070050638 Rasti Mar 2007 A1
20070059442 Sabeta Mar 2007 A1
20070061273 Greene et al. Mar 2007 A1
20070067207 Haggerty et al. Mar 2007 A1
20070067297 Kublickis Mar 2007 A1
20070072190 Aggarwal Mar 2007 A1
20070073622 Kane Mar 2007 A1
20070073630 Greene et al. Mar 2007 A1
20070078786 Bous et al. Apr 2007 A1
20070078908 Rohatgi et al. Apr 2007 A1
20070078985 Shao et al. Apr 2007 A1
20070083460 Bachenheimer Apr 2007 A1
20070087795 Aletto et al. Apr 2007 A1
20070093234 Willis et al. Apr 2007 A1
20070094137 Phillips et al. Apr 2007 A1
20070094264 Nair Apr 2007 A1
20070100774 Abdon May 2007 A1
20070106582 Baker et al. May 2007 A1
20070106611 Larsen May 2007 A1
20070107050 Selvarajan May 2007 A1
20070109103 Jedrey et al. May 2007 A1
20070110282 Millsapp May 2007 A1
20070112667 Rucker May 2007 A1
20070112668 Celano et al. May 2007 A1
20070118393 Rosen et al. May 2007 A1
20070155411 Morrison Jul 2007 A1
20070157299 Hare Jul 2007 A1
20070168246 Haggerty et al. Jul 2007 A1
20070168480 Biggs et al. Jul 2007 A1
20070174208 Black et al. Jul 2007 A1
20070179903 Seinfeld et al. Aug 2007 A1
20070180209 Tallman Aug 2007 A1
20070180263 Delgrosso et al. Aug 2007 A1
20070186276 McRae et al. Aug 2007 A1
20070192248 West Aug 2007 A1
20070192853 Shraim et al. Aug 2007 A1
20070198410 Labgold et al. Aug 2007 A1
20070205266 Carr et al. Sep 2007 A1
20070208669 Rivette et al. Sep 2007 A1
20070214037 Shubert et al. Sep 2007 A1
20070214365 Cornett et al. Sep 2007 A1
20070219928 Madhogarhia Sep 2007 A1
20070220594 Tulsyan Sep 2007 A1
20070226093 Chan et al. Sep 2007 A1
20070226129 Liao et al. Sep 2007 A1
20070233614 McNelley et al. Oct 2007 A1
20070234427 Gardner et al. Oct 2007 A1
20070244782 Chimento Oct 2007 A1
20070244807 Andringa et al. Oct 2007 A1
20070250704 Hallam-Baker Oct 2007 A1
20070250920 Lindsay Oct 2007 A1
20070266439 Kraft Nov 2007 A1
20070282730 Carpenter et al. Dec 2007 A1
20070288355 Roland et al. Dec 2007 A1
20070288360 Seeklus Dec 2007 A1
20070288559 Parsadayan Dec 2007 A1
20070291995 Rivera Dec 2007 A1
20070292006 Johnson Dec 2007 A1
20070294104 Boaz et al. Dec 2007 A1
20070299759 Kelly Dec 2007 A1
20080010203 Grant Jan 2008 A1
20080010683 Baddour et al. Jan 2008 A1
20080010687 Gonen et al. Jan 2008 A1
20080015887 Drabek et al. Jan 2008 A1
20080021804 Deckoff Jan 2008 A1
20080027857 Benson Jan 2008 A1
20080027858 Benson Jan 2008 A1
20080052182 Marshall Feb 2008 A1
20080059236 Cartier Mar 2008 A1
20080059352 Chandran Mar 2008 A1
20080059364 Tidwell et al. Mar 2008 A1
20080059366 Fou Mar 2008 A1
20080063172 Ahuja et al. Mar 2008 A1
20080066188 Kwak Mar 2008 A1
20080071882 Hering et al. Mar 2008 A1
20080076386 Khetawat et al. Mar 2008 A1
20080077526 Arumugam Mar 2008 A1
20080086759 Colson Apr 2008 A1
20080098222 Zilberman Apr 2008 A1
20080103798 Domenikos et al. May 2008 A1
20080103799 Domenikos et al. May 2008 A1
20080103800 Domenikos et al. May 2008 A1
20080103811 Sosa May 2008 A1
20080103972 Lane May 2008 A1
20080104021 Cai et al. May 2008 A1
20080104672 Lunde May 2008 A1
20080114837 Biggs et al. May 2008 A1
20080120237 Lin May 2008 A1
20080126116 Singhai May 2008 A1
20080126233 Hogan May 2008 A1
20080140576 Lewis et al. Jun 2008 A1
20080147454 Walker et al. Jun 2008 A1
20080154758 Schattmaier et al. Jun 2008 A1
20080162202 Khanna et al. Jul 2008 A1
20080162383 Kraft Jul 2008 A1
20080167883 Khazaneh Jul 2008 A1
20080175360 Schwarz et al. Jul 2008 A1
20080177655 Zalik Jul 2008 A1
20080177841 Sinn et al. Jul 2008 A1
20080189789 Lamontagne Aug 2008 A1
20080208548 Metzger et al. Aug 2008 A1
20080208610 Thomas et al. Aug 2008 A1
20080208726 Tsantes et al. Aug 2008 A1
20080217400 Portano Sep 2008 A1
20080228635 Megdal et al. Sep 2008 A1
20080243680 Megdal et al. Oct 2008 A1
20080244717 Jelatis et al. Oct 2008 A1
20080255922 Feldman et al. Oct 2008 A1
20080255992 Lin Oct 2008 A1
20080256613 Grover Oct 2008 A1
20080281737 Fajardo Nov 2008 A1
20080281743 Pettit Nov 2008 A1
20080288382 Smith et al. Nov 2008 A1
20080288430 Friedlander et al. Nov 2008 A1
20080288790 Wilson Nov 2008 A1
20080294540 Celka et al. Nov 2008 A1
20080294689 Metzger et al. Nov 2008 A1
20080296367 Parkinson Dec 2008 A1
20080296382 Connell, II et al. Dec 2008 A1
20080300877 Gilbert et al. Dec 2008 A1
20080319889 Hammad Dec 2008 A1
20090007220 Ormazabal et al. Jan 2009 A1
20090018934 Peng et al. Jan 2009 A1
20090021349 Errico et al. Jan 2009 A1
20090024417 Marks et al. Jan 2009 A1
20090024505 Patel et al. Jan 2009 A1
20090024636 Shiloh Jan 2009 A1
20090024663 McGovern Jan 2009 A1
20090026270 Connell, II et al. Jan 2009 A1
20090043637 Eder Feb 2009 A1
20090044279 Crawford et al. Feb 2009 A1
20090048957 Celano Feb 2009 A1
20090079539 Johnson Mar 2009 A1
20090094311 Awadallah et al. Apr 2009 A1
20090099960 Robida et al. Apr 2009 A1
20090106150 Pelegero et al. Apr 2009 A1
20090106153 Ezra Apr 2009 A1
20090106846 Dupray et al. Apr 2009 A1
20090112650 Iwane Apr 2009 A1
20090119106 Rajakumar et al. May 2009 A1
20090119299 Rhodes May 2009 A1
20090125369 Kloostra et al. May 2009 A1
20090125439 Zarikian et al. May 2009 A1
20090125463 Hido May 2009 A1
20090138391 Dudley et al. May 2009 A1
20090141318 Hughes Jun 2009 A1
20090151005 Bell et al. Jun 2009 A1
20090158404 Hahn et al. Jun 2009 A1
20090164380 Brown Jun 2009 A1
20090172815 Gu et al. Jul 2009 A1
20090182653 Zimiles Jul 2009 A1
20090199264 Lang Aug 2009 A1
20090205032 Hinton et al. Aug 2009 A1
20090206993 Di Mambro et al. Aug 2009 A1
20090216560 Siegel Aug 2009 A1
20090222308 Zoldi et al. Sep 2009 A1
20090222362 Stood et al. Sep 2009 A1
20090222373 Choudhuri et al. Sep 2009 A1
20090222374 Choudhuri et al. Sep 2009 A1
20090222375 Choudhuri et al. Sep 2009 A1
20090222376 Choudhuri et al. Sep 2009 A1
20090222377 Choudhuri et al. Sep 2009 A1
20090222378 Choudhuri et al. Sep 2009 A1
20090222379 Choudhuri et al. Sep 2009 A1
20090222380 Choudhuri et al. Sep 2009 A1
20090222897 Carow et al. Sep 2009 A1
20090224875 Rabinowitz et al. Sep 2009 A1
20090224889 Aggarwal et al. Sep 2009 A1
20090226056 Vlachos et al. Sep 2009 A1
20090234738 Britton et al. Sep 2009 A1
20090240609 Cho et al. Sep 2009 A1
20090241168 Readshaw Sep 2009 A1
20090241173 Troyansky Sep 2009 A1
20090248198 Siegel et al. Oct 2009 A1
20090248497 Hueter Oct 2009 A1
20090248567 Haggerty et al. Oct 2009 A1
20090248568 Haggerty et al. Oct 2009 A1
20090248569 Haggerty et al. Oct 2009 A1
20090248570 Haggerty et al. Oct 2009 A1
20090248571 Haggerty et al. Oct 2009 A1
20090248572 Haggerty et al. Oct 2009 A1
20090248573 Haggerty et al. Oct 2009 A1
20090254476 Sharma et al. Oct 2009 A1
20090254484 Forero et al. Oct 2009 A1
20090257595 de Cesare et al. Oct 2009 A1
20090259470 Chang Oct 2009 A1
20090259560 Bachenheimer Oct 2009 A1
20090259588 Lindsay Oct 2009 A1
20090259855 de Cesare et al. Oct 2009 A1
20090261189 Ellis, Jr. Oct 2009 A1
20090270126 Liu Oct 2009 A1
20090271265 Lay et al. Oct 2009 A1
20090271617 Song et al. Oct 2009 A1
20090272801 Connell, II et al. Nov 2009 A1
20090276244 Baldwin, Jr. et al. Nov 2009 A1
20090281945 Shakkarwar Nov 2009 A1
20090281951 Shakkarwar Nov 2009 A1
20090289110 Regen et al. Nov 2009 A1
20090300066 Guo et al. Dec 2009 A1
20090307778 Mardikar Dec 2009 A1
20090326972 Washington Dec 2009 A1
20090328173 Jakobson et al. Dec 2009 A1
20100024037 Grzymala-Busse et al. Jan 2010 A1
20100030677 Melik-Aslanian et al. Feb 2010 A1
20100031030 Kao et al. Feb 2010 A1
20100037147 Champion et al. Feb 2010 A1
20100037308 Lin et al. Feb 2010 A1
20100042526 Martinov Feb 2010 A1
20100043055 Baumgart Feb 2010 A1
20100070620 Awadallah et al. Mar 2010 A1
20100077006 El Emam et al. Mar 2010 A1
20100085146 Johnson Apr 2010 A1
20100088233 Tattan et al. Apr 2010 A1
20100088338 Pavoni, Jr. et al. Apr 2010 A1
20100094664 Bush et al. Apr 2010 A1
20100094767 Miltonberger Apr 2010 A1
20100094768 Miltonberger Apr 2010 A1
20100094910 Bayliss Apr 2010 A1
20100095357 Willis et al. Apr 2010 A1
20100100406 Lim Apr 2010 A1
20100100945 Ozzie et al. Apr 2010 A1
20100107225 Spencer et al. Apr 2010 A1
20100114724 Ghosh et al. May 2010 A1
20100114744 Gonen May 2010 A1
20100121767 Coulter et al. May 2010 A1
20100130172 Vendrow et al. May 2010 A1
20100131273 Aley-Raz et al. May 2010 A1
20100132043 Bjorn et al. May 2010 A1
20100145836 Baker et al. Jun 2010 A1
20100158207 Dhawan et al. Jun 2010 A1
20100169210 Bous et al. Jul 2010 A1
20100169947 Sarmah et al. Jul 2010 A1
20100188684 Kumara Jul 2010 A1
20100205662 Ibrahim et al. Aug 2010 A1
20100217837 Ansari et al. Aug 2010 A1
20100218255 Ritman et al. Aug 2010 A1
20100228649 Pettitt Sep 2010 A1
20100228657 Kagarlis Sep 2010 A1
20100229225 Sarmah et al. Sep 2010 A1
20100229230 Edeki et al. Sep 2010 A1
20100229245 Singhal Sep 2010 A1
20100241501 Marshall Sep 2010 A1
20100250364 Song et al. Sep 2010 A1
20100250411 Ogrodski Sep 2010 A1
20100250509 Andersen Sep 2010 A1
20100250955 Trevithick et al. Sep 2010 A1
20100268557 Faith et al. Oct 2010 A1
20100274679 Hammad Oct 2010 A1
20100275265 Fiske et al. Oct 2010 A1
20100280882 Faith Nov 2010 A1
20100293090 Domenikos et al. Nov 2010 A1
20100293114 Khan et al. Nov 2010 A1
20100302157 Zilberman Dec 2010 A1
20100306101 Lefner et al. Dec 2010 A1
20100313273 Freas Dec 2010 A1
20100325035 Hilgers et al. Dec 2010 A1
20100325442 Petrone et al. Dec 2010 A1
20100332292 Anderson Dec 2010 A1
20100332362 Ramsey et al. Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110016042 Cho et al. Jan 2011 A1
20110040983 Grzymala-Busse et al. Feb 2011 A1
20110047071 Choudhuri et al. Feb 2011 A1
20110066547 Clark et al. Mar 2011 A1
20110082768 Eisen Apr 2011 A1
20110093383 Haggerty et al. Apr 2011 A1
20110112958 Haggerty et al. May 2011 A1
20110119291 Rice May 2011 A1
20110126024 Beatson et al. May 2011 A1
20110126275 Anderson et al. May 2011 A1
20110131123 Griffin et al. Jun 2011 A1
20110145899 Cao et al. Jun 2011 A1
20110166988 Coulter Jul 2011 A1
20110184838 Winters et al. Jul 2011 A1
20110184851 Megdal et al. Jul 2011 A1
20110196791 Dominguez Aug 2011 A1
20110238566 Santos Sep 2011 A1
20110260832 Ross et al. Oct 2011 A1
20110276496 Neville et al. Nov 2011 A1
20110282778 Wright et al. Nov 2011 A1
20110289032 Crooks et al. Nov 2011 A1
20110289322 Rasti Nov 2011 A1
20110295721 MacDonald Dec 2011 A1
20110295750 Rammal Dec 2011 A1
20110296529 Bhanoo et al. Dec 2011 A1
20110302412 Deng et al. Dec 2011 A1
20110302641 Hald et al. Dec 2011 A1
20120030080 Slater et al. Feb 2012 A1
20120030083 Newman et al. Feb 2012 A1
20120030771 Pierson et al. Feb 2012 A1
20120066073 Dilip et al. Mar 2012 A1
20120101939 Kasower Apr 2012 A1
20120158574 Brunzell et al. Jun 2012 A1
20120158654 Behren et al. Jun 2012 A1
20120198556 Patel et al. Aug 2012 A1
20120215682 Lent et al. Aug 2012 A1
20120278227 Kolo et al. Nov 2012 A1
20120290660 Rao et al. Nov 2012 A1
20130004033 Trugenberger et al. Jan 2013 A1
20130132060 Badhe et al. May 2013 A1
20130185293 Boback Jul 2013 A1
20130218797 Prichard et al. Aug 2013 A1
20140007238 Magee et al. Jan 2014 A1
20140058910 Abeles Feb 2014 A1
20140149304 Bucholz et al. May 2014 A1
20140214636 Rajsky Jul 2014 A1
20140304822 Sher-Jan et al. Oct 2014 A1
20150161529 Kondaji Jun 2015 A1
20150186901 Miltonberger Jul 2015 A1
20150199784 Straub et al. Jul 2015 A1
20150205692 Seto Jul 2015 A1
20150295924 Gottschalk, Jr. Oct 2015 A1
20160012561 Lappenbusch et al. Jan 2016 A1
20160063645 Houseworth et al. Mar 2016 A1
20160071208 Straub et al. Mar 2016 A1
20160086262 Straub et al. Mar 2016 A1
20160210450 Su Jul 2016 A1
20160328814 Prichard et al. Nov 2016 A1
20160344758 Cohen et al. Nov 2016 A1
20170099314 Klatt et al. Apr 2017 A1
20170206376 Sher-Jan Jul 2017 A1
20170270629 Fitzgerald Sep 2017 A1
20170278182 Kasower Sep 2017 A1
20170287065 Samler et al. Oct 2017 A1
20170374076 Pierson et al. Dec 2017 A1
20180130157 Gottschalk, Jr. et al. May 2018 A1
20180322572 Straub et al. Nov 2018 A1
20190311366 Zoldi et al. Oct 2019 A1
20190377896 Spinelli et al. Dec 2019 A1
20200134629 Zoldi et al. Apr 2020 A1
20200143465 Chilaka et al. May 2020 A1
20200145436 Brown et al. May 2020 A1
20200151628 Zoldi et al. May 2020 A1
20200219181 Kasower Jul 2020 A1
20200242615 Chandra et al. Jul 2020 A1
20200396246 Zoldi et al. Dec 2020 A1
Foreign Referenced Citations (41)
Number Date Country
3 058 653 Apr 2020 CA
104877993 Sep 2015 CN
0 554 083 Aug 1993 EP
2 939 361 Oct 2019 EP
2 392 748 Mar 2004 GB
2 518 099 Mar 2015 GB
2011-134252 Jul 2011 JP
5191376 May 2013 JP
10-2004-0034063 Apr 2004 KR
256569 Jun 2006 TW
WO 94006103 Mar 1994 WO
WO 96041488 Dec 1996 WO
WO 00055778 Sep 2000 WO
WO 00055789 Sep 2000 WO
WO 00055790 Sep 2000 WO
WO 01011522 Feb 2001 WO
WO 02027610 Apr 2002 WO
WO 02097563 Dec 2002 WO
WO 03071388 Aug 2003 WO
WO 02037219 May 2004 WO
WO 2004046882 Jun 2004 WO
WO 2006069199 Jun 2006 WO
WO 2007001394 Jan 2007 WO
WO 2007106393 Sep 2007 WO
WO 2008054403 May 2008 WO
WO 2008054849 May 2008 WO
WO 2008147918 Dec 2008 WO
WO 2009062111 May 2009 WO
WO 2009117518 Sep 2009 WO
WO 2011044036 Apr 2011 WO
WO 2012054646 Apr 2012 WO
WO 2012112781 Aug 2012 WO
WO 2013026343 Feb 2013 WO
WO 2013126281 Aug 2013 WO
WO 2014008079 Jan 2014 WO
WO 2014008247 Jan 2014 WO
WO 2014150987 Sep 2014 WO
WO 2018175440 Sep 2018 WO
WO 2018208770 Nov 2018 WO
WO 2019006272 Jan 2019 WO
WO 2019050864 Mar 2019 WO
Non-Patent Literature Citations (105)
Entry
University of Newcasel Upon Tyne, Dealing with Measurement Nose, Apr. 18, 2000, retrieved from: https://web.archive.org/web/20000418021742/http://lorien.ncl.ac.uk/ming/filter/filewma.htm.
Official Communication in Australian Patent Application No. 2019279982, dated Dec. 19, 2019.
“The Return Review: Program Increases Fraud Detection; However, Full Retirement of the Electronic Fraud Detection System Will be Delayed”, Treasury Inspector General for Tax Administration, Sep. 25, 2017, Reference No. 2017-20-080, pp. 27.
U.S. Appl. No. 12/705,489, filed Feb. 12, 2010, Bargoli et al.
U.S. Appl. No. 12/705,511, filed Feb. 12, 2010, Bargoli et al.
“A New Approach to Fraud Solutions”, BasePoint Science Solving Fraud, pp. 8, 2006.
“Arizona Company Has Found Key in Stopping ID Theft,” PR Newswire, New York, Aug. 10, 2005 http://proquest.umi.com/pqdweb?did=880104711&sid=1&Fmt=3&clientId=19649&ROT=309&Vname=PQD.
ABC News Now:Money Matters, as broadcasted Nov. 15, 2005 with guest Todd Davis (CEO of Lifelock), pp. 6.
Anonymous, “Feedback”, Credit Management, ABI/INFORM Global, Sep. 2006, pp. 6.
“Beverly Hills Man Convicted of Operating ‘Bust-Out’ Schemes that Caused More than $8 Million in Losses”, Department of Justice, Jul. 25, 2006, 2 Pgs.
Bielski, Lauren, “Will you Spend to Thwart ID Theft?” ABA Banking Journal, Apr. 2005, pp. 54, 56-57, 60.
Bluecava, “What We Do”, http://www.bluecava.com/what-we-do/, printed Nov. 5, 2012 in 3 pages.
“Bust-Out Schemes”, Visual Analytics Inc. Technical Product Support, Newsletter vol. 4, Issue 1, Jan. 2005, pp. 7.
Chores & Allowances, “Do Kids Have Credit Reports?” Oct. 15, 2007, http://choresandallowances.blogspot.com/2007/10/do-kids-have-credit-reports.html, pp. 5.
Cowie, Norman, “Warning Bells & ‘The Bust-Out’”, Business Credit, Jul. 1, 2000, pp. 5.
Cullen, Terri; “The Wall Street Journal Complete Identity Theft Guidebook:How to Protect Yourself from the Most Pervasive Crime in America”; Chapter 3, pp. 59-79; Jul. 10, 2007.
“Data Loss Prevention (DLP) Software”, http://www.symantec.com/data-loss-prevention/ printed Apr. 8, 2013 in 8 pages.
“Data Protection”, http://compliantprocessing.com/data-protection/ printed Apr. 8, 2013 in 4 pages.
Day, Jo and Kevin; “ID-ology: A Planner's Guide to Identity Theft”; Journal of Financial Planning:Tech Talk; pp. 36-38; Sep. 2004.
EFunds Corporation, “Data & Decisioning: Debit Report” printed Apr. 1, 2007, http://www.efunds.com/web/industry-solutions/financial-services/frm-debit-report/htm in 1 page.
Equifax; “Equifax Credit Watch”; https://www.econsumer.equifax.co.uk/consumer/uk/sitepage.ehtml, dated Jun. 27, 2007 on www.archive.org.
“Fair Isaac Introduces Falcon One System to Combat Fraud at Every Customer Interaction”, Business Wire, May 5, 2005, pp. 3.
“Fair Isaac Offers New Fraud Tool”, National Mortgage News & Source Media, Inc., Jun. 13, 2005, pp. 2.
FamilySecure.com, “Frequently Asked Questions”, http://www.familysecure.com/FAQ.aspx as archived July. 15, 2007 in 3 pages.
FamilySecure.com; “Identity Theft Protection for the Whole Family | FamilySecure.com” http://www.familysecure.com/, as retrieved on Nov. 5, 2009.
“Fighting the New Face of Fraud”, FinanceTech, http://www.financetech.com/showArticle.ihtml?articleID=167100405, Aug. 2, 2005.
“FinExtra, Basepoint Analytics Introduces Predictive Technology for Mortgage Fraud”, Oct. 5, 2005, pp. 3.
Gibbs, Adrienne; “Protecting Your Children from Identity Theft,” Nov. 25, 2008, http://www.creditcards.com/credit-card-news/identity-ID-theft-and-kids-children-1282.php, pp. 4.
“GLBA Compliance and FFIEC Compliance” http://www.trustwave.com/financial-services.php printed Apr. 8, 2013 in 1 page.
Gordon et al., “Identity Fraud: A Critical National and Global Threat,” LexisNexis, Oct. 28, 2003, pp. 1-48.
Herzberg, Amir, “Payments and Banking with Mobile Personal Devices,” Communications of the ACM, May 2003, vol. 46, No. 5, pp. 53-58.
ID Theft Assist, “Do You Know Where Your Child's Credit Is?”, Nov. 26, 2007, http://www.idtheftassist.com/pages/story14, pp. 3.
“ID Thieves These Days Want Your Number, Not Your Name”, The Columbus Dispatch, Columbus, Ohio, http://www.dispatch.com/content/stories/business/2014/08/03/id-thieves-these-days-want-your-number-not-your-name.html, Aug. 3, 2014 in 2 pages.
Identity Theft Resource Center; Fact Sheet 120 A—To Order a Credit Report for a Child; Fact Sheets, Victim Resources; Apr. 30, 2007.
“Identity Thieves Beware: Lifelock Introduces Nation's First Guaranteed Proactive Solution to Identity Theft Protection,” PR Newswire, New York, Jun. 13, 2005 http://proquest.umi.com/pgdweb?did=852869731&sid=1&Fmt=3&clientId=19649&RQT=309&Vname=PQD.
“Industry News, New Technology Identifies Mortgage Fraud: Basepoint Analytics Launches FraudMark”, Inman News, American Land Title Association, Oct. 5, 2005, pp. 1.
Information Brokers of America, “Information Brokers of America Child Identity Theft Protection” http://web.archive.org/web/20080706135451/http://iboainfo.com/child-order.html as archived Jul. 6, 2008 in 1 page.
Information Brokers of America, “Safeguard Your Child's Credit”, http://web.archive.org/web/20071215210406/http://www.iboainfo.com/chiid-id-protect.html as archived Dec. 15, 2007 in 1 page.
Iovation, Device Identification & Device Fingerprinting, http://www.iovation.com/risk-management/device-identification printed Nov. 5, 2012 in 6 pages.
Jacob et al., A Case Study of Checking Account Inquiries and Closures in Chicago, The Center for Financial Services Innovation, Nov. 2006.
Karlan et al., “Observing Unobservables:Identifying Information Asymmetries with a Consumer Credit Field Experiment”, Jun. 17, 2006, pp. 58, http://aida.econ.yale.edu/karlan/papers/ObservingUnobservables.KarianZinman.pdf.
Lamons, Bob, “Be Smart: Offer Inquiry Qualification Services,” Marketing News, ABI/Inform Global, Nov. 6, 1995, vol. 29, No. 23, pp. 13.
Lee, Timothy B., “How America's Broken Tax System Makes Identity Theft Easy”, http://www.vox.com/2014/4/14/5608022/how-americas-broken-tax-system-makes-identity-theft-easy, Apr. 14, 2014, pp. 10.
Lee, W.A.; “Experian, on Deal Hunt, Nets Identity Theft Insurer”, American Banker: The Financial Services Daily, Jun. 4, 2003, New York, NY, 1 page.
Lifelock, “How LifeLock Works.” http://www.lifelock.com/lifelock-for-people printed Mar. 14, 2008 in 1 page.
Lifelock, “LifeLock Launches First ID Theft Prevention Program for the Protection of Children,” Press Release, Oct. 14, 2005, http://www.lifelock.com/about-us/press-room/2005-press-releases/lifelock-protection-for-children.
LifeLock; “How Can LifeLock Protect My Kids and Family?” http://www.lifelock.com/lifelock-for-people/how-we-do-it/how-can-lifelock-protet-my-kids-and-family printed Mar. 14, 2008 in 1 page.
Lifelock, Various Pages, www.lifelock.com/, 2007.
My Call Credit http://www.mycallcredit.com/products.asp?product=ALR dated Dec. 10, 2005 on www.archive.org.
My Call Credit http://www.mycallcredit.com/rewrite.asp?display-faq dated Dec. 10, 2005 on www.archive.org.
MyReceipts, http://www.myreceipts.com/, printed Oct. 16, 2012 in 1 page.
MyReceipts—How it Works, http://www.myreceipts.com/howItWorks.do, printed Oct. 16, 2012 in 1 page.
National Alert Registry Launches RegisteredOffendersList.org to Provide Information on Registered Sex Offenders, May 16, 2005, pp. 2, http://www.prweb.com/printer/240437.htm accessed on Oct. 18, 2011.
National Alert Registry Offers Free Child Safety “Safe From Harm” DVD and Child Identification Kit, Oct. 24, 2006. pp. 2, http://www.prleap.com/or/53170 accessed on Oct. 18, 2011.
National Alert Registry website titled, “Does a sexual offender live in your neighborhood”, Oct. 22, 2006, pp. 2, http://web.archive.org/wb/20061022204835/http://www.nationallertregistry.com/ accessed on Oct. 13, 2011.
Ogg, Erica, “Apple Cracks Down on UDID Use”, http://gigaom.com/apple/apple-cracks-down-on-udid-use/ Printed Nov. 5, 2012 in 5 Pages.
Organizing Maniac's Blog—Online Receipts Provided by MyQuickReceipts.com, http://organizinamaniacs.wordpress.com/2011/01/12/online-receipts-provided-by-myquickreceipts-com/ dated Jan. 12, 2011 printed Oct. 16, 2012 in 3 pages.
Planet Receipt—Home, http://www.planetreceipt.com/home printed Oct. 16, 2012 in 1 page.
Planet Receipt—Solutions & Features, http://www.planetreceipt.com/solutions-features printed Oct. 16, 2012 in 2 pages.
Press Release—“Helping Families Protect Against Identity Theft—Experian Announces FamilySecure.com; Parents and guardians are alerted for signs of potential identity theft for them and their children; product features an industry-leading $2 million guarantee”; PR Newswire; Irvine, CA; Oct. 1, 2007.
Privacy Rights Clearinghouse, “Identity Theft: What to do if it Happens to You,” http://web.archive.org/web/19990218180542/http://privacyrights.org/fs/fs17a.htm printed Feb. 18, 1999.
Rivera, Barbara, “New Tools for Combating Income Tax Refund Fraud”, https://qcn.com/Articles/2014/05/08/Insight-tax-fraud-tools.aspx?Page=1, May 8, 2014, pp. 3.
Scholastic Inc.:Parent's Request for Information http://web.archive.org/wen/20070210091055/http://www.scholastic.com/inforequest/index.htm as archived Feb. 10, 2007 in 1 page.
Scholastic Inc.:Privacy Policy http://web.archive.org/web/20070127214753/http://www.scholastic.com/privacy.htm as archived Jan. 27, 2007 in 3 pages.
Shoeboxed, https://www.shoeboxed.com/sbx-home/ printed Oct. 16, 2012 in 4 pages.
Singletary, Michelle, “The Littlest Victims of ID Theft”, The Washington Post, The Color Of Money, Oct. 4, 2007.
Sumner, Anthony, “Tackling The Issue of Bust-Out Fraud”, Retail Banker International, Jul. 24, 2007, pp. 4.
Sumner, Anthony, “Tackling The Issue of Bust-Out Fraud”, Experian: Decision Analytics, Dec. 18, 2007, pp. 24.
Sumner, Anthony, “Tackling The Issue of Bust-Out Fraud”, e-News, Experian: Decision Analytics, pp. 4, [Originally Published in Retail Banker International Magazine Jul. 24, 2007].
“TransUnion—Child Identity Theft Inquiry”, TransUnion, http://www.transunion.com/corporate/personal/fraudIdentityTheft/fraudPrevention/childIDInquiry.page printed Nov. 5, 2009 in 4 pages.
Truston, “Checking if your Child is an ID Theft Victim can be Stressful,” as posted by Michelle Pastor on Jan. 22, 2007 at http://www.mytruston.com/blog/credit/checking_if_your_child_is_an_id_theft_vi.html.
Vamosi, Robert, “How to Handle ID Fraud's Youngest Victims,” Nov. 21, 2008, http://news.cnet.com/8031-10789_3-10105303-57.html.
Webpage printed out from http://www.jpmorgan.com/cm/ContontServer?c:TS_Content&pagename=jpmorgan%2Fts%2FTS_Content%2FGeneral&cid=1139403950394 on Mar. 20, 2008, Feb. 13, 2006, New York, NY.
Wilson, Andrea, “Escaping the Alcatraz of Collections and Charge-Offs”, http://www.transactionworld.net/articles/2003/october/riskMgmt1.asp, Oct. 2003.
International Search Report and Written Opinion for Application No. PCT/US2007/06070, dated Nov. 10, 2008.
International Search Report and Written Opinion for Application No. PCT/US2008/064594, dated Oct. 30, 2008.
International Preliminary Report and Written Opinion in PCT/US2008/064594, dated Dec. 10, 2009.
International Search Report and Written Opinion for Application No. PCT/US09/37565, dated May 12, 2009.
U.S. Appl. No. 09/557,252, filed Apr. 24, 2000, Page.
Aad et al., “NRC Data Collection and the Privacy by Design Principles”, IEEE, Nov. 2010, pp. 5.
Experian Team, “Impact on Credit Scores of Inquiries for an Auto Loan,” Ask Experian, Mar. 1, 2009, pp. 5.
Fisher, Joseph, “Access to Fair Credit Reports: Current Practices and Proposed Legislation,” American Business Law Journal, Fall 1981, vol. 19, No. 3, p. 319.
“Fraud Alert | Learn How”. Fight Identity Theft, http://www.fightidentitytheft.com/flag.html, accessed on Nov. 5, 2009.
Haglund, Christoffer, “Two-Factor Authentication With a Mobile Phone”, Fox Technologies, Uppsala, Department of Information Technology, Nov. 2, 2007, pp. 62.
“ID Analytics ID Network”, from www.idanalytics.com, as retrieved from www.archive.org, dated Nov. 20, 2005 or earlier; attached as “ID Network (IDNb)”, pp. 8.
ID Cops, www.idcops.com; retrieved from www.archive.org any linkage Feb. 16, 2007.
“Intersections, Inc. Identity Guard”, from www.intersections.com and www.identityguard.com, as retrieved from Internet Archive, dated Nov. 25, 2005 or earlier; attached as “Identity Guard (IDG)”, pp. 7.
Khan, Muhammad Khurram, PhD., “An Efficient and Secure Remote Mutual Authentication Scheme with Smart Cards” IEEE International Symposium on Biometrics & Security Technologies (ISBAST), Apr. 23-24, 2008, pp. 1-6.
Lefebvre et al., “A Robust Soft Hash Algorithm for Digital Image Signature”, International Conference on Image Processing 2:11 (ICIP), vol. 3, Oct. 2003, pp. 495-498.
Lifelock, “Personal Identity Theft Protection & Identity Theft Products,” http://www.lifelock.com/lifelock-for-people, accessed Nov. 5, 2007.
Pagano, et al., “Information Sharing in Credit Markets,” Dec. 1993, The Journal of Finance, vol. 48, No. 5, pp. 1693-1718.
Partnoy, Frank, Rethinking Regulation of Credit Rating Agencies: An Institutional Investor Perspective, Council of Institutional Investors, Apr. 2009, pp. 21.
Quinn, Tom, “Low Credit Inquiries Affect Your Credit Score”, Credit.com, May 2, 2011, pp. 2.
TheMorningCall.Com, “Cheap Ways to Foil Identity Theft,” www.mcall.com/business/columnists/all-karp.5920748jul01,0 . . . published Jul. 1, 2007.
Extended European Search Report for Application No. EP12747205, dated Sep. 25, 2014.
Extended European Search Report for Application No. EP18207755, dated Dec. 13, 2018.
International Preliminary Report on Patentability in Application No. PCT/US2012/025456, dated Aug. 21, 2013.
International Search Report and Written Opinion for Application No. PCT/US2011/033940, dated Aug. 22, 2011.
International Search Report and Written Opinion for Application No. PCT/US2012/025456, dated May 21, 2012.
Official Communication in Australian Patent Application No. 2012217565, dated May 12, 2017.
Official Communication in Australian Patent Application No. 2017203586, dated Jun. 18, 2019.
Official Communication in Canadian Patent Application No. 2,827,478, dated Jun. 29, 2017.
Official Communication in Canadian Patent Application No. 2,827,478, dated Mar. 27, 2019.
Official Communication in Canadian Patent Application No. 2,827,478, dated May 31, 2018.
Supplementary European Search Report for Application No. EP12747205, dated Jun. 19, 2015.
Provisional Applications (1)
Number Date Country
62188252 Jul 2015 US