System and architecture for electronic fraud detection

Information

  • Patent Grant
  • 11941635
  • Patent Number
    11,941,635
  • Date Filed
    Monday, July 25, 2022
    2 years ago
  • Date Issued
    Tuesday, March 26, 2024
    10 months ago
  • Inventors
  • Original Assignees
  • Examiners
    • Hamilton; Lalita M
    Agents
    • Knobbe Martens Olson & Bear LLP
Abstract
Embodiments of an electronic fraud analysis platform system are disclosed which may be used to analyze tax returns for potential fraud. Analysis of tax return data using the tax return analysis platform computing systems and methods discussed herein may provide insight into whether a tax return may be fraudulent based on, for example, an initial screening component configured to filter tax returns which appear fraudulent due to missing or inaccurate information provided with the return; a device activity analysis component configured to identify whether a device used to submit a tax return or to provide further authentication information needed to complete processing of the return may have been used in other fraudulent activities; and a knowledge-based authentication component configured to identify potential fraudsters using dynamically generated questions for which fraudsters typically do not know the answers.
Description
BACKGROUND

Billions of dollars of fraudulent tax refunds are paid out every tax year. This not only puts a strain on government's ability to provide services, but it also erodes public trust in our country's tax system. With the increased reliance on electronic filing of tax returns comes an increase of the efficiency of tax operations and overall convenience. However, this has also contributed to a rise in identity theft and unwarranted or fraudulent tax refunds. Stealing identities and filing for tax refunds has become one of the fastest growing non-violent criminal activities in the country, often resulting in significant returns for the fraudster.


SUMMARY OF CERTAIN EMBODIMENTS

In one embodiment, an electronic fraud detection system is disclosed. The system may comprise: an electronic data interface module configured to electronically communicate with a first electronic data store configured to at least store tax return filing data associated with a plurality of consumers and at least one tax agency, wherein access to the first electronic data store is provided by a tax agency computing system, a second electronic data store configured to at least store consumer data associated with the plurality of consumers, and a third electronic data store configured to at least store consumer device activity data associated with a plurality of consumer devices associated with the plurality of consumers; an initial screening module configured to apply filters to tax return filing data, including at least one or more consumer attributes associated with each respective consumer and received from the electronic data interface module, to generate a set of electronic tax fraud indications that represent whether consumers records within the tax return filing data are likely fraudulent due to missing or inaccurate information; a knowledge-based authentication module configured to dynamically generate authentication questions associated with a consumer associated with one of the consumer records identified as likely fraudulent, the generated questions based on consumer credit data corresponding to the consumer that is received from the electronic data interface module, for which the answers are confidential and based on credit data, to provide the authentication questions, receive authentication response information corresponding to the authentication questions, and generate an electronic authentication indication representing an accuracy level of the authentication response information; a device authentication module configured to dynamically analyze whether a computing device used to provide the authentication response information may have been used in fraudulent activities or is related to other devices that have been used in fraudulent activities using a unique device identifier associated with the computing device, the unique device identifier generated using information collected from the computing device, and further configured to generate an electronic device indication representing a risk level that the device associated with fraud; and an accuracy reporting module configured to make the electronic authentication indication and the electronic device indication available to the tax agency computing system.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram which illustrates an exemplary data flow between a consumer device, tax agency computing system(s), and a tax return analysis platform system, according to one embodiment.



FIG. 2 schematically illustrates a logical flow diagram for one embodiment of an example process for performing an initial tax fraud screening of one or more tax returns which may be run by one embodiment of the tax return analysis platform computing system of FIG. 6.



FIG. 3 schematically illustrates a logical flow diagram for one embodiment of another example process for performing a device activity analysis and/or a knowledge-based authentication process which may be run by one embodiment of the tax return analysis platform computing system of FIG. 6.



FIG. 4 schematically illustrates a logical flow diagram for one embodiment of a process for performing a device activity analysis which may be run by one embodiment of the tax return analysis platform computing system of FIG. 6.



FIG. 5 schematically illustrates a logical flow diagram for one embodiment of an example knowledge-based authentication process which may be run by one embodiment of the tax return analysis platform computing system of FIG. 6.



FIG. 6 is a block diagram showing one embodiment in which a tax return analysis platform computing system is in communication with a network and various systems, such as websites and/or online services, are also in communication with the network.





DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS

Overview


One principal method of conducting income tax refund fraud is through identity theft, specifically, by filing a tax return deceptively as someone else. According to a 2011 report by the Federal Trade Commission, identity theft has been the number one consumer complaint since 2000. Unfortunately, identities can be stolen from anywhere and resourceful identity thieves use a variety of ways to obtain an individual's personal information. Identity thieves can retrieve personal information by rummaging through the trash of businesses. In some cases identity thieves work for legitimate companies, medical offices, clinics, pharmacies, or government agencies, and take advantage of their roles at these organizations to illicitly obtain or solicit personal information.


Identity thieves have two primary concerns when they are concocting an income tax refund fraud scheme. They need to devise a clever system for moving and using the fraudulently obtained funds, and they need to figure out how to obtain another individual's social security number (“SSN”) and other identifying information, which they will use to circumvent the existing tax return review process. Typically the fraudster will file a false return electronically, early in the tax filing season before the legitimate tax filer has a chance to the file their return. Fraudsters then use the stolen information and provide false information about wages earned, taxes withheld, and other data in order to appear as the legitimate tax payer who is entitled to a tax refund. The fraudster arranges for the proceeds of the refund to be deposited into a bank account or transferred to a debit card, or other similar methods which are virtually untraceable once the payment has been released. According to the IRS's Identity Protection Incident Tracking Statistics Report, incidents of identity theft tied to taxpayers has risen three fold from 2009 to 2011 growing from 456,453 incidents in 2009 to 1,125,634 in 2011.


Unfortunately, many individuals who are victims of identity theft may be unaware that their identity has been stolen to file fraudulent tax returns. It is not until the legitimate individual files a tax return resulting in a duplicate filing under the same name and SSN that many individuals realize they are a victim of identity theft. Everyone with a social security number is potentially vulnerable to having their identity stolen.


Anyone who has access to a computer can fill out an income tax form online and hit submit. Income tax returns are processed within days or weeks, and the proceeds are then deposited into accounts or provided on debit cards. Once released, these monies are virtually untraceable, and thus an improved method to detect fraudulent tax returns prior to releasing tax refund monies is needed. According to the Tax Inspector General for Tax Administration (“TIGTA”), the number of identified fraudulent tax returns has increased by 40% from 2011 to 2012 which equates to an increase in over $4B dollars. While the number of fraudulent tax returns can be identified, the full scope of the fraud remains unknown. Additionally in 2012, TIGTA reported that, using characteristics of identity theft confirmed by the IRS, it had identified approximately 1.5 million undetected tax returns with potentially fraudulent tax refunds totaling in excess of $5.2 billion. This number only takes into consideration income tax fraud on a federal level. TIGTA also found that contributing to the growth in tax fraud is an ongoing challenge in authenticating taxpayers. Even though some revenue agencies have adopted verification techniques such as use of a Personal Identification Number (“PIN”), or providing information from a previous year's return, these controls can be circumvented and have proven inadequate in stopping identity-based income tax fraud.


Income tax refund fraud schemes vary from those committed by individual perpetrators to those that are much more large scale, with multiple players spanning several years with the number of filings rising into the thousands and the losses ranging into the millions of dollars. With the average federal tax refund amounting to roughly $3,000 and state refund averaging around $500, many taxpayers anxiously await the return of their funds and are justifiably upset when their refunds are delayed. In some embodiments the systems used in detecting income tax refund fraud are effective and simultaneously efficient such that they do not delay the release of legitimate refunds. Complicating the issue is that typical “red flags” which might trigger a fraud alert, such as having a refund sent to a new address or an unfamiliar name, happen millions of times each year for honest reasons, such as when a taxpayer gets married (and changes his/her name and/or address) or moves, thus making it even more difficult to identify the fraudulent returns from the legitimate ones.


Embodiments of an electronic fraud analysis platform system are disclosed which may be used to analyze tax returns for potential fraud. Analysis of tax return data using the tax return analysis platform computing systems and methods discussed herein may provide insight into whether a tax return may be fraudulent based on, for example, an initial screening component configured to filter tax returns which appear fraudulent due to missing or inaccurate information provided with the return; a device activity analysis component configured to identify whether a device used to submit a tax return or to provide further authentication information needed to complete processing of the return may have been used in other fraudulent activities; and a knowledge-based authentication component configured to identify potential fraudsters using dynamically generated questions for which fraudsters typically do not know the answers.


The terms “individual,” “consumer,” “customer,” “people,” “persons,” “party,” “entity,” and the like, whether singular or plural, should be interpreted to include either individuals or groups of individuals, such as, for example, married couples or domestic partners, joint tax filers, organizations, groups, business entities, non-profit entities, and other entities.


Embodiments of the disclosure will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure. Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the embodiments of the disclosure herein described.


For purposes of this disclosure, certain aspects, advantages, and novel features of various embodiments are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that one embodiment may be carried out in a manner that achieves one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.


High Level Data Flow



FIG. 1 is a block diagram which illustrates an exemplary data flow between a consumer computing device (or devices) (for example, a smart phone, a tablet, a car console, or other electronic computing device) 162, a tax agency computing system (or systems) 168, and a tax return analysis platform (“TRAP”) system 100, according to one embodiment. The data flow of FIG. 1 illustrates at a high level how a consumer tax filing may be analyzed by the TRAP system according to associated processes described herein to determine and provide an indication of whether the tax filing may be fraudulent.


The exemplary data flow may begin at (1) when a consumer submits a tax return filing to a tax agency. The tax return may be submitted in any manner by which the tax agency accepts tax return filings, including traditional physical paper filings as well as electronic submissions. Traditional physical paper filings typically are digitally scanned or otherwise input to the tax agency computing system 168 to facilitate faster processing of the tax return.


In some instances, if a tax return is submitted to the tax agency electronically (for example, to the tax agency computing system 168), the tax agency may have the ability to detect or associate a user computing device used by the individual to submit the tax return. For example, in some cases an IP address, a device identifier, or other identifying information associated with the user computing device or the tax return may be automatically detected and gathered by the tax agency computing system, such as by the use of a client-side script downloaded to the user computing device, a cookie, or other methodology. Such device identifying information may be collected at various stages of an electronic tax return filing process, such as when the individual registers with the tax agency computing system 168 via a website provided by the tax agency, or when the individual submits the completed tax return to the tax agency, and so on. If device identifying information is gathered by the tax agency, the device identifying information may be provided to the TRAP system 100 and used or included in the tax refund fraud analysis processes described herein. Embodiments of various device identification systems and methods are disclosed in U.S. Pat. Nos. 7,853,533, 8,862,514, and U.S. Publication No. 2011/0082768, the entire contents of which are all hereby incorporated by reference herein. However, the TRAP system 100 may perform the tax refund fraud analysis processes even if such device identifying information is not provided by the tax agency at the outset.


Although the description with reference to (1) describes submission of a tax return for a single individual, the tax agency naturally receives tax returns numbering in the thousands or even millions depending on the size of the tax base being served. Thus it should be understood that the actions described at (1) may occur for thousands, millions, or any number of tax returns submitted to the tax agency, some in parallel and some over a period of time.


At (2), the tax agency provides tax return data for one or more consumers to be analyzed for potential fraud to the TRAP system 100. The TRAP system 100 is configured to support either analysis of a single tax return or analysis of multiple tax returns from multiple consumers via high-volume batch-mode processing. The TRAP system 100 may be configured to perform the fraud analysis in various ways and at different stages of the overall process flow as described further below.


At (3A), the TRAP system 100 performs an automated, initial data screening of the tax return data to identify tax returns which may be potentially fraudulent. The initial screening may be performed, for example, by the screening/precise ID module 128 of the TRAP system 100 as illustrated in FIG. 6. The automated initial screening process is described in more detail with reference to block 210 of FIG. 2 herein. At a high level, the initial screening process may involve accessing consumer data (such as consumer data that may be stored in one of the consumer data sources 172) that is generally known to be highly accurate and/or verified, generating consumer attributes (associated with each respective tax return provided by the tax agency), and performing matching, identification, verification, duplicate checking and other screening processes using the consumer attributes. A tax return may be flagged or identified as potentially fraudulent in response to determining, for example, that some consumer attributes associated with the tax return do not match the accessed consumer data.


If the tax agency provides device identifiers associated with the tax return data then at (3B) the TRAP system 100 may optionally perform a device activity analysis (which may also be referred to as device proofing) to further identify tax returns which may be potentially fraudulent. The device proofing process is described in more detail with reference to FIG. 4 herein. At a high level the device proofing process may involve accessing device activity data (such as device activity data that may be stored in one of the device activity data sources 174) using one or more of the provided device identifiers. The device activity data may indicate, for example, whether a particular device has been previously associated with other fraudulent activities. If a particular device associated with one or more of the device identifiers for a particular tax return has been previously associated with other fraudulent activities, the particular tax return may be flagged for potential fraud as well.


At (4), once the TRAP system 100 has completed the initial screen process (and optionally the device proofing process), the TRAP system 100 provides a list of flagged tax returns to the tax agency computing system 100. The list may be a one-to-one correspondence of the tax return data initially provided by the tax agency with only those particular tax returns identified by the TRAP system 100 as potentially fraudulent being flagged as such. In another embodiment, the list may include only those particular tax returns flagged by the TRAP system 100. In one embodiment, the flags may comprise one or more indicators, scores, probability, risk levels, or other information that may indicate a degree to which a respective tax return may be fraudulent. For example, a first tax return may be flagged as “low probability” and further indicate that only one consumer attribute with a low risk rate was found to be non-matching during the initial screening process. In another example, a second tax return may be flagged as “medium probability” and further indicate that several consumer attributes were found to be non-matching or unverified. In another example, a third tax return may be flagged as “high probability” and further indicate that several consumer attributes were found to be non-matching or unverified, as well as indicate that the device used to submit the tax return has been previously associated with other fraudulent activities.


In one embodiment, the tax agency can review and decide which returns to process, deny, or require additional information. For those where the tax agency wants more information which the tax agency can utilize the TRAP system 100 to conduct additional analysis. At (5), the tax agency computing system 168 may direct a flagged consumer to access a website or application provided by the tax agency in order to provide further authentication information necessary to complete processing of the tax return. For example, the tax agency computing system 168 may send a letter, an electronic message, or a text message to the flagged consumer based on the list provided by the TRAP system 100. In some instances the letter or electronic message may be sent automatically once the list is received from the TRAP system 100.


At (6), the consumer accesses the website or the application provided by the tax agency to provide the requested authentication information. The website may be preconfigured to download a script (for example, a JavaScript code or similar) to a computing device used by the consumer to access the website via a web browser, an application, or other program. The script may be provided by the TRAP system 100 to facilitate device proofing with respect to the computing device being used by the consumer to access the website.


At (7), device-related information, which may include one or more device attributes and/or identifiers, associated with the computing device used by the consumer to access the website or the application may be detected, for example by execution of the script or program of the application on the consumer's computing device. In one embodiment, the script or application may be configured to provide the device-related information to the tax agency computing system 100, which may be configured to perform internal processing and/or to forward the device-related information to the TRAP system 100. In another embodiment the script may be configured to provide the device-related information directly to the TRAP system 100. The device-related information may be used to generate a unique device identifier, which may be used, for example, as described herein to access device activity data associated with the unique device identifier. In some embodiments, generation of the unique device identifier may be performed by the TRAP system 100; by an application installed on the computing device used by the consumer (in which case the unique device identifier may be detected as part of the device-related information); or by a third party service that may offer device identity services via one or more application programmatic interfaces (“APIs”).


At (8A), once the device identifiers are received by the TRAP system 100, the system may then perform the device proofing described above and in more detail with reference to FIG. 4 herein. The device proofing may be performed at (8A) to determine whether the computing device used by the consumer to provide the requested authentication information has previously been associated with any fraudulent activity. In some instances, the device proofing at (8A) may be the first time such device proofing is executed, such as would be the case in which the tax agency did not provide device identifiers to the TRAP system 100 with the tax return data at (2) discussed above. In such a scenario, the device proofing may only be applied with respect to the computing device used by the consumer at (6) to access the tax agency website.


However, in another possible scenario, the device proofing at (8A) may be at least the second time such device proofing is executed. For example, an initial device proofing may be performed at (3A) with respect to a first device used by a consumer to submit the tax return electronically to the tax agency computing system 168; and a second device proofing may be performed at (8A) with respect to a second device used by the same consumer at (6) to access the tax agency website. The first device and the second device may or may not be the same device, and as such the initial device proofing and the second device proofing may produce different results. For example, the initial device proofing may provide an indication that the first device is not associated with previous fraudulent activity, whereas the second device proofing may provide an indication that the second device is associated with previous fraudulent activity. This additional round of device proofing, if available, may provide the tax agency with an additional layer of fraud detection, as a fraudster may utilize multiple devices in an attempt to avoid detection.


At (8B), the TRAP system 100 may also initiate a knowledge-based authentication (“KBA”) process in order to further authenticate the consumer and/or to provide further fraud detection back-end support to the tax agency computing system 100. For example, the consumer may be prompted to provide personal information (for example, full name, current and/or prior addresses, and other personally identifying information or “PII”) through the tax agency website. Some or all this personal information may be gathered by the tax agency computing system 168, which may perform internal processing and/or forward the provided personal information to the TRAP system 100. In another embodiment, the personal information may be collected automatically and provided directly to the TRAP system 100, such as via a client-side script downloaded to the consumer's computing device when the tax agency website is accessed.


Once at least some personal information is received at the TRAP system 100, the TRAP system 100 can use the personal information to access consumer data, including credit data, associated with the consumer (for example, from the consumer data sources 172). The TRAP system 100 may then generate further authentication questions (for example, “out of wallet” questions) based on the accessed consumer data. For example, out of wallet questions may be generated in order to solicit responses that include information highly likely to only be known by the consumer (and/or unlikely to be known by a fraudster) which would not be found in the consumer's wallet, such as a monthly payment amount on an outstanding debt obligation which may appear in the consumer's credit data.


At (9), the TRAP system 100 provides the out-of-wallet or other authentication questions and receives and processes the responses. The questions may be provided directly to the consumer computing device, such as via a client side script downloaded to the consumer computing device when accessing the tax agency's authentication website. For example, a client side script may be provided by the TRAP system 100 to the tax agency computing system 168 for inclusion in the website. The client side script may be configured to retrieve personal information as it is entered by the consumer into a form on the website; send the personal information to the TRAP system 100; receive one or more authentication questions; and present the questions to the consumer for further authentication. The client side script may be further configured to collect responses to the presented questions and send the responses directly to the TRAP system 100. After the TRAP system 100 receives the responses, it processes them to determine whether they are accurate with respect to the accessed consumer data.


At (10), the TRAP system 100 provides one or more indicators of potential fraud (for example, scores and the like) to the tax agency computing system 168 based on any combination of the various fraud detection processes described throughout FIG. 1. For example, indicators may be provided for each of the initial screening, the first device proofing (if performed), the second device proofing, and the KBA process (including for example indicators of whether and/or how many questions were answered correctly). In one embodiment, a composite or aggregate tax return fraud score may be provided, wherein the fraud score may be generated based at least in part on any of the component fraud indicators described herein. The tax agency may then use the one or more indicators, and/or the aggregate tax return fraud score, to make a determination as to whether the tax return should be processed, denied, approved, or flagged for further follow-up.


Examples of Processes Performed by TRAP systems



FIGS. 2, 3, 4, and 5 are flowcharts illustrating various embodiments of TRAP system processes. In some implementations, the processes are performed by embodiments of the TRAP system 100 described with reference to FIG. 6 and/or by one of its components, such as the such as the authentication module 122, the data partition and security module 126, the screening/precise ID module 128, the device activity analysis module 132, and/or the fraud detection module 134. For ease of explanation, the following describes the services as performed by the TRAP system 100. The example scenarios are intended to illustrate, but not to limit, various aspects of the TRAP system 100. In one embodiment, the processes can be dynamic, with some procedures omitted and others added.


Initial Tax Fraud Screening



FIG. 2 is a flowchart illustrating one embodiment of a process 200 for performing an initial tax fraud screening of one or more tax returns, which may be run by one embodiment of the TRAP system 100 of FIG. 6. The process 200 may be performed by the TRAP system 100 separately or in conjunction with, for example, the process 300 of FIG. 3, the process 400 of FIG. 4, and/or the process 500 of FIG. 5. For ease of explanation certain portions of the description below describes the process with respect to an individual consumer and an individual tax return. However the process may also be applied similarly to a plurality of consumers and/or a plurality of tax returns separately and/or in parallel, such as in batch processing of multiple thousands or millions of tax returns.


The process 200 begins at block 205, where the TRAP system 100 (for example, via the data partition and security module 126 of FIG. 6) accesses (or receives) a list of encrypted consumer tax return data. The tax return data may be provided by a tax agency to the TRAP system in order to perform an initial fraud screening of one or more consumer tax returns. In one embodiment the tax return data may be accessed from the tax return data source(s) 170 by the tax agency computing system 168 and provided to the TRAP system 100. In another embodiment, the TRAP system 100 may be granted permission to access the tax return data source 170 directly. As described with reference to FIG. 1 the tax return data may also include device identifiers that may be associated with respective tax returns.


At block 210, for each consumer identified in the tax return data, the TRAP system 100 (for example, via the screening/precise ID module 128) performs an initial screening (for example, data matching, data verification, identifying duplicates, and so forth) based on the attributes associated with each respective tax return. Various attributes may be screened including but not limited to name, address, date of birth, social security number (“SSN”), driver license, phone number (wireless or landline), bank account number(s), and/or IP address. Other attributes not expressly listed herein may also be used. To perform the initial screening, the TRAP system 100 may access consumer data from consumer data source(s) 172, wherein the consumer data may be accessed using at least some of the consumer attributes associated with respective tax returns. For example, one attribute of a tax return may include a social security number (or other unique consumer identifiers), which the TRAP system 100 may then use to access consumer data associated with the social security number (or other unique consumer identifiers). The screening process may generate, for example, a validation score which predicts the likelihood that the identification information supplied (for example, name, address, SSN, phone number, date-of-birth, and so forth) is a valid combination which has been seen previously within one or multiple data sources. The screening process may also generate, for example, an ID Theft Score that predicts the likelihood that the application is originating from the true consumer.


The screening process at block 210 may involve checking or addressing multiple attributes of each tax return, including for example: whether the SSN is valid; whether the name on the return matches the SSN provided; whether the tax filer exists in any data records at all; whether the address on the return matches the tax filer's current address; whether the SSN is associated with a deceased person; whether the address is valid or if it corresponds to an institution, a vacant lot, or some other invalid location; whether the return address on the tax return is in a state where the tax filer has never resided (for example, based on past address history which may be contained in the tax filer's credit data); whether there is any indication of fraud within the tax filer's credit data; whether multiple returns are identified as going to the same address; and/or whether joint filers as stated on the return are actually connected to each other (for example, spouses, domestic partners, and so forth)


Next, at block 215, the TRAP system 100 determines whether any device identifiers associated with respective tax returns with the tax return data have been provided and/or are available in order to facilitate an initial device proofing. In response to determining that device identifiers are provided or available the process 200 may proceed to block 220. In response to determining that no device identifiers are provided or available the process 200 may proceed to block 225.


At block 220, the TRAP system 100 (for example, via the device activity analysis module 132) may optionally perform an initial device activity screening (for example, device proofing) using any device identifiers associated with respective tax returns which have been provided to or accessed by the TRAP system 100. The device proofing process is described in more detail with reference to FIG. 4 herein. At a high level the device proofing process performed at block 220 may involve accessing device activity data (such as device activity data that may be stored in one of the device activity data sources 174, including as a lookup table which may further include blacklist information for one or more devices) using one or more of the device identifiers. The device activity data may indicate, for example, whether a particular device has been previously associated with other fraudulent activities or is associated with other devices which may have been involved in past fraud. If a particular device associated with one or more of the device identifiers for a particular tax return has been previously associated with other fraudulent activities, the particular tax return may be flagged for potential fraud as well.


Once the initial device activity screening at block 220 has been completed, or in response to determining that no device activity screening need be performed at this stage of the tax fraud analysis, the process 200 proceeds to block 225. At block 225, the TRAP system 100 (for example, via the fraud detection module 134) identifies or flags consumers and/or tax returns for possible fraud, based at least in part on the initial screening performed at block 210 and/or the device activity screening performed at block 220. The flags or indicators may include for example, a plurality of indicators for individually identified items from each of the items checked in the initial screening process at block 210 and/or the device activity screening performed at block 220; one or more indicators representing aggregate or overall fraud indicators for particular categories, such as an initial screening fraud score, a device fraud score; an overall fraud score; or any other variant or combination thereof.


At block 230, the TRAP system 100 provides the list of flagged consumer tax returns and possible fraud indicators. This list may be provided, for example, to the particular tax agency which provided the tax return data for fraud analysis. The tax agency may then use the list of flagged tax returns in order to initiate further authentication of consumers who filed the flagged tax returns before completing the processing of those returns, and other purposes as described herein.


In one embodiment, the TRAP system 100 may store at least some identifying information related to flagged tax returns for possible retrieval and continued fraud analysis processes as will be described further below. Otherwise, the TRAP system 100 promptly and securely destroys or removes the tax return data once the process 200 has been completed in order to ensure privacy and maintain compliance with any regulatory requirements with respect to tax return data which may limit the purpose, use or duration under which such data may be held by non-tax agency entities.


Additional Tax Fraud Analysis



FIG. 3 is a flowchart illustrating one embodiment of a process 300 for performing a device activity analysis and/or a knowledge-based authentication process with respect to a consumer asked to provide further authentication information for a tax return flagged as potentially fraudulent, which may be run by one embodiment of the TRAP system of FIG. 6. The process 300 may be performed by TRAP system 100 separately or in conjunction with, for example, the processes 400 of FIG. 4 and/or the process 500 of FIG. 5.


At block 305, the TRAP system 100 (for example, via the device activity analysis module 132) accesses or receives the device identifiers associated with a device used by the consumer to provide identity authentication information for a flagged tax return. For example, in one embodiment, the process 300 may be performed in response to the consumer accessing a website (or using a software application or “app”) provided by the tax agency to provide requested further authentication information in order to complete processing of a tax return filed by the consumer. The website or app may be configured to download a client-side script to the computing device used by the consumer to access the website or app, wherein the client-side script is configured to execute automatically in order to gather device identifiers associated with the consumer's computing device. These device identifiers may be collected and sent either to the tax agency computing system 168, which may in turn provide them to the TRAP system 100 for further fraud analysis; or the device identifiers may be provided directly to the TRAP system 100.


Next at block 310, the TRAP system 100 accesses and analyzes device activity data to identify potentially fraudulent activity that may be associated with the device used by the consumer to provide the requested identity authentication information. The device proofing process is described in more detail with reference to FIG. 4 herein.


At block 315, the TRAP system 100 (for example, the authentication module 122) performs or initiates a knowledge-based authentication (“KBA”) process to further authenticate the consumer. The KBA process is described in more detail with reference to FIG. 5 herein.


In one embodiment, at least some identifying information usable to initially determine an identity of the consumer may be provided to the TRAP system 100. For example, some identifying information may be provided to the TRAP system 100 as follows: when the TRAP system performs the initial screening process described previously, a temporary encrypted identifier may be generated and associated with a flagged return and provided to the tax agency computing system 168. The tax agency computing system 168 may then include the temporary encrypted identifier along with the request to the consumer to access the website or app to provide further authentication information. The encrypted identifier may be provided, for example, as part of a unique access identifier the consumer may be prompted to enter at the website, or embedded in a unique URL or hyperlink the consumer may use to access the website. Once the consumer visits the website, the encrypted identifier may be detected and retrieved, for example as part of the client-side script configured to collect device identifier data, and eventually provided back to the TRAP system 100. The encrypted identifier may then be decrypted and used to determine, for example, either that the consumer is associated with a previously-flagged tax return or to obtain at least some initially identifying information such as a name or other non-sensitive data that may be used to initiate the KBA process.


In another embodiment, as the consumer provides personal information (for example, via the website or app), the personal information may be provided directly or indirectly (for example, via the tax agency computing system 168) to the TRAP system 100. When enough identifying information is received to at least initially determine an identity of the consumer, the TRAP system 100 may access verified consumer data associated with the determined identity, such as credit data, from the consumer data sources 172 in order to generate authentication questions.


At block 320, the TRAP system 100 may optionally access previously screened tax return data to determine whether the consumer and/or the tax return were previously flagged for potential fraud, and/or to what extent such potential fraud may be been previously determined. In some embodiments this data may not be available to the TRAP system 100 or available only in a limited fashion which protects the privacy and security of the underlying tax return data. One embodiment that may permit storage and retrieval of at least the fraud indicators generated by the TRAP system 100 during the initial screening process 200 may involve the use of encrypted identifiers as described above.


Finally, at block 325, the TRAP system 100 provides one or more indicators of the potential fraud for the flagged tax return, based at least in part on: the device activity analysis performed at block 310, the KBA process performed at block 315, and/or the initial screening flag accessed at block 320 (if applicable). For example, the provided indicators may include an indication of whether the computing device has been previously associated with other fraudulent activities; a degree or level of risk that may be associated with such other fraudulent activities; an indicator of whether and/or how many authentication questions were answered correctly by the consumer; an indicator of whether and/or to what extent the tax return may have previously been flagged for potential fraud during the initial screening described in reference to process 200; an overall fraud score, range, number, letter, and so forth that may be generated in the aggregate or for each individually flagged item; and so forth.


Device Activity Analysis



FIG. 4 is a flowchart illustrating one embodiment of a process 400 for performing a device activity analysis which may be run by one embodiment of the TRAP system of FIG. 6. The process 400 may be performed by TRAP system 100 separately or in conjunction with, for example, the process 300 of FIG. 3.


The process 400 begins at block 405, where the TRAP system 100 (for example, via the device activity analysis module 132) accesses device activity data associated with a device, for example using a unique device identifier. The unique device identifier may be generated or determined, for example, based on the one or more device identifiers accessed at block 305 of FIG. 3. The unique device identifier may be one of the accessed device identifiers, or it may be based on some combination of some or all of the accessed device identifiers.


At block 410, the TRAP system 100 determines whether one or more fraudulent or potentially fraudulent activities are associated with the device based on the accessed device activity. The device activity analysis process performed at block 410 may involve accessing device activity data (such as device activity data that may be stored in one of the device activity data sources 174) using one or more of the device identifiers. The device activity data may indicate, for example, whether a particular device has been previously associated with other fraudulent activities or whether a device is in a blacklist. If a particular device associated with one or more of the device identifiers has been previously associated with other fraudulent activities, the particular device may be flagged for potential fraud as well.


Next, at block 415, the TRAP system 100 determines whether any fraudulent activities are associated with the device. In response to determining that no fraudulent activities appear to be associated with the device the process 400 may proceed to block 420. In response to determining that fraudulent activities are associated with the device the process 400 may proceed to block 425.


At block 420, the TRAP system 100 provides an indicator that the device is authenticated or otherwise does not appear to be associated with prior fraudulent activities.


At block 425, the TRAP system 100 provides at least one indicator (for example, a score, a flag, or other indicator) to describe the possible involvement of the device in the fraudulent activities. In one embodiment, the TRAP system 100 may also provide or enable access to a dashboard user interface that allows users to fully research and link seemingly unrelated events. The capability provided by the dashboard user interface can have a multiplying effect on the ability to detect fraud because, for example, the residue left by fraudsters across different transactions or accounts can be linked together for more precise detection of fraud rings.


Knowledge-Based Authentication Process



FIG. 5 is a flowchart illustrating one embodiment of a process 500 for performing a knowledge-based authentication process which may be run by one embodiment of the TRAP system of FIG. 6. The process 500 may be performed by TRAP system 100 separately or in conjunction with, for example, the process 300 of FIG. 3.


The process 500 begins at block 505, where the TRAP system 100 (for example, via the authentication module 122) access consumer data, such as credit data or a credit report, associated with the consumer (for example, from the consumer data sources 172).


At block 510, the TRAP system 100 generates one or more authentication questions (for example, “out of wallet” questions) based on the accessed consumer data in order to further authenticate the user. For example, out of wallet questions may be generated in order to solicit responses that include information highly likely to only be known by the consumer (and/or unlikely to be known by a fraudster), such as a monthly payment amount on an outstanding debt obligation which may appear on the consumer's credit report, the name or address of a particular loan servicer, the date that the last payment was posted to a credit account, and so on.


Next, at block 515, the TRAP system 100 provides the out-of-wallet or other authentication questions. The questions may be provided by the TRAP system 100 directly to the consumer computing device, such as via a client side script downloaded to the consumer computing device when accessing the tax agency's authentication website. For example, a client side script may be provided by the TRAP system 100 to the tax agency computing system 168 for inclusion in the website. The client side script may be configured to retrieve personal information as it is entered by the consumer into a form on the website; send the personal information to the TRAP system 100; receive one or more authentication questions from the TRAP system 100; and present the questions to the consumer for further authentication. The client side script may be further configured to collect responses to the presented questions and send the responses directly to the TRAP system 100.


At block 520, the TRAP system 100 receives/processes responses to the authentication questions. The responses are processed to determine whether they are accurate with respect to the accessed consumer data.


At block 525, the TRAP system 100 provides an indicator of whether and/or how many responses were correct. This information may be provided to the tax agency computing system 168 which can then use the information to determine whether the tax return should be denied, approved, or flagged for further follow-up.


Example System Implementation and Architecture



FIG. 6 is a block diagram of one embodiment of a tax return analysis platform (“TRAP”) system 100 in communication with a network 160 and various systems, such as consumer computing device(s) 162, tax agency computing systems(s) 168, tax return data source(s) 170, consumer data source(s) 172, and device activity data source(s) 174. The TRAP system 100 may be used to implement systems and methods described herein, including but not limited to the processes 200, 300, and 400 of FIGS. 2, 3, 4, and 5 respectively.


TRAP System


In the embodiment of FIG. 6, the TRAP system 100 includes an authentication module 122, an interface module 124, a data partition and security module 126, a screening/precise ID module 128, a device activity analysis module 132, and a fraud detection module 134 that may be stored in the mass storage device 120 as executable software codes that are executed by the CPU 150. These and other modules in the TRAP system 100 may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. In the embodiment shown in FIG. 6, the TRAP system 100 is configured to execute the modules recited above to perform the various methods and/or processes for tax filing data analysis as described herein (such as the processes described with respect to FIGS. 2, 3, 4, and 5 herein).


The authentication module 122 provides capabilities related to the knowledge-based authentication processes described, for example, with reference to FIGS. 3 and 5 herein. For example, the authentication module 122 may be configured to access the consumer data sources 172; generate authentication questions to be presented to a consumer asked to provide further authentication information for a tax return flagged as potentially fraudulent; receive and process responses; and provide indications of the accuracy of responses.


The interface module 124 provides capabilities related to interfacing between the TRAP system 100, the tax agency computing systems 168, and various data sources 170 (if applicable), 172, and 174. For example, the interface module 124 may be configured to provide various client-side scripts to the tax agency which may in turn be installed as part of a web service provided by the tax agency for consumers to access in order to further authenticate for a tax return. The interface module 124 may further be configured to receive data via the client-side scripts or from the tax agency computing systems for further processing by the various other modules described herein.


The data partition and security module 126 provides capabilities related to ensuring that tax return data accessed or received from various tax agency systems 168 and/or tax return data sources 170 are strictly separated or partitioned to maintain data privacy for each respective tax agency. In some embodiments the data partition and security module 126 may also be configured to ensure that the tax return data is promptly and securely destroyed or removed from the memory 130 and/or mass storage 120 of TRAP system 100 once the tax return data fraud analysis process(es) have completed.


The screening/precise ID module 128 provides capabilities related to performing identity screening and related routines, for example on tax returns provided by tax agency computing systems to the TRAP system 100 for fraud analysis. Some of these processes are described, with reference to FIG. 2 herein and may include, for example, matching and/or verifying consumer attributes associated with a tax return against verified consumer data accessed from the consumer data sources 172; identifying discrepancies in consumer attributes which may signal potential fraud, such as the use of a prior address rather than a current address; and similar types of screening.


The device activity analysis module 132 provides capabilities related to performing “device proofing” to determine whether a device used by a consumer during any part of the tax return process (for example, either filing/submitting the tax return or providing further information that may be required by the tax agency in order to complete processing of the tax return, and so on). Some of these processes are described, with reference to FIGS. 3 and 4 herein and may include, for example, accessing device activity data from the device activity data sources 174; determining whether fraudulent activities may be associated with the device; and providing indicators for the tax agency computing system regarding the likelihood that the device used by the consumer may have been previously used for other fraudulent activities.


The fraud detection module 134 provides capabilities related to those described with respect to the authentication module 122, the screening/precise ID module 128, and/or the device activity/analysis module 132. For example, the fraud detection module 134 may receive outputs from these various other modules and use the output to generate fraud indicator information (for example, a plurality of indicators for individually identified items from each of the modules involved in the fraud analysis process; one or more indicators representing aggregate or overall fraud indicators for particular categories, such as an initial screening fraud score, a device fraud score, and/or a KBA fraud score; an overall fraud score; or any other variant or combination thereof).


The TRAP system 100 includes, for example, a server, workstation, or other computing device. In one embodiment, the exemplary TRAP system 100 includes one or more central processing units (“CPU”) 150, which may each include a conventional or proprietary microprocessor. The TRAP system 100 further includes one or more memories 130, such as random access memory (“RAM”) for temporary storage of information, one or more read only memories (“ROM”) for permanent storage of information, and one or more mass storage device 120, such as a hard drive, diskette, solid state drive, or optical media storage device. Typically, the modules of the TRAP system 100 are connected to the computer using a standard based bus system. In different embodiments, the standard based bus system could be implemented in Peripheral Component Interconnect (“PCI”), Microchannel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules of TRAP system 100 may be combined into fewer components and modules or further separated into additional components and modules.


The TRAP system 100 is generally controlled and coordinated by operating system software, such as Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the TRAP system 100 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The exemplary TRAP system 100 may include one or more commonly available input/output (I/O) devices and interfaces 110, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 110 include one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia analytics, for example. The TRAP system 100 may also include one or more multimedia devices 140, such as speakers, video cards, graphics accelerators, and microphones, for example.


Network


In the embodiment of FIG. 6, the I/O devices and interfaces 110 provide a communication interface to various external devices. In the embodiment of FIG. 6, the TRAP system 100 is electronically coupled to a network 160, which comprises one or more of a LAN, WAN, and/or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link. The network 160 communicates with various computing devices and/or other electronic devices via wired or wireless communication links.


According to FIG. 6, in some embodiments information may be provided to or accessed by the TRAP system 100 over the network 160 from one or more tax return data sources 170, consumer data source(s) 172, and/or device activity data source(s) 174. The tax return data source(s) 170, consumer data source(s) 172, and/or device activity data source(s) 174 may include one or more internal and/or external data sources. In some embodiments, one or more of the databases or data sources may be implemented using a relational database, such as Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.


Tax Return Data Sources


The tax return data source(s) 170 may store, for example, tax return data including attributes, profiles, and other data descriptive of or related to tax return filings. The tax return data may include name, address, social security number, financial data related to the return, and other such information typically provided in a local or state tax return filing. In some embodiments, due to the sensitive nature of such tax return data, the TRAP system 100 may not have direct access to the tax return data source(s) 170. Rather, the tax agency computing system(s) 168 would have access to their own respective tax return data sources 170 and provide selected tax return data to the TRAP system 100. In some embodiments the TRAP system 100 may have at least a limited access permission (as indicated by the dashed line connecting network 160 to the tax return data sources 170) which may be allowed by the tax agency or under various laws and regulatory requirements which limit access to such data by non-tax agency or non-government entities.


Consumer Data Sources


The consumer data source(s) 172 may store, for example, credit bureau data (for example, credit bureau data from File Ones℠) and/or other consumer data. Consumer data source(s) 172 may also store geographic level demographics that include one or more models, such as models that identify lifestyle and/or socio-economic attributes associated with a geographic location (for example, MOSAIC® segmentation and/or codes) and/or behavioral/attitudinal/psychographic attributes associated with a geographic location (for example, TrueTouch℠ Touch Points segmentation).


Device Activity Data Sources


The device activity data source(s) 174 may store, for example, device activity data for respective computing devices. The device activity data may include among other things indications of fraudulent activities that may be associated with particular device identifiers. For example, a fraudster may use a device to engage in a fraudulent transaction online, and thus the transaction and a device identifier associated with the device may be collected and stored in a device activity data source 174. Such information may be extremely valuable to prevent future repeat fraud with the same device, such as if a potential tax fraudster attempts to use the device in relation to filing of a fraudulent tax return.


Additional Use Cases


In some embodiments, the systems and methods maybe used to provide a variety of features, such as the features described below.


Risk Based Versus Traditional Rules Based Tax Return Analysis


One aspect of the identity authentication or screening processes described herein is that the processes may be based upon data and analytics used in the financial industry to approve millions of credit transactions daily. In some instances authentication tools may be certified under Federal Identity, Credential and Access Management (“FICAM”) at a National Institute of Standards and Technology (“NIST”) Level of Assurance (“LOA”) 3 for virtual identity authentication. Such authentication tools help organizations to mitigate risk in billions of dollars in credit transactions, the majority of which are done electronically and in virtual anonymity. One strength of these authentication tools is the ability to not only use traditional identity verification checks based upon public records review, but the addition of a risk based process providing an identity fraud score which significantly lowers the number of false positives. The most predictive authentication and fraud scores are those that incorporate multiple data assets spanning traditionally used customer information categories, such as public records and demographic data, but also utilize, when possible, credit history attributes and historical application and inquiry records. Scores that incorporate a breadth of varied data categories such as credit attributes and demographic data typically outperform models built on singular categories of data such as public record assets.


Knowledge-Based Authentication


In addition, to verify a tax filer's identity, further authentication of those returns that are identified or flagged as suspect or potentially fraudulent may be implemented to provide greater assurance that the tax refund is released to the legitimate taxpayer. In conjunction with a risk-based identity proofing process, the tax refund fraud detection process can be further strengthened by use of a knowledge based authentication (“KBA”) process, which often include “Out of Wallet” questions. The tax filer is required to answer a list of questions correctly in order to receive the requested refund. In certain embodiments, challenge-response question technology can be used to dynamically formulate questions only the true tax payer would know. With an adjustable question configuration and the ability to change strategies for each inquiry, tax agencies may be well-suited to achieve their identity verification or fraud prevention and detection objectives with various levels of authentication. Configurable time limits can prevent fraudsters from researching answers during faceless interactions, and the use of both credit and non-credit related questions provide a more accurate picture of the consumer and further assure that the refund is being released to the legitimate taxpayer. The KBA processes described herein may be provided, for example, via a web site or app.


Device Identity Proofing


Many tax filings are now conducted electronically, which further preserves anonymity for the fraudster and allows for a quicker turn around in receiving the fraudulent refund. Individuals committing tax fraud will typically use the same computer to submit tax returns, submit credit applications, open accounts, and so forth. Device proofing capabilities offered by embodiments of the systems and methods described herein can authenticate the device being used to provide additional assurance that the device is not currently involved in or tied to other fraudulent activity, nor has it been involved in or tied to any past fraudulent activity.


Fraud Detection in Other Financial Transactions


A stolen identity has a short shelf life, and fraudsters frequently will try to utilize it for multiple transactions before it is abandoned. Thus, in some embodiments, an inquiry process that utilizes a complex set of algorithms may determine if the attributes of the identity used in the tax return have been involved in other fraudulent attempts to open accounts or secure lines of credit. This independent inquiry check based on the same identity credentials being used to submit the fraudulent tax return can help identify if the fraudster has attempted to use these same credentials in other credit related activities.


Returns Linked to Multiple Bank Accounts and Addresses


One of the weaknesses in the tax filing system which is exploited by income tax fraudsters is the government's need to quickly process returns and provide refunds. Tax returns are frequently processed and refunds released within a few days or weeks. This quick turnaround may not allow the government to fully authenticate all the elements submitted on returns. Current fraud detection processes does not detect addresses or bank accounts that are receiving multiple refunds. Most income tax refund fraudsters want easy access to their fraudulent refunds and thereby chose to have the refund placed on a debit card and sent to one or two of the same addresses or deposited into one or two accounts. Having large numbers of debit card sent to the same address or refunds deposited into one account is not normal behavior for legitimate tax filers, and thus evidence of such behavior can also be used as a fraudulent flag indicator in the return analysis process.


Thus, in one embodiment, the TRAP system 100 may be configured to analyze tax return filings to, for example, determine whether a same address is used multiple times across multiple tax returns. The TRAP system 100 also may be configured to analyze tax return filings to, for example, determine whether a same bank account is used multiple times across multiple tax returns. In another embodiment, the TRAP system 100 may be configured to combine both of these features to determine whether a same address is used in conjunction with a same bank account across multiple tax returns. Any one of these determinations, alone or in combination, may contribute or give rise to a fraudulent flag indicator. In another embodiment, as another fraud safeguard the TRAP system 100 may be configured to access verified bank account data (for example, under permission from a participating bank service provider), or be configured to request verification of a bank account with respect to the individual tax filer. Thus, for example, if one or more tax returns appear potentially fraudulent based on repeated use of a same address or a same bank account, the TRAP system 100 may be configured to perform an additional bank account verification process to verify whether the tax filer(s) associated with the suspect return(s) are verified account holders with respect to the bank accounts used on the suspected return(s).


Income Check Against Reported Adjusted Gross Income (“AGI”)


As described above, the income tax refund fraudster can use a variety of methods to obtain a consumer's name, address, and Social Security Number. However, it is not as easy for a fraudster to obtain information on an individual's income. According to TIGTA, access to third-party income and withholding information at the time tax returns are processed can be an important tool in identifying and preventing tax refund fraud. Unfortunately, this information is usually not available until well after tax filing season begins, since employers are not required to file W-2 information until after the consumer filing process begins. The amounts listed on fraudulent returns can thus be falsified by the fraudster in order to increase the ultimate number of deductions and extract the largest refund without arousing suspicion. In some instances, using income estimation models, the reported income can be checked against third party data not based upon previous years' returns, but, independent financial information which can take into account a consumer's credit history and recent spending habits. While not a report of the actual income, it can provide a gauge that can be used to flag returns where reported income is well outside the expected norm for that tax filer.


The use of a risk based identity authentication process, coupled with business rules based analysis and knowledge based authentication tools can facilitate identification of fraudulent tax returns. In addition, the ability to perform non-traditional checks against device fraud activity; the use by fraudsters of same identity credentials in independent financial transactions; detecting that multiple refunds are requested to be to sent to the same address or deposited into the same bank account; and the ability check the reported income against an individual consumer's estimated income, further strengthens the tax refund fraud detection processes and helps close additional loopholes exploited by the tax fraudster while at the same time decreasing the number of false positives. Embodiments of the tax return analysis platform system and methods described herein may be easy to implement, integrate seamlessly into any existing tax return evaluation process, and/or add little to no additional time to the existing process, thereby assuring a continued quick turnaround for legitimate tax refund releases, while at the same time providing increased assurance that the refunds are being provided to the legitimate tax payer.


OTHER EMBODIMENTS

Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The code modules may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like. The systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the executing computing device, such as the TRAP system 100, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “for example,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.


While certain example embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosure. Thus, nothing in the foregoing description is intended to imply that any particular element, feature, characteristic, step, module, or block is necessary or indispensable. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions disclosed herein. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of certain of the inventions disclosed herein.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


All of the methods and processes described above may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers. For example, the methods described herein may be performed by the TRAP system 100 and/or any other suitable computing device. The methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium. A tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated.

Claims
  • 1. A system comprising: a network interface configured to send and receive secure, encrypted electronic messages with a remote tax agency server, the remote tax agency server comprising a first electronic data store configured to store a plurality of tax return data associated with a plurality of consumers and at least one tax agency;a second electronic data store comprising device activity data records associated with the plurality of consumers; anda computing device configured to electronically communicate with the remote tax agency server and the second electronic data store, the computing device comprising one or more processors programmed to execute software instructions to cause the system to: access, from the remote tax agency server, a first tax return data associated with a first tax return associated with a first consumer of the plurality of consumers, the first tax return data comprising one or more consumer attributes associated with the first consumer;identify the first tax return as a flagged tax return based at least in part on the one or more consumer attributes associated with the first consumer;access, from the second electronic data store, a first device activity data associated with a consumer computing device associated with the first consumer, the first device activity data indicative of at least one of: whether the consumer computing device has been previously associated with fraudulent activities, a degree of risk associated with the fraudulent activities, or whether the consumer computing device is in a list of devices that are banned from conducting a tax return filing;analyze the first device activity data; andbased at least in part on the analysis of the first device activity data, generate a fraud score for the first tax return data and request authentication of the consumer computing device associated with the first consumer.
  • 2. The system of claim 1, wherein the software instructions further cause the system to: access, from a third electronic data store, a first consumer data associated with the first consumer;determine one or more consumer attributes from the first consumer data;compare the one or more consumer attributes from the first consumer data with the one or more consumer attributes associated with the first tax return; anddetermine whether the one or more consumer attributes from the first consumer data match with the one or more consumer attributes associated with the first tax return.
  • 3. The system of claim 1, wherein the authentication of the consumer computing device associated with the first consumer comprises: generating a set of one or more authentication questions;transmitting, to the consumer computing device, the set of one or more authentication questions;receiving, from the consumer computing device, a set of one or more responses to the set of one or more authentication questions; anddetermine, based on the set of one or more responses, whether the set of one or more responses are correct.
  • 4. The system of claim 1, wherein the consumer computing device associated with the first consumer is the same as or different from a device used to electronically submit the first tax return of the first consumer.
  • 5. The system of claim 1, wherein the first tax return data further comprises one or more device identifiers associated with a device used to electronically submit the first tax return, and wherein a first tax fraud indicator is generated based in part on the one or more device identifiers of the first tax return data.
  • 6. The system of claim 1, wherein the first device activity data associated with the first consumer is accessed from the second electronic data store based at least in part on a unique device identifier associated with the consumer computing device associated with the first consumer.
  • 7. The system of claim 6, wherein the software instructions further cause the system to: send a request to the consumer computing device, the request comprising a link configured to automatically gather device identification information when assessed;receive device identification information associated with the consumer computing device from the consumer computing device; anddetermine the unique device identifier associated with the consumer computing device based at least in part on the device identification information.
  • 8. A computer-implemented method performed by one or more computer processors comprising: accessing, via a server system, a first tax return data from a remote tax agency server via a network interface, the network interface configured to send and receive secure, encrypted electronic messages with the remote tax agency server, the remote tax agency server comprising a first electronic data store configured to store a plurality of tax return data associated with a plurality of consumers and at least one tax agency, the first tax return data associated with a first tax return associated with a first consumer of the plurality of consumers and comprising one or more consumer attributes associated with the first consumer;identifying, via a server system, the first tax return as a flagged tax return based at least in part on the one or more consumer attributes associated with the first consumer;accessing, via a server system, a first device activity data from a second electronic data store, the first device activity data associated with a consumer computing device associated with the first consumer, the first device activity data indicative of at least one of: whether the consumer computing device has been previously associated with fraudulent activities, a degree of risk associated with the fraudulent activities, or whether the consumer computing device is in a list of devices that are banned from conducting a tax return filing;analyzing, via a server system, the first device activity data; andbased at least in part on the analysis of the first device activity data, generating, via a server system, a fraud score for the first tax return data and generating an electronic authentication instruction to launch an authentication execution script to conduct authentication of the consumer computing device associated with the first consumer.
  • 9. The computer-implemented method of claim 8 further comprising: accessing, from a third data store, a first consumer data associated with the first consumer;determining one or more consumer attributes from the first consumer data;comparing the one or more consumer attributes from the first consumer data with the one or more consumer attributes associated with the first tax return; anddetermining whether the one or more consumer attributes from the first consumer data match with the one or more consumer attributes associated with the first tax return.
  • 10. The computer-implemented method of claim 8, wherein the requesting authentication of the consumer computing device associated with the first consumer comprises: generating a set of one or more authentication questions;transmitting, to the consumer computing device, the set of one or more authentication questions;receiving, from the consumer computing device, a set of one or more responses to the set of one or more authentication questions; anddetermining, based on the set of one or more responses, whether the set of one or more responses are correct.
  • 11. The computer-implemented method of claim 8, wherein the authentication of the consumer computing device is based at least in part on personal information associated with the first consumer.
  • 12. The computer-implemented method of claim 8, wherein the first tax return data further comprises one or more device identifiers associated with a device used to electronically submit the first tax return, and wherein a first tax fraud indicator is generated based in part on the one or more device identifiers of the first tax return data.
  • 13. The computer-implemented method of claim 8, wherein the first device activity data associated with the first consumer is accessed from the second electronic data store based at least in part on a unique device identifier associated with the consumer computing device associated with the first consumer.
  • 14. The computer-implemented method of claim 13, further comprising: sending a request to the consumer computing device, the request comprising a link configured to automatically gather device identification information when assessed;receiving device identification information associated with the consumer computing device from the consumer computing device; anddetermining the unique device identifier associated with the consumer computing device based at least in part on the device identification information.
  • 15. A non-transitory computer storage having stored thereon a computer program, the computer program including executable instructions that instruct a computer system to at least: access a first tax return data from a remote tax agency server via a network interface, the network interface configured to send and receive secure, encrypted electronic messages with the remote tax agency server, the remote tax agency server comprising a first electronic data store configured to store a plurality of tax return data associated with a plurality of consumers and at least one tax agency, the first tax return data associated with a first tax return associated with a first consumer of the plurality of consumers and comprising one or more consumer attributes associated with the first consumer;identify the first tax return as a flagged tax return based at least in part on the one or more consumer attributes associated with the first consumer;access a first device activity data from a second electronic data store, the first device activity data associated with a consumer computing device associated with the first consumer, the first device activity data indicative of at least one of: whether the consumer computing device has been previously associated with fraudulent activities, a degree of risk associated with the fraudulent activities, or whether the consumer computing device is in a list of devices that are banned from conducting a tax return filing;analyze the first device activity data; andbased at least in part on the analysis of the first device activity data, generate a fraud score for the first tax return data and request authentication of the consumer computing device associated with the first consumer.
  • 16. The non-transitory computer storage of claim 15, wherein the executable instructions further instruct the computer system to: access, from a third data store, a first consumer data associated with the first consumer;determine one or more consumer attributes from the first consumer data;compare the one or more consumer attributes from the first consumer data with the one or more consumer attributes associated with the first tax return; anddetermine whether the one or more consumer attributes from the first consumer data match with the one or more consumer attributes associated with the first tax return.
  • 17. The non-transitory computer storage of claim 15, wherein the requesting authentication of the consumer computing device associated with the first consumer comprises: generating a set of one or more authentication questions;transmitting, to the consumer computing device, the set of one or more authentication questions;receiving, from the consumer computing device, a set of one or more responses to the set of one or more authentication questions; anddetermining, based on the set of one or more responses, whether the set of one or more responses are correct.
  • 18. The non-transitory computer storage of claim 15, wherein the first tax return data further comprises one or more device identifiers associated with a device used to electronically submit the first tax return, and wherein a first tax fraud indicator is generated based in part on the one or more device identifiers of the first tax return data.
  • 19. The non-transitory computer storage of claim 15, wherein the first device activity data associated with the first consumer is accessed from the second electronic data store based at least in part on a unique device identifier associated with the consumer computing device associated with the first consumer.
  • 20. The non-transitory computer storage of claim 19, wherein the executable instructions further instruct the computer system to: send a request to the consumer computing device, the request comprising a link configured to automatically gather device identification information when assessed;receive device identification information associated with the consumer computing device from the consumer computing device; anddetermine the unique device identifier associated with the consumer computing device based at least in part on the device identification information.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 17/208,327 filed on Mar. 22, 2021, entitled SYSTEM AND ARCHITECTURE FOR ELECTRONIC FRAUD DETECTION, which is a continuation of U.S. patent application Ser. No. 16/443,662 filed on Jun. 17, 2019, entitled SYSTEM AND ARCHITECTURE FOR ELECTRONIC FRAUD DETECTION, which is a continuation of U.S. patent application Ser. No. 14/928,770 filed Oct. 30, 2015, entitled SYSTEM AND ARCHITECTURE FOR ELECTRONIC FRAUD DETECTION, which claims the benefit of priority from U.S. Provisional Patent Application No. 62/073,714 filed on Oct. 31, 2014, entitled SYSTEM AND ARCHITECTURE FOR ELECTRONIC FRAUD DETECTION. All above-cited applications are hereby incorporated herein by reference in their entirety.

US Referenced Citations (1174)
Number Name Date Kind
2074513 Mills Mar 1937 A
3316395 Lavin et al. Apr 1967 A
3752904 Waterbury Aug 1973 A
4163290 Sutherlin et al. Jul 1979 A
5274547 Zoffel et al. Dec 1993 A
5323315 Highbloom Jun 1994 A
5386104 Sime Jan 1995 A
5414833 Hershey et al. May 1995 A
5454030 de Oliveira et al. Sep 1995 A
5504675 Cragun et al. Apr 1996 A
5563783 Stolfo et al. Oct 1996 A
5627886 Bowman May 1997 A
5679940 Templeton et al. Oct 1997 A
5696907 Tom Dec 1997 A
5696965 Dedrick Dec 1997 A
5739512 Tognazzini Apr 1998 A
5742775 King Apr 1998 A
5745654 Titan Apr 1998 A
5752242 Havens May 1998 A
5754632 Smith May 1998 A
5774868 Cragun et al. Jun 1998 A
5793497 Funk Aug 1998 A
5809478 Greco et al. Sep 1998 A
5819226 Gopinathan et al. Oct 1998 A
5819260 Lu et al. Oct 1998 A
5822741 Fischthal Oct 1998 A
5832068 Smith Nov 1998 A
5842178 Giovannoli Nov 1998 A
5870721 Norris Feb 1999 A
5872921 Zahariev et al. Feb 1999 A
5878403 DeFrancesco Mar 1999 A
5879297 Haynor et al. Mar 1999 A
5884289 Anderson et al. Mar 1999 A
5912839 Ovshinsky et al. Jun 1999 A
5913196 Talmor et al. Jun 1999 A
5943666 Kleewein et al. Aug 1999 A
5950179 Buchanan et al. Sep 1999 A
5987440 O'Neil et al. Nov 1999 A
5999907 Donner Dec 1999 A
5999940 Ranger Dec 1999 A
6023694 Kouchi et al. Feb 2000 A
6029139 Cunningham et al. Feb 2000 A
6029149 Dykstra et al. Feb 2000 A
6029154 Pettitt Feb 2000 A
6029194 Tilt Feb 2000 A
6044357 Garg Mar 2000 A
6055570 Nielsen Apr 2000 A
6094643 Anderson et al. Jul 2000 A
6119103 Basch et al. Sep 2000 A
6125985 Amdahl et al. Oct 2000 A
6142283 Amdahl et al. Nov 2000 A
6144988 Kappel Nov 2000 A
6157707 Baulier et al. Dec 2000 A
6182219 Feldbau et al. Jan 2001 B1
6208720 Curtis et al. Mar 2001 B1
6249228 Shirk et al. Jun 2001 B1
6253203 O'Flaherty et al. Jun 2001 B1
6254000 Degen et al. Jul 2001 B1
6263447 French et al. Jul 2001 B1
6269349 Aieta et al. Jul 2001 B1
6282658 French et al. Aug 2001 B2
6285983 Jenkins Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6292795 Peters et al. Sep 2001 B1
6311169 Duhon Oct 2001 B2
6317783 Freishtat et al. Nov 2001 B1
6321339 French et al. Nov 2001 B1
6330546 Gopinathan et al. Dec 2001 B1
6397197 Gindlesperger May 2002 B1
6418436 Degen et al. Jul 2002 B1
6424956 Werbos Jul 2002 B1
6448889 Hudson Sep 2002 B1
6456984 Demoff et al. Sep 2002 B1
6496936 French et al. Dec 2002 B1
6505193 Musgrave et al. Jan 2003 B1
6510415 Talmor et al. Jan 2003 B1
6513018 Culhane Jan 2003 B1
6532459 Berson Mar 2003 B1
6542894 Lee et al. Apr 2003 B1
6543683 Hoffman Apr 2003 B2
6553495 Johansson et al. Apr 2003 B1
6571334 Feldbau et al. May 2003 B1
6597775 Lawyer et al. Jul 2003 B2
6612488 Suzuki Sep 2003 B2
6615193 Kingdon et al. Sep 2003 B1
6658393 Basch et al. Dec 2003 B1
6662023 Helle Dec 2003 B1
6696941 Baker Feb 2004 B2
6700220 Bayeur et al. Mar 2004 B2
6714918 Hillmer et al. Mar 2004 B2
6735572 Landesmann May 2004 B2
6740875 Ishikawa et al. May 2004 B1
6748426 Shaffer et al. Jun 2004 B1
6751626 Brown et al. Jun 2004 B2
6796497 Benkert et al. Sep 2004 B2
6811082 Wong Nov 2004 B2
6829711 Kwok et al. Dec 2004 B1
6850606 Lawyer et al. Feb 2005 B2
6857073 French et al. Feb 2005 B2
6866586 Oberberger et al. Mar 2005 B2
6871287 Ellingson Mar 2005 B1
6873979 Fishman et al. Mar 2005 B2
6898574 Regan May 2005 B1
6907408 Angel Jun 2005 B2
6908030 Rajasekaran et al. Jun 2005 B2
6913194 Suzuki Jul 2005 B2
6918038 Smith et al. Jul 2005 B1
6920435 Hoffman et al. Jul 2005 B2
6928546 Nanavati et al. Aug 2005 B1
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6934858 Woodhill Aug 2005 B2
6965881 Brickell et al. Nov 2005 B1
6965997 Dutta Nov 2005 B2
6973462 Dattero et al. Dec 2005 B2
6973575 Arnold Dec 2005 B2
6983381 Jerdonek Jan 2006 B2
6983882 Cassone Jan 2006 B2
6991174 Zuili Jan 2006 B2
6993659 Milgramm et al. Jan 2006 B2
7007174 Wheeler et al. Feb 2006 B2
7028052 Chapman et al. Apr 2006 B2
7035855 Kilger et al. Apr 2006 B1
7069240 Spero et al. Jun 2006 B2
7083090 Zuili Aug 2006 B2
7089592 Adjaoute et al. Aug 2006 B2
7092891 Maus et al. Aug 2006 B2
7104444 Suzuki Sep 2006 B2
7158622 Lawyer et al. Jan 2007 B2
7162640 Heath et al. Jan 2007 B2
7174335 Kameda Feb 2007 B2
7188078 Arnett et al. Mar 2007 B2
7203653 McIntosh Apr 2007 B1
7212995 Schulkins May 2007 B2
7222779 Pineda-Sanchez et al. May 2007 B1
7225977 Davis Jun 2007 B2
7234156 French et al. Jun 2007 B2
7240059 Bayliss et al. Jul 2007 B2
7240363 Ellingson Jul 2007 B1
7246067 Austin et al. Jul 2007 B2
7246740 Swift et al. Jul 2007 B2
7254560 Singhal Aug 2007 B2
7263506 Lee et al. Aug 2007 B2
7272728 Pierson et al. Sep 2007 B2
7272857 Everhart Sep 2007 B1
7277869 Starkman Oct 2007 B2
7277875 Serrano-Morales et al. Oct 2007 B2
7283974 Katz et al. Oct 2007 B2
7289607 Bhargava et al. Oct 2007 B2
7290704 Ball et al. Nov 2007 B1
7298873 Miller, Jr. et al. Nov 2007 B2
7310743 Gagne et al. Dec 2007 B1
7314162 Carr et al. Jan 2008 B2
7314167 Kiliccote Jan 2008 B1
7330871 Barber Feb 2008 B2
7333635 Tsantes et al. Feb 2008 B2
7340042 Cluff et al. Mar 2008 B2
7343149 Benco Mar 2008 B2
7356516 Richey et al. Apr 2008 B2
7370044 Mulhern et al. May 2008 B2
7370351 Ramachandran et al. May 2008 B1
7376618 Anderson et al. May 2008 B1
7383227 Weinflash et al. Jun 2008 B2
7386448 Poss et al. Jun 2008 B1
7386506 Aoki et al. Jun 2008 B2
7392534 Lu et al. Jun 2008 B2
7395273 Khan et al. Jul 2008 B2
7398915 Pineda-Sanchez et al. Jul 2008 B1
7406715 Clapper Jul 2008 B2
7412228 Barclay et al. Aug 2008 B2
7418431 Nies et al. Aug 2008 B1
7428509 Klebanoff Sep 2008 B2
7433855 Gavan et al. Oct 2008 B2
7433864 Malik Oct 2008 B2
7438226 Helsper et al. Oct 2008 B2
7444518 Dharmarajan et al. Oct 2008 B1
7457401 Lawyer et al. Nov 2008 B2
7458508 Shao et al. Dec 2008 B1
7466235 Kolb et al. Dec 2008 B1
7467401 Cicchitto Dec 2008 B2
7480631 Merced et al. Jan 2009 B1
7481363 Zuili Jan 2009 B2
7490052 Kilger et al. Feb 2009 B2
7490356 Lieblich et al. Feb 2009 B2
7497374 Helsper et al. Mar 2009 B2
7509117 Yum Mar 2009 B2
7512221 Toms Mar 2009 B2
7519558 Ballard et al. Apr 2009 B2
7522060 Tumperi et al. Apr 2009 B1
7533808 Song et al. May 2009 B2
7536346 Aliffi et al. May 2009 B2
7540021 Page May 2009 B2
7542993 Satterfield et al. Jun 2009 B2
7543739 Brown et al. Jun 2009 B2
7543740 Greene et al. Jun 2009 B2
7546271 Chmielewski et al. Jun 2009 B1
7548886 Kirkland et al. Jun 2009 B2
7552467 Lindsay Jun 2009 B2
7562184 Henmi et al. Jul 2009 B2
7562814 Shao et al. Jul 2009 B1
7568616 Zuili Aug 2009 B2
7575157 Barnhardt et al. Aug 2009 B2
7580884 Cook Aug 2009 B2
7581112 Brown et al. Aug 2009 B2
7584146 Duhon Sep 2009 B1
7587368 Felsher Sep 2009 B2
7591425 Zuili et al. Sep 2009 B1
7593891 Kornegay et al. Sep 2009 B2
7606401 Hoffman et al. Oct 2009 B2
7606790 Levy Oct 2009 B2
7610216 May et al. Oct 2009 B1
7610229 Kornegay Oct 2009 B1
7610243 Haggerty et al. Oct 2009 B2
7620596 Knudson et al. Nov 2009 B2
7623844 Herrmann et al. Nov 2009 B2
7630924 Collins et al. Dec 2009 B1
7630932 Danaher et al. Dec 2009 B2
7636853 Cluts et al. Dec 2009 B2
7644868 Hare Jan 2010 B2
7647344 Skurtovich, Jr. et al. Jan 2010 B2
7647645 Edeki et al. Jan 2010 B2
7653593 Zarikian et al. Jan 2010 B2
7657431 Hayakawa Feb 2010 B2
7668769 Baker et al. Feb 2010 B2
7668840 Bayliss et al. Feb 2010 B2
7668921 Proux et al. Feb 2010 B2
7672865 Kumar et al. Mar 2010 B2
7673793 Greene et al. Mar 2010 B2
7676418 Chung et al. Mar 2010 B1
7676433 Ross et al. Mar 2010 B1
7685096 Margolus et al. Mar 2010 B2
7686214 Shao et al. Mar 2010 B1
7689007 Bous et al. Mar 2010 B2
7689505 Kasower Mar 2010 B2
7689506 Fei et al. Mar 2010 B2
7690032 Peirce Mar 2010 B1
7701364 Zilberman Apr 2010 B1
7702550 Perg et al. Apr 2010 B2
7707163 Anzalone et al. Apr 2010 B2
7708190 Brandt et al. May 2010 B2
7708200 Helsper et al. May 2010 B2
7711635 Steele et al. May 2010 B2
7711636 Robida et al. May 2010 B2
7720750 Brody May 2010 B2
7725300 Pinto et al. May 2010 B2
7734523 Cui et al. Jun 2010 B1
7735125 Alvarez et al. Jun 2010 B1
7742982 Chaudhuri et al. Jun 2010 B2
7747520 Livermore et al. Jun 2010 B2
7747521 Serio Jun 2010 B2
7747559 Leitner et al. Jun 2010 B2
7752084 Pettitt Jul 2010 B2
7752236 Williams et al. Jul 2010 B2
7752554 Biggs et al. Jul 2010 B2
7756783 Crooks Jul 2010 B2
7761379 Zoldi et al. Jul 2010 B2
7761384 Madhogarhia Jul 2010 B2
7774270 MacCloskey Aug 2010 B1
7778885 Semprevivo et al. Aug 2010 B1
7779456 Dennis et al. Aug 2010 B2
7779457 Taylor Aug 2010 B2
7783281 Cook et al. Aug 2010 B1
7783515 Kumar et al. Aug 2010 B1
7788184 Kane Aug 2010 B2
7792715 Kasower Sep 2010 B1
7792864 Rice et al. Sep 2010 B1
7793835 Coggeshall et al. Sep 2010 B1
7801811 Merrell et al. Sep 2010 B1
7801828 Candella et al. Sep 2010 B2
7802104 Dickinson Sep 2010 B2
7805362 Merrell et al. Sep 2010 B1
7805391 Friedlander et al. Sep 2010 B2
7809797 Cooley et al. Oct 2010 B2
7813944 Luk et al. Oct 2010 B1
7827115 Weller et al. Nov 2010 B2
7832006 Chen et al. Nov 2010 B2
7835983 Lefner et al. Nov 2010 B2
7840459 Loftesness et al. Nov 2010 B1
7841004 Balducci et al. Nov 2010 B1
7844520 Franklin Nov 2010 B1
7848987 Haig Dec 2010 B2
7849029 Crooks et al. Dec 2010 B2
7853518 Cagan Dec 2010 B2
7853526 Milana Dec 2010 B2
7853533 Eisen Dec 2010 B2
7853998 Blaisdell et al. Dec 2010 B2
7856397 Whipple et al. Dec 2010 B2
7856494 Kulkarni Dec 2010 B2
7860769 Benson Dec 2010 B2
7860783 Yang et al. Dec 2010 B2
7865427 Wright et al. Jan 2011 B2
7865439 Seifert et al. Jan 2011 B2
7865937 White et al. Jan 2011 B1
7870078 Clark et al. Jan 2011 B2
7870599 Pemmaraju Jan 2011 B2
7873382 Rydgren et al. Jan 2011 B2
7873566 Templeton et al. Jan 2011 B1
7874488 Parkinson Jan 2011 B2
7877304 Coulter Jan 2011 B1
7877784 Chow et al. Jan 2011 B2
7882548 Heath et al. Feb 2011 B2
7890433 Singhal Feb 2011 B2
7904360 Evans Mar 2011 B2
7904367 Chung et al. Mar 2011 B2
7908242 Achanta Mar 2011 B1
7909246 Hogg et al. Mar 2011 B2
7912865 Akerman et al. Mar 2011 B2
7917715 Tallman, Jr. Mar 2011 B2
7925582 Kornegay et al. Apr 2011 B1
7929951 Stevens et al. Apr 2011 B2
7933835 Keane et al. Apr 2011 B2
7941363 Tanaka et al. May 2011 B2
7945515 Zoldi et al. May 2011 B2
7950577 Daniel May 2011 B1
7958046 Doerner et al. Jun 2011 B2
7961857 Zoldi et al. Jun 2011 B2
7962404 Metzger, II et al. Jun 2011 B1
7962467 Howard et al. Jun 2011 B2
7970679 Kasower Jun 2011 B2
7970698 Gupta et al. Jun 2011 B2
7970701 Lewis et al. Jun 2011 B2
7971246 Emigh et al. Jun 2011 B1
7975299 Balducci et al. Jul 2011 B1
7983976 Nafeh et al. Jul 2011 B2
7983979 Holland, IV Jul 2011 B2
7984849 Berghel et al. Jul 2011 B2
7988043 Davis Aug 2011 B2
7991201 Bous et al. Aug 2011 B2
7991689 Brunzell et al. Aug 2011 B1
7991716 Crooks et al. Aug 2011 B2
7991751 Peled et al. Aug 2011 B2
7995994 Khetawat et al. Aug 2011 B2
7996521 Chamberlain et al. Aug 2011 B2
8001034 Chung et al. Aug 2011 B2
8001042 Brunzell et al. Aug 2011 B1
8001153 Skurtovich, Jr. et al. Aug 2011 B2
8001597 Crooks Aug 2011 B2
8005749 Ginsberg Aug 2011 B2
8006291 Headley et al. Aug 2011 B2
8009873 Chapman Aug 2011 B2
8019678 Wright et al. Sep 2011 B2
8020763 Kowalchyk et al. Sep 2011 B1
8024263 Zarikian et al. Sep 2011 B2
8024271 Grant Sep 2011 B2
8027439 Zoldi et al. Sep 2011 B2
8027518 Baker et al. Sep 2011 B2
8027947 Hinsz et al. Sep 2011 B2
8028168 Smithies et al. Sep 2011 B2
8028326 Palmer et al. Sep 2011 B2
8028329 Whitcomb Sep 2011 B2
8028896 Carter et al. Oct 2011 B2
8032448 Anderson et al. Oct 2011 B2
8032449 Hu et al. Oct 2011 B2
8032927 Ross Oct 2011 B2
8037097 Guo et al. Oct 2011 B2
8037512 Wright et al. Oct 2011 B2
8041597 Li et al. Oct 2011 B2
8042159 Basner et al. Oct 2011 B2
8042193 Piliouras Oct 2011 B1
8049596 Sato Nov 2011 B2
8055667 Levy Nov 2011 B2
8056128 Dingle et al. Nov 2011 B1
8058972 Mohanty Nov 2011 B2
8060424 Kasower Nov 2011 B2
8060915 Voice et al. Nov 2011 B2
8060916 Bajaj et al. Nov 2011 B2
8065233 Lee et al. Nov 2011 B2
8065525 Zilberman Nov 2011 B2
8069053 Gervais et al. Nov 2011 B2
8069084 Mackouse Nov 2011 B2
8069256 Rasti Nov 2011 B2
8069485 Carter Nov 2011 B2
8073785 Candella et al. Dec 2011 B1
8078569 Kennel Dec 2011 B2
8090648 Zoldi et al. Jan 2012 B2
8104679 Brown Jan 2012 B2
8116731 Buhrmann et al. Feb 2012 B2
8121962 Vaiciulis et al. Feb 2012 B2
8131615 Diev et al. Mar 2012 B2
8151327 Eisen Apr 2012 B2
8195549 Kasower Jun 2012 B2
8201257 Andres et al. Jun 2012 B1
8204774 Chwast et al. Jun 2012 B2
8204982 Casado et al. Jun 2012 B2
8214262 Semprevivo et al. Jul 2012 B1
8214285 Hu et al. Jul 2012 B2
8224723 Bosch et al. Jul 2012 B2
8225395 Atwood et al. Jul 2012 B2
8239677 Colson Aug 2012 B2
8244629 Lewis et al. Aug 2012 B2
8255978 Dick Aug 2012 B2
8260914 Ranjan Sep 2012 B1
8280805 Abrahams et al. Oct 2012 B1
8280833 Miltonberger Oct 2012 B2
8285613 Coulter Oct 2012 B1
8285636 Curry et al. Oct 2012 B2
8296225 Maddipati et al. Oct 2012 B2
8296229 Yellin et al. Oct 2012 B1
8296250 Crooks et al. Oct 2012 B2
8332338 Vaiciulis et al. Dec 2012 B2
8346593 Fanelli Jan 2013 B2
8355896 Kumar et al. Jan 2013 B2
8359278 Domenikos et al. Jan 2013 B2
8364588 Celka et al. Jan 2013 B2
8374973 Herbrich et al. Feb 2013 B2
8386377 Xiong et al. Feb 2013 B1
8429070 Hu et al. Apr 2013 B2
8463904 Casado et al. Jun 2013 B2
8468090 Lesandro et al. Jun 2013 B2
8489479 Slater et al. Jul 2013 B2
8510329 Balkir et al. Aug 2013 B2
8515844 Kasower Aug 2013 B2
8516439 Brass et al. Aug 2013 B2
8543499 Haggerty et al. Sep 2013 B2
8548137 Zoldi et al. Oct 2013 B2
8548903 Becker Oct 2013 B2
8549590 de Villiers Prichard et al. Oct 2013 B1
8559607 Zoldi et al. Oct 2013 B2
8567669 Griegel et al. Oct 2013 B2
8578496 Krishnappa Nov 2013 B1
8626671 Federgreen Jan 2014 B2
8630938 Cheng et al. Jan 2014 B2
8639920 Stack et al. Jan 2014 B2
8645301 Vaiciulis et al. Feb 2014 B2
8671115 Skurtovich, Jr. et al. Mar 2014 B2
8676684 Newman et al. Mar 2014 B2
8676726 Hore et al. Mar 2014 B2
8682755 Bucholz et al. Mar 2014 B2
8683586 Crooks Mar 2014 B2
8694427 Maddipati et al. Apr 2014 B2
8707445 Sher-Jan et al. Apr 2014 B2
8725613 Celka et al. May 2014 B1
8763133 Sher-Jan et al. Jun 2014 B2
8776225 Pierson et al. Jul 2014 B2
8781953 Kasower Jul 2014 B2
8781975 Bennett et al. Jul 2014 B2
8793777 Colson Jul 2014 B2
8805836 Hore et al. Aug 2014 B2
8812387 Samler et al. Aug 2014 B1
8819793 Gottschalk, Jr. Aug 2014 B2
8824648 Zoldi et al. Sep 2014 B2
8826393 Eisen Sep 2014 B2
8862514 Eisen Oct 2014 B2
8862526 Miltonberger Oct 2014 B2
8909664 Hopkins Dec 2014 B2
8918891 Coggeshall et al. Dec 2014 B2
8949981 Trollope et al. Feb 2015 B1
9118646 Pierson et al. Aug 2015 B2
9147117 Madhu et al. Sep 2015 B1
9191403 Zoldi et al. Nov 2015 B2
9194899 Zoldi et al. Nov 2015 B2
9196004 Eisen Nov 2015 B2
9210156 Little et al. Dec 2015 B1
9235728 Gottschalk, Jr. et al. Jan 2016 B2
9251541 Celka et al. Feb 2016 B2
9256624 Skurtovich, Jr. et al. Feb 2016 B2
9280658 Coggeshall et al. Mar 2016 B2
9361597 Britton et al. Jun 2016 B2
9367520 Zhao et al. Jun 2016 B2
9390384 Eisen Jul 2016 B2
9412141 Prichard et al. Aug 2016 B2
9483650 Sher-Jan et al. Nov 2016 B2
9489497 MaGill et al. Nov 2016 B2
9531738 Zoldi et al. Dec 2016 B2
9558368 Gottschalk, Jr. et al. Jan 2017 B2
9595066 Samler et al. Mar 2017 B2
9600845 Nordyke et al. Mar 2017 B2
9652802 Kasower May 2017 B1
9704195 Zoldi Jul 2017 B2
9710523 Skurtovich, Jr. et al. Jul 2017 B2
9710868 Gottschalk, Jr. et al. Jul 2017 B2
9754256 Britton et al. Sep 2017 B2
9754311 Eisen Sep 2017 B2
9760885 Ramalingam et al. Sep 2017 B1
9773227 Zoldi et al. Sep 2017 B2
9781147 Sher-Jan et al. Oct 2017 B2
9805216 Kraska et al. Oct 2017 B2
9953321 Zoldi et al. Apr 2018 B2
10043213 Straub et al. Aug 2018 B2
10089411 Kassa Oct 2018 B2
10089679 Eisen Oct 2018 B2
10089686 Straub et al. Oct 2018 B2
10102530 Zoldi et al. Oct 2018 B2
10115153 Zoldi et al. Oct 2018 B2
10152736 Yang et al. Dec 2018 B2
10217163 Straub et al. Feb 2019 B2
10242540 Chen et al. Mar 2019 B2
10339527 Coleman et al. Jul 2019 B1
10373061 Kennel et al. Aug 2019 B2
10404472 Knopf Sep 2019 B2
10430604 Spinelli et al. Oct 2019 B2
10438308 Prichard et al. Oct 2019 B2
10482542 Jain Nov 2019 B1
10497034 Yang et al. Dec 2019 B2
10510025 Zoldi et al. Dec 2019 B2
10521857 Shao et al. Dec 2019 B1
10528948 Zoldi et al. Jan 2020 B2
10579938 Zoldi et al. Mar 2020 B2
10592982 Samler et al. Mar 2020 B2
10593004 Gottschalk, Jr. et al. Mar 2020 B2
10616196 Khitrenovich et al. Apr 2020 B1
10692058 Zoldi et al. Jun 2020 B2
10699028 Kennedy et al. Jun 2020 B1
10713711 Zoldi Jul 2020 B2
10769290 Crawford et al. Sep 2020 B2
10791136 Zoldi et al. Sep 2020 B2
10896381 Zoldi et al. Jan 2021 B2
10896472 Stack et al. Jan 2021 B1
10902426 Zoldi et al. Jan 2021 B2
10909617 Kasower Feb 2021 B2
10958725 Knopf Mar 2021 B2
10977363 Leitner et al. Apr 2021 B2
10990979 Coleman et al. Apr 2021 B1
10999298 Eisen May 2021 B2
11023963 Zoldi et al. Jun 2021 B2
11025428 Knopf Jun 2021 B2
11030562 Dean et al. Jun 2021 B1
11037229 Zoldi et al. Jun 2021 B2
11080740 Billman et al. Aug 2021 B2
11087334 McEachern et al. Aug 2021 B1
11093845 Zoldi et al. Aug 2021 B2
11093988 Zoldi et al. Aug 2021 B2
11100506 Zoldi et al. Aug 2021 B2
11108562 Knopf et al. Aug 2021 B2
11151468 Chen et al. Oct 2021 B1
11157650 Kennedy et al. Oct 2021 B1
11256825 Spinelli et al. Feb 2022 B2
11354670 Phelan et al. Jun 2022 B2
11367074 Zoldi et al. Jun 2022 B2
11373190 Zoldi et al. Jun 2022 B2
11380171 Chen et al. Jul 2022 B2
11423414 Zoldi et al. Aug 2022 B2
11431736 Brown et al. Aug 2022 B2
11436606 Coleman et al. Sep 2022 B1
11552901 Hoover et al. Jan 2023 B2
11568348 Dean et al. Jan 2023 B1
11580259 Kennedy et al. Feb 2023 B1
11593476 Van Dyke Feb 2023 B2
11625730 Liu et al. Apr 2023 B2
11658994 Johnston et al. May 2023 B2
11665004 Knopf May 2023 B2
11669894 Nordyke et al. Jun 2023 B2
20010014868 Herz et al. Aug 2001 A1
20010014878 Mitra et al. Aug 2001 A1
20010027413 Bhutta Oct 2001 A1
20010029470 Schultz et al. Oct 2001 A1
20010034631 Kiselik Oct 2001 A1
20010039523 Iwamoto Nov 2001 A1
20020010684 Moskowitz Jan 2002 A1
20020013899 Faul Jan 2002 A1
20020019804 Sutton Feb 2002 A1
20020019938 Aarons Feb 2002 A1
20020032635 Harris et al. Mar 2002 A1
20020040344 Preiser et al. Apr 2002 A1
20020042879 Gould et al. Apr 2002 A1
20020052841 Guthrie et al. May 2002 A1
20020059521 Tasler May 2002 A1
20020062185 Runge et al. May 2002 A1
20020062281 Singhal May 2002 A1
20020073044 Singhal Jun 2002 A1
20020077178 Oberberger et al. Jun 2002 A1
20020077964 Brody et al. Jun 2002 A1
20020080256 Bates et al. Jun 2002 A1
20020087460 Hornung Jul 2002 A1
20020099649 Lee et al. Jul 2002 A1
20020119824 Allen Aug 2002 A1
20020130176 Suzuki Sep 2002 A1
20020138417 Lawrence Sep 2002 A1
20020138751 Dutta Sep 2002 A1
20020147695 Khedkar et al. Oct 2002 A1
20020156676 Ahrens et al. Oct 2002 A1
20020161664 Shaya et al. Oct 2002 A1
20020161711 Sartor et al. Oct 2002 A1
20020173994 Ferguson, III Nov 2002 A1
20020178112 Goeller et al. Nov 2002 A1
20020184509 Scheidt et al. Dec 2002 A1
20020188544 Wizon et al. Dec 2002 A1
20030004879 Demoff et al. Jan 2003 A1
20030009426 Ruiz-Sanchez Jan 2003 A1
20030018549 Fei et al. Jan 2003 A1
20030033261 Knegendorf Feb 2003 A1
20030046554 Leydier et al. Mar 2003 A1
20030048904 Wang et al. Mar 2003 A1
20030050882 Degen et al. Mar 2003 A1
20030057278 Wong Mar 2003 A1
20030061163 Durfield Mar 2003 A1
20030065563 Elliott et al. Apr 2003 A1
20030070101 Buscemi Apr 2003 A1
20030078877 Beirne et al. Apr 2003 A1
20030093366 Halper et al. May 2003 A1
20030097320 Gordon May 2003 A1
20030105696 Kalotay et al. Jun 2003 A1
20030115133 Bian Jun 2003 A1
20030143980 Choi et al. Jul 2003 A1
20030149744 Bierre et al. Aug 2003 A1
20030153299 Perfit et al. Aug 2003 A1
20030158751 Suresh et al. Aug 2003 A1
20030158960 Engberg Aug 2003 A1
20030182214 Taylor Sep 2003 A1
20030195859 Lawrence Oct 2003 A1
20030200447 Sjoblom Oct 2003 A1
20030208428 Raynes et al. Nov 2003 A1
20030222500 Bayeur et al. Dec 2003 A1
20030225656 Aberman et al. Dec 2003 A1
20030225692 Bosch et al. Dec 2003 A1
20030225742 Tenner et al. Dec 2003 A1
20030233278 Marshall Dec 2003 A1
20040004117 Suzuki Jan 2004 A1
20040005912 Hubbe et al. Jan 2004 A1
20040010698 Rolfe Jan 2004 A1
20040024709 Yu et al. Feb 2004 A1
20040026496 Zuili Feb 2004 A1
20040030649 Nelson et al. Feb 2004 A1
20040039586 Garvey et al. Feb 2004 A1
20040054619 Watson et al. Mar 2004 A1
20040059653 Verkuylen et al. Mar 2004 A1
20040064401 Palaghita et al. Apr 2004 A1
20040078324 Lonnberg et al. Apr 2004 A1
20040103147 Flesher et al. May 2004 A1
20040107363 Monteverde Jun 2004 A1
20040110119 Riconda et al. Jun 2004 A1
20040111305 Gavan et al. Jun 2004 A1
20040111335 Black et al. Jun 2004 A1
20040117235 Shacham Jun 2004 A1
20040128227 Whipple et al. Jul 2004 A1
20040128232 Descloux Jul 2004 A1
20040133440 Carolan et al. Jul 2004 A1
20040143526 Monasterio et al. Jul 2004 A1
20040149820 Zuili Aug 2004 A1
20040149827 Zuili Aug 2004 A1
20040153330 Miller et al. Aug 2004 A1
20040153656 Cluts et al. Aug 2004 A1
20040158520 Noh Aug 2004 A1
20040158523 Dort Aug 2004 A1
20040158723 Root Aug 2004 A1
20040167793 Masuoka et al. Aug 2004 A1
20040177046 Ogram Sep 2004 A1
20040193538 Raines Sep 2004 A1
20040199456 Flint et al. Oct 2004 A1
20040199462 Starrs Oct 2004 A1
20040204948 Singletary et al. Oct 2004 A1
20040205008 Haynie et al. Oct 2004 A1
20040225594 Nolan, III et al. Nov 2004 A1
20040230448 Schaich Nov 2004 A1
20040230527 Hansen et al. Nov 2004 A1
20040230538 Clifton et al. Nov 2004 A1
20040234117 Tibor Nov 2004 A1
20040243514 Wankmueller Dec 2004 A1
20040243518 Clifton et al. Dec 2004 A1
20040243567 Levy Dec 2004 A1
20040250085 Tattan et al. Dec 2004 A1
20040255127 Arnouse Dec 2004 A1
20040260922 Goodman et al. Dec 2004 A1
20050001028 Zuili Jan 2005 A1
20050005168 Dick Jan 2005 A1
20050010513 Duckworth et al. Jan 2005 A1
20050010780 Kane et al. Jan 2005 A1
20050021476 Candella et al. Jan 2005 A1
20050021519 Ghouri Jan 2005 A1
20050027983 Klawon Feb 2005 A1
20050038726 Salomon et al. Feb 2005 A1
20050038737 Norris Feb 2005 A1
20050039086 Krishnamurthy et al. Feb 2005 A1
20050050577 Westbrook et al. Mar 2005 A1
20050058262 Timmins et al. Mar 2005 A1
20050065874 Lefner et al. Mar 2005 A1
20050065950 Chaganti et al. Mar 2005 A1
20050071282 Lu et al. Mar 2005 A1
20050075985 Cartmell Apr 2005 A1
20050081052 Washington Apr 2005 A1
20050086161 Gallant Apr 2005 A1
20050091164 Varble Apr 2005 A1
20050097039 Kulcsar et al. May 2005 A1
20050097051 Madill, Jr. et al. May 2005 A1
20050097364 Edeki et al. May 2005 A1
20050102206 Savasoglu et al. May 2005 A1
20050105719 Huda May 2005 A1
20050125226 Magee Jun 2005 A1
20050125686 Brandt Jun 2005 A1
20050138391 Mandalia et al. Jun 2005 A1
20050144143 Freiberg Jun 2005 A1
20050154664 Guy et al. Jul 2005 A1
20050154665 Kerr Jul 2005 A1
20050154671 Doan et al. Jul 2005 A1
20050165667 Cox Jul 2005 A1
20050197953 Broadbent et al. Sep 2005 A1
20050203885 Chenevich et al. Sep 2005 A1
20050216953 Ellingson Sep 2005 A1
20050229007 Bolle et al. Oct 2005 A1
20050240578 Biederman et al. Oct 2005 A1
20050242173 Suzuki Nov 2005 A1
20050251474 Shinn et al. Nov 2005 A1
20050256809 Sadri Nov 2005 A1
20050262014 Fickes Nov 2005 A1
20050273333 Morin et al. Dec 2005 A1
20050273442 Bennett et al. Dec 2005 A1
20050278542 Pierson et al. Dec 2005 A1
20050279827 Mascavage et al. Dec 2005 A1
20050279869 Barklage Dec 2005 A1
20060004663 Singhal Jan 2006 A1
20060014129 Coleman et al. Jan 2006 A1
20060032909 Seegar Feb 2006 A1
20060041464 Powers et al. Feb 2006 A1
20060045105 Dobosz et al. Mar 2006 A1
20060047605 Ahmad Mar 2006 A1
20060059073 Walzak Mar 2006 A1
20060059110 Madhok et al. Mar 2006 A1
20060064374 Helsper et al. Mar 2006 A1
20060074798 Din et al. Apr 2006 A1
20060074986 Mallalieu et al. Apr 2006 A1
20060080230 Freiberg Apr 2006 A1
20060080263 Willis et al. Apr 2006 A1
20060089905 Song et al. Apr 2006 A1
20060101508 Taylor May 2006 A1
20060106605 Saunders et al. May 2006 A1
20060112279 Cohen et al. May 2006 A1
20060112280 Cohen et al. May 2006 A1
20060129428 Wennberg Jun 2006 A1
20060129481 Bhatt et al. Jun 2006 A1
20060129840 Milgramm et al. Jun 2006 A1
20060131390 Kim Jun 2006 A1
20060136332 Ziegler Jun 2006 A1
20060140460 Coutts Jun 2006 A1
20060143073 Engel et al. Jun 2006 A1
20060144924 Stover Jul 2006 A1
20060149580 Helsper et al. Jul 2006 A1
20060149674 Cook et al. Jul 2006 A1
20060161435 Atef et al. Jul 2006 A1
20060161592 Ertoz et al. Jul 2006 A1
20060173776 Shalley et al. Aug 2006 A1
20060173792 Glass Aug 2006 A1
20060177226 Ellis, III Aug 2006 A1
20060178971 Owen et al. Aug 2006 A1
20060179004 Fuchs Aug 2006 A1
20060195351 Bayburtian Aug 2006 A1
20060200855 Willis Sep 2006 A1
20060202012 Grano et al. Sep 2006 A1
20060204051 Holland, IV Sep 2006 A1
20060206725 Milgramm et al. Sep 2006 A1
20060212386 Willey et al. Sep 2006 A1
20060218069 Aberman et al. Sep 2006 A1
20060229961 Lyftogt et al. Oct 2006 A1
20060239512 Petrillo Oct 2006 A1
20060239513 Song et al. Oct 2006 A1
20060242046 Haggerty et al. Oct 2006 A1
20060242047 Haggerty et al. Oct 2006 A1
20060253358 Delgrosso et al. Nov 2006 A1
20060253583 Dixon et al. Nov 2006 A1
20060255914 Westman Nov 2006 A1
20060262929 Vatanen et al. Nov 2006 A1
20060265243 Racho et al. Nov 2006 A1
20060271456 Romain et al. Nov 2006 A1
20060271457 Romain et al. Nov 2006 A1
20060271633 Adler Nov 2006 A1
20060273158 Suzuki Dec 2006 A1
20060277043 Tomes et al. Dec 2006 A1
20060282285 Helsper et al. Dec 2006 A1
20060282372 Endres et al. Dec 2006 A1
20060282395 Leibowitz Dec 2006 A1
20060287765 Kraft Dec 2006 A1
20060288090 Kraft Dec 2006 A1
20060294023 Lu Dec 2006 A1
20070005508 Chiang Jan 2007 A1
20070011100 Libin et al. Jan 2007 A1
20070016500 Chatterji et al. Jan 2007 A1
20070016521 Wang Jan 2007 A1
20070016522 Wang Jan 2007 A1
20070022141 Singleton et al. Jan 2007 A1
20070038483 Wood Feb 2007 A1
20070038568 Greene et al. Feb 2007 A1
20070040017 Kozlay Feb 2007 A1
20070040019 Berghel et al. Feb 2007 A1
20070043577 Kasower Feb 2007 A1
20070047770 Swope et al. Mar 2007 A1
20070048765 Abramson Mar 2007 A1
20070050638 Rasti Mar 2007 A1
20070059442 Sabeta Mar 2007 A1
20070061273 Greene et al. Mar 2007 A1
20070067207 Haggerty et al. Mar 2007 A1
20070067297 Kublickis Mar 2007 A1
20070072190 Aggarwal Mar 2007 A1
20070073622 Kane Mar 2007 A1
20070073630 Greene et al. Mar 2007 A1
20070078786 Bous et al. Apr 2007 A1
20070078908 Rohatgi et al. Apr 2007 A1
20070078985 Shao et al. Apr 2007 A1
20070083460 Bachenheimer Apr 2007 A1
20070087795 Aletto et al. Apr 2007 A1
20070093234 Willis et al. Apr 2007 A1
20070094137 Phillips et al. Apr 2007 A1
20070094264 Nair Apr 2007 A1
20070100774 Abdon May 2007 A1
20070106582 Baker et al. May 2007 A1
20070106611 Larsen May 2007 A1
20070107050 Selvarajan May 2007 A1
20070109103 Jedrey et al. May 2007 A1
20070110282 Millsapp May 2007 A1
20070112667 Rucker May 2007 A1
20070112668 Celano et al. May 2007 A1
20070118393 Rosen et al. May 2007 A1
20070155411 Morrison Jul 2007 A1
20070157299 Hare Jul 2007 A1
20070168246 Haggerty et al. Jul 2007 A1
20070168480 Biggs et al. Jul 2007 A1
20070174208 Black et al. Jul 2007 A1
20070179903 Seinfeld et al. Aug 2007 A1
20070180209 Tallman Aug 2007 A1
20070180263 Delgrosso et al. Aug 2007 A1
20070186276 McRae et al. Aug 2007 A1
20070192248 West Aug 2007 A1
20070192853 Shraim et al. Aug 2007 A1
20070198410 Labgold et al. Aug 2007 A1
20070205266 Carr et al. Sep 2007 A1
20070208669 Rivette et al. Sep 2007 A1
20070214037 Shubert et al. Sep 2007 A1
20070214365 Cornett et al. Sep 2007 A1
20070219928 Madhogarhia Sep 2007 A1
20070220594 Tulsyan Sep 2007 A1
20070226093 Chan et al. Sep 2007 A1
20070226129 Liao et al. Sep 2007 A1
20070233614 McNelley et al. Oct 2007 A1
20070234427 Gardner et al. Oct 2007 A1
20070244782 Chimento Oct 2007 A1
20070244807 Andringa et al. Oct 2007 A1
20070250704 Hallam-Baker Oct 2007 A1
20070250920 Lindsay Oct 2007 A1
20070266439 Kraft Nov 2007 A1
20070282730 Carpenter et al. Dec 2007 A1
20070288355 Roland et al. Dec 2007 A1
20070288360 Seeklus Dec 2007 A1
20070288559 Parsadayan Dec 2007 A1
20070291995 Rivera Dec 2007 A1
20070292006 Johnson Dec 2007 A1
20070294104 Boaz et al. Dec 2007 A1
20070299759 Kelly Dec 2007 A1
20080010203 Grant Jan 2008 A1
20080010683 Baddour et al. Jan 2008 A1
20080010687 Gonen et al. Jan 2008 A1
20080015887 Drabek et al. Jan 2008 A1
20080021804 Deckoff Jan 2008 A1
20080027857 Benson Jan 2008 A1
20080027858 Benson Jan 2008 A1
20080052182 Marshall Feb 2008 A1
20080059236 Cartier Mar 2008 A1
20080059352 Chandran Mar 2008 A1
20080059364 Tidwell et al. Mar 2008 A1
20080059366 Fou Mar 2008 A1
20080063172 Ahuja et al. Mar 2008 A1
20080066188 Kwak Mar 2008 A1
20080071882 Hering et al. Mar 2008 A1
20080076386 Khetawat et al. Mar 2008 A1
20080077526 Arumugam Mar 2008 A1
20080098222 Zilberman Apr 2008 A1
20080103798 Domenikos et al. May 2008 A1
20080103799 Domenikos et al. May 2008 A1
20080103800 Domenikos et al. May 2008 A1
20080103811 Sosa May 2008 A1
20080103972 Lanc May 2008 A1
20080104021 Cai et al. May 2008 A1
20080104672 Lunde et al. May 2008 A1
20080114837 Biggs et al. May 2008 A1
20080120237 Lin May 2008 A1
20080126116 Singhai May 2008 A1
20080126233 Hogan May 2008 A1
20080140576 Lewis et al. Jun 2008 A1
20080147454 Walker et al. Jun 2008 A1
20080154758 Schattmaier et al. Jun 2008 A1
20080162202 Khanna et al. Jul 2008 A1
20080162259 Patil et al. Jul 2008 A1
20080162383 Kraft Jul 2008 A1
20080167883 Thavildar Khazaneh Jul 2008 A1
20080175360 Schwarz et al. Jul 2008 A1
20080177655 Zalik Jul 2008 A1
20080177841 Sinn et al. Jul 2008 A1
20080189789 Lamontagne Aug 2008 A1
20080208548 Metzger et al. Aug 2008 A1
20080208610 Thomas et al. Aug 2008 A1
20080208726 Tsantes et al. Aug 2008 A1
20080217400 Portano Sep 2008 A1
20080228635 Megdal et al. Sep 2008 A1
20080243680 Megdal et al. Oct 2008 A1
20080244717 Jelatis et al. Oct 2008 A1
20080255922 Feldman et al. Oct 2008 A1
20080255992 Lin Oct 2008 A1
20080256613 Grover Oct 2008 A1
20080281737 Fajardo Nov 2008 A1
20080281743 Pettit Nov 2008 A1
20080288382 Smith et al. Nov 2008 A1
20080288430 Friedlander et al. Nov 2008 A1
20080288790 Wilson Nov 2008 A1
20080294540 Celka et al. Nov 2008 A1
20080294689 Metzger et al. Nov 2008 A1
20080296367 Parkinson Dec 2008 A1
20080296382 Connell, II et al. Dec 2008 A1
20080300877 Gilbert et al. Dec 2008 A1
20080319889 Hammad Dec 2008 A1
20090007220 Ormazabal et al. Jan 2009 A1
20090018934 Peng et al. Jan 2009 A1
20090021349 Errico et al. Jan 2009 A1
20090024417 Marks et al. Jan 2009 A1
20090024505 Patel et al. Jan 2009 A1
20090024636 Shiloh Jan 2009 A1
20090024663 McGovern Jan 2009 A1
20090026270 Connell, II et al. Jan 2009 A1
20090043637 Eder Feb 2009 A1
20090044279 Crawford et al. Feb 2009 A1
20090048957 Celano Feb 2009 A1
20090079539 Johnson Mar 2009 A1
20090094311 Awadallah et al. Apr 2009 A1
20090099960 Robida et al. Apr 2009 A1
20090106150 Pelegero et al. Apr 2009 A1
20090106153 Ezra Apr 2009 A1
20090106846 Dupray et al. Apr 2009 A1
20090112650 Iwane Apr 2009 A1
20090119106 Rajakumar et al. May 2009 A1
20090119299 Rhodes May 2009 A1
20090125369 Kloostra et al. May 2009 A1
20090125439 Zarikian et al. May 2009 A1
20090125463 Hido May 2009 A1
20090126013 Atwood et al. May 2009 A1
20090138391 Dudley et al. May 2009 A1
20090141318 Hughes Jun 2009 A1
20090151005 Bell et al. Jun 2009 A1
20090158404 Hahn et al. Jun 2009 A1
20090164380 Brown Jun 2009 A1
20090172815 Gu et al. Jul 2009 A1
20090182653 Zimiles Jul 2009 A1
20090199264 Lang Aug 2009 A1
20090205032 Hinton et al. Aug 2009 A1
20090206993 Di Mambro et al. Aug 2009 A1
20090216560 Siegel Aug 2009 A1
20090216747 Li et al. Aug 2009 A1
20090222308 Zoldi et al. Sep 2009 A1
20090222362 Stood et al. Sep 2009 A1
20090222373 Choudhuri et al. Sep 2009 A1
20090222374 Choudhuri et al. Sep 2009 A1
20090222375 Choudhuri et al. Sep 2009 A1
20090222376 Choudhuri et al. Sep 2009 A1
20090222377 Choudhuri et al. Sep 2009 A1
20090222378 Choudhuri et al. Sep 2009 A1
20090222379 Choudhuri et al. Sep 2009 A1
20090222380 Choudhuri et al. Sep 2009 A1
20090222897 Carow et al. Sep 2009 A1
20090224875 Rabinowitz et al. Sep 2009 A1
20090224889 Aggarwal et al. Sep 2009 A1
20090226056 Vlachos et al. Sep 2009 A1
20090234738 Britton et al. Sep 2009 A1
20090240609 Cho et al. Sep 2009 A1
20090241168 Readshaw Sep 2009 A1
20090241173 Troyansky Sep 2009 A1
20090248198 Siegel et al. Oct 2009 A1
20090248497 Hueter Oct 2009 A1
20090248567 Haggerty et al. Oct 2009 A1
20090248568 Haggerty et al. Oct 2009 A1
20090248569 Haggerty et al. Oct 2009 A1
20090248570 Haggerty et al. Oct 2009 A1
20090248571 Haggerty et al. Oct 2009 A1
20090248572 Haggerty et al. Oct 2009 A1
20090248573 Haggerty et al. Oct 2009 A1
20090254476 Sharma et al. Oct 2009 A1
20090254484 Forero et al. Oct 2009 A1
20090257595 de Cesare et al. Oct 2009 A1
20090259470 Chang Oct 2009 A1
20090259560 Bachenheimer Oct 2009 A1
20090259588 Lindsay Oct 2009 A1
20090259855 de Cesare et al. Oct 2009 A1
20090261189 Ellis, Jr. Oct 2009 A1
20090270126 Liu Oct 2009 A1
20090271265 Lay et al. Oct 2009 A1
20090271617 Song et al. Oct 2009 A1
20090272801 Connell, II et al. Nov 2009 A1
20090276244 Baldwin, Jr. et al. Nov 2009 A1
20090281945 Shakkarwar Nov 2009 A1
20090281951 Shakkarwar Nov 2009 A1
20090289110 Regen et al. Nov 2009 A1
20090300066 Guo et al. Dec 2009 A1
20090307778 Mardikar Dec 2009 A1
20090326972 Washington Dec 2009 A1
20090328173 Jakobson et al. Dec 2009 A1
20100024037 Grzymala-Busse et al. Jan 2010 A1
20100030677 Melik-Aslanian et al. Feb 2010 A1
20100031030 Kao et al. Feb 2010 A1
20100037147 Champion et al. Feb 2010 A1
20100037308 Lin et al. Feb 2010 A1
20100042526 Martinov Feb 2010 A1
20100043055 Baumgart Feb 2010 A1
20100070620 Awadallah et al. Mar 2010 A1
20100077006 El Emam et al. Mar 2010 A1
20100085146 Johnson Apr 2010 A1
20100088233 Tattan et al. Apr 2010 A1
20100088338 Pavoni, Jr. et al. Apr 2010 A1
20100094664 Bush et al. Apr 2010 A1
20100094767 Miltonberger Apr 2010 A1
20100094768 Miltonberger Apr 2010 A1
20100094910 Bayliss Apr 2010 A1
20100095357 Willis et al. Apr 2010 A1
20100100406 Lim Apr 2010 A1
20100100945 Ozzie et al. Apr 2010 A1
20100107225 Spencer et al. Apr 2010 A1
20100114724 Ghosh et al. May 2010 A1
20100114744 Gonen May 2010 A1
20100121767 Coulter et al. May 2010 A1
20100130172 Vendrow et al. May 2010 A1
20100131273 Aley-Raz et al. May 2010 A1
20100132043 Bjorn et al. May 2010 A1
20100145836 Baker et al. Jun 2010 A1
20100158207 Dhawan et al. Jun 2010 A1
20100169210 Bous et al. Jul 2010 A1
20100169947 Sarmah et al. Jul 2010 A1
20100188684 Kumara Jul 2010 A1
20100205662 Ibrahim et al. Aug 2010 A1
20100217837 Ansari et al. Aug 2010 A1
20100218255 Ritman et al. Aug 2010 A1
20100228649 Pettitt Sep 2010 A1
20100228657 Kagarlis Sep 2010 A1
20100229225 Sarmah et al. Sep 2010 A1
20100229230 Edeki et al. Sep 2010 A1
20100229245 Singhal Sep 2010 A1
20100241501 Marshall Sep 2010 A1
20100250364 Song et al. Sep 2010 A1
20100250411 Ogrodski Sep 2010 A1
20100250509 Andersen Sep 2010 A1
20100250955 Trevithick et al. Sep 2010 A1
20100268557 Faith et al. Oct 2010 A1
20100274679 Hammad Oct 2010 A1
20100275265 Fiske et al. Oct 2010 A1
20100280882 Faith et al. Nov 2010 A1
20100293090 Domenikos et al. Nov 2010 A1
20100293114 Khan et al. Nov 2010 A1
20100302157 Zilberman Dec 2010 A1
20100306101 Lefner et al. Dec 2010 A1
20100313273 Freas Dec 2010 A1
20100325035 Hilgers et al. Dec 2010 A1
20100325442 Petrone et al. Dec 2010 A1
20100332292 Anderson Dec 2010 A1
20100332362 Ramsey et al. Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110016042 Cho et al. Jan 2011 A1
20110040983 Grzymala-Busse et al. Feb 2011 A1
20110047071 Choudhuri et al. Feb 2011 A1
20110066547 Clark et al. Mar 2011 A1
20110082768 Eisen Apr 2011 A1
20110093383 Haggerty et al. Apr 2011 A1
20110112958 Haggerty et al. May 2011 A1
20110119291 Rice May 2011 A1
20110126024 Beatson et al. May 2011 A1
20110126275 Anderson et al. May 2011 A1
20110131123 Griffin et al. Jun 2011 A1
20110145899 Cao et al. Jun 2011 A1
20110166988 Coulter Jul 2011 A1
20110184838 Winters et al. Jul 2011 A1
20110184851 Megdal et al. Jul 2011 A1
20110196791 Dominguez Aug 2011 A1
20110238566 Santos Sep 2011 A1
20110260832 Ross et al. Oct 2011 A1
20110276496 Neville et al. Nov 2011 A1
20110282778 Wright et al. Nov 2011 A1
20110289032 Crooks et al. Nov 2011 A1
20110289322 Rasti Nov 2011 A1
20110295721 MacDonald Dec 2011 A1
20110295750 Rammal Dec 2011 A1
20110296529 Bhanoo et al. Dec 2011 A1
20110302412 Deng et al. Dec 2011 A1
20110302641 Hald et al. Dec 2011 A1
20120030079 Slater et al. Feb 2012 A1
20120030080 Slater et al. Feb 2012 A1
20120030083 Newman et al. Feb 2012 A1
20120030771 Pierson et al. Feb 2012 A1
20120036352 Tovar et al. Feb 2012 A1
20120066073 Dilip et al. Mar 2012 A1
20120101939 Kasower Apr 2012 A1
20120158574 Brunzell et al. Jun 2012 A1
20120158654 Behren et al. Jun 2012 A1
20120198556 Patel et al. Aug 2012 A1
20120215682 Lent et al. Aug 2012 A1
20120278227 Kolo et al. Nov 2012 A1
20120278249 Duggal et al. Nov 2012 A1
20120290660 Rao et al. Nov 2012 A1
20130004033 Trugenberger et al. Jan 2013 A1
20130132060 Badhe et al. May 2013 A1
20130185293 Boback Jul 2013 A1
20130218797 Prichard et al. Aug 2013 A1
20140007238 Magee et al. Jan 2014 A1
20140012716 Bucholz Jan 2014 A1
20140058910 Abeles Feb 2014 A1
20140149304 Bucholz et al. May 2014 A1
20140214636 Rajsky Jul 2014 A1
20140283097 Allen et al. Sep 2014 A1
20140304822 Sher-Jan et al. Oct 2014 A1
20150106260 Andrews et al. Apr 2015 A1
20150142595 Acuña-Rohter May 2015 A1
20150161529 Kondaji et al. Jun 2015 A1
20150186901 Miltonberger Jul 2015 A1
20150199784 Straub et al. Jul 2015 A1
20150205692 Seto Jul 2015 A1
20150295924 Gottschalk, Jr. Oct 2015 A1
20150348036 Nordyke et al. Dec 2015 A1
20150348208 Nordyke et al. Dec 2015 A1
20160012561 Lappenbusch et al. Jan 2016 A1
20160063278 Kraska et al. Mar 2016 A1
20160063645 Houseworth et al. Mar 2016 A1
20160071208 Straub et al. Mar 2016 A1
20160086262 Straub et al. Mar 2016 A1
20160142532 Bostick May 2016 A1
20160210450 Su Jul 2016 A1
20160328814 Prichard et al. Nov 2016 A1
20160344758 Cohen et al. Nov 2016 A1
20160379011 Koike et al. Dec 2016 A1
20170053369 Gottschalk, Jr. et al. Feb 2017 A1
20170099314 Klatt et al. Apr 2017 A1
20170177683 Koike et al. Jun 2017 A1
20170206376 Sher-Jan Jul 2017 A1
20170270629 Fitzgerald Sep 2017 A1
20170278182 Kasower Sep 2017 A1
20170287065 Samler et al. Oct 2017 A1
20170357971 Pitz et al. Dec 2017 A1
20170374076 Pierson et al. Dec 2017 A1
20180004978 Hebert et al. Jan 2018 A1
20180013786 Knopf Jan 2018 A1
20180033009 Goldman et al. Feb 2018 A1
20180130157 Gottschalk, Jr. et al. May 2018 A1
20180184288 De Lorenzo et al. Jun 2018 A1
20180322572 Straub et al. Nov 2018 A1
20190073676 Wang Mar 2019 A1
20190164173 Liu et al. May 2019 A1
20190228178 Sharma et al. Jul 2019 A1
20190266609 Phelan et al. Aug 2019 A1
20190294786 Villavicencio et al. Sep 2019 A1
20190311366 Zoldi et al. Oct 2019 A1
20190333101 Sohum et al. Oct 2019 A1
20190349351 Verma et al. Nov 2019 A1
20190377896 Spinelli et al. Dec 2019 A1
20200134629 Zoldi et al. Apr 2020 A1
20200143465 Chilaka et al. May 2020 A1
20200145436 Brown et al. May 2020 A1
20200151628 Zoldi et al. May 2020 A1
20200193018 Van Dyke Jun 2020 A1
20200242615 Chandra et al. Jul 2020 A1
20200273097 Nordyke et al. Aug 2020 A1
20200293684 Harris et al. Sep 2020 A1
20200380112 Allen Dec 2020 A1
20200396246 Zoldi et al. Dec 2020 A1
20210021631 Okutan et al. Jan 2021 A1
20210150532 Zhang et al. May 2021 A1
20210209230 Leitner et al. Jul 2021 A1
20210326785 McBurnett et al. Oct 2021 A1
20210372314 Weigl et al. Dec 2021 A1
20220038481 Jones Feb 2022 A1
20220046088 Knopf Feb 2022 A1
20220084032 Koehler et al. Mar 2022 A1
20220103589 Shen et al. Mar 2022 A1
20220123946 Knopf Apr 2022 A1
20220147817 Boardman et al. May 2022 A1
20220207324 Hamilton et al. Jun 2022 A1
20220231859 Knopf et al. Jul 2022 A1
20220277308 Ardizzi et al. Sep 2022 A1
20220321394 Huang et al. Oct 2022 A1
20220327541 Seguritan Oct 2022 A1
20220358516 Zoldi et al. Nov 2022 A1
20220368704 Brown et al. Nov 2022 A1
20220377096 Johnston et al. Nov 2022 A1
20220391793 Latimer et al. Dec 2022 A1
20220400087 Hoover et al. Dec 2022 A1
20220417275 Jones Dec 2022 A1
20230035336 Knopf Feb 2023 A1
20230046601 Hamilton et al. Feb 2023 A1
20230082708 Hoover et al. Mar 2023 A1
20230113118 Guo et al. Apr 2023 A1
20230196147 McBurnett et al. Jun 2023 A1
20230196455 Huber et al. Jun 2023 A1
20230205893 Gjorvad et al. Jun 2023 A1
20230216866 Monnig et al. Jul 2023 A1
20230229767 Galli et al. Jul 2023 A1
20230245246 Stack et al. Aug 2023 A1
Foreign Referenced Citations (58)
Number Date Country
2022291564 Jul 2023 AU
3 058 653 Apr 2020 CA
104877993 Sep 2015 CN
113011973 Jun 2021 CN
113011973 Jun 2021 CN
91 08 341 Oct 1991 DE
0 554 083 Aug 1993 EP
2 939 361 Oct 2019 EP
2 392 748 Mar 2004 GB
2 518 099 Mar 2015 GB
2011-134252 Jul 2011 JP
5191376 May 2013 JP
10-2004-0034063 Apr 2004 KR
I256569 Jun 2006 TW
WO 94006103 Mar 1994 WO
WO 96041488 Dec 1996 WO
WO 00055778 Sep 2000 WO
WO 00055789 Sep 2000 WO
WO 00055790 Sep 2000 WO
WO 01011522 Feb 2001 WO
WO 02027610 Apr 2002 WO
WO 02097563 Dec 2002 WO
WO 03071388 Aug 2003 WO
WO 02037219 May 2004 WO
WO 2004046882 Jun 2004 WO
WO 2006069199 Jun 2006 WO
WO 2007001394 Jan 2007 WO
WO 2007106393 Sep 2007 WO
WO 2008054403 May 2008 WO
WO 2008054849 May 2008 WO
WO 2008147918 Dec 2008 WO
WO 2009062111 May 2009 WO
WO 2009117518 Sep 2009 WO
WO 2011044036 Apr 2011 WO
WO 2012054646 Apr 2012 WO
WO 2012112781 Aug 2012 WO
WO 2013026343 Feb 2013 WO
WO 2013126281 Aug 2013 WO
WO 2014008079 Jan 2014 WO
WO 2014008247 Jan 2014 WO
WO 2014150987 Sep 2014 WO
WO 2015184006 Dec 2015 WO
WO 2018175440 Sep 2018 WO
WO 2018208770 Nov 2018 WO
WO 2019006272 Jan 2019 WO
WO 2019040443 Feb 2019 WO
WO 2019050864 Mar 2019 WO
WO 2019079071 Apr 2019 WO
WO 2019125445 Jun 2019 WO
WO 2019169000 Sep 2019 WO
WO 2022020162 Jan 2022 WO
WO 2022026273 Feb 2022 WO
WO 2022031412 Feb 2022 WO
WO 2022032285 Feb 2022 WO
WO 2022072989 Apr 2022 WO
WO 2022221202 Oct 2022 WO
WO 2023060150 Apr 2023 WO
WO 2023129977 Jul 2023 WO
Non-Patent Literature Citations (127)
Entry
Tax aggressiveness and accounting fraud C Lennox, P Lisowsky, J Pittman—Journal of Accounting . . . , 2013—Wiley Online Library (Year: 2013).
U.S. Appl. No. 14/928,770, U.S. Pat. No. 10,339,527, System and Architecture for Electronic Fraud Detection, filed Oct. 30, 2015.
U.S. Appl. No. 16/443,662, U.S. Pat. No. 10,990,979, System and Architecture for Electronic Fraud Detection, filed Jun. 17, 2019.
U.S. Appl. No. 17/208,327, U.S. Pat. No. 11,436,606, System and Architecture for Electronic Fraud Detection, filed Mar. 22, 2021.
U.S. Appl. No. 09/557,252, filed Apr. 24, 2000, Page.
U.S. Appl. No. 12/705,489, filed Feb. 12, 2010, Bargoli et al.
U.S. Appl. No. 12/705,511, filed Feb. 12, 2010, Bargoli et al.
“A New Approach to Fraud Solutions”, BasePoint Science Solving Fraud, pp. 8, 2006.
Aad et al., “NRC Data Collection and the Privacy by Design Principles”, IEEE, Nov. 2010, pp. 5.
Allard et al., “Safe Realization of the Generalization Privacy Mechanism”, 2011 Ninth Annual International Conference on Privacy, Security and Trust, pp. 8.
“Arizona Company Has Found Key in Stopping ID Theft,” PR Newswire, New York, Aug. 10, 2005 http://proquest.umi.com/pqdweb?did=880104711&SID=1&Fmt=3&clientId=19649&RQT=309&Vname=PQD.
ABC News Now:Money Matters, as broadcasted Nov. 15, 2005 with guest Todd Davis (CEO of Lifelock), pp. 6.
AlSalamah et al., “Security Risk Management in Online System”, 2017 5th International Conference on Applied Computing and Information Technology/4th International Conference on Computational Science/Intelligence and Applied Informatics/2nd International Conference on Big Data, Cloud Computing, Data Science & Engineering, 2017, pp. 119-124.
Anonymous, “Feedback”, Credit Management, ABI/Inform Global, Sep. 2006, pp. 6.
“Beverly Hills Man Convicted of Operating ‘Bust-Out’ Schemes that Caused More than $8 Million in Losses”, Department of Justice, Jul. 25, 2006, 2 Pgs.
Bielski, Lauren, “Will you Spend to Thwart ID Theft?” ABA Banking Journal, Apr. 2005, pp. 54, 56-57, 60.
BlueCava, “What We Do”, http://www.bluecava.com/what-we-do/, printed Nov. 5, 2012 in 3 pages.
“Bust-Out Schemes”, Visual Analytics Inc. Technical Product Support, Newsletter vol. 4, Issue 1, Jan. 2005, pp. 7.
Chores & Allowances, “Do Kids Have Credit Reports?” Oct. 15, 2007, http://choresandallowances.blogspot.com/2007/10/do-kids-have-credit-reports.html, pp. 5.
Cowie, Norman, “Warning Bells & ‘The Bust-Out’”, Business Credit, Jul. 1, 2000, pp. 5.
Cullen, Terri; “The Wall Street Journal Complete Identity Theft Guidebook:How to Protect Yourself from the Most Pervasive Crime in America”; Chapter 3, pp. 59-79; Jul. 10, 2007.
“Data Loss Prevention (DLP) Software”, http://www.symantec.com/data-loss-prevention/ printed Apr. 8, 2013 in 8 pages.
“Data Protection”, http://compliantprocessing.com/data-protection/ printed Apr. 8, 2013 in 4 pages.
Day, Jo and Kevin; “ID-ology: A Planner's Guide to Identity Theft”; Journal of Financial Planning:Tech Talk; pp. 36-38; Sep. 2004.
“Dealing with Measurement Noise (A Gentle Introduction to Noise Filtering)”, Chemical and Process Engineering, University of Newcastle Upon Tyne, https://web.archive.org/web/20000418021742/http://lorien.ncl.ac.uk/ming/filter/filewma.htm, Archived Apr. 18, 2000, pp. 3.
EFunds Corporation, “Data & Decisioning: Debit Report” printed Apr. 1, 2007, http://www.efunds.com/web/industry-solutions/financial-services/frm-debit-report/htm in 1 page.
El Kalam et al., “Personal Data Anonymization for Security and Privacy in Collaborative Environments”, 2005 IEEE, pp. 56-61.
Equifax; “Equifax Credit Watch”; https://web.archive.org/web/20070627135447/https://www.econsumer.equifax.co.uk/consumer/uk/sitepage.ehtml?forward=gb_esn_detail, dated Jun. 27, 2007 on www.archive.org in 2 pages.
Experian Team, “Impact on Credit Scores of Inquiries for an Auto Loan,” Ask Experian, Mar. 1, 2009, pp. 5.
“Fair Isaac Introduces Falcon One System to Combat Fraud at Every Customer Interaction”, Business Wire, May 5, 2005, pp. 3.
“Fair Isaac Offers New Fraud Tool”, National Mortgage News & Source Media, Inc., Jun. 13, 2005, pp. 2.
FamilySecure.com, “Frequently Asked Questions”, http://www.familysecure.com/FAQ.aspx as archived Jul. 15, 2007 in 3 pages.
FamilySecure.com; “Identity Theft Protection for the Whole Family | FamilySecure.com” http://www.familysecure.com/, as retrieved on Nov. 5, 2009.
“Fighting the New Face of Fraud”, FinanceTech, http://www.financetech.com/showArticle.jhtml?articleID=167100405, Aug. 2, 2005.
“FinExtra, Basepoint Analytics Introduces Predictive Technology for Mortgage Fraud”, Oct. 5, 2005, pp. 3.
Fisher, Joseph, “Access to Fair Credit Reports: Current Practices and Proposed Legislation,” American Business Law Journal, Fall 1981, vol. 19, No. 3, p. 319.
“Fraud Alert | Learn How”. Fight Identity Theft. http://www.fightidentitytheft.com/flag.html, accessed on Nov. 5, 2009.
Gaudio, David, “Intelligent Adaptive Authentication: How 6 Workflow Steps Improve Customer Experience”, OneSpan, https://www.onespan.com/blog/intelligent-adaptive-authentication-how-6-workflow-steps-improve-customer-experience, Jun. 22, 2020, pp. 6.
Gibbs, Adrienne; “Protecting Your Children from Identity Theft,” Nov. 25, 2008, http://www.creditcards.com/credit-card-news/identity-ID-theft-and-kids-children-1282.php, pp. 4.
“GLBA Compliance and FFIEC Compliance” http://www.trustwave.com/financial-services.php printed Apr. 8, 2013 in 1 page.
Gordon et al., “Identity Fraud: A Critical National and Global Threat,” LexisNexis, Oct. 28, 2003, pp. 1-48.
Haglund, Christoffer, “Two-Factor Authentication with a Mobile Phone”, Fox Technologies, Uppsala, Department of Information Technology, Nov. 2, 2007, pp. 62.
Herzberg, Amir, “Payments and Banking with Mobile Personal Devices,” Communications of the ACM, May 2003, vol. 46, No. 5, pp. 53-58.
“ID Analytics ID Network”, from www.idanalytics.com, as retrieved from www.archive.org, dated Nov. 20, 2005 or earlier; attached as “ID Network (IDN)”, pp. 8.
ID Cops, www.idcops.com; retrieved from www.archive.org any linkage Feb. 16, 2007.
ID Theft Assist, “Do You Know Where Your Child's Credit Is?”, Nov. 26, 2007, http://www.idtheftassist.com/pages/story14, pp. 3.
“ID Thieves These Days Want Your Number, Not Your Name”, The Columbus Dispatch, Columbus, Ohio, http://www.dispatch.com/content/stories/business/2014/08/03/id-thieves-these-days-want-your-numbre-not-your-name.html, Aug. 3, 2014 in 2 pages.
Identity Theft Resource Center; Fact Sheet 120 A—To Order a Credit Report for a Child; Fact Sheets, Victim Resources; Apr. 30, 2007.
“Identity Thieves Beware: Lifelock Introduces Nation's First Guaranteed Proactive Solution to Identity Theft Protection,” PR Newswire, New York, Jun. 13, 2005 http://proquest.umi.com/pqdweb?did=852869731&sid=1&Fmt=3&clientId=19649&RQT=309&Vname=PQD.
“Industry News, New Technology Identifies Mortgage Fraud: Basepoint Analytics Launches FraudMark”, Inman News, American Land Title Association, Oct. 5, 2005, pp. 1.
Information Brokers of America, “Information Brokers of America Child Identity Theft Protection” http://web.archive.org/web/20080706135451/http://iboainfo.com/child-order.html as archived Jul. 6, 2008 in 1 page.
Information Brokers of America, “Safeguard Your Child's Credit”, http://web.archive.org/web/20071215210406/http://www.iboainfo.com/child-id-protect.html as archived Dec. 15, 2007 in 1 page.
“Intersections, Inc. Identity Guard”, from www.intersections.com and www.identityguard.com, as retrieved from Internet Archive, dated Nov. 25, 2005 or earlier; attached as “Identity Guard (IDG)”, pp. 7.
Iovation, Device Identification & Device Fingerprinting, http://www.iovation.com/risk-management/device-identification printed Nov. 5, 2012 in 6 pages.
Jacob et al., A Case Study of Checking Account Inquiries and Closures in Chicago, The Center for Financial Services Innovation, Nov. 2006.
Jaeger, Herbert, “A Tutorial on Training Recurrent Neural Networks, Covering BPPT, RTRL, EKF and the ‘Echo State Network’ Approach”, Fraunhofer Institute for Autonomous Intelligent Systems (AIS), International University Bremen, Oct. 2002, pp. 46.
Jin et al., “Network Security Risks in Online Banking”, 2005 International Conference on Wireless Communications, Networking and Mobile Computing, Jan. 2005, vol. 2, pp. 1229-1234.
Karlan et al., “Observing Unobservables:Identifying Information Asymmetries with a Consumer Credit Field Experiment”, Jun. 17, 2006, pp. 58, http://aida.econ.yale.edu/karlan/papers/ObservingUnobservables.KarlanZinman.pdf.
Khan, Muhammad Khurram, PhD., “An Efficient and Secure Remote Mutual Authentication Scheme with Smart Cards” IEEE International Symposium on Biometrics & Security Technologies (ISBAST), Apr. 23-24, 2008, pp. 1-6.
Lamons, Bob, “Be Smart: Offer Inquiry Qualification Services,” Marketing News, ABI/Inform Global, Nov. 6, 1995, vol. 29, No. 23, pp. 13.
Lee, Timothy B., “How America's Broken Tax System Makes Identity Theft Easy”, http://www.vox.com/2014/4/14/5608072/how-americas-broken-tax-system-makes-identity-theft-easy, Apr. 14, 2014, pp. 10.
Lee, W.A.; “Experian, on Deal Hunt, Nets Identity Theft Insurer”, American Banker: The Financial Services Daily, Jun. 4, 2003, New York, NY, 1 page.
Lefebvre et al., “A Robust Soft Hash Algorithm for Digital Image Signature”, International Conference on Image Processing 2:11 (ICIP), vol. 3, Oct. 2003, pp. 495-498.
Lennox et al., “Tax Aggressiveness and Accounting Fraud”, Journal of Accounting Research, 2013, pp. 40.
LifeLock, “How LifeLock Works,” http://www.lifelock.com/lifelock-for-people printed Mar. 14, 2008 in 1 page.
LifeLock, “LifeLock Launches First ID Theft Prevention Program for the Protection of Children,” Press Release, Oct. 14, 2005, http://www.lifelock.com/about-us/press-room/2005-press-releases/lifelock-protection-for-children.
LifeLock; “How Can LifeLock Protect My Kids and Family?” http://www.lifelock.com/lifelock-for-people/how-we-do-it/how-can-lifelock-protect-my-kids-and-family printed Mar. 14, 2008 in 1 page.
LifeLock, “Personal Identity Theft Protection & Identity Theft Products,” http://www.lifelock.com/lifelock-for-people, accessed Nov. 5, 2007.
LifeLock, Various Pages, www.lifelock.com/, Jan. 9, 2007, pp. 49.
My Call Credit http://www.mycallcredit.com/products.asp?product=ALR dated Dec. 10, 2005 on www.archive.org.
My Call Credit http://www.mycallcredit.com/rewrite.asp?display=faq dated Dec. 10, 2005 on www.archive.org.
MyReceipts, http://www.myreceipts.com/, printed Oct. 16, 2012 in 1 page.
MyReceipts—How it Works, http://www.myreceipts.com/howItWorks.do, printed Oct. 16, 2012 in 1 page.
National Alert Registry Launches RegisteredOffendersList.org to Provide Information on Registered Sex Offenders, May 16, 2005, pp. 2, http://www.prweb.com/printer/240437.htm accessed on Oct. 18, 2011.
National Alert Registry Offers Free Child Safety “Safe From Harm” DVD and Child Identification Kit, Oct. 24, 2006. pp. 2, http://www.prleap.com/pr/53170 accessed on Oct. 18, 2011.
National Alert Registry website titled, “Does a sexual offender live in your neighborhood”, Oct. 22, 2006, pp. 2, http://web.archive.org/wb/20061022204835/http://www.nationallertregistry.com/ accessed on Oct. 13, 2011.
Ogg, Erica, “Apple Cracks Down on UDID Use”, http://gigaom.com/apple/apple-cracks-down-on-udid-use/ printed Nov. 5, 2012 in 5 Pages.
Organizing Maniac's Blog—Online Receipts Provided by MyQuickReceipts.com, http://organizingmaniacs.wordpress.com/2011/01/12/online-receipts-provided-by-myquickreceipts.com/ dated Jan. 12, 2011 printed Oct. 16, 2012 in 3 pages.
Pagano, et al., “Information Sharing in Credit Markets,” Dec. 1993, The Journal of Finance, vol. 48, No. 5, pp. 1693-1718.
Partnoy, Frank, Rethinking Regulation of Credit Rating Agencies: An Institutional Investor Perspective, Council of Institutional Investors, Apr. 2009, pp. 21.
Planet Receipt—Home, http://www.planetreceipt.com/home printed Oct. 16, 2012 in 1 page.
Planet Receipt—Solutions & Features, http://www.planetreceipt.com/solutions-features printed Oct. 16, 2012 in 2 pages.
Press Release—“Helping Families Protect Against Identity Theft—Experian Announces FamilySecure.com; Parents and guardians are alerted for signs of potential identity theft for them and their children; product features an industry-leading $2 million guarantee”; PR Newswire; Irvine, CA; Oct. 1, 2007.
Privacy Rights Clearinghouse, “Identity Theft: What to do if it Happens to You,” http://web.archive.org/web/19990218180542/http://privacyrights.org/fs/fs17a.htm printed Feb. 18, 1999.
Quinn, Tom, “Low Credit Inquiries Affect Your Credit Score”, Credit.com, May 2, 2011, pp. 2.
“Recurrent Neural Network”, as downloaded from wikipedia.org <https://en.wikipedia.org/wiki/Recurrent_neural_network?oldid=717224329>, Apr. 2016, pp. 8.
Ribeiro et al., “Privacy Protection with Pseudonumization and Anonumization In a Health IoT System”, Results from OCARIoT, 2019 IEEE, pp. 904-908.
Rivera, Barbara, “New Tools for Combating Income Tax Refund Fraud”, https://gcn.com/Articles/2014/05/08/Insight-tax-fraud-tools.aspx?Page=1, May 8, 2014, pp. 3.
Scholastic Inc.:Parent's Request for Information http://web.archive.org/web/20070210091055/http://www.scholastic.com/inforequest/index.htm as archived Feb. 10, 2007 in 1 page.
Scholastic Inc.:Privacy Policy http://web.archive.org/web/20070127214753/http://www.scholastic.com/privacy.htm as archived Jan. 27, 2007 in 3 pages.
ShoeBoxed, https://www.shoeboxed.com/sbx-home/ printed Oct. 16, 2012 in 4 pages.
Singletary, Michelle, “The Littlest Victims of ID Theft”, The Washington Post, The Color of Money, Oct. 4, 2007.
Sumner, Anthony, “Tackling the Issue of Bust-Out Fraud”, Retail Banker International, Jul. 24, 2007, pp. 4.
Sumner, Anthony, “Tackling the Issue of Bust-Out Fraud”, Experian: Decision Analytics, Dec. 18, 2007, pp. 24.
Sumner, Anthony, “Tackling the Issue of Bust-Out Fraud”, e-News, Experian: Decision Analytics, pp. 4, [Originally Published in Retail Banker International Magazine Jul. 24, 2007].
“The Return Review: Program Increases Fraud Detection; However, Full Retirement of the Electronic Fraud Detection System Will be Delayed”, Treasury Inspector General for Tax Administration, Sep. 25, 2017, Reference No. 2017-20-080, pp. 27.
TheMorningCall.Com, “Cheap Ways to Foil Identity Theft,” www.mcall.com/business/columnists/all-karp.5920748jul01,0 . . . , published Jul. 1, 2007.
Torgler, Benno, “What Do We Know about Tax Fraud ?: An Overview of Recent Developments”, Social Research: An International Quarterly, vol. 74, No. 4, Winter 2008, pp. 1239-1270.
“TransUnion—Child Identity Theft Inquiry”, TransUnion, http://www.transunion.com/corporate/personal/fraudIdentityTheft/fraudPrevention/childIDInquiry.page as printed Nov. 5, 2009 in 4 pages.
Truston, “Checking if your Child is an ID Theft Victim can be Stressful,” as posted by Michelle Pastor on Jan. 22, 2007 at http://www.mytruston.com/blog/credit/checking_if_your_child_is_an_id_theft_vi.html.
Vamosi, Robert, “How to Handle ID Fraud's Youngest Victims,” Nov. 21, 2008, http://news.cnet.com/8301-10789_3-10105303-57.html.
Webpage printed out from http://www.jpmorgan.com/cm/ContentServer?c=TS_Content&pagename=jpmorgan%2Fts%2FTS_Content%2FGeneral&cid=1139403950394 on Mar. 20, 2008, Feb. 13, 2006, New York, NY.
Wilson, Andrea, “Escaping the Alcatraz of Collections and Charge-Offs”, http://www.transactionworld.net/articles/2003/october/riskMgmt1.asp, Oct. 2003.
International Search Report and Written Opinion for Application No. PCT/US2007/06070, dated Nov. 10, 2008.
International Search Report and Written Opinion for Application No. PCT/US2008/064594, dated Oct. 30, 2008.
International Search Report and Written Opinion for Application No. PCT/US09/37565, dated May 12, 2009.
Official Communication in Australian Patent Application No. 2012217565, dated May 12, 2017.
Official Communication in Australian Patent Application No. 2017203586, dated Jun. 18, 2019.
Official Communication in Australian Patent Application No. 2019279982, dated Dec. 19, 2019.
Official Communication in Canadian Patent Application No. 2,827,478, dated Jun. 29, 2017.
Official Communication in Canadian Patent Application No. 2,827,478, dated May 31, 2018.
Official Communication in Canadian Patent Application No. 2,827,478, dated Mar. 27, 2019.
Extended European Search Report for Application No. EP12747205, dated Sep. 25, 2014.
Supplementary European Search Report for Application No. EP12747205, dated Jun. 19, 2015.
Extended European Search Report for Application No. EP18748000, dated Dec. 13, 2018.
International Search Report and Written Opinion for Application No. PCT/US2012/025456, dated May 21, 2012.
International Preliminary Report on Patentability in Application No. PCT/US2012/025456, dated Aug. 21, 2013.
International Search Report and Written Opinion for Application No. PCT/US2011/033940, dated Aug. 22, 2011.
Aïmeur et al., “The Scourge of Internet Personal Data Collection”, 2013 International Conference on Availability, Reliability and Security, pp. 821-828.
Dimopoulou et al., “Mobile Anonymization and Pseudonymization of Structured Health Data for Research,” 2022 Seventh International Conference on Mobile and Secure Services (MobiSecServ), 2022, pp. 1-6.
El Haddad et al., “Exploring User Behavior and Cybersecurity Knowledge—An Experimental Study in Online Shopping”, 2018 16th Annual Conference on Privacy, Security and Trust (PST), pp. 10.
Hu et al. “Robust Support Vector Machines for Anomaly Detection in Computer Security”, 2003, Proceedings of the 2003 International Conference on Machine Learning and Applications, pp. 7.
International Search Report and Written Opinion for Application No. PCT/US2022/024277, dated Jul. 18, 2022.
Kundu et al., “BLAST-SSAHA Hybridization for Credit Card Fraud Detection”, Oct.-Dec. 2009, IEEE Transactions on Dependable and Secure Computing, vol. 6, No. 4, pp. 309-315.
Pervushin et al., “Determination of loss of information during data anonymization procedure,” 2016 IEEE 10th International Conference on Application of Information and Communication Technologies (AICT), 2016, pp. 1-5.
Sun et al., “Enhancing Security Using Mobility-Based Anomaly Detection in Cellular Mobile Networks”, Jul. 2006, IEEE Transactions on Vehicular Technology, vol. 55, No. 4, pp. 1385-1396.
Trivedi et al., “Parallel Data Stream Anonymization Methods: A Review,” 2022 Second International Conference on Artificial Intelligence and Smart Energy (ICAIS), 2022, pp. 887-891.
Provisional Applications (1)
Number Date Country
62073714 Oct 2014 US
Continuations (3)
Number Date Country
Parent 17208327 Mar 2021 US
Child 17814789 US
Parent 16443662 Jun 2019 US
Child 17208327 US
Parent 14928770 Oct 2015 US
Child 16443662 US