System and methods for identifying compromised personally identifiable information on the internet

Information

  • Patent Grant
  • 9710868
  • Patent Number
    9,710,868
  • Date Filed
    Wednesday, November 2, 2016
    8 years ago
  • Date Issued
    Tuesday, July 18, 2017
    7 years ago
Abstract
In one embodiment, a method includes generating, by a computer system, a search-engine query from stored identity-theft nomenclature. The method also includes querying, by the computer system, at least one search engine via the search-engine query. Further, the method includes crawling, by the computer system, at least one computer-network resource identified via the querying. In addition, the method includes collecting, by the computer system, identity-theft information from the at least one computer-network resource. Additionally, the method includes processing, by the computer system, the identity-theft information for compromised personally-identifying information (PII).
Description
BACKGROUND

Technical Field


The present invention relates generally to the field of identity theft and more specifically, but not by way of limitation, to data mining of personally-identifying information found on the Internet.


History of Related Art


Identity theft is a mounting concern in commercial transactions. This is particularly true in remote commercial transactions such as, for example, Internet-shopping transactions, that involve little or no direct personal contact between a consumer and a goods or services provider (GSP). It is commonplace for personally-identifying information (PII) to be compromised and utilized for identity theft such as, for example, in a remote commercial transaction. PII, as used herein, refers to information that can be used to uniquely identify, contact, or locate an individual person or can be used with other sources to uniquely identify, contact, or locate an individual person. PII may include, but is not limited to, social security numbers (SSN), bank or credit card account numbers, passwords, birth dates, and addresses. PII that has been obtained by or made available to a third party without proper authorization is referred to herein as compromised PII.


PII can be compromised in a myriad of ways. For example, record keeping for entities such as, for example, healthcare, governmental, financial, and educational institutions, is increasingly and sometimes exclusively electronic. Electronic record keeping introduces new risks for which the entities are frequently ill-equipped to handle. For example, PII is often compromised via stolen hardware, inadequate security procedures, security breaches, or employee carelessness or misconduct.


Another way that PII is frequently compromised is via “phishing.” Phishing is the process of attempting to acquire PII by masquerading as a trustworthy entity in an electronic communication. A common example of phishing is a fraudulent email that is made to appear as though it originates from a valid source such as, for example, a national bank. The fraudulent email may incorporate a uniform resource locator (URL) that re-directs its audience to a false website that appears to be a legitimate website for the valid source. In actuality, the false website may be a front for stealing PII as part of a spurious transaction. For example, the false website may request “confirmation” of PII such as, for example, a credit card number or a username and password. The PII may then be stored for later improper use such as, for example, identity theft in a remote commercial transaction.


At least 182,395 instances of phishing were recorded during 2009, as reported by antiphishing.org. This is a forty-two percent increase over a number recorded in 2008. More than 10,745 malicious domains were registered in 2009, which is an increase of fifty-two percent over 2008. Sometimes, a misleading link such as, for example, the URL for the false website described above, may actually originate from a legitimate website but cause traffic to be redirected to an illegitimate website. This type of scam is known as “pharming.”


Legislation to curb efforts to compromise PII are largely ineffective. For example, phishing and pharming activities originate from areas around the globe and are thus often protected from prosecution by a particular jurisdiction. Additionally, once PII is compromised, distribution of the compromised PII may be difficult or impossible to prevent. Web sites and forums dedicated to exchanging compromised PII are increasing rapidly in number. Some of these web sites and forums exchange compromised PII though email or secure direct uploads and downloads.


Identity theft resulting from compromised PII is costly to victims and companies alike. The Identity Fraud Survey Report created by Javelin Strategy & Research reported that in 2009 victims averaged a personal cost of $373 and 21 hours of time to resolve identity-theft issues. The annual cost of identity theft currently exceeds $200 billion worldwide. In addition, as a result of new legislation and litigation resulting from compromised PII, companies stand to suffer from lower profit margins, damaged credibility due to negative customer experiences, and eroded brand value. Identity theft also looms as a threat to the advancement of promising consumer-driven, self-service, and cost-savings technologies.


SUMMARY OF THE INVENTION

In one embodiment, a method includes generating, by a computer system, a search-engine query from stored identity-theft nomenclature. The method also includes querying, by the computer system, at least one search engine via the search-engine query. Further, the method includes crawling, by the computer system, at least one computer-network resource identified via the querying. In addition, the method includes collecting, by the computer system, identity-theft information from the at least one computer-network resource. Additionally, the method includes processing, by the computer system, the identity-theft information for compromised personally-identifying information (PII).


In one embodiment, a computer-program product includes a computer-usable medium having computer-readable program code embodied therein. The computer-readable program code is adapted to be executed to implement a method. The method includes generating, by a computer system, a search-engine query from stored identity-theft nomenclature. The method also includes querying, by the computer system, at least one search engine via the search-engine query. Further, the method includes crawling, by the computer system, at least one computer-network resource identified via the querying. In addition, the method includes collecting, by the computer system, identity-theft information from the at least one computer-network resource. Additionally, the method includes processing, by the computer system, the identity-theft information for compromised personally-identifying information (PII).





BRIEF DESCRIPTION OF THE DRAWINGS

A more complete understanding of the method and apparatus of the present invention may be obtained by reference to the following Detailed Description when taken in conjunction with the accompanying Drawings wherein:



FIG. 1 illustrates a process of identifying compromised PII on the Internet;



FIG. 2 illustrates a process of data mining for compromised PII using a PII Web Searcher;



FIG. 3 illustrates a process of data mining for compromised PII using an Internet Relay Chat Robot (IRC Bot);



FIG. 4 illustrates a process of chat room, nomenclature and website discovery;



FIG. 4A illustrates a process of nomenclature and website discovery;



FIG. 4B illustrates a process of chat-room discovery;



FIG. 5 illustrates a system that may be utilized to facilitate acquisition and utilization of identity-theft information; and



FIG. 6 illustrates an embodiment of a computer system on which various embodiments of the invention may be implemented.





DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS OF THE INVENTION

Although various embodiments of the method and apparatus of the present invention have been illustrated in the accompanying Drawings and described in the foregoing Detailed Description, it will be understood that the invention is not limited to the embodiments disclosed, but is capable of numerous rearrangements, modifications and substitutions without departing from the spirit of the invention as set forth herein.



FIG. 1 depicts an illustrative flow 1000 for identifying, analyzing, and reporting compromised PII on a computer network such as, for example, the Internet. In a typical embodiment, the flow 1000 may be initiated by one or both of a PII Web Searcher (PWS) 100 and an Internet Relay Chat Robot (IRC bot) 101. One of ordinary skill in the art will appreciate that the PWS 100 and the IRC bot 101 are illustrative in nature and that, in various embodiments, the flow 1000 may be initiated via other types of components that are operable to collect identity-theft information.


As used herein, identity theft generally involves a use of PII that is not authorized by an owner of the PII. Identity theft may include, for example, an unauthorized change to PII or an unauthorized use of PII to access resources or to obtain credit or other benefits. Identity-theft information, as used herein, includes any information that may be used to facilitate discovery or prevention of identity theft. Identity-theft information may include, for example, compromised PII and information related to where or how compromised PII may be found. Identity-theft nomenclature, as used herein, refers to words, phrases, nicknames, numbers, and the like that are determined to be suggestive of identity-theft information or identity theft. In various embodiments, identity-theft may include nomenclature for multiple languages (e.g., English and non-English words).


In various embodiments, the flow 1000 may be initiated via the PWS 100. The PWS 100 may utilize, for example, search engines, web spiders, and keyword-matching features. In a typical embodiment, the search engines and the web spiders may be utilized to collect identity-theft information such as, for example, potential sources of compromised PII. The potential sources of compromised PII may include, for example, websites and forums that facilitate exchange of compromised PII (e.g., by identity thieves). Further, keyword-matching features may be leveraged to analyze the potential sources of identity-theft information using, for example, identity-theft nomenclature. Additionally, the PWS 100 is generally operable to identify and collect other identity-theft information such as, for example, compromised PII, uniform resource locators (URLs), and references to IRC chat rooms (i.e., channels). An illustrative embodiment of the PWS 100 will be described with respect to FIG. 2.


In various embodiments, the flow 1000 may be initiated via the IRC bot 101. Oftentimes, compromised PII is exchanged via chat rooms (e.g., between identity thieves on IRC channels). In a typical embodiment, the IRC bot 101 is operable to crawl the Internet in search of chat rooms (e.g., IRC channels) that are frequented by identity thieves. In a typical embodiment, the IRC bot 101 is operable to monitor such chat rooms for identity-theft nomenclature. Furthermore, the IRC bot 101 is typically operable to identify and collect compromised PII, URLs, references to other IRC chat rooms, and other identity-theft information from such chat rooms. Illustrative embodiments of the IRC bot 101 will be described with respect to FIGS. 3, 4, 4A, and 4B.


Oftentimes, if a particular user in a chat room is inactive for a certain period of time, the particular user may be timed out either automatically or by an administrator. In a typical embodiment, the IRC bot 101 may invoke auto-banning features that are operable to maintain an active status and thereby prevent time-out. The auto-banning features may involve simulating a human chat. For example, the auto-banning features may initiate a chat via a generic greeting, reproduce a single word from a monitored conversation, and the like. In a typical embodiment, the simulation of human chat may additionally cause an identity thief to reveal additional identity-theft information such as, for example, compromised PII or a URL to a potential source for compromised PII.


In various embodiments, the IRC bot 101 and the PWS 100 may operate collaboratively in the flow 1000. For example, the IRC bot 101 may provide identity-theft nomenclature such as email addresses, nicknames, and other information that may be used by an identity thief. The IRC bot 101 may further provide, for example, URLs to potential sources of compromised PII. In a typical embodiment, the PWS 100 may crawl the URLs provided by the IRC bot 101 and scan for identity-theft information. The PWS 100 may also search and crawl the Internet using the identity-theft nomenclature provided by the IRC bot 101. In a similar manner, the PWS 100 may discover and send identity-theft information such as, for example, chat rooms, to the IRC bot 101. In a typical embodiment, the IRC bot 101 may monitor the chat rooms provided by the PWS 100.


After identity-theft information is collected by the IRC bot 101 and the PWS 100, the collected identity-theft information may be processed at step 103. In a typical embodiment, the processing of the collected identity-theft information may include an extraction process, a validation process, and a normalization process. In various embodiments, the PWS 100 and the IRC bot 101 may yield extensive amounts of identity-theft information that includes, for example, webpage segments, IRC logs, text files, and the like. In a typical embodiment, the extraction process and the validation process operate to intelligently reduce an amount of the collected identity-theft information that is stored and utilized in subsequent steps of the flow 1000. In a typical embodiment, the normalization process ensures that the identity-theft information is stored efficiently and effectively.


In a typical embodiment, as part of the extraction process, the collected identity-theft information may be processed for compromised PII by one or more parsers that recognize common formats for PII. For example, a parser may identify token-separated data (e.g., tab-delimited data). Similarly, a parser may determine a column type for columns lacking a column header, for example, by analyzing data that is present in particular columns (e.g., recognizing a list of text strings as email addresses). Furthermore, a parser may identify multi-line labeled data such as, for example, “first name: John,” and various other labels that may be associated with compromised PII (e.g., recognizing “ccn,” “cc” or “credit card” as possible labels for credit-card information). Additionally, by way of further example, a parser may identify identity-theft information taken from encodings that may be present on cards such as, for example, credit cards, driver's licenses, and the like. The encodings may include, for example, track 1 and track 2 magnetic-stripe data.


Additionally, as part of the extraction process, rules may be enforced that require groups of fields to be present in particular compromised PII before allowing the particular compromised NI to be recorded. In a typical embodiment, the requirement that groups of fields be present has the benefit of reducing “false positives” within compromised PII. False positives may be considered elements of compromised PII that are not deemed to be sufficiently private or sufficiently important to merit recordation. In a typical embodiment, false positives may be removed from the collected identity-theft information. For example, an email address that is not accompanied by a password may be considered a false positive and not recorded. In a typical embodiment, a rule may be established that requires, for example, a username or email address to be accompanied by a password in order to be recorded.


In a typical embodiment, the validation process involves analyzing a source of the collected identity-theft information such as, for example, compromised PII, and determining if any elements of the compromised PII are false positives. For example, in a typical embodiment, genealogy websites, phone/address lookup websites, and website log files are common sources of false positives. Compromised PII that is mined from such websites, in a typical embodiment, may be considered false positives and removed from the collected identity-theft information. Conversely, compromised PII mined, for example, from known hacker websites and websites replete with identity-theft nomenclature, in a typical embodiment, may be protected from identification as false positives.


In a typical embodiment, the normalization process ensures that the collected identity-theft information such as, for example, compromised PII, is stored according to a standardized format. For example, standardized data structures and attributes may be established for names, credit-card numbers, and the like. In a typical embodiment, the normalization process facilitates matching, for example, elements of compromised PII to particular individuals to whom the elements correspond. In that way, reports and alerts based on the compromised PII may be more efficiently and more accurately generated. In a typical embodiment, after the extraction process, the validation process, and the normalization process, the collected identity-theft information is recorded in a database at step 104.


At step 105, in a typical embodiment, alerts and reports may be delivered based on, for example, compromised PII that is stored in the database at step 104. In some embodiments, the recordation of any elements of compromised PII at step 104 merits delivery of an alert to an individual to whom the elements correspond. In other embodiments, an individual may only be delivered an alert if, for example, certain elements or combinations of elements are discovered and recorded (e.g., credit-card information or social-security-number). In a typical embodiment, a particular individual may be able to pre-specify an alert-delivery method (e.g., email, telephone, etc.). After step 105, the flow 1000 ends.



FIG. 2 illustrates a flow 2000 for mining compromised PII via a PWS 200. In a typical embodiment, the PWS 200 is similar to the PWS 100 of FIG. 1. The PWS 200 typically accesses a database 203 that includes identity-theft nomenclature and identity-theft websites. Identity-theft theft websites are websites that have been identified via, for example, identity-theft nomenclature, to be possible sources of compromised PII. The database 203 is typically populated with identity-theft websites and identity-theft nomenclature via a discovery process 204. Illustrative embodiments of the discovery process 204 will be described in further detail with respect to FIG. 4A.


In a typical embodiment, the PWS 200 receives identity-theft nomenclature and identity-theft websites as input from the database 203. The PWS 200 typically queries search engines 201 via keywords from the identity-theft nomenclature. Additionally, the PWS 200 typically crawls websites 202 and scans the websites 202 for the identity-theft nomenclature. In a typical embodiment, the websites 202 include the identity-theft websites received as input from the database 203 and websites identified via queries to the search engines 201. At step 206, compromised PII collected by the PWS 200 may be processed at a processing step 206 in a manner similar to that described with respect to step 103 of FIG. 1.


As new websites and identity-theft nomenclature are added to the database 203 via, for example, the discovery process 204, the database 203 may be optimized via a performance-analysis process 205. In the performance-analysis 205, the identity-theft nomenclature is typically ranked according to a relative significance of compromised PII that is gleaned thereby. In a typical embodiment, the database 203 maintains, for each element of the identity-theft nomenclature, historical information related to compromised PII obtained via that element. In a typical embodiment, each element of the identity-theft nomenclature may be ranked, for example, according to an amount and/or a quality of the compromised PII obtained via that element.


The quality of the compromised PII may be determined, for example, by assigning weights based on a degree of sensitivity of particular elements of compromised PII. For example, in various embodiments, credit-card information and social security numbers may be assigned higher weights than, for example, website account information. In various embodiments, the amount of compromised PII may be, for example, an overall amount of compromised PII historically obtained via particular identity-theft nomenclature. Further, in various embodiments, the amount of compromised PII may be, for example, an amount of PII obtained via particular identity-theft nomenclature in a defined period of time. For example, in some embodiments, it may be advantageous to consider an amount of compromised PII obtained via particular identity-theft nomenclature within the last thirty days.


In a typical embodiment, a score may be computed for each element of identity-theft nomenclature based on, for example, an amount and/or a quality of the compromised PII that is gleaned thereby. In a typical embodiment, a scoring formula for generating the score is configurable. For example, weighting factors may be assigned to the amount and/or the quality of the compromised PII. In that way, greater or less weight may be assigned to the amount and/or the quality of the compromised PII, as may be desired for particular applications. Once scores are generated for each element of the identity-theft nomenclature, the identity-theft nomenclature may be ranked based on the scores.


In a typical embodiment, the PWS 200 may query the search engines 201 via keywords from the ranked identity-theft nomenclature in order to yield, for example, URLs to additional websites. The additional websites may be stored in the database 203. In a typical embodiment, the PWS 200 may crawl and scan the additional websites in a manner similar to that described above with regard to the websites 202. Further, compromised PII collected by the PWS 200 may be processed at a processing step 206 in a manner similar to that described with respect to step 103 of FIG. 1. After the performance-analysis process 205, the flow 2000 ends.



FIG. 3 illustrates a flow 3000 for compiling databases of compromised PII via an IRC bot. The flow 3000 begins via a chat-room-discovery process 300. During the chat-room-discover process 300, a database 301 is populated. The database 301, in a typical embodiment, includes URLs, for example, to IRC networks and channels likely to relate to identity theft. An illustrative embodiment of the chat-room-discovery process 300 will be described in more detail with respect to FIG. 4B.


In a typical embodiment, an IRC bot 302 receives URLs for IRC networks 303 as input from the database 301. The IRC bot 302 is generally similar to the IRC bot 101 of FIG. 1. The IRC bot 302 typically scans the IRC networks 303 for identity-theft information such as, for example, compromised PII. In a typical embodiment, the IRC bot 302 invokes one or more auto-banning features 304 in order to prevent being timed out on a particular IRC network due to inactivity. For example, the IRC bot 304 may simulate human interaction by interjecting text. In a typical embodiment, the IRC bot 304 is further operable to change Internet Protocol (IP) addresses in order explore IRC networks and chat rooms with efficiency.


Any compromised PII that is found by the IRC bot 302 is typically logged into an IRC log database 305. After being logged, in a typical embodiment, the compromised PII is processed at a processing step 306 in a manner similar to that described with respect to step 103 of FIG. 1. After the processing step 306, the flow 3000 ends.



FIG. 4 depicts an illustrative flow 4000 for chat room and website discovery. In particular, the flow 4000 illustrates interactions between an IRC bot 400, a chat-room-discovery process 405, a nomenclature-and-website discovery process 404, a dialog-extraction process 402, an IRC log database 401, and an IRC dialog database 403. The IRC bot 400 is generally operable to scan chat rooms on IRC networks for compromised PII. In a typical embodiment, the IRC bot 400 is similar to the IRC bot 101 of FIG. 1 and the IRC bot 302 of FIG. 3.


After the chat rooms are scanned by the IRC bot 400 as described with respect to FIGS. 1 and 3, identity-theft information such as, for example, compromised PII, is typically logged into the IRC log database 401 as an IRC log. In a typical embodiment, the dialog-extraction process 402 is applied to the IRC log. The dialog-extraction process 402 is typically similar to the extraction process described with respect to step 103 of FIG. 1. In a typical embodiment, compromised PII that is extracted as part of the dialog-extraction process is stored in the IRC dialog database 403. In a typical embodiment, automated spam postings can be distinguished and separated from other dialog.


In a typical embodiment, the IRC log stored in the IRC log database 401 and the extracted compromised PII stored in the IRC dialog database 403 may be provided as inputs to the nomenclature-and-website discover process 404. In a typical embodiment, the nomenclature-and-website discover process 404 discovers new websites and identity-theft nomenclature that may be utilized, for example, by the IRC bot 400, to acquire additional identity-theft information. An illustrative embodiment of the nomenclature-and-website discovery process 404 will be described in more detail with respect to FIG. 4A.


In a typical embodiment, the IRC log stored in the IRC log database 401 may be provided as input to the chat-room-discovery process 405. Although not illustrated, in various embodiments, the extracted compromised PII stored in the IRC dialog database 403 may also be provided as input to the chat-room-discovery process 405. In a typical embodiment, the chat-room-discovery process 405 analyzes the IRC log in order to identify, for example, references to new chat rooms on IRC networks that may be sources of compromised PII. An illustrative embodiment of the chat-room-discovery process 405 will be described with respect to FIG. 4B.



FIG. 4A is an illustrative flow 4000A for nomenclature and website discovery. The flow 4000A typically begins with an IRC bot 400A. In a typical embodiment, the IRC bot 400A is similar to the IRC bot 400 of FIG. 4, the IRC bot 300 of FIG. 3, and the IRC bot 101 of FIG. 1. At a discovery step 401A, an IRC log generated by the IRC bot 400A may be analyzed for new identity-theft nomenclature and new websites. The IRC log may be, for example, an IRC log from the IRC log database 401 of FIG. 4. The new identity-theft nomenclature may include, for example, nicknames and email addresses used by participants (e.g., identity thieves) in chat rooms. By way of further example, the new websites may include URLs to websites that are mentioned in chat rooms. In various embodiments, the new identity-theft nomenclature may be utilized by a PWS such as, for example, the PWS 200 of FIG. 2, to search for additional compromised PII as described with respect to FIG. 2.


After the discovery step 401A, an analysis step 402A may occur. In a typical embodiment, the analysis step 402A includes ranking a relative significance of identity-theft websites and forums that are stored, for example, in a database 403A. The identity-theft websites and forums include, for example, the new websites and forums identified at the discovery step 401A. The identity-theft websites and forums may be ranked in a manner similar to that described with respect to the ranking of identity-theft nomenclature in the performance-analysis process 205 of FIG. 2. In a typical embodiment, the analysis step 402A results in storage of the rankings and the new websites in the database 403A. Subsequently, the flow 4000A ends.



FIG. 4B illustrates a flow 4000B for chat-room discovery. In a typical embodiment, the flow 4000B may begin via an IRC bot 400B. In a typical embodiment, the IRC bot 400B is similar to the IRC bot 400A of FIG. 4A, the IRC bot 400 of FIG. 4, the IRC bot 300 of FIG. 3, and the IRC bot 101 of FIG. 1. As described with respect to FIG. 4, the IRC bot 400B may yield IRC logs from monitoring of chat rooms. Additionally, as described with respect to the discovery process 401A of FIG. 4A, in various embodiments, the IRC bot 400B may yield identity-theft nomenclature and identity-theft websites after engaging in a discovery process. The identity-theft nomenclature and the identity-theft websites may be stored, for example, in a nomenclature database 403B and an IRC-network database 404B.


In a typical embodiment, the IRC logs, the identity-theft nomenclature from the nomenclature database 403B and the chat rooms from the chat-room database 404B may serve as inputs to an analysis step 401B. At the analysis step 401B, the flow 4000B is typically operable to analyze the IRC logs to discover new chat rooms. For example, for a given IRC log, the flow 4000B may analyze a frequency of identity-theft nomenclature. In addition, by way of further example, the flow 4000B may determine how often particular chat rooms are referenced in a given IRC log. In various embodiments, if references to a particular chat room exceed a configurable threshold, the particular chat room may be recorded in a database 405B at step 402B. In some embodiments, the predetermined threshold for overall references may vary based on, for example, a frequency of identity-theft nomenclature in the given IRC log. For example, if the given IRC log has a high frequency of identity-theft nomenclature relative to a configurable value, a single reference may be sufficient for recordation in the database 405B.


In various embodiments, the analysis step 401B may further involve monitoring particular chat rooms from the chat-room database 404B. For example, as described with respect to the analysis step 402A of FIG. 4A, chat rooms in the chat-room database 404B may be ranked. Therefore, in various embodiments, high-ranking chat rooms may be monitored for references to other chat rooms. In a typical embodiment, new chat rooms discovered via the analysis step 401B are stored in the database 405B at step 402B. Subsequently, the flow 4000B ends.



FIG. 5 illustrates a system 500 that may be utilized to facilitate acquisition and utilization of identity-theft information. The system 500 includes a server computer 502, a database 504, and a computer network 506. In a typical embodiment, the server computer 502 may have resident and operating thereon a PWS such as, for example, the PWS 200 of FIG. 2. In a typical embodiment, the server computer may have resident and operating thereon an IRC bot such as, for example, the IRC bot 400B of FIG. 4B, the IRC bot 400A of FIG. 4A, the IRC bot 400 of FIG. 4, the IRC bot 300 of FIG. 3, and the IRC bot 101 of FIG. 1. In various embodiments, the server computer 502 may facilitate execution, for example, of the flow 1000 of FIG. 1, the flow 2000 of FIG. 2, the flow 3000 of FIG. 3, and/or the flow 4000 of FIG. 4. In that way, the server computer 502 may be operable to acquire identity-theft information such as, for example, compromised PII, via the computer network 506. The computer network 506 may be, for example, the Internet. The identity-theft information may be stored, for example, in the database 504.


One of ordinary skill in the art will appreciate that the server computer 502 may, in various embodiments, represent a plurality of server computers. For example, the PWS and the IRC bot may, in various embodiments, be resident and operating on distinct physical or virtual server computers. Likewise, in various embodiments, the PWS and the IRC bot may be resident and operating on one physical or virtual server computer. Furthermore, one of ordinary skill in the art will appreciate that the database 504 may, in various embodiments, represent either a single database or a plurality of databases.



FIG. 6 illustrates an embodiment of a computer system 600 on which various embodiments of the invention may be implemented such as, for example, the PWS 200 of FIG. 2, the IRC bot 400B of FIG. 4B, the IRC bot 400A of FIG. 4A, the IRC bot 400 of FIG. 4, the IRC bot 300 of FIG. 3, and the IRC bot 101 of FIG. 1. The computer system 600 may be, for example, similar to the server computer 502 of FIG. 5. The computer system 600 may be a physical system, virtual system, or a combination of both physical and virtual systems. In the implementation, a computer system 600 may include a bus 618 or other communication mechanism for communicating information and a processor 602 coupled to the bus 618 for processing information. The computer system 600 also includes a main memory 604, such as random-access memory (RAM) or other dynamic storage device, coupled to the bus 618 for storing computer readable instructions by the processor 602.


The main memory 604 also may be used for storing temporary variables or other intermediate information during execution of the instructions to be executed by the processor 602. The computer system 600 further includes a read-only memory (ROM) 606 or other static storage device coupled to the bus 618 for storing static information and instructions for the processor 602. A computer-readable storage device 608, such as a magnetic disk or optical disk, is coupled to the bus 618 for storing information and instructions for the processor 602. The computer system 600 may be coupled via the bus 618 to a display 610, such as a liquid crystal display (LCD) or a cathode ray tube (CRT), for displaying information to a user. An input device 612, including, for example, alphanumeric and other keys, is coupled to the bus 618 for communicating information and command selections to the processor 602. Another type of user input device is a cursor control 614, such as a mouse, a trackball, or cursor direction keys for communicating direct information and command selections to the processor 602 and for controlling cursor movement on the display 610. The cursor control 614 typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allow the device to specify positions in a plane.


The term “computer readable instructions” as used above refers to any instructions that may be performed by the processor 602 and/or other component of the computer system 600. Similarly, the term “computer readable medium” refers to any storage medium that may be used to store the computer readable instructions. Such a medium may take many forms, including, but not limited to, non volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks, such as the storage device 608. Volatile media includes dynamic memory, such as the main memory 604. Transmission media includes coaxial cables, copper wire, and fiber optics, including wires of the bus 618. Transmission media can also take the form of acoustic or light waves, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.


Various forms of the computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor 602 for execution. For example, the instructions may initially be borne on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the computer system 600 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to the bus 618 can receive the data carried in the infrared signal and place the data on the bus 618. The bus 618 carries the data to the main memory 604, from which the processor 602 retrieves and executes the instructions. The instructions received by the main memory 604 may optionally be stored on the storage device 608 either before or after execution by the processor 602.


The computer system 600 may also include a communication interface 616 coupled to the bus 618. The communication interface 616 provides a two-way data communication coupling between the computer system 600 and a network. For example, the communication interface 616 may be an integrated services digital network (ISDN) card or a modem used to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 616 may be a local area network (LAN) card used to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, the communication interface 616 sends and receives electrical, electromagnetic, optical, or other signals that carry digital data streams representing various types of information. The storage device 608 can further include instructions for carrying out various processes for image processing as described herein when executed by the processor 602. The storage device 608 can further include a database for storing data relative to same.

Claims
  • 1. A method comprising: generating, by a computer system, a search-engine query from stored identity-theft nomenclature;querying, by the computer system, at least one search engine via the search-engine query;crawling, by the computer system, at least one computer-network resource identified via the querying;collecting, by the computer system, identity-theft information from the at least one computer-network resource;wherein the at least one computer-network resource comprises a chat room;wherein the collecting comprises logging chat dialog from the chat room into a chat log database; andprocessing, by the computer system, the identity-theft information for compromised personally-identifying information (PII), wherein the processing comprises discovering new chat rooms based, at least in part, on an analysis of chat dialogs in the chat log database and storing the compromised PII.
  • 2. The method of claim 1, wherein the collecting comprises scanning the at least one computer network resource for at least a portion of the stored identity-theft nomenclature.
  • 3. The method of claim 1, wherein the processing comprises extracting the compromised PII from the identity-theft information, the extracting comprising recognizing at least one PII format.
  • 4. The method of claim 3, wherein the recognized at least one PII format is selected from the group consisting of: token-separated data, one or more columns of data lacking column headers, multi-line labeled data, and magnetic-stripe data.
  • 5. The method of claim 1, wherein the processing comprises validating the at least one computer-network resource, the validating comprising determining whether the at least one computer-network resource is a likely source of false positives for compromised PII.
  • 6. The method of claim 1, wherein the processing comprises normalizing the identity-theft information, the normalizing comprising storing the identity-theft information according to a standardized format.
  • 7. The method of claim 1, comprising creating and delivering at least one of an alert and a report in connection with the identity-theft information.
  • 8. The method of claim 1, wherein the identity-theft information comprises information related to new sources of compromised PII.
  • 9. The method of claim 1, wherein the stored identity-theft nomenclature comprises words that are determined to be suggestive of identity-theft information.
  • 10. The method of claim 1, wherein the processing comprises: analyzing the identity-theft information for new identity-theft nomenclature; andstoring any new identity-theft nomenclature with the stored identity-theft nomenclature.
  • 11. The method of claim 1, comprising ranking entries within the stored identity-theft nomenclature according to a relative significance of compromised PII that is gleaned thereby.
  • 12. The method of claim 11, wherein the ranking comprises ranking the stored identity-theft nomenclature according to a quality of compromised PII that is gleaned thereby.
  • 13. The method of claim 11, wherein the ranking comprises ranking the stored identity-theft nomenclature according to a quantity of compromised PII that is gleaned thereby.
  • 14. The method of claim 11, wherein the generating comprises generating the search-engine query from highly-ranked entries from the stored identity-theft nomenclature.
  • 15. The method of claim 1, wherein the collecting comprises distinguishing spam postings from other dialog.
  • 16. The method of claim 1, wherein the discovering comprises analyzing a frequency of the stored identity-theft nomenclature in the chat dialogs.
  • 17. A computer-program product comprising a non-transitory computer-readable storage medium having computer-readable program code embodied therein, the computer-readable program code adapted to be executed to implement a method comprising: generating a search-engine query from stored identity-theft nomenclature;querying at least one search engine via the search-engine query;crawling at least one computer-network resource identified via the querying;collecting identity-theft information from the at least one computer-network resource; andwherein the at least one computer-network resource comprises a chat room;wherein the collecting comprises logging chat dialog from the chat room into a chat log database; andprocessing the identity-theft information for compromised personally-identifying information (PII), wherein the processing comprises discovering new chat rooms based, at least in part, on an analysis of chat dialogs in the chat log database and storing the compromised PII.
CROSS-REFERENCE TO RELATED APPLICATIONS

This patent application is a continuation of U.S. patent application Ser. No. 14/929,835, filed on Nov. 2, 2015. U.S. patent application Ser. No. 14/929,835 is a continuation of U.S. patent application Ser. No. 13/398,471, filed on Feb. 16, 2012. U.S. patent application Ser. No. 13/398,471 claims priority from U.S. Provisional Application No. 61/444,433, filed on Feb. 18, 2011. U.S. patent application Ser. No. 14/929,835, U.S. patent application Ser. No. 13/398,471 and U.S. Provisional Application No. 61/444,433 are incorporated by reference.

US Referenced Citations (456)
Number Name Date Kind
3752904 Waterbury Aug 1973 A
5913196 Talmor et al. Jun 1999 A
5987440 O'Neil et al. Nov 1999 A
5999940 Ranger Dec 1999 A
6125985 Amdahl et al. Oct 2000 A
6142283 Amdahl et al. Nov 2000 A
6144988 Kappel Nov 2000 A
6249228 Shirk et al. Jun 2001 B1
6263447 French et al. Jul 2001 B1
6269349 Aieta et al. Jul 2001 B1
6282658 French et al. Aug 2001 B2
6292795 Peters et al. Sep 2001 B1
6321339 French et al. Nov 2001 B1
6448889 Hudson Sep 2002 B1
6456984 Demoff et al. Sep 2002 B1
6496936 French et al. Dec 2002 B1
6505193 Musgrave et al. Jan 2003 B1
6510415 Talmor et al. Jan 2003 B1
6532459 Berson Mar 2003 B1
6553495 Johansson et al. Apr 2003 B1
6612488 Suzuki Sep 2003 B2
6700220 Bayeur et al. Mar 2004 B2
6740875 Ishikawa et al. May 2004 B1
6751626 Brown et al. Jun 2004 B2
6811082 Wong Nov 2004 B2
6829711 Kwok et al. Dec 2004 B1
6857073 French et al. Feb 2005 B2
6866586 Oberberger et al. Mar 2005 B2
6871287 Ellingson Mar 2005 B1
6913194 Suzuki Jul 2005 B2
6920435 Hoffman et al. Jul 2005 B2
6928546 Nanavati et al. Aug 2005 B1
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6965997 Dutta Nov 2005 B2
6973575 Arnold Dec 2005 B2
6983882 Cassone Jan 2006 B2
6991174 Zuili Jan 2006 B2
6993659 Milgramm et al. Jan 2006 B2
7028052 Chapman et al. Apr 2006 B2
7035855 Kilger et al. Apr 2006 B1
7083090 Zuili Aug 2006 B2
7092891 Maus et al. Aug 2006 B2
7104444 Suzuki Sep 2006 B2
7174335 Kameda Feb 2007 B2
7203653 McIntosh Apr 2007 B1
7212995 Schulkins May 2007 B2
7222779 Pineda-Sanchez et al. May 2007 B1
7225977 Davis Jun 2007 B2
7234156 French et al. Jun 2007 B2
7240363 Ellingson Jul 2007 B1
7254560 Singhal Aug 2007 B2
7272857 Everhart Sep 2007 B1
7289607 Bhargava et al. Oct 2007 B2
7298873 Miller, Jr. et al. Nov 2007 B2
7310743 Gagne et al. Dec 2007 B1
7314162 Carr et al. Jan 2008 B2
7340042 Cluff et al. Mar 2008 B2
7370044 Mulhern et al. May 2008 B2
7386448 Poss et al. Jun 2008 B1
7392534 Lu et al. Jun 2008 B2
7398915 Pineda-Sanchez et al. Jul 2008 B1
7438226 Helsper et al. Oct 2008 B2
7458508 Shao et al. Dec 2008 B1
7466235 Kolb et al. Dec 2008 B1
7480631 Merced et al. Jan 2009 B1
7481363 Zuili Jan 2009 B2
7490052 Kilger et al. Feb 2009 B2
7497374 Helsper et al. Mar 2009 B2
7519558 Ballard et al. Apr 2009 B2
7522060 Tumperi et al. Apr 2009 B1
7533808 Song et al. May 2009 B2
7536346 Aliffi et al. May 2009 B2
7540021 Page May 2009 B2
7542993 Satterfield et al. Jun 2009 B2
7543740 Greene et al. Jun 2009 B2
7548886 Kirkland et al. Jun 2009 B2
7552467 Lindsay Jun 2009 B2
7562814 Shao et al. Jul 2009 B1
7568616 Zuili Aug 2009 B2
7591425 Zuili et al. Sep 2009 B1
7593891 Kornegay et al. Sep 2009 B2
7606401 Hoffman et al. Oct 2009 B2
7606790 Levy Oct 2009 B2
7610229 Kornegay et al. Oct 2009 B1
7630932 Danaher et al. Dec 2009 B2
7636853 Cluts et al. Dec 2009 B2
7644868 Hare Jan 2010 B2
7647344 Skurtovich, Jr. et al. Jan 2010 B2
7647645 Edeki et al. Jan 2010 B2
7653593 Zarikian et al. Jan 2010 B2
7657431 Hayakawa Feb 2010 B2
7668921 Proux et al. Feb 2010 B2
7673793 Greene et al. Mar 2010 B2
7676418 Chung et al. Mar 2010 B1
7676433 Ross et al. Mar 2010 B1
7685096 Margolus et al. Mar 2010 B2
7686214 Shao et al. Mar 2010 B1
7689007 Bous et al. Mar 2010 B2
7701364 Zilberman Apr 2010 B1
7707163 Anzalone et al. Apr 2010 B2
7708200 Helsper et al. May 2010 B2
7711636 Robida et al. May 2010 B2
7720750 Brody et al. May 2010 B2
7735125 Alvarez et al. Jun 2010 B1
7742982 Chaudhuri et al. Jun 2010 B2
7747559 Leitner et al. Jun 2010 B2
7752236 Williams et al. Jul 2010 B2
7752554 Biggs et al. Jul 2010 B2
7761384 Madhogarhia Jul 2010 B2
7774270 MacCloskey Aug 2010 B1
7779456 Dennis et al. Aug 2010 B2
7779457 Taylor Aug 2010 B2
7788184 Kane Aug 2010 B2
7792864 Rice Sep 2010 B1
7793835 Coggeshall et al. Sep 2010 B1
7801828 Candella et al. Sep 2010 B2
7805391 Friedlander et al. Sep 2010 B2
7840459 Loftesness et al. Nov 2010 B1
7860769 Benson Dec 2010 B2
7865439 Seifert et al. Jan 2011 B2
7865937 White et al. Jan 2011 B1
7870078 Clark et al. Jan 2011 B2
7870599 Pemmaraju Jan 2011 B2
7874488 Parkinson Jan 2011 B2
7882548 Heath et al. Feb 2011 B2
7890433 Singhal Feb 2011 B2
7904360 Evans Mar 2011 B2
7904367 Chung et al. Mar 2011 B2
7908242 Achanta Mar 2011 B1
7912865 Akerman et al. Mar 2011 B2
7917715 Tallman, Jr. Mar 2011 B2
7925582 Kornegay et al. Apr 2011 B1
7929951 Stevens Apr 2011 B2
7933835 Keane et al. Apr 2011 B2
7950577 Daniel May 2011 B1
7962404 Metzger, II et al. Jun 2011 B1
7962467 Howard et al. Jun 2011 B2
7970679 Kasower Jun 2011 B2
7970698 Gupta et al. Jun 2011 B2
7971246 Emigh et al. Jun 2011 B1
7975299 Balducci et al. Jul 2011 B1
7983979 Holland, IV Jul 2011 B2
7984849 Berghel et al. Jul 2011 B2
7988043 Davis Aug 2011 B2
7991201 Bous et al. Aug 2011 B2
7991689 Brunzell et al. Aug 2011 B1
7995994 Khetawat et al. Aug 2011 B2
7996521 Chamberlain et al. Aug 2011 B2
8001034 Chung et al. Aug 2011 B2
8001042 Brunzell et al. Aug 2011 B1
8001153 Skurtovich, Jr. et al. Aug 2011 B2
8005749 Ginsberg Aug 2011 B2
8006291 Headley et al. Aug 2011 B2
8009873 Chapman Aug 2011 B2
8019678 Wright et al. Sep 2011 B2
8020763 Kowalchyk et al. Sep 2011 B1
8024271 Grant Sep 2011 B2
8027518 Baker et al. Sep 2011 B2
8028168 Smithies et al. Sep 2011 B2
8028326 Palmer et al. Sep 2011 B2
8028329 Whitcomb Sep 2011 B2
8028896 Carter et al. Oct 2011 B2
8032449 Hu et al. Oct 2011 B2
8032927 Ross Oct 2011 B2
8037512 Wright et al. Oct 2011 B2
8042159 Basner et al. Oct 2011 B2
8042193 Piliouras Oct 2011 B1
8049596 Sato Nov 2011 B2
8055667 Levy Nov 2011 B2
8056128 Dingle et al. Nov 2011 B1
8058972 Mohanty Nov 2011 B2
8060915 Voice et al. Nov 2011 B2
8065525 Zilberman Nov 2011 B2
8069053 Gervais et al. Nov 2011 B2
8069084 Mackouse Nov 2011 B2
8069256 Rasti Nov 2011 B2
8069485 Carter Nov 2011 B2
8819793 Gottschalk, Jr. Aug 2014 B2
20020019938 Aarons Feb 2002 A1
20020042879 Gould et al. Apr 2002 A1
20020059521 Tasler May 2002 A1
20020062185 Runge et al. May 2002 A1
20020062281 Singhal May 2002 A1
20020073044 Singhal Jun 2002 A1
20020077178 Oberberger et al. Jun 2002 A1
20020080256 Bates et al. Jun 2002 A1
20020087460 Hornung Jul 2002 A1
20020130176 Suzuki Sep 2002 A1
20020138751 Dutta Sep 2002 A1
20020173994 Ferguson Nov 2002 A1
20020184509 Scheidt et al. Dec 2002 A1
20030004879 Demoff et al. Jan 2003 A1
20030009426 Ruiz-Sanchez Jan 2003 A1
20030046554 Leydier et al. Mar 2003 A1
20030048904 Wang et al. Mar 2003 A1
20030057278 Wong Mar 2003 A1
20030070101 Buscemi Apr 2003 A1
20030143980 Choi et al. Jul 2003 A1
20030149744 Bierre et al. Aug 2003 A1
20030200447 Sjoblom Oct 2003 A1
20030222500 Bayeur et al. Dec 2003 A1
20030233278 Marshall Dec 2003 A1
20040004117 Suzuki Jan 2004 A1
20040005912 Hubbe et al. Jan 2004 A1
20040010698 Rolfe Jan 2004 A1
20040026496 Zuili Feb 2004 A1
20040111335 Black et al. Jun 2004 A1
20040149820 Zuili Aug 2004 A1
20040149827 Zuili Aug 2004 A1
20040153656 Cluts et al. Aug 2004 A1
20040153663 Clark et al. Aug 2004 A1
20040158723 Root Aug 2004 A1
20040230538 Clifton et al. Nov 2004 A1
20040243514 Wankmueller Dec 2004 A1
20040243539 Skurtovich et al. Dec 2004 A1
20040243567 Levy Dec 2004 A1
20040250085 Tattan et al. Dec 2004 A1
20040254868 Kirkland et al. Dec 2004 A1
20050001028 Zuili Jan 2005 A1
20050005168 Dick Jan 2005 A1
20050010780 Kane et al. Jan 2005 A1
20050021476 Candella et al. Jan 2005 A1
20050021519 Ghouri Jan 2005 A1
20050065874 Lefner et al. Mar 2005 A1
20050065950 Chaganti et al. Mar 2005 A1
20050071282 Lu et al. Mar 2005 A1
20050075985 Cartmell Apr 2005 A1
20050081052 Washington Apr 2005 A1
20050086161 Gallant Apr 2005 A1
20050097364 Edeki et al. May 2005 A1
20050125226 Magee Jun 2005 A1
20050125686 Brandt Jun 2005 A1
20050138391 Mandalia et al. Jun 2005 A1
20050144143 Freiberg Jun 2005 A1
20050154671 Doan et al. Jul 2005 A1
20050165667 Cox Jul 2005 A1
20050203885 Chenevich et al. Sep 2005 A1
20050216953 Ellingson Sep 2005 A1
20050229007 Bolle et al. Oct 2005 A1
20050242173 Suzuki Nov 2005 A1
20050273333 Morin et al. Dec 2005 A1
20050279869 Barklage Dec 2005 A1
20060004622 Fanelli et al. Jan 2006 A1
20060004663 Singhal Jan 2006 A1
20060041464 Powers et al. Feb 2006 A1
20060045105 Dobosz et al. Mar 2006 A1
20060064374 Helsper et al. Mar 2006 A1
20060074798 Din et al. Apr 2006 A1
20060080230 Freiberg Apr 2006 A1
20060080263 Willis et al. Apr 2006 A1
20060089905 Song et al. Apr 2006 A1
20060101508 Taylor May 2006 A1
20060106605 Saunders et al. May 2006 A1
20060112279 Cohen et al. May 2006 A1
20060112280 Cohen et al. May 2006 A1
20060129840 Milgramm et al. Jun 2006 A1
20060140460 Coutts Jun 2006 A1
20060143073 Engel et al. Jun 2006 A1
20060144924 Stover Jul 2006 A1
20060149580 Helsper et al. Jul 2006 A1
20060149674 Cook et al. Jul 2006 A1
20060178971 Owen et al. Aug 2006 A1
20060179004 Fuchs Aug 2006 A1
20060200855 Willis Sep 2006 A1
20060204051 Holland Sep 2006 A1
20060206725 Milgramm et al. Sep 2006 A1
20060239512 Petrillo Oct 2006 A1
20060239513 Song et al. Oct 2006 A1
20060253583 Dixon et al. Nov 2006 A1
20060255914 Westman Nov 2006 A1
20060271456 Romain et al. Nov 2006 A1
20060271457 Romain et al. Nov 2006 A1
20060271507 Anzalone et al. Nov 2006 A1
20060271568 Balkir et al. Nov 2006 A1
20060273158 Suzuki Dec 2006 A1
20060277043 Tomes et al. Dec 2006 A1
20060282285 Helsper et al. Dec 2006 A1
20060282372 Endres et al. Dec 2006 A1
20060282395 Leibowitz Dec 2006 A1
20060287902 Helsper et al. Dec 2006 A1
20060294023 Lu Dec 2006 A1
20070011100 Libin et al. Jan 2007 A1
20070016521 Wang Jan 2007 A1
20070016522 Wang Jan 2007 A1
20070038568 Greene et al. Feb 2007 A1
20070040017 Kozlay Feb 2007 A1
20070040019 Berghel et al. Feb 2007 A1
20070043577 Kasower Feb 2007 A1
20070047770 Swope et al. Mar 2007 A1
20070048765 Abramson Mar 2007 A1
20070050638 Rasti Mar 2007 A1
20070061273 Greene et al. Mar 2007 A1
20070072190 Aggarwal Mar 2007 A1
20070073622 Kane Mar 2007 A1
20070073630 Greene et al. Mar 2007 A1
20070078786 Bous et al. Apr 2007 A1
20070083460 Bachenheimer Apr 2007 A1
20070087795 Aletto et al. Apr 2007 A1
20070100774 Abdon May 2007 A1
20070106517 Cluff et al. May 2007 A1
20070106611 Larsen May 2007 A1
20070107050 Selvarajan May 2007 A1
20070109103 Jedrey et al. May 2007 A1
20070110282 Millsapp May 2007 A1
20070124270 Page May 2007 A1
20070155411 Morrison Jul 2007 A1
20070157299 Hare Jul 2007 A1
20070168480 Biggs et al. Jul 2007 A1
20070174208 Black et al. Jul 2007 A1
20070179903 Seinfeld et al. Aug 2007 A1
20070180209 Tallman Aug 2007 A1
20070180263 Delgrosso et al. Aug 2007 A1
20070186276 McRae et al. Aug 2007 A1
20070192853 Shraim et al. Aug 2007 A1
20070198410 Labgold et al. Aug 2007 A1
20070205266 Carr et al. Sep 2007 A1
20070214037 Shubert et al. Sep 2007 A1
20070214076 Robida et al. Sep 2007 A1
20070214365 Cornett et al. Sep 2007 A1
20070219928 Madhogarhia Sep 2007 A1
20070220594 Tulsyan Sep 2007 A1
20070233614 McNelley et al. Oct 2007 A1
20070234427 Gardner et al. Oct 2007 A1
20070244807 Andringa et al. Oct 2007 A1
20070250704 Hallam-Baker Oct 2007 A1
20070250920 Lindsay Oct 2007 A1
20070266439 Kraft Nov 2007 A1
20070291995 Rivera Dec 2007 A1
20070292006 Johnson Dec 2007 A1
20070294104 Boaz et al. Dec 2007 A1
20080027857 Benson Jan 2008 A1
20080027858 Benson Jan 2008 A1
20080059236 Cartier Mar 2008 A1
20080059352 Chandran Mar 2008 A1
20080059366 Fou Mar 2008 A1
20080076386 Khetawat et al. Mar 2008 A1
20080098222 Zilberman Apr 2008 A1
20080103799 Domenikos et al. May 2008 A1
20080103811 Sosa May 2008 A1
20080103972 Lanc May 2008 A1
20080104021 Cai et al. May 2008 A1
20080114837 Biggs et al. May 2008 A1
20080120237 Lin May 2008 A1
20080126116 Singhai May 2008 A1
20080162383 Kraft Jul 2008 A1
20080177841 Sinn et al. Jul 2008 A1
20080189789 Lamontagne Aug 2008 A1
20080208548 Metzger et al. Aug 2008 A1
20080208726 Tsantes et al. Aug 2008 A1
20080217400 Portano Sep 2008 A1
20080244717 Jelatis et al. Oct 2008 A1
20080256613 Grover Oct 2008 A1
20080288430 Friedlander et al. Nov 2008 A1
20080288790 Wilson Nov 2008 A1
20080294689 Metzger et al. Nov 2008 A1
20080296367 Parkinson Dec 2008 A1
20080296382 Connell, II et al. Dec 2008 A1
20080300877 Gilbert et al. Dec 2008 A1
20090007220 Ormazabal et al. Jan 2009 A1
20090018934 Peng et al. Jan 2009 A1
20090021349 Errico et al. Jan 2009 A1
20090024417 Marks et al. Jan 2009 A1
20090024636 Shiloh Jan 2009 A1
20090026270 Connell, II et al. Jan 2009 A1
20090079539 Johnson Mar 2009 A1
20090094311 Awadallah et al. Apr 2009 A1
20090099960 Robida et al. Apr 2009 A1
20090106153 Ezra Apr 2009 A1
20090106846 Dupray et al. Apr 2009 A1
20090119299 Rhodes May 2009 A1
20090125439 Zarikian et al. May 2009 A1
20090126013 Atwood et al. May 2009 A1
20090138391 Dudley et al. May 2009 A1
20090141318 Hughes Jun 2009 A1
20090151005 Bell et al. Jun 2009 A1
20090158404 Hahn et al. Jun 2009 A1
20090204457 Buhrmann et al. Aug 2009 A1
20090205032 Hinton et al. Aug 2009 A1
20090206993 Di Mambro et al. Aug 2009 A1
20090216560 Siegel Aug 2009 A1
20090222362 Stood et al. Sep 2009 A1
20090222897 Carow et al. Sep 2009 A1
20090224875 Rabinowitz et al. Sep 2009 A1
20090224889 Aggarwal et al. Sep 2009 A1
20090226056 Vlachos et al. Sep 2009 A1
20090241168 Readshaw Sep 2009 A1
20090241173 Troyansky Sep 2009 A1
20090248198 Siegel et al. Oct 2009 A1
20090248497 Hueter Oct 2009 A1
20090254484 Forero et al. Oct 2009 A1
20090257595 de Cesare et al. Oct 2009 A1
20090259470 Chang Oct 2009 A1
20090259560 Bachenheimer Oct 2009 A1
20090259588 Lindsay Oct 2009 A1
20090259855 de Cesare et al. Oct 2009 A1
20090261189 Ellis, Jr. Oct 2009 A1
20090270126 Liu Oct 2009 A1
20090271617 Song et al. Oct 2009 A1
20090272801 Connell, II et al. Nov 2009 A1
20090276244 Baldwin, Jr. et al. Nov 2009 A1
20090281945 Shakkarwar Nov 2009 A1
20090281951 Shakkarwar Nov 2009 A1
20090307778 Mardikar Dec 2009 A1
20090326972 Washington Dec 2009 A1
20090328173 Jakobson et al. Dec 2009 A1
20100004965 Eisen Jan 2010 A1
20100024037 Grzymala-Busse et al. Jan 2010 A1
20100031030 Kao et al. Feb 2010 A1
20100037147 Champion et al. Feb 2010 A1
20100037308 Lin et al. Feb 2010 A1
20100042526 Martinov Feb 2010 A1
20100070620 Awadallah et al. Mar 2010 A1
20100077006 El Emam et al. Mar 2010 A1
20100085146 Johnson Apr 2010 A1
20100088233 Tattan et al. Apr 2010 A1
20100088338 Pavoni, Jr. et al. Apr 2010 A1
20100095357 Willis et al. Apr 2010 A1
20100100406 Lim Apr 2010 A1
20100131273 Aley-Raz et al. May 2010 A1
20100132043 Bjorn et al. May 2010 A1
20100145847 Zarikian et al. Jun 2010 A1
20100158207 Dhawan et al. Jun 2010 A1
20100169210 Bous et al. Jul 2010 A1
20100169947 Sarmah et al. Jul 2010 A1
20100205662 Ibrahim et al. Aug 2010 A1
20100218255 Ritman et al. Aug 2010 A1
20100229225 Sarmah et al. Sep 2010 A1
20100229230 Edeki et al. Sep 2010 A1
20100241501 Marshall Sep 2010 A1
20100250364 Song et al. Sep 2010 A1
20100250411 Ogrodski Sep 2010 A1
20100275265 Fiske et al. Oct 2010 A1
20100302157 Zilberman Dec 2010 A1
20100306101 Lefner et al. Dec 2010 A1
20100313273 Freas Dec 2010 A1
20100325442 Petrone et al. Dec 2010 A1
20100332292 Anderson Dec 2010 A1
20110016042 Cho et al. Jan 2011 A1
20110040983 Grzymala-Busse et al. Feb 2011 A1
20110060905 Stack et al. Mar 2011 A1
20110119291 Rice May 2011 A1
20110126024 Beatson et al. May 2011 A1
20110260832 Ross et al. Oct 2011 A1
20110270727 Kasower Nov 2011 A1
20110276496 Neville et al. Nov 2011 A1
20110289032 Crooks et al. Nov 2011 A1
20110289322 Rasti Nov 2011 A1
20110295721 MacDonald Dec 2011 A1
20110295750 Rammal Dec 2011 A1
20110296529 Bhanoo et al. Dec 2011 A1
20110302412 Deng et al. Dec 2011 A1
20110302641 Hald et al. Dec 2011 A1
20130004033 Trugenberger et al. Jan 2013 A1
20150295924 Gottschalk, Jr. Oct 2015 A1
20160055350 Gottschalk, Jr. et al. Feb 2016 A1
Foreign Referenced Citations (2)
Number Date Country
WO-2009062111 May 2009 WO
WO-2012112781 Aug 2012 WO
Non-Patent Literature Citations (9)
Entry
Young, Lee W., “International Search Report” prepared for PCT/US11/33940 as mailed Aug. 22, 2011, 3 pages.
Ross, et al., U.S. Appl. No. 13/093,664, filed Apr. 25, 2011.
Gottschalk, Jr., U.S. Appl. No. 13/236,687, filed Sep. 20, 2011.
Gottschalk, U.S. Appl. No. 13/398,471, filed Feb. 16, 2012.
Copenheaver, Blaine R., “International Search Report” for PCT/US2012/025456 as mailed May 21, 2012, 4 pages.
Lefebvre, F. et al., “A Robust Soft Hash Algorithm for Digital Image Signature”, IEEE, 2003, pp. 495-498.
Khan, Muhammad Khurram, “An Efficient and Secure Remote Mutual Authentication Scheme with Smart Cards”, IEEE, 2008, pp. 1-6.
Aad, Imad, et al.; “NRC Data Collection and the Privacy by Design Principles”; IEEE, Nov. 2010; 5 pages.
Haglund, Christoffer; “Two-Factor Authentication with a Mobile Phone”; Fox Technologies, Uppsala, Department of Information Technology, Lund University; Nov. 2, 2007; 62 pages.
Related Publications (1)
Number Date Country
20170053369 A1 Feb 2017 US
Provisional Applications (1)
Number Date Country
61444433 Feb 2011 US
Continuations (2)
Number Date Country
Parent 14929835 Nov 2015 US
Child 15341096 US
Parent 13398471 Feb 2012 US
Child 14929835 US