Systems and methods of identity protection and management

Information

  • Patent Grant
  • 9106691
  • Patent Number
    9,106,691
  • Date Filed
    Friday, September 16, 2011
    13 years ago
  • Date Issued
    Tuesday, August 11, 2015
    9 years ago
Abstract
In an embodiment, a computing system, such as a monitoring computer, receives a request from a user to monitor an account of the user with an online service provider. The request may include personal information and user preferences for one or more protective actions. The system periodically monitors external data sources for indications of changes to personal information associated with the account, and detects changes or attempted changes to personal information associated with the account. The system may determine risk levels associated with detected changes or attempted changes, and transmit a notification to the user via a communication channel selected based on the determined risk level and/or the user preferences. The system may also initiate protective actions, so that further unauthorized access to the account may be prevented.
Description
BACKGROUND

This disclosure relates to personal information management, and particularly, systems and methods for management of identity and personal information on external services.


Communication systems and network systems such as the Internet enable users to access a multitude of services such as e-commerce services, banking services, credit services, social networking services, and the like. Users often maintain relationships with many of these services. They may have accounts with these services accessed by credentials such as user names and passwords. Furthermore, these services may store personal information of users, such as personal names, relationships with others, home and residence addresses, telephone numbers, credit card numbers, financial information, and so on. Such users often rely on these services to maintain this information, and any compromise to the security or accuracy of this information may impose substantial costs on those users. For example, if an unauthorized person manages to gain access to the user's account and change that user's password, login information, or personal information, then that user may become unable to access his or her account and may be forced to deal with the fallout of identity theft, which can be costly and time-consuming for the user.


In order to prevent such identity theft and unauthorized access, online services often send out notifications of changes to personal information on users' accounts. For example, where a user changes his or her password, online services often send out an email confirmation to notify the user of the password change. Unfortunately, such notifications may become too numerous and burdensome for the user to carefully review to detect fraud and/or identity theft. Additionally, such notifications may be hidden among other communications, such as other emails, and thus not be noticed by the user in a timely manner. For example, notifications may be misclassified as junk mail or spam, possibly resulting in them going unnoticed by the user. Thus, such notifications may be ignored by users and thus become ineffective.


SUMMARY

Accordingly, disclosed herein are systems and methods of management of identity and personal information, such as account information stored by service providers. The systems and methods disclosed herein enable a user to effectively detect relevant events indicative of changes to identity and/or personal information, such as changes to passwords, login information, address information, and other personal information associated with the user's various accounts with service providers. Additionally, the systems and methods disclosed herein may enable the user to specify automatic actions to be taken in response to such events. Thus, the user may be relieved of the need to manually monitor and/or respond to such events and may be enabled to rapidly respond to those events.


In one embodiment, a method of monitoring and handling potential identity theft threats is performed by a monitoring computer having one or more computer processors. The monitoring computer receives a request from a user to monitor an account of the user with an online service provider. The request includes personal information associated with the user and user preferences for one or more protective actions to be taken in response to detection, by the monitoring computer, of a change or attempted change to personal information associated with the account. The monitoring computer periodically monitors one or more external data sources for indications of changes to personal information associated with the account. The monitoring computer detects a change or attempted change to personal information associated with the account. The monitoring computer determines a risk level associated with the detected change or attempted change to personal information associated with the account. The monitoring computer transmits a notification to the user via a communication channel selected based on the determined risk level and/or the user preferences. The monitoring computer initiates one or more protective actions selected based on one or more of the determined risk level or the user preferences. Further unauthorized access to the account may be prevented by the one or more actions.


In an embodiment, periodically monitoring one or more external data sources for indications of changes to personal information comprises periodically connecting to an external service, providing the external service with login credentials associated with the user, and determining whether the external service accepts the provided login credentials.


In an embodiment, periodically monitoring one or more external data sources for indications of changes to personal information comprises periodically retrieving electronic messages associated with the user and analyzing the content of the retrieved messages to determine whether any of the messages indicates a change to personal information.


In an embodiment, the risk level may be determined at least in part based on whether a preauthorization for the change or attempted change to personal information was received.


In an embodiment, the one or more protective actions are initiated subsequent to receiving user approval for initiating the one or more protective actions.


In an embodiment, the one or more protective actions are initiated without requiring user approval for initiating the one or more protective actions.


In one embodiment, a computing system is configured to monitor and handle potential identity theft threats. The computing system includes a computer-readable storage medium having stored thereon a plurality of executable software modules. The computing system includes one or more computer processors configured to execute the plurality of software modules stored on the computer-readable storage medium. The computing system includes a network interface. The computing system includes a message monitoring module configured to retrieve an electronic message and determine whether the electronic message indicates a change or a possible change to personal information. The computing system includes an event notification module configured to determine a risk level associated with the electronic message in response to the message monitoring module determining that the electronic message indicates a change or a possible change to personal information. The event notification module may be further configured to execute one or more user-customizable responsive actions based upon the risk level associated with the electronic message as determined by the event notification module.


In an embodiment, the message monitoring module may be configured to retrieve the electronic message by automatically logging into one or more email accounts and gathering messages from the one or more email accounts.


In an embodiment, the message monitoring module may be configured to retrieve the electronic message by receiving messages sent to the computing system.


In an embodiment, at least one of the user-customizable responsive actions may be sending an electronic notification identifying the possible change to personal information.


In an embodiment, the event notification module may be configured to execute at least a portion of the user-customizable responsive actions only in response to receiving a user confirmation message.


In an embodiment, the user-customizable responsive actions are selected based upon stored user preferences and the risk level associated with the electronic message.


In an embodiment, the event notification module may be further configured to determine whether the possible change to personal information was preauthorized, and further configured to execute different user-customizable responsive actions if the possible change to personal information was preauthorized.


In an embodiment, a non-transitory computer-readable medium comprises executable instructions configured to cause one or more computer processors to perform operations such as the following. The system periodically determines whether a network service is accessible based on a set of user credentials, by performing operations on a periodic basis. The system transmits a login request to the network service. The login request comprises the user credentials formatted in accordance with a protocol used by the network service. The system receives a login response from the network service. The system determines whether the login response indicates that the network service did not accept the user credentials. The system, in response to a determination that the login response indicates that the network service did not accept the user credentials, performs one or more event responses, selected based at least upon user preferences relating to the network service.


In an embodiment, transmitting a login request to the network service comprises transmitting an HTTP request to the network service.


In an embodiment, determining whether the login response indicates that the network service did not accept the user credentials comprises comparing the login response to one or more predefined parsing rules associated with the network service.


In an embodiment, at least one of the event responses may be sending an electronic notification identifying the possible change to personal information.


In an embodiment, at least a portion of the event responses may be performed only in subsequent to receiving a user confirmation message.


In an embodiment, the event responses may be selected based upon a risk level determined based on the login response.


In an embodiment, the operations may also include determining whether the possible change to personal information was preauthorized. The event responses are selected based at least in part on whether the possible change to personal information was preauthorized.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram representing a system of identity protection and management as used in an embodiment.



FIG. 2 is a flow chart of a process of handling a detected event as used in an embodiment.



FIG. 3 is a flow chart of a process of analyzing messages for personal information change data as used in an embodiment.



FIG. 4 is a flow chart of a process of verifying credentials with a service as used in an embodiment.



FIGS. 5A and 5B are sample user interfaces for specifying monitoring services as used in an embodiment.



FIG. 6 is a sample notification email that may be sent in response to an event as used in an embodiment.



FIG. 7 is a sample notification message user interface that may be displayed on a mobile device as used in an embodiment.



FIG. 8 is a block diagram of a computing system, as used in an embodiment.





DETAILED DESCRIPTION


FIG. 1 is a block diagram of a system of identity protection and management, as used in an embodiment. The system may comprise one or more computing devices, and various elements depicted in FIG. 1 may be included in a single computing device, in separate individual computing devices, or in any combination thereof. The computing device or devices implementing the system of FIG. 1 may be connected to one or more networks such as the Internet by which they may communicate with external entities and data sources.


In an embodiment, the system comprises a management system 101 which performs methods of identity protection and management, as described throughout this specification. Management system 101 may provide various interfaces by which users 102 may access data on the management system. For example, the management system may provide one or more user interfaces via module 103 that may be accessed by users 102. Such user interfaces may include, for example, HTML interfaces, mobile device or tablet computer application interfaces, RSS feeds, audiovisual interfaces, textual interfaces, application programming interfaces, and the like. Additionally, management system 101 may enable users 102 to access data via event notifications module 104. Such event notifications may be sent by any number of means, including, for example, email, text message, instant message, telephone communications, physical mail, and other forms of communication known to those of skill in the art. Management system 101 may provide further interfaces to users 102, other than those provided by modules 103 and 104, as may be known to those of skill in the art.


Management system 101 may have access to various data repositories, in an embodiment. The data repositories may be any of various forms of data storage that may be accessed by a computing system, such as hard drives, tape drives, flash memory, random-access memory, read-only memory, EEPROM storage, and so on, as well as any combination thereof. The data may be formatted within the repositories in one or more formats, referred to herein as “data structures,” such as flat text file storage, relational databases, non-relational databases, XML, comma-separated values, Microsoft Excel files, and so on, as well as any combination thereof. The data repositories may provide various forms of access to the stored data, such as by filesystem access, network access, a SQL protocol (e.g. ODBC), HTTP, FTP, NFS, CIFS, and so on, as well as any combination thereof. As used throughout this specification, the terms “data store,” “repository,” “storage device,” and the like may refer to any such data repository as described herein or otherwise known to those of skill in the art.


In an embodiment, management system 101 may be connected to online service data repository 105 which may include information on various online services such as website services, social networking services, online banking services, e-commerce services and the like. The data included in repository 105 may include data such as a URL and/or location for an online service, types of login credentials for the online service, methods of accessing and providing credentials for the online service, forms of communication used by the online service such as email notifications, data provided by the online service and so on.


In an embodiment, repository 105 receives information on an online service by manual entry performed by an operator or administrator of management system 101. In an embodiment, management system 101 includes automated software routines that gather appropriate information from online services, so that repository 105 may be populated and/or updated automatically. In an embodiment, management system 101 receives information descriptive of online services directly from those online services, and may use that descriptive information to populate repository 105.


Additionally, management system 101 may be in communication with credentials and user information repository 106. The repository may be physically stored on the same storage medium as repository 105 or on different storage media, and the two repositories may be implemented in a single repository in an embodiment. The credentials and user information repository 106 may include information about individual users and user accounts. Such information may include login credentials to access the management system so that users may establish accounts and utilize the services provided by the management system. Additionally, repository 106 may include information about users' online identities. Such information may include, for example, login credentials for various online services, types of identities of services to be monitored, types of services to be monitored, preferences for monitoring of online services, preferences for notifications, preferences for levels of urgency for notifications, and the like.


Management system 101 may include or be connected to identity monitoring service 107. The identity monitoring service may provide periodic or on-demand monitoring of online identity and personal information. For example, identity monitoring service 107 may execute an email monitoring module 108 configured to monitor user emails. Such monitoring may be performed either immediately upon receipt of emails for individual users or on a periodic basis by retrieving relevant emails from a user's account. Identity monitoring service 107 may also execute a credentials monitoring module 109 which may be configured to periodically attempt to access various online services on behalf of users and retrieve personal information associated with those users in order to detect changes or updates to identity and personal information associated with those users. Identity monitoring service 107 may further execute a direct notification module 110 which may be configured to directly receive information about identity and personal information changes from one or more online services. Such direct notifications may be received through standard network protocols such as HTTP or specialized communication protocols including secure communication protocols established with online services.


In an embodiment, monitoring service 107 performs various monitoring tasks, as described previously or as may be contemplated otherwise, to detect events. Events may be related to changes in identity and/or personal information maintained by a service. For example, if a user's login name, password, or other authentication credentials are changed on an online service, the change may be detected by monitoring service 107, thus triggering an event. Monitoring service 107 may also be configured to detect changes to personal information stored online, such as address information, as a type of event. An event indicating an address change could inform a user, for example, of an unauthorized attempt to cause goods or services to be delivered to a different location, which would be a form of identity theft. Other activities may also be considered events by the system, such as online orders or service requests. If the monitoring service 107 is able to trigger events in response to unauthorized online orders or service requests, then the system may be able to stop the orders from being shipped or the services being performed, thus again minimizing the impact of identity theft.



FIG. 2 is a flow chart of a process of handling and/or responding to a detected event as used in an embodiment. Such a process may be used by the event notification module 104 of management system 101 as shown in FIG. 1. In various embodiments, additional blocks may be included, some blocks may be removed, and/or blocks may be connected or arranged differently from what is shown.


At block 201, an event is identified. Such an event may be triggered by one of any number of modules such as the user email module 108, credentials monitoring module 109, or direct monitoring module 110 of the identity monitoring service 107 as shown in FIG. 1. The event identified at block 201 may, in various embodiments, include information about an associated user, an associated online service, personal information associated with the event, other relevant information, or any combination thereof.


At block 202, the system determines whether or not the event identified at block 201 was anticipated. An event may be anticipated, for example, because a user has intentionally caused a change to that user's personal information. For example, where a user decides to change a password or an online account, the password change may be anticipated because it was intended by the user.


The determination of whether an event is anticipated may be based on preauthorization data which includes information provided by users about which events to anticipate. Users may provide preauthorization for events by contacting the system and indicating that a particular event is to be anticipated, via a web interface, mobile application, or other means. Additionally or alternatively, algorithms including artificial intelligence algorithms may be used to determine whether an event is anticipated or how likely an event is to be non-anomalous, known, and/or authorized by the user. Such algorithms may be similar to, for example, algorithms used to detect credit card fraud, as will be known to those of skill in the art.


If the event is anticipated, then at block 203, the system ignores the event or alternatively generates a low-priority notification and/or response. This provides the advantage that the user will only be notified of unexpected and/or important events, so that the user will not be inundated with unnecessary notifications.


If the event is not anticipated then at block 204 the system determines the nature, urgency, and/or other characteristics of the event. This determination may be based on any number of factors including, for example, the nature of the event identified, user preferences stored by the system, frequency of events identified with respect to this user or other users on the system, general information maintained by the system regarding trends in identity fraud, and other information that may be available to the system.


At block 205, the system retrieves user preferences for notifications. These user preferences may be retrieved from one or more data repositories such as repository 106 shown in FIG. 1. Then, based on the nature and urgency of the event determined at block 204, the user preferences retrieved at block 205, and/or other information available to the system, the system may determine at block 206 an appropriate event response, such as a method of notifying the user of the identified event. The system may then generate a notification 207 to be provided to the user by any number of forms of communication known to those of skill in the art, including, telephone notifications, text messages, instant messages, email messages, physical mail messages and/or other forms of communication, as well as any combination thereof. In an embodiment, the system may determine that no notification is required and thus send no notification to the user. In an embodiment, the system may use default preferences provided by an administrator of the system or built into the system in addition to, or rather than, user preferences.


At block 208, the system determines whether further event responses, such as protective actions, are to be taken in response to the event that has been identified, and what actions to take, if any. The determination of further protective actions at block 208 may occur immediately after the determination of the method to notify the user at block 206, or it may occur at a later time. In an embodiment, the system first sends out a notification 207 and then waits to receive a response from the user, at block 209. Such a system enables the user to choose not to perform the protective actions, for example because the triggering event was actually caused by the user, but possibly not preauthorized. In an embodiment, the system may determine whether to wait for a user response at block 209 based on user preferences determined at block 205 or based on other information available to the system. In an embodiment, the system may perform some actions automatically and other actions only after user response.


The protective actions determined at block 208 may include any number of protective actions 210. Such actions may include notifying a third party such as a credit bureau or the police, notifying the online service, temporarily locking the user's account on the service, temporarily or permanently disabling the user's account on the service, changing the user's password on the service, or other actions that may be described throughout this specification or known to those of skill in the art.



FIG. 3 shows a flow chart of a process of reviewing emails for identity or personal information changes, as used in an embodiment. In various embodiments, additional blocks may be included, some blocks may be removed, and/or blocks may be connected or arranged differently from what is shown.


Although the process of FIG. 3 is described with respect to email messages, the process may be applied to other forms of communication as will be known to those of skill in the art. For example, the system may be configured to receive and analyze text messages received on the user's cell phone. In another embodiment, the system may be configured to automatically review physical mail that may have been, for example, scanned in by the user.


The system may access emails in any number of ways. For example, at block 301, the system may directly access the user's email. This may be done, for example, by the system maintaining the user's email account, login and password and periodically accessing the user's email account to retrieve messages. Such retrieval may be performed via an online interface such as a web interface, an IMAP interface, a POP interface, or the like. Alternatively or additionally, the system may receive emails for the user directly at block 302. For example, the user may configure one or more email accounts to automatically forward and/or copy all messages to a specialized email address operated by the system so that the system may receive messages immediately. In an embodiment, the user may maintain one or more email accounts on the system, in which case all messages may be delivered to the system or otherwise accessed so that they may be reviewed.


Upon accessing one or more messages from block 301, from block 302, and/or by other means, the system analyzes the content and/or headers of the email messages at block 303. The data analyzed by the system at block 303 may include any data associated with an email message such as the sender of the message, the recipient of the message, the time and date of the message, the subject line of the message, any Internet headers included in the message, digital signatures attached to the message, attachments to the message, images included in the message, the content of the message, MIME parts of the message, Internet addresses such as IP addresses associated with the message, and so on. For example, the system may identify messages containing the terms “password change,” “address change,” “email address change,” “account created,” “account modified,” “account removed,” and so on. In various embodiments, the parameters such as keywords to be identified may be manually configured, or they may be automatically determined by an automated process such as a machine learning process, Bayesian analysis, neural network processing, and so on.


In an embodiment, the system may be configured to recognize one or more specialized headers in the message. Such a specialized header may be used, for example, by an online service to enable the automatic detection of messages relating to personal information changes. For example, when a user changes a login name or password on an online service account, the online service may be configured to send an email to that user with a specialized header indicating that the message relates to a login name or password change. In an embodiment, such a specialized header may include one or more digital signatures to provide verification that the message originated from the online service.


At block 304, the system determines whether the email indicates a change in identity information based on the analysis performed at block 303. Such identity information changes may include changes to the user's login name, password, personal name, account number, associated accounts, home address, mailing address, telephone number, email address, or the like. In an embodiment, the system detects attempted changes as well as or alternatively to successful changes, in which case the system may provide notifications as to attempted changes.


If at block 304 the system determines that the email indicates a change (or attempted change) in identity information, then at block 305 the system triggers an event for processing. This triggering of an event may invoke an event notification process, such as that shown in FIG. 2, which determines whether a notification should be transmitted, and attributes of the notification. In an embodiment, at block 305, the event is processed (e.g. by the process outlined in FIG. 2) immediately upon the determination that the email indicates a change in identity information. In an alternate embodiment, the system may initiate event processing at block 305 on a regular or periodic basis such as once every hour, once every day, or once every week. Whether the event is processed immediately or at a later time may depend on user preferences and/or the nature of the event, including the urgency of the event.


If the email is determined not to indicate a change in identity information at block 304 or after the event is processed at block 305, the system waits for the next monitoring cycle at block 306. The system may be configured to perform the monitoring shown in blocks 301 or 302 on a periodic basis such as a daily, weekly, or monthly basis. In such a case, the system would, at block 306, wait for the appropriate period of time to elapse prior to again reviewing messages. In an additional embodiment, the system may wait at block 306 for further messages to be received prior to again performing either of block 301 or 302.



FIG. 4 is a flowchart of a process of verifying credentials with an online service, as used in an embodiment. Although this process is described with respect to an online service connected via a network such as the Internet, this process may equally be applied to services accessible by other forms of communication. For example, this process may be applied to telephone services by automatically dialing and providing information to such services. In various embodiments, additional blocks may be included, some blocks may be removed, and/or blocks may be connected or arranged differently from what is shown.


At block 401, the system maintains online credentials for a user. These online credentials may include a login name and a password. Other information that may be used to authenticate users to online services may also be stored at block 401.


At block 402, the system connects with an online service associated with the credentials maintained at block 401. The system may connect with the online service by any number of means. For example, it may attempt to access the main web page of the online service or it may attempt to access a login page of the online service. In another embodiment, the system may access a special application programming interface (API) provided by the online service. Such an API may be an HTML based API such as a SOAP API or a REST API. In an embodiment, the communications performed at block 402 are performed over a secure channel. In an embodiment, the system maintains instructions for how to connect with the online service at block 402 in one or more repositories such as the online service repository 105 of FIG. 1.


At block 403, the system provides the credentials maintained at block 401 to the online service to which the system has connected at block 402. The system may be configured to provide those credentials to the online service in a manner expected by the online service. The appropriate manner of providing those online credentials may be stored in a repository such as online service repository 105 in FIG. 1.


In an embodiment, the credentials are provided over a secure communications channel. In an embodiment, the credentials may be provided by an HTTP protocol such as an HTTP POST form submission. In an embodiment, the credentials may be transmitted using an HTTP Basic or Digest authentication protocol. In other embodiments, the credentials may be transmitted using a challenge/response protocol, a digital signature, or by other means. Additionally and/or alternatively, the credentials, or any part of the credentials such as a password, may be encrypted or may be obfuscated using a hash function, such as a cryptographic or one-way hash function.


At block 404, the system retrieves a response from the online service, subsequent to the system providing the credentials at block 403. The system may interpret and/or parse the response based on information about the online service, such as information stored in repository 105 of FIG. 1. At block 405, the system analyzes the response retrieved at block 404 to determine whether the response indicates that the credentials were accepted. Such a determination may be specific to particular online services, may depend on the nature of the content received, and/or may be based on parsing of the response data for inclusion of content indicative of whether the credentials were accepted. For example, where the content is a webpage indicating that the password was not correct, then the system may determine that the credentials were not accepted.


If the credentials are not accepted at block 405, then at block 406 the system triggers an event for processing. Such triggering of an event may invoke the performance of a process such as that shown in FIG. 2. As explained previously with respect to FIG. 3, the triggering of the event at block 406 may be performed immediately in response to the determination that the credentials were not accepted or it may be performed at a later time.


If the response indicates that the credentials were accepted at block 405 or after the event is processed at block 406, the system waits for the next monitoring cycle at block 407. The particular intervals at which the system performs the monitoring of online credentials may be specified by the user as a preference. Alternately, the system may include a default period for monitoring. In an embodiment, the system waits for a predefined action that indicates that the credentials should be tested, such as a user-initiated request or a notification from the online service being monitored.



FIGS. 5A and 5B depict sample user interfaces for specifying monitoring services, as used in an embodiment. A user may use such interfaces to manage the performance of monitoring services such as those shown in FIGS. 3 and 4. Additionally, the user may use such interfaces to manage the handling of event notifications and other protective actions such as those shown in FIG. 2. In an embodiment, the system may provide these interfaces to a user computing device or other device by means of a website, a mobile phone application, a tablet device application, a telephone call system, an application programming interface or by other means of communication. In an embodiment, multiple interfaces may be provided.



FIG. 5A illustrates an embodiment of an interface for establishing email monitoring preferences. In this embodiment, the user is able to select an online service using interface element 501. The user may select an online service by typing in a name and/or URL of the online service. In an embodiment, the user may alternatively select the online service using a predefined list. In other embodiments, the user may select the online service by other means. In an embodiment, the user may be able to specify the methods of monitoring a service, thus possibly enabling the user to monitor services not already known to the system.


The user may provide options for online credentials verification using the interface elements shown in block 502. Such information may be used to control a process such as that shown in FIG. 4. The user may provide login credentials such as a user name and password using interface elements 503. Additionally, the user may provide information such as the frequency of monitoring using interface elements 504. The system may request additional information or less information depending on the particular requirements of the monitoring service provided by the system.


The user may configure email monitoring services using the interface elements included in block 505. The information provided in block 505 may be used to configure the performance of a method such as that shown in FIG. 3. For example, the user may provide an email address to be monitored using interface element 506. In an embodiment where the user wishes to have the system retrieve emails from the specified account, the user may provide login and password information or other login information to the system. In another embodiment, the system may be configured to have access to certain email services so that login credentials are not required for the email monitoring service to function.


Additionally, the user may choose to forward emails to the system and may indicate a desire to do so at using interface element 507. Upon selecting this interface the system may provide instructions to the user as to how to forward email to the system. Additionally, the system may configure itself to receive emails and perform monitoring on those emails.


The user may configure direct monitoring of the selected online service using the interface elements shown in block 508. To enable direct monitoring, the user may select interface element 509. Selection of this interface element may cause the system to periodically query the online service for identity or personal information changes. The user may be provided with options for how frequently the monitoring is to be provided. Alternatively, selecting interface element 509 may cause the system to notify the online service of the user's interest in identity and personal information monitoring. Such a request may cause the online service, based on a prior agreement between the system and the online service, to send notifications to the system in response to the online service detecting changes (and/or attempted changes) to the user's identity or personal information. Such monitoring has the advantage that the system may only receive notifications about verified and actual information changes rather than likely information changes detected either through online monitoring, email monitoring or by other means.


In the embodiment shown in FIG. 5, the various forms of monitoring are associated with a particular site provided using interface element 501. In other embodiments, some or all of the forms of monitoring need not be associated with a particular site. For example, a user may be able to use email monitoring, in which case the system may, upon detecting a message of interest, determine a site or online service associated with the message and perform actions based on that message. Similarly, the user may sign up for direct monitoring without specifying a particular site, and the system would process events based on any notifications relating to the user and received from online services. Such embodiments may thus relieve the user of having to manually specify every site or service to be monitored.


Where the system provides other monitoring means, the interface shown in FIG. 5A may include further sections and interface elements to receive configuration settings for those monitoring means. In embodiments that do not implement all the monitoring means shown in FIG. 5A, the interface may be adjusted accordingly.


Turning to FIG. 5B, the system may present a user interface for specifying responses to detected events. The information provided by the user in such an interface may be used to configure the performance of a method such as that shown in FIG. 2.


In an embodiment the interface includes options for various threat or urgency levels. For example, options for severe threats are shown in box 510. Options for moderate threats are shown in box 511. Other levels of threats may also be included on this interface and/or other interfaces. Additionally, in other embodiments the system may categorize threats using different terminology or different categorizations. For example, the system may categorize events as password change events, address change events, login name change events, and so on. In such a case, the interface of FIG. 5B may display boxes for types of event categories. In an embodiment, the categorizations may be account-dependent and/or user-defined.


In an embodiment, the system provides options for notification and/or actions to be taken in response to particular events. For example, interface elements 512 provide options for notification delivery options in response to a severe threat. Interface elements 513 provide options for actions to be taken in response to a severe threat. In an embodiment the system indicates that certain notifications and/or actions are recommended. In an embodiment the recommended notification and/or actions are selected by default.


Additionally, the interface provides options for when the action should be taken. For example using element 514 the user may request to take an action automatically upon detection of the events. Using element 515, the user may request that the system ask the user before taking any further actions. In an embodiment, the recommended actions and/or notifications may change based on whether the user wishes to be asked before taking the action or taking the action automatically. In an embodiment, the interface may provide further controls for specifying that some actions are to be taken without user confirmation, and other actions are to be taken only after user confirmation.


The recommended actions may be dependent as well upon the threat level of the events. As shown in interface elements 514 and 515, the recommended notifications and actions may differ for moderate threats as opposed to severe threats. Thus, the system may recommend levels of notification and/or actions that are appropriate to particular threats, so that users need not select forms of notification and/or actions without prior suggestion.



FIG. 6 shows a sample notification email sent in response to an event, as used in an embodiment. The notification may be sent by any number of means such as by email, by text message, by voicemail, by telephone call, via a mobile phone application, via a portable computer application, or by other means.


The notification message includes pertinent information relating to the detected events. For example, the notification shows in subject line 601 that the event is a password change and that it is a severe threat. The notification message may also indicate the online service at which the event was detected 602 and it may provide information about responding to the event 603. Additionally, in an embodiment, the notification message may include a link to the content that triggered the event and/or a copy, snippet, summary, or other representation of the content that triggered the event, such as the email from the online service or the page returned by the attempted login.


In an embodiment, the notification message may provide options for the user to respond to the event. By providing these options in the notification message itself, the system provides users with the benefit of being able to respond quickly and informedly to the detected events. In an alternate embodiment, the notification message may provide a link or other mechanism by which the user may access a website, mobile phone application, or other interface for responding to the event. Such an interface may appear much like the interface described below with respect to FIG. 6.


In the embodiment shown in FIG. 6, the user may indicate using interface element 604 that the event was initiated by the user so that it may be ignored. If the user did not initiate the change, however, then the user may select interface element 605 and further select actions to be taken using interface elements 606. In an embodiment, interface elements 606 are only displayed if interface element 605 is selected to indicate that the event was not initiated by the user. In an embodiment, the elements 606 that are selected by default are based on preferences provided by the user, for example, using an interface such as that shown in FIG. 5B.


Notification email may also include an interface control 607 to enable the user to submit the information provided using element 604, 605, and 606. Upon submitting this information to the system, the system may then undertake the appropriate actions by for example applying block 208 as shown in FIG. 2.


Other embodiments of the notification message shown in FIG. 6 may be used, and may include different information from that shown. In an embodiment, the contents of the notification message are customized to the particular type of event detected. In an embodiment, where some actions have already been taken, the notification message may include information indicating the results of the actions taken. In an embodiment, upon the user requesting certain actions to be taken using the notification message, a further message confirming the results of those actions is sent to the user.



FIG. 7 is a sample user interface with a notification message as displayed on a mobile device, as used in an embodiment. The information provided by the interface may be similar to that shown in the notification email of FIG. 6, and in various embodiments the mobile interface may include less or additional information, or information organized in a different form, as is suitable for the particular mobile device. In other embodiments, notifications may be transmitted to a mobile device via other delivery mechanisms, such as SMS messages, browser-renderable content, standalone applications, etc., which may also allow the user to select protective actions through any of these mechanisms.


The sample interface of FIG. 7 includes information about the detected event 701, as well as options for displaying further information 702. The sample interface further includes options for actions to be taken 703, an interface element to initiate the performance of the actions 704, and an interface element to ignore the event 705. Additional controls and/or information may be included on the mobile interface. In an embodiment, the mobile interface is displayed as a series of screens, in order to reduce the amount of information shown on each screen to accommodate for the smaller available display size on many mobile devices.


Example System Architecture



FIG. 8 is a block diagram illustrating one embodiment of a computing system that implements the systems and methods described herein. In the embodiment of FIG. 8, a computing device 801 is in communication with a user 802, as well as an optional third-party data source 803, via a network 804. In an embodiment, the computing device 801 receives data, such as credit data, from one or more data sources 803 and accesses the data to identify information regarding one or more entities. The computing device 801 may then perform analysis and prepare information for presentation to the user 802. The management system 101 may include the same or similar components as the computing device 801. Similarly, the computing devices 801 may be used to implement any of the methods discussed herein.


The network 804 may include any communication network or combination of communication networks, such as one or more of the Internet, LANs, WANs, MANs, etc., for example. In the embodiment of FIG. 801, the computing device 801 includes a computing system having one or more computing devices (e.g., computers). The computing device 801 may include, for example, a single computing device, a computer server, a smart storage unit, or a combination of one or more computing devices and/or computer servers. Depending on the embodiment, the components illustrated in the computing device 801 may be distributed amongst multiple devices, such as via a local area or other network connection. In other embodiments the computing device 801 may include fewer and/or additional components that are illustrated in FIG. 8.


The exemplary computing device 801 may be a general purpose computer using one or more microprocessors, such as, for example, an Intel® Pentium® processor, an Intel® Pentium® II processor, an Intel® Pentium® Pro processor, an Intel® Pentium® IV processor, an Intel® Pentium® D processor, an Intel® Core™ processor, an xx86 processor, an 8051 processor, a MIPS processor, a Power PC processor, a SPARC processor, an Alpha processor, and so forth. The computer may run a variety of operating systems that perform standard operating system functions such as, for example, opening, reading, writing, and closing a file. It is recognized that other operating systems may be used, such as, for example, Microsoft® Windows® 3.X, Microsoft® Windows 98, Microsoft® Windows® 2000, Microsoft® Windows® NT, Microsoft® Windows® CE, Microsoft® Windows® ME, Microsoft® Windows® XP, Windows® 7, Palm Pilot OS, Apple® MacOS®, Disk Operating System (DOS), UNIX, IRIX, Solaris, SunOS, FreeBSD, Linux®, or IBM® OS/2® operating systems. In other embodiments, the computing device 801 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The computing device 801 includes one or more central processing units (“CPU”) 805, which may each include one or more conventional or proprietary microprocessor(s). The computing device 801 may further include one or more memories 806, such as random access memory (“RAM”), for temporary storage of information, read only memory (“ROM”) for permanent storage of information, and/or a mass storage device 807, such as a hard drive, diskette, or optical media storage device. The memory 806 may store software code, or instructions, for execution by the processor 805 in order to cause the computing device to perform certain operations, such as gathering sensor-related data, processing the data with statistical and/or predictive models, formatting data for user devices or other presentation, transmitting data, or other operations described or used herein.


The methods described and claimed herein may be performed by any suitable computing device, such as the computing device 801. The methods may be executed on such suitable computing devices in response to execution of software instructions or other executable code read from a non-transitory tangible computer readable medium or computer storage device. A computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.


The exemplary computing device 801 may include one or more input/output (I/O) devices and interfaces 808, such as a keyboard, trackball, mouse, drawing tablet, joystick, game controller, touchscreen (e.g., capacitive or resistive touchscreen), touchpad, accelerometer, and/or printer, for example. The computing device 801 may also include one or more multimedia devices 809, such as a display device (also referred to herein as a display screen), which may also be one of the I/O devices 808 in the case of a touchscreen, for example. Display devices may include LCD, OLED, or other thin screen display surfaces, a monitor, television, projector, or any other device that visually depicts user interfaces and data to viewers. The computing device 801 may also include one or more multimedia devices, such as speakers, video cards, graphics accelerators, and microphones, for example.


In the embodiment of FIG. 8, the I/O devices and interfaces 808 provides a communication interface to various external devices via the network 804. For example, the computing device 801 may be electronically coupled to the network 804 via a wired, wireless, or combination of wired and wireless, communication link(s). The network 804 may allow communication with various other computing devices and/or other electronic devices via wired or wireless communication links.


In the embodiment of FIG. 8, the computing device 801 may include an identity monitoring service module 107, an event notification module 104, and a user interface module 103, as well as other modules or fewer modules. The computing device 801 may include fewer or additional modules, such as the email monitoring module 108, the credentials monitoring module 109, and/or the direct notification module 110, which are discussed above with reference to FIG. 1. Each of these modules is discussed in further detail below. In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in any programming language, such as, for example, Java, Python, Perl, Lua, C, C++, C#, Objective C, etc. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. Software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the executing computing device, such as the computing device 801, for execution by the computing device. Hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are typically implemented as software modules, but may be implemented in hardware, firmware and/or software. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


Example Modules


In the embodiment of FIG. 8, the computing device 801 includes three modules, namely, an identity monitoring service module 107, an event notification module 104, and a user interface module 103, as well as other modules or fewer modules. In this embodiment, each of the modules is shown as part of the computing device 801. However, in other embodiments, the modules may be distributed across multiple devices, and may be controlled and/or operated by multiple different entities. These modules are configured to perform methods as described throughout this specification. In various embodiments, fewer or additional modules may be included within a computing system.


The computing device 801 may be configured to acquire user data and other external data such as third-party data. The various modules and/or other modules may comprise software alone, hardware alone, or a combination of software and hardware. The device may be especially adapted to communicate using a variety of network or communications protocols in order to communicate with external data sources such as data repositories, network servers, online services, telecommunication services, distributed computing systems, and so on. Some of these protocols may include standard network protocols, such as HTTP, FTP, SNMP, or the like. The device may further include hardware drivers, such as USB, FireWire, Thunderbolt (Light Peak), or serial communications drivers, for example to communicate with devices in direct communication with the system.


The computing device 801 may be configured to transmit, or initiate transmission of, data such as user interfaces, data reports, application programming interface data, or the like, to requesting entities, such as external user 802, that have registered interest with the system. In one embodiment, the device provides the data in an unformatted data structure, such as in an XML, CSV, TXT, or other spreadsheet, text, or web accessible data structure. In other embodiments, the device provides information in user interfaces, such as user interfaces that are configured for rendering by a web browser, mobile device, tablet device, or other device or application, for display to users. A variety of different presentations may be provided. In some embodiments, the requesting entities may indicate presentation preferences or configurations (e.g., data formats and/or types of information), and the device may transmit data based on the indicated preferences or configurations. The presentation format may also be determined based on the type of device being used by the user.


In an embodiment, any or all of the modules 103, 104, and 107-110 are configured to act in real time. Thus, when data is received by the modules, the modules process that data as soon as practicable or necessary to provide users with timely information. In order to achieve this, specialized hardware may be used to gain efficiency, and executable code may be designed to minimize latency or computation time. In an embodiment, the modules, possibly with other modules of the system, are executed within a real-time operating system, to enhance the responsiveness of the system.


SUMMARY

Depending on the embodiment, the methods described with reference to the flowcharts and block diagrams such as FIGS. 1-4 and 8, as well as any other methods discussed herein, may include fewer or additional blocks and/or the blocks may be performed in a different order than is illustrated. Software code configured for execution on a computing device in order to perform the methods may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, hard drive, memory device or any other tangible medium. Such software code may be stored, partially or fully, on a memory of a computing device, such as the computing system 101 of FIG. 1 and/or other computing devices illustrated in the Figures, in order to perform the respective methods. For ease of explanation, the method will be described herein as performed by the various modules, such as may be executed on the computing system 101, which should be interpreted to include any one or more of the computing devices noted above and/or any other suitable computing device.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


All of the methods and processes described above may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers. For example, the methods described herein may be performed by the computing devices described herein and/or any other suitable computing device. The methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium. A tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A method of monitoring and handling potential identity theft threats, the method being performed by a monitoring computer having one or more computer processors, the method comprising: receiving a request, by the monitoring computer, from a user to monitor a third party account of the user with an online service provider, the request including personal information associated with the user and a plurality of user preferences, each user preference specifying one or more protective actions to be taken in response to detection, by the monitoring computer, of a change or attempted change to personal information associated with the account;periodically monitoring the third party account of the user for indications of changes or attempted changes to personal information associated with the account;detecting a change or attempted change to personal information associated with the account;determining a risk level associated with the detected change or attempted change to personal information associated with the account;identifying, from the user preferences, a user preference associated with the determined risk level;transmitting, via a communication channel, a notification to the user, wherein the communication channel is specified by the user preference; andinitiating one or more protective actions included in the identified user preference.
  • 2. The method of claim 1, wherein periodically monitoring the third party account of the user for indications of changes to personal information comprises periodically connecting to the online service provider associated with the third party account of the user, providing the online service provider with login credentials associated with the user, and determining whether the online service provider accepts the provided login credentials.
  • 3. The method of claim 1, wherein periodically monitoring the third party account of the user for indications of changes to personal information comprises periodically retrieving electronic messages associated with the user and analyzing content of the retrieved messages to determine whether any of the messages indicates a change to personal information.
  • 4. The method of claim 1, wherein the risk level is determined at least in part based on whether a preauthorization for the change or attempted change to personal information was received.
  • 5. The method of claim 1, wherein the one or more protective actions are initiated subsequent to receiving user approval for initiating the one or more protective actions.
  • 6. The method of claim 1, wherein the one or more protective actions are initiated without requiring user approval for initiating the one or more protective actions.
  • 7. A computing system configured to monitor and handle potential identity theft threats, comprising: a non-transitory computer-readable storage medium having stored thereon a plurality of executable software modules;one or more computer hardware processors configured to execute the plurality of software modules stored on the computer-readable storage medium;a network interface;a message monitoring module configured to access an electronic message received in an electronic mail account of a user and determine whether the electronic message indicates a change, a possible change, or an attempted change to personal information associated with an external account of the user;an event notification module configured to determine a risk level associated with the indicated change, possible change, or attempted change to personal information associated with the external account of the user and identify a user preference associated with the determined risk level in response to the message monitoring module determining that the electronic message indicates a change, a possible change, or an attempted change to personal information associated with the external account of the user, wherein the user preference specifies one or more user-customizable responsive actions to execute in response to determining that the electronic message indicates a change, a possible change, or an attempted change to personal information; andthe event notification module further configured to execute at least one of the one or more user-customizable responsive actions based upon the user preference.
  • 8. The computing system of claim 7, wherein the message monitoring module is configured to retrieve the electronic message by automatically logging into one or more email accounts and gathering messages from the one or more email accounts.
  • 9. The computing system of claim 7, wherein the message monitoring module is configured to retrieve the electronic message by receiving messages sent to the computing system.
  • 10. The computing system of claim 7, wherein at least one of the user-customizable responsive actions is sending an electronic notification identifying the change or possible change to personal information.
  • 11. The computing system of claim 7, wherein the event notification module is configured to execute at least a portion of the user-customizable responsive actions only in response to receiving a user confirmation message.
  • 12. The computing system of claim 7, wherein the event notification module is further configured to determine whether the change, the possible change, or the attempted change to personal information was preauthorized, and further configured to execute different user-customizable responsive actions if the possible change to personal information was preauthorized.
  • 13. The computing system of claim 7, wherein the event notification module is configured to execute at least a portion of the user-customizable responsive actions without requiring user approval to execute the portion of the one or more user-customizable responsive actions.
  • 14. Non-transitory physical computer storage comprising computer-executable instructions stored thereon that, when executed by a hardware processor, are configured to perform operations comprising: receiving a request from a user to monitor a third party account of the user with an online service provider, the request including personal information associated with the user and a plurality of user preferences, each user preference specifying one or more protective actions to be taken in response to detection of a change or attempted change to personal information associated with the account;periodically monitoring the third party account of the user for indications of changes or attempted changes to personal information associated with the account;detecting a change or attempted change to personal information associated with the account;determining a risk level associated with the detected change or attempted change to personal information associated with the account;identifying, from the user preferences, a user preference associated with the determined risk level;transmitting, via a communication channel, a notification to the user, wherein the communication channel is specified by the user preference; andinitiating one or more protective actions included in the identified user preference.
  • 15. The non-transitory physical computer storage of claim 14, wherein periodically monitoring the third party account of the user for indications of changes to personal information comprises periodically connecting to the online service provider associated with the third party account of the user, providing the online service provider with login credentials associated with the user, and determining whether the online service provider accepts the provided login credentials.
  • 16. The non-transitory physical computer storage of claim 14, wherein periodically monitoring the third party account of the user for indications of changes to personal information comprises periodically retrieving electronic messages associated with the user and analyzing content of the retrieved messages to determine whether any of the messages indicates a change to personal information.
  • 17. The non-transitory physical computer storage of claim 16, wherein the electronic messages are retrieved by automatically logging into one or more email accounts and gathering messages from the one or more email accounts.
  • 18. The non-transitory physical computer storage of claim 14, wherein the risk level is determined at least in part based on whether a preauthorization for the change or attempted change to personal information was received.
  • 19. The non-transitory physical computer storage of claim 14, wherein the one or more protective actions are initiated subsequent to receiving user approval for initiating the one or more protective actions.
  • 20. The non-transitory physical computer storage of claim 14, wherein the one or more protective actions are initiated without requiring user approval for initiating the one or more protective actions.
US Referenced Citations (411)
Number Name Date Kind
5659731 Gustafson Aug 1997 A
5719941 Swift et al. Feb 1998 A
5754632 Smith May 1998 A
5832068 Smith Nov 1998 A
5881131 Farris et al. Mar 1999 A
5966695 Melchione et al. Oct 1999 A
6021397 Jones et al. Feb 2000 A
6026440 Shrader et al. Feb 2000 A
6072894 Payne Jun 2000 A
6073140 Morgan et al. Jun 2000 A
6119103 Basch et al. Sep 2000 A
6128602 Northington et al. Oct 2000 A
6157707 Baulier et al. Dec 2000 A
6182068 Culliss Jan 2001 B1
6254000 Degen et al. Jul 2001 B1
6263447 French et al. Jul 2001 B1
6311169 Duhon Oct 2001 B2
6343279 Bissonette et al. Jan 2002 B1
6356937 Montville et al. Mar 2002 B1
6397212 Biffar May 2002 B1
6457012 Jatkowski Sep 2002 B1
6523041 Morgan et al. Feb 2003 B1
6539377 Culliss Mar 2003 B1
6629245 Stone et al. Sep 2003 B1
6658393 Basch et al. Dec 2003 B1
6714944 Shapiro et al. Mar 2004 B1
6750985 Rhoads Jun 2004 B2
6766327 Morgan, Jr. et al. Jul 2004 B2
6796497 Benkert et al. Sep 2004 B2
6816850 Culliss Nov 2004 B2
6845448 Chaganti et al. Jan 2005 B1
6871287 Ellingson Mar 2005 B1
6892307 Wood et al. May 2005 B1
6928487 Eggebraaten et al. Aug 2005 B2
6934714 Meinig Aug 2005 B2
6950807 Brock Sep 2005 B2
6968319 Remington et al. Nov 2005 B1
6973462 Dattero et al. Dec 2005 B2
6988085 Hedy Jan 2006 B2
7028013 Saeki Apr 2006 B2
7028052 Chapman et al. Apr 2006 B2
7039607 Watarai et al. May 2006 B2
7076462 Nelson et al. Jul 2006 B1
7089594 Lal et al. Aug 2006 B2
7107241 Pinto Sep 2006 B1
7194416 Provost et al. Mar 2007 B1
7200602 Jonas Apr 2007 B2
7209895 Kundtz et al. Apr 2007 B2
7234156 French et al. Jun 2007 B2
7246740 Swift et al. Jul 2007 B2
7249113 Continelli et al. Jul 2007 B1
7263497 Wiser et al. Aug 2007 B1
7289971 O'Neil et al. Oct 2007 B1
7330717 Gidron et al. Feb 2008 B2
7343295 Pomerance Mar 2008 B2
7356516 Richey et al. Apr 2008 B2
7370044 Mulhern et al. May 2008 B2
7383988 Slonecker, Jr. Jun 2008 B2
7403942 Bayliss Jul 2008 B1
7433864 Malik Oct 2008 B2
7437679 Uemura et al. Oct 2008 B2
7444518 Dharmarajan et al. Oct 2008 B1
7458508 Shao et al. Dec 2008 B1
7467401 Cicchitto Dec 2008 B2
7480631 Merced et al. Jan 2009 B1
7503489 Heffez Mar 2009 B2
7509117 Yum Mar 2009 B2
7512221 Toms Mar 2009 B2
7529698 Joao May 2009 B2
7530097 Casco-Arias et al. May 2009 B2
7542993 Satterfield et al. Jun 2009 B2
7543739 Brown et al. Jun 2009 B2
7546271 Chmielewski et al. Jun 2009 B1
7548886 Kirkland et al. Jun 2009 B2
7552467 Lindsay Jun 2009 B2
7562814 Shao et al. Jul 2009 B1
7575157 Barnhardt et al. Aug 2009 B2
7577665 Ramer et al. Aug 2009 B2
7580884 Cook Aug 2009 B2
7587368 Felsher Sep 2009 B2
7610216 May et al. Oct 2009 B1
7620596 Knudson et al. Nov 2009 B2
7623844 Herrmann et al. Nov 2009 B2
7647344 Skurtovich et al. Jan 2010 B2
7653592 Flaxman et al. Jan 2010 B1
7653600 Gustin Jan 2010 B2
7672833 Blume et al. Mar 2010 B2
7689487 Britto et al. Mar 2010 B1
7689563 Jacobson Mar 2010 B1
7690032 Peirce Mar 2010 B1
7698214 Lindgren Apr 2010 B1
7698217 Phillips et al. Apr 2010 B1
7708190 Brandt et al. May 2010 B2
7711635 Steele et al. May 2010 B2
7725385 Royer et al. May 2010 B2
7747520 Livermore et al. Jun 2010 B2
7747521 Serio Jun 2010 B2
7761384 Madhogarhia Jul 2010 B2
7769697 Fieschi et al. Aug 2010 B2
7774270 MacCloskey Aug 2010 B1
7788040 Haskell et al. Aug 2010 B2
7792715 Kasower Sep 2010 B1
7793835 Coggeshall et al. Sep 2010 B1
7841004 Balducci Nov 2010 B1
7841008 Cole et al. Nov 2010 B1
7844520 Franklin Nov 2010 B1
7849014 Erikson Dec 2010 B2
7853493 DeBie et al. Dec 2010 B2
7870078 Clark et al. Jan 2011 B2
7877304 Coulter Jan 2011 B1
7908242 Achanta Mar 2011 B1
7909246 Hogg et al. Mar 2011 B2
7912865 Akerman et al. Mar 2011 B2
7958046 Doerner et al. Jun 2011 B2
7966192 Pagliari et al. Jun 2011 B2
7970679 Kasower Jun 2011 B2
7975299 Balducci et al. Jul 2011 B1
7979908 Millwee Jul 2011 B2
7983932 Kane Jul 2011 B2
8001235 Russ et al. Aug 2011 B2
8032932 Speyer et al. Oct 2011 B2
8037097 Guo et al. Oct 2011 B2
8055904 Cato et al. Nov 2011 B1
8060424 Kasower Nov 2011 B2
8065233 Lee et al. Nov 2011 B2
8078453 Shaw Dec 2011 B2
8078524 Crawford et al. Dec 2011 B2
8078881 Liu Dec 2011 B1
8099341 Varghese Jan 2012 B2
8104679 Brown Jan 2012 B2
8131777 McCullough Mar 2012 B2
8175889 Girulat et al. May 2012 B1
8195549 Kasower Jun 2012 B2
8224723 Bosch et al. Jul 2012 B2
8234498 Britti et al. Jul 2012 B2
8244848 Narayanan et al. Aug 2012 B1
8281372 Vidal Oct 2012 B1
8285613 Coulter Oct 2012 B1
8285656 Chang et al. Oct 2012 B1
8312033 McMillan Nov 2012 B1
8327429 Speyer et al. Dec 2012 B2
8478674 Kapczynski et al. Jul 2013 B1
8484186 Kapczynski et al. Jul 2013 B1
8515828 Wolf et al. Aug 2013 B1
8515844 Kasower Aug 2013 B2
8533118 Weller et al. Sep 2013 B2
8600886 Ramavarjula et al. Dec 2013 B2
8601602 Zheng Dec 2013 B1
8646051 Paden et al. Feb 2014 B2
8725613 Celka et al. May 2014 B1
8818888 Kapczynski et al. Aug 2014 B1
8856894 Dean et al. Oct 2014 B1
20010011245 Duhon Aug 2001 A1
20010029482 Tealdi et al. Oct 2001 A1
20010039532 Coleman, Jr. et al. Nov 2001 A1
20010042785 Walker et al. Nov 2001 A1
20010044729 Pomerance Nov 2001 A1
20010044756 Watkins et al. Nov 2001 A1
20020045154 Wood et al. Apr 2002 A1
20020059201 Work May 2002 A1
20020077964 Brody et al. Jun 2002 A1
20020111816 Lortscher et al. Aug 2002 A1
20020147801 Gullotta et al. Oct 2002 A1
20020173994 Ferguson, III Nov 2002 A1
20020198824 Cook Dec 2002 A1
20030009418 Green et al. Jan 2003 A1
20030009426 Ruiz-Sanchez Jan 2003 A1
20030023531 Fergusson Jan 2003 A1
20030061163 Durfield Mar 2003 A1
20030069839 Whittington et al. Apr 2003 A1
20030097342 Whittingtom May 2003 A1
20030097380 Mulhern et al. May 2003 A1
20030115133 Bian Jun 2003 A1
20030158960 Engberg Aug 2003 A1
20030163733 Barriga-Caceres et al. Aug 2003 A1
20030171942 Gaito Sep 2003 A1
20030187837 Culliss Oct 2003 A1
20030195859 Lawrence Oct 2003 A1
20030220858 Lam et al. Nov 2003 A1
20040015714 Abraham et al. Jan 2004 A1
20040015715 Brown Jan 2004 A1
20040019518 Abraham et al. Jan 2004 A1
20040024709 Yu et al. Feb 2004 A1
20040030649 Nelson et al. Feb 2004 A1
20040039586 Garvey et al. Feb 2004 A1
20040044628 Mathew et al. Mar 2004 A1
20040044673 Brady et al. Mar 2004 A1
20040110119 Riconda et al. Jun 2004 A1
20040111359 Hudock Jun 2004 A1
20040117302 Weichert et al. Jun 2004 A1
20040122681 Ruvolo et al. Jun 2004 A1
20040128150 Lundegren Jul 2004 A1
20040133440 Carolan et al. Jul 2004 A1
20040143596 Sirkin Jul 2004 A1
20040158723 Root Aug 2004 A1
20040167793 Masuoka et al. Aug 2004 A1
20040193891 Ollila Sep 2004 A1
20040199789 Shaw et al. Oct 2004 A1
20040210661 Thompson Oct 2004 A1
20040220865 Lozowski et al. Nov 2004 A1
20040220918 Scriffignano et al. Nov 2004 A1
20040230527 Hansen et al. Nov 2004 A1
20040243588 Tanner et al. Dec 2004 A1
20040249811 Shostack et al. Dec 2004 A1
20040250107 Guo Dec 2004 A1
20040267714 Frid et al. Dec 2004 A1
20050010513 Duckworth et al. Jan 2005 A1
20050021551 Silva et al. Jan 2005 A1
20050027983 Klawon Feb 2005 A1
20050055231 Lee Mar 2005 A1
20050058262 Timmins et al. Mar 2005 A1
20050071328 Lawrence Mar 2005 A1
20050091164 Varble Apr 2005 A1
20050097320 Golan et al. May 2005 A1
20050102180 Gailey et al. May 2005 A1
20050125397 Gross et al. Jun 2005 A1
20050125686 Brandt Jun 2005 A1
20050137899 Davies et al. Jun 2005 A1
20050144452 Lynch et al. Jun 2005 A1
20050154665 Kerr Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050216434 Haveliwala et al. Sep 2005 A1
20050216955 Wilkins et al. Sep 2005 A1
20050288998 Verma et al. Dec 2005 A1
20060004623 Jasti Jan 2006 A1
20060010391 Uemura et al. Jan 2006 A1
20060036543 Blagg et al. Feb 2006 A1
20060041464 Powers et al. Feb 2006 A1
20060059110 Madhok et al. Mar 2006 A1
20060059362 Paden et al. Mar 2006 A1
20060074986 Mallalieu et al. Apr 2006 A1
20060074991 Lussier et al. Apr 2006 A1
20060080251 Fried et al. Apr 2006 A1
20060101508 Taylor May 2006 A1
20060129481 Bhatt et al. Jun 2006 A1
20060129533 Purvis Jun 2006 A1
20060161435 Atef et al. Jul 2006 A1
20060173792 Glass Aug 2006 A1
20060178971 Owen et al. Aug 2006 A1
20060184585 Grear et al. Aug 2006 A1
20060212407 Lyon Sep 2006 A1
20060218407 Toms Sep 2006 A1
20060229943 Mathias et al. Oct 2006 A1
20060229961 Lyftogt et al. Oct 2006 A1
20060253358 Delgrosso et al. Nov 2006 A1
20060262929 Vatanen et al. Nov 2006 A1
20060271457 Romain et al. Nov 2006 A1
20060277089 Hubbard et al. Dec 2006 A1
20060294199 Bertholf Dec 2006 A1
20070005508 Chiang Jan 2007 A1
20070005984 Florencio et al. Jan 2007 A1
20070022141 Singleton et al. Jan 2007 A1
20070027816 Writer Feb 2007 A1
20070032240 Finnegan et al. Feb 2007 A1
20070038568 Greene et al. Feb 2007 A1
20070067297 Kublickis Mar 2007 A1
20070073889 Morris Mar 2007 A1
20070078908 Rohatgi et al. Apr 2007 A1
20070078985 Shao et al. Apr 2007 A1
20070083460 Bachenheimer Apr 2007 A1
20070093234 Willis et al. Apr 2007 A1
20070094230 Subramaniam et al. Apr 2007 A1
20070094241 M. Blackwell et al. Apr 2007 A1
20070112667 Rucker May 2007 A1
20070124256 Crooks et al. May 2007 A1
20070156692 Rosewarne Jul 2007 A1
20070174186 Hokland Jul 2007 A1
20070174448 Ahuja et al. Jul 2007 A1
20070174903 Greff Jul 2007 A1
20070240206 Wu et al. Oct 2007 A1
20070245245 Blue et al. Oct 2007 A1
20070288355 Roland et al. Dec 2007 A1
20070288360 Seeklus Dec 2007 A1
20070294195 Curry et al. Dec 2007 A1
20080010206 Coleman Jan 2008 A1
20080010687 Gonen et al. Jan 2008 A1
20080028446 Burgoyne Jan 2008 A1
20080047017 Renaud Feb 2008 A1
20080052182 Marshall Feb 2008 A1
20080052244 Tsuei et al. Feb 2008 A1
20080059364 Tidwell et al. Mar 2008 A1
20080071682 Dominguez Mar 2008 A1
20080083021 Doane et al. Apr 2008 A1
20080086431 Robinson et al. Apr 2008 A1
20080103800 Domenikos et al. May 2008 A1
20080103972 Lanc May 2008 A1
20080109422 Dedhia May 2008 A1
20080115226 Welingkar et al. May 2008 A1
20080120569 Mann et al. May 2008 A1
20080120716 Hall et al. May 2008 A1
20080141346 Kay et al. Jun 2008 A1
20080148368 Zurko et al. Jun 2008 A1
20080154758 Schattmaier et al. Jun 2008 A1
20080162383 Kraft Jul 2008 A1
20080175360 Schwarz et al. Jul 2008 A1
20080183480 Carlson et al. Jul 2008 A1
20080183585 Vianello Jul 2008 A1
20080195548 Chu et al. Aug 2008 A1
20080201401 Pugh et al. Aug 2008 A1
20080208735 Balet et al. Aug 2008 A1
20080222706 Renaud et al. Sep 2008 A1
20080229415 Kapoor et al. Sep 2008 A1
20080255992 Lin Oct 2008 A1
20080270295 Lent et al. Oct 2008 A1
20080281737 Fajardo Nov 2008 A1
20080288299 Schultz Nov 2008 A1
20080301016 Durvasula et al. Dec 2008 A1
20080319889 Hammad Dec 2008 A1
20090006230 Lyda et al. Jan 2009 A1
20090037332 Cheung et al. Feb 2009 A1
20090043691 Kasower Feb 2009 A1
20090055322 Bykov et al. Feb 2009 A1
20090064297 Selgas et al. Mar 2009 A1
20090100047 Jones et al. Apr 2009 A1
20090106141 Becker Apr 2009 A1
20090106150 Pelegero et al. Apr 2009 A1
20090106846 Dupray et al. Apr 2009 A1
20090125972 Hinton et al. May 2009 A1
20090126013 Atwood et al. May 2009 A1
20090157693 Palahnuk Jun 2009 A1
20090164380 Brown Jun 2009 A1
20090172788 Vedula et al. Jul 2009 A1
20090172795 Ritari et al. Jul 2009 A1
20090177529 Hadi Jul 2009 A1
20090199294 Schneider Aug 2009 A1
20090204599 Morris et al. Aug 2009 A1
20090210241 Calloway Aug 2009 A1
20090228918 Rolff et al. Sep 2009 A1
20090234665 Conkel Sep 2009 A1
20090234775 Whitney et al. Sep 2009 A1
20090240624 James et al. Sep 2009 A1
20090247122 Fitzgerald et al. Oct 2009 A1
20090254375 Martinez et al. Oct 2009 A1
20090254971 Herz et al. Oct 2009 A1
20090260064 Mcdowell et al. Oct 2009 A1
20090307778 Mardikar Dec 2009 A1
20090327270 Teevan et al. Dec 2009 A1
20100043055 Baumgart Feb 2010 A1
20100049803 Ogilvie et al. Feb 2010 A1
20100063993 Higgins et al. Mar 2010 A1
20100077483 Stolfo et al. Mar 2010 A1
20100083371 Bennetts et al. Apr 2010 A1
20100114744 Gonen May 2010 A1
20100114747 Kasower May 2010 A1
20100114776 Weller et al. May 2010 A1
20100122324 Welingkar et al. May 2010 A1
20100122333 Noe et al. May 2010 A1
20100136956 Drachev et al. Jun 2010 A1
20100153278 Farsedakis Jun 2010 A1
20100153290 Duggan Jun 2010 A1
20100161816 Kraft et al. Jun 2010 A1
20100169159 Rose et al. Jul 2010 A1
20100174813 Hildreth et al. Jul 2010 A1
20100179906 Hawkes Jul 2010 A1
20100185546 Pollard Jul 2010 A1
20100217837 Ansari et al. Aug 2010 A1
20100223192 Levine et al. Sep 2010 A1
20100235897 Mason et al. Sep 2010 A1
20100241535 Nightengale et al. Sep 2010 A1
20100250411 Ogrodski Sep 2010 A1
20100257102 Perlman Oct 2010 A1
20100262932 Pan Oct 2010 A1
20100280914 Carlson Nov 2010 A1
20100281020 Drubner Nov 2010 A1
20110023115 Wright Jan 2011 A1
20110029388 Kendall et al. Feb 2011 A1
20110035788 White et al. Feb 2011 A1
20110071950 Ivanovic Mar 2011 A1
20110083181 Nazarov Apr 2011 A1
20110126275 Anderson et al. May 2011 A1
20110137760 Rudie et al. Jun 2011 A1
20110166988 Coulter Jul 2011 A1
20110167011 Paltenghe et al. Jul 2011 A1
20110196791 Dominguez Aug 2011 A1
20110307397 Benmbarek Dec 2011 A1
20110307957 Barcelo et al. Dec 2011 A1
20120011158 Avner et al. Jan 2012 A1
20120016948 Sinha Jan 2012 A1
20120030216 Churi et al. Feb 2012 A1
20120030771 Pierson et al. Feb 2012 A1
20120047219 Feng et al. Feb 2012 A1
20120054592 Jaffe et al. Mar 2012 A1
20120072382 Pearson et al. Mar 2012 A1
20120084866 Stolfo Apr 2012 A1
20120110677 Abendroth et al. May 2012 A1
20120124498 Santoro et al. May 2012 A1
20120151045 Anakata et al. Jun 2012 A1
20120173339 Flynt et al. Jul 2012 A1
20120215682 Lent et al. Aug 2012 A1
20120246060 Conyack, Jr. et al. Sep 2012 A1
20120253852 Pourfallah et al. Oct 2012 A1
20120290660 Rao et al. Nov 2012 A1
20120297484 Srivastava Nov 2012 A1
20130018811 Britti et al. Jan 2013 A1
20130031109 Routson et al. Jan 2013 A1
20130031624 Britti et al. Jan 2013 A1
20130066775 Milam Mar 2013 A1
20130117087 Coppinger May 2013 A1
20130125010 Strandell May 2013 A1
20130132151 Stibel et al. May 2013 A1
20130173449 Ng et al. Jul 2013 A1
20130205135 Lutz Aug 2013 A1
20130298238 Shah et al. Nov 2013 A1
20130332342 Kasower Dec 2013 A1
20130339249 Weller et al. Dec 2013 A1
20140012733 Vidal Jan 2014 A1
20140032723 Nema Jan 2014 A1
20140061302 Hammad Mar 2014 A1
20140089167 Kasower Mar 2014 A1
20140110477 Hammad Apr 2014 A1
20140298485 Gardner Oct 2014 A1
Foreign Referenced Citations (15)
Number Date Country
1 239 378 Jan 2002 EP
1 301 887 Apr 2003 EP
2005-208945 Aug 2005 JP
2000-0063313 Nov 2000 KR
2002-0039203 May 2002 KR
10-2007-0081504 Aug 2007 KR
WO 02029636 Apr 2002 WO
WO 2005033979 Apr 2005 WO
WO 2009064694 May 2009 WO
WO 2009102391 Aug 2009 WO
WO 2010001406 Jan 2010 WO
WO 2010062537 Jun 2010 WO
WO 2010077989 Jul 2010 WO
WO 2010150251 Dec 2010 WO
WO 2011005876 Jan 2011 WO
Non-Patent Literature Citations (61)
Entry
BlueCava, “What We Do”, http://www.bluecava.com/what-we-do/, printed Nov. 5, 2012 in 3 pages.
Hoofnagle, Chris Jay, “Identity Theft: Making the Known Unknowns Known,” Harvard Journal of Law & Technology, Fall 2007, vol. 21, No. 1, pp. 98-122.
Iovation, Device Identification & Device Fingerprinting, http://www.iovation.com/risk-management/device-identification printed Nov. 5, 2012 in 6 pages.
Li et al., “Automatic Verbal Information Verification for User Authentication”, IEEE Transactions on Speech and Audio Processing, vol. 8, No. 5, Sep. 2000, pp. 585-596.
LifeLock, Various Pages, www.lifelock.com/, 2007.
Ogg, Erica, “Apple Cracks Down on UDID Use”, http://gigaom.com/apple/apple-cracks-down-on-udid-use/ printed Nov. 5, 2012 in 5 Pages.
U.S. Appl. No. 12/705,489, filed Feb. 12, 2010, Bargoli et al.
U.S. Appl. No. 12/705,511, filed Feb. 12, 2010, Bargoli et al.
Chores & Allowances. “Do Kids Have Credit Reports?” http://choresandallowances.blogspot.com/2007/10/do-kids-have-credit-reports.html Oct. 15, 2007 as printed May 31, 2011.
Gibbs, Adrienne; “Protecting Your Children from Identity Theft,” http://www.creditcards.com/credit-card-news/identity-ID-theft-and-kids-children-1282.php Nov. 25, 2008 as printed Jul. 5, 2011.
ID Theft Assist, “Do You Know Where Your Child's Credit Is?” http://www.idtheftassist.com/pages/story14 Nov. 26, 2007, as printed May 31, 2011.
LifeLock; “How can LifeLock protect my kids and family?” http://www.lifelock.conn/lifelock-for-people/how-we-do-it/how-can-lifelock-protect-my-kids-and-family as accessed Mar. 14, 2008.
Magid, Lawrence, J. , Business Tools: When Selecting an ASP Ensure Data Mobility, Los Angeles Times, Feb. 26, 2001, vol. C, Issue 4, pp. 3 pages, Los Angeles, CA.
Ramaswamy, Vinita M., Identity-Theft Toolkit, The CPA Journal, Oct. 1, 2006, vol. 76, Issue 10, pp. 66 (5 pages).
Vamosi, Robert, “How to Handle ID Fraud's Youngest Victims,” http://news.cnet.com/8301-10789—3-10105303-57.html Nov. 21, 2008 as printed May 31, 2011.
“Aggregate and Analyze Social Media Content: Gain Faster and Broader Insight to Market Sentiment,” SAP Partner, Mantis Technology Group, Apr. 2011, pp. 4.
ABC News Now:Money Matters; as broadcasted Nov. 15, 2005 with guest Todd Davis (CEO of Lifelock); pp. 6.
Anonymous, “Credit-Report Disputes Await Electronic Resolution,” Credit Card News, Chicago, Jan. 15, 1993, vol. 5, No. 19, p. 5.
Anonymous; “MBNA Offers Resolution of Credit Card Disputes,” Hempstead, Feb. 2002, vol. 68, No. 2, p. 47.
Bielski, Lauren; “Will you Spend to Thwart ID Theft?”; ABA Banking Journal; Apr. 2005; pp. 54, 56-57, 60.
Buxfer, http://www.buxfer.com/ printed Feb. 5, 2014 in 1 page.
Check, http://check.me/ printed Feb. 5, 2014 in 3 pages.
Comlounge.net, “plonesocial.auth.rpx” http://web.archive.org/web/20101026041841/http://comlounge.net/rpx as captured Oct. 26, 2010 in 9 pages.
“Consumers Gain Immediate and Full Access to Credit Score Used by Majority of U.S. Lenders.” PR Newswire, ProQuest Copy; Mar. 19, 2001; p. 1.
“CreditCheck Monitoring Services,” Dec. 11, 2000, pp. 1, lines 21-23.
Cullen, Terri; “The Wall Street Journal Complete Identity Theft Guidebook:How to Protect Yourself from the Most Pervasive Crime in America”; Chapter 3, pp. 59-79; Jul. 10, 2007.
“D&B Corporate Family Linkage”, D&B Internet Access for U.S. Contract Customers, https://www.dnb.com/ecomp/help/linkage.htm as printed Dec. 17, 2009, pp. 1.
Day, Jo and Kevin; “ID-ology: A Planner's Guide to Identity Theft”; Journal of Financial Planning:Tech Talk; pp. 36-38; Sep. 2004.
Facebook, “Facebook helps you connect and share with the people in your life,” <www.facebook.com> printed Nov. 16, 2010 in 1 page.
FamilySecure.com; “Frequently Asked Questions|FamilySecure.com”, http://www.familysecure.com/FAQ.aspx, dated Jul. 15, 2007 on www.archive.org.
“Ficticious Business Name Records”, Westlaw Database Directory, http://directoy.westlaw.com/scope/default.asp?db=FBN-ALL&Rs-W...&VR=2.0 as printed Dec. 17, 2009, pp. 5.
“ID Thieves These Days Want Your Number, Not Your Name”, The Colombus Dispatch, Columbus, Ohio, http://www.dispatch.com/content/storiesibusiness/2014/08/03/id-thieves-these-days-want-your-number-not-your-name.html, Aug. 3, 2014 in 2 pages.
Identity Theft Resource Center; Fact Sheet 120 A—To Order a Credit Report for a Child; Fact Sheets, Victim Resources; Apr. 30, 2007.
Information Brokers of America:Child ID Protection Order Form http://iboainfo.com/child-order.html dated Jul. 6, 2008 on www.archive.org.
Information Brokers of America http://iboainfo.com/child-id-protect.html dated Dec. 15, 2007 on www.archive.org.
Intelius, “People Search—Updated Daily, Accurate and Fast!” http://www.intelius.com/people-search.html?=&gclid=CJqZIZP7paUCFYK5KgodbCUJJQ printed Nov. 16, 2010 in 1 page.
Lanubile, et al., “Evaluating Empirical Models for the Detection of High-Risk Components: Some Lessons Learned”, 20th Annual Software Engineering Workshop, Nov. 29-30, 1995, Greenbelt, Maryland, pp. 1-6.
Leskovec, Jure, “Social Media Analytics: Tracking, Modeling and Predicting the Flow of Information through Networks”, WWW 2011-Tutorial, Mar. 28-Apr. 1, 2011, Hyderabad, India, pp. 277-278.
Manilla, http://www.manilla.com/how-it-works/ printed Feb. 5, 2014 in 1 page.
Meyers et al., “Using Your Social Networking Accounts to Log Into NPR.org,” NPR.org, Jun. 24, 2010, http://www.npr.org/blogs/inside/2010/06/24/128079309/using-your-social-networking-accounts-to-log-into-npr-org in 2 pages.
Micarelli et al., “Personalized Search on the World Wide Web,” The Adaptive Web, LNCS 4321, 2007, pp. 195-230.
Mint.com, http://www.mint.com/how-it-works/ printed Feb. 5, 2013 in 2 pages.
Mvelopes, http://www.mvelopes.com/ printed Feb. 5, 2014 in 2 pages.
My Call Credit http://www.mycallcredit.com/products.asp?product=ALR dated Dec. 10, 2005 on www.archive.org.
My Call Credit http://www.mycallcredit.com/rewrite.asp?display=faq dated Dec. 10, 2005 on www.archive.org.
“Name Availability Records”, Westlaw Database Directory, http://directoy.westlaw.com/scope/default.asp?db=NA-ALL&RS=W...&VR=2.0 as printed Dec. 17, 2009, pp. 5.
Next Card: About Us, http://web.cba.neu.edu/˜awatson/NextCardCase/NextCardAboutUs.htm printer Oct. 23, 2009 in 10 pages.
Paustian, Chuck, “Every Cardholder a King Customers get the Full Treatment at Issuers' Web Sites,” Card Marketing, New York, Mar. 2001, vol. 5, No. 3, pp. 4.
People Finders, http://www.peoplefinders.com/?CMP=Google&utm—source=google&utm—medium=cpc printed Nov. 16, 2010 in 1 page.
People Lookup, “Your Source for Locating Anyone!” www.peoplelookup.com/people-search.html printed Nov. 16, 2010 in 1 page.
People Search, “The Leading Premium People Search Site on the Web,” http://www.peoplesearch.com printed Nov. 16, 2010 in 2 pages.
PersonalCapital.com, http://www.personalcapital.com/how-it-works printed Feb. 5, 2014 in 5 pages.
Press Release—“Helping Families Protect Against Identity Theft—Experian Announces FamilySecure.com; Parents and guardians are alerted for signs of potential identity theft for them and their children; product features an industry-leading $2 million guarantee”; PR Newswire; Irvine, CA; Oct. 1, 2007.
Roth, Andrew, “CheckFree to Introduce E-Mail Billing Serving,” American Banker, New York, Mar. 13, 2001, vol. 166, No. 49, pp. 3.
Scholastic Inc.:Parent's Request for Information http://www.scholastic.com/inforequest/index.htm dated Feb. 10, 2007 on www.archive.org.
Scholastic Inc.:Privacy Policy http://www.scholastic.com/privacy.htm dated Jan. 27, 2007 on www.archive.org.
Singletary, Michelle, “The Littlest Victims of ID Theft”, The Washington Post, The Color of Money, Oct. 4, 2007.
US Legal, Description, http://www.uslegalforms.com/us/US-00708-LTR.htm printed Sep. 4, 2007 in 2 pages.
Yahoo! Search, “People Search,” http://people.yahoo/com printed Nov. 16, 2010 in 1 page.
YODLEE | Money Center, https://yodleemoneycenter.com/ printed Feb. 5, 2014 in 2 pages.
You Need a Budget, http://www.youneedabudget.com/features printed Feb. 5, 2014 in 3 pages.