Systems and methods for identifying potentially malicious messages

Abstract
Computer-implemented systems and methods for identifying illegitimate messaging activity on a system using a network of sensors.
Description
TECHNICAL FIELD

This document relates generally to electronic communications processing and more particularly to analyzing electronic communications for spoofing and other situations.


BACKGROUND AND SUMMARY

A significant number of Internet users and companies are subject to spoofing attacks wherein an attacker masquerades as another person or company. An example includes a spoofing attack known as phishing wherein an attacker tries to illegally obtain confidential information (e.g., the user's password) by sending phony e-mails or instant messages and making the user believe that the source of the communication is a legitimate company. The technique is often used to try to secure user passwords and other sensitive information such as credit card numbers, bank account information, brokerage information and generally anything that could yield a financial gain in line with fraud operations.


In accordance with the teachings provided herein, systems and methods for operation upon data processing devices are provided in order to overcome one or more of the aforementioned disadvantages or other disadvantages concerning the detection of spoofing type situations. For example, a system and method can include examining whether an electronic communication includes elements associated with a first entity's website and elements associated with a second entity's website. The examination is then used in determining whether a spoofing situation exists with respect to the received electronic communication.


As another example, a computer-implemented method and system can be provided for detecting a spoofing situation with respect to one or more electronic communications, comprising. A determination is performed as to whether the electronic communication includes a textual or graphical reference to a first entity as well as a determination as to whether the textual or graphical reference to the first entity is associated with a link to a second entity. Spoofing is detected with respect to the received electronic communication based upon the determination of whether the textual reference is associated with the link to the second entity.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram depicting a computer-implemented system that includes a spoofed message detector to determine whether spoofing is evident with respect to one or more electronic communication messages.



FIG. 2 is a flowchart depicting operations that a message analysis system can utilize in determining the presence of spoofing.



FIG. 3 is a block diagram depicting a spoofed message detector configured to recognize a spoofed message.



FIG. 4 is a flowchart depicting an operational scenario for comparing two fingerprints of textual or graphical content.



FIG. 5 is a process flow diagram depicting application of a winnowing fingerprinting algorithm in order to detect spoofing.



FIG. 6 is a block diagram depicting a spoofed message detector configured to detect whether spoofing has occurred with respect to images.



FIG. 7 is a flowchart depicting an operational scenario for using fingerprint analysis in detecting spoofing.



FIG. 8 is a block diagram depicting a spoofed message detector configured to detect whether spoofing may have occurred with respect to communications that have direct links to a real website's images.



FIG. 9 is a flowchart depicting an operational scenario illustrating the analysis of direct links.



FIG. 10 is a block diagram depicting a spoofed message detector configured to be used with a reputation system.



FIGS. 11A and 11B are block diagrams illustrating actions that can be taken based upon the results of a spoofed message detector.



FIG. 12 is a block diagram depicting a server access architecture.



FIG. 13 is a block diagram depicting a message analysis system using an existing network of sensors.



FIG. 14 is a block diagram depicting the aggregation of threat data collected from existing sensors and external sources.





DETAILED DESCRIPTION


FIG. 1 depicts a computer-implemented system 30 that includes a spoofed message detector 32 to determine whether spoofing is evident with respect to one or more electronic messages (34, 36). As an example, the messages to be analyzed could be legitimate messages 34 from a company or could be spoofed messages 36 from an attacker feigning to be the company.


The legitimate messages 34 contain links to or elements from the company's website 38. The legitimate messages 34 can allow a recipient to access the company website 38 in order to perform a transaction or other activity through the company website 38. In contrast, spoofed messages 36 may contain links to or elements from the company's website 38 while also containing links to or elements from the attacker's website 40. This can result in the user being tricked into interacting with the attacker's website 40 instead of with the legitimate company's website 38.


The spoofed message detector 32 receives electronic communication (36, 34) over one or more networks 42. The spoofed message detector 32 analyzes the messages (36, 34) to determine whether spoofing may have occurred. If suspected spoofing has been detected with respect to an electronic message, then one or more actions 44 can take place with respect to the electronic communication. The actions 44 can be tailored based upon how likely the electronic communication is a spoofed message.



FIG. 2 represents operations that the message analysis system can utilize in determining the presence of spoofing. At step 100, a system can perform data collection to locate messages for analysis. For example, messages may be sent from devices that are located within one or more companies' networks. Such a device can include the IronMail message profiler device available from CipherTrust® (located in Alpharetta, Ga.).


From the data collected in step 100, step 102 determines which data is associated with which company. References to the company in the content, subject heading, and/or To/From/CC/BCC fields can be used to locate messages specific to a company. As an illustration, messages specific to Company A can be separated or otherwise indicated as being associated with Company A. Messages specific to Company B can be separated or otherwise indicated as being associated with Company B, and so forth. Other levels of granularity of separating the message can be performed, such as on an organization level, individual level, etc. In this manner, a user can direct analysis be performed at different levels of granularity.


Any messages that can be determined as legitimate at this stage can be removed from the corpus of messages that are to be analyzed at step 104. For example, messages can be determined as legitimate if their senders' addresses are from an advanced-authorized list of e-mail addresses, held by an ISP, subscriber or other e-mail service provider. At step 104, the remaining messages are analyzed to determine whether any of them are spoofed messages and if so, then one or more actions are performed at step 106 in order to address the spoofing situation.


A variety of different analysis techniques can be used to determine whether a spoofing situation has arisen at step 104, such as the approach depicted in FIG. 3. With reference to FIG. 3, the spoofed message detector 32 can be configured to recognize that a spoofed message 36 is a composite 200 of one or more elements 220 from a legitimate company website 38 as well as one or more elements 210 from a different entity's website (e.g., attacker's website 40). As an illustration of what website elements (210, 220) might be involved, the spoofed message detector 32 may detect that a message is a composite 200 because it includes content 222 from the legitimate company website 38 as well as content 212 from the attacker's website 40.


The spoofed message detector 32 can perform its composite analysis in many different ways. For example, the spoofed message detector 32 can utilize fingerprint analysis techniques 230 in order to determine whether the message is a composite 200 or not.


The spoofed message detector 32 can include or have access to a fingerprint analysis software routine or program 230 that will generate a fingerprint of the content 212 associated with a communication under analysis and generate a fingerprint of the actual content 220 used within the company website 38. A comparison of the two fingerprints generated by the fingerprint analysis program 230 is used to determine whether spoofing may have occurred. As operational scenario illustrating the use of fingerprinting analysis 230 is depicted in FIG. 4. It should also be understood that the fingerprinting analysis can be used to locate legitimate content. Such legitimate content can also be sorted for later analysis, such as, for example, trend analysis (e.g., how many times a legitimate usage is observed versus how many times a malicious usage is observed). Furthermore, it should be noted that instances of malicious usages can be stored for later use as evidence in a civil case or criminal case, or used in an administrative proceeding to shut malicious sites down.


With reference to FIG. 4, a communication to be analyzed is received by the spoofed message detector at process block 250. It should be understood that in various examples, the spoofed message detector can reside within an enterprise network, or any other generic location where messaging traffic may be observed. Moreover, when the spoofed message detector resides within an enterprise network, it should be noted that the detector can examine messaging traffic regardless of the originator of the message. For example, outgoing messages from the enterprise network may be examined to ensure that employees are not misusing the company mark or attempting to commit fraud with outsiders using company machines. Similarly, incoming messages maybe examined to protect employees from spoofing attacks by outsiders.


The spoofed message detector 32 identifies at process block 252 the different pieces of content referenced in the communication, such as what company-related content is being pointed to or hyperlinked in the communication. For example, a hyperlink in the communication might contain a textual description that indicates that it is a link to company content but instead provides a link to content on another website (e.g., an attacker's website)—this is an example of a communication faking an association with a company. The content is accessed and retrieved via the URL that is embodied in the hyperlink.


At process block 254, a fingerprint 256 is generated of the content that is actually pointed to or referenced in the communication that is under analysis. The fingerprint 256 is then made available to process block 262 which performs a comparison of fingerprint 256 with a fingerprint 260 that had been generated at process block 258. The comparison operation at process block 262 produces a matching result 264 indicative of how well the two compared fingerprints (256, 260) matched. A strong or complete match of the two fingerprints (256, 260) can provide evidence that spoofing has not occurred, while a partial match or a totally incomplete match can provide evidence that spoofing may be present.


As described above, it should be understood that various actions can be taken responsive to detecting suspected spoofing. For example, among others, suspected spoofing attacks can be added to a brand-abuse database, whereby messaging data can be combined with existing brand protection techniques.


It should be understood that similar to the other processing flows described herein, the steps and the order of the steps in this flowchart may be altered, modified and/or augmented and still achieve the desired outcome. For example, the generation of specific company's content fingerprint at process block 258 may be done in real-time or in near-real-time, such as when it has been discovered that the communication under analysis is referencing the specific company. The company content fingerprint could also be generated before the communication has been received for analysis. Furthermore, the comparison can use one or more techniques to determine whether a link or web page matches a legitimate link or web page.


As another example of the variety of processing flows that can be performed, the analysis does not have to include fingerprinting, but different comparison techniques can be utilized, such as a character-by-character comparison of the content involved in the analysis. Moreover, in various environments, different weightings can be applied to the different comparison techniques. If fingerprinting is utilized, then it should also be understood that different types of fingerprinting algorithms can be employed, such as the winnowing fingerprint algorithm discussed in the following reference: S. Schleimer et al. “Winnowing: Local Algorithms for Document Fingerprinting” (SIGMOD 2003, Jun. 9-12, 2003, San Diego, Calif.). An example of an application of the winnowing fingerprinting algorithm is shown in FIG. 5.


With reference to FIG. 5, a fingerprint of a “real” (i.e., authentic) login page of a company website is shown at 300. A fingerprint of the actual content that was contained in a communication purporting to be from the company is shown at 302. While many of the prints between fingerprints 300 and 302 may match, there are a number of significant departures between the two fingerprints 300 and 302. Accordingly, comparison operation 304 will produce a matching result 306 that would indicate that there is evidence of spoofing.



FIG. 6 depicts a spoofed message detector 32 that has been configured to detect whether spoofing may have occurred with respect to images 310 that have been incorporated into or is referenced by a message. Attackers may have downloaded images (e.g., company logos or other source indicating images, etc.) from the company's website. Accordingly, a spoofing situation could involve a composite 200 of website elements (e.g., images) from a company's website 38 as well as from an attacker's website 40. The spoofed message detector 32 can include or have access to a fingerprint analysis software routine or program 350 that will generate a fingerprint of an image 300 associated with a communication under analysis and generate a fingerprint of the actual image 310 used within the company website 38. A comparison of the two fingerprints generated by the fingerprint analysis program 350 is used to determine whether spoofing may have occurred.


As an illustration in detecting this type of spoofing, a company's images can be fingerprinted (e.g., by applying an md5 algorithm) and then these fingerprints can be compared against that of the communication in question or destination phishing website. Any matches not coming from the company's IPs can be deemed to be strong evidence of phishing. This could force phishers to modify their images which would result in more work for the phishers as well as increase the likelihood that people will not be fooled.


An operational scenario illustrating the use of fingerprint analysis 350 is depicted in FIG. 7. With reference to FIG. 7, an image is received at 400 that is associated with a communication to be analyzed. At step 402, the image from the company website is obtained. This image could have been obtained before or after the communication to be analyzed has been received.


At step 404, a fingerprint 406 of the image to be analyzed is generated. Correspondingly, at step 408, a fingerprint 410 of the company's image is generated. It is noted that the fingerprint 410 of the company's image could be generated before or after the communication to be analyzed is received.


The fingerprints 406 and 410 are then made available to process block 412 which performs a comparison of the fingerprints 406 and 410. The comparison operation at process block 412 produces a matching result 414 indicative of how well the two fingerprints (406, 410) matched. A strong or complete match of the two fingerprints (406, 410) can provide evidence that spoofing has not occurred, while a partial match or a totally incomplete match can provide evidence that spoofing may be present.



FIG. 8 depicts a spoofed message detector 32 that has been configured to detect whether spoofing may have occurred with respect to communications that have direct links 450 to the real website's images 310. An inventory of all the URLs belonging to a company can be performed periodically to reflect changes to a company's URLs. This inventory could be cross-referenced with a list of URLs permitted for real company communications. The inventory could also be cross-referenced with a fraud database in case any of the URLs that appear are not listed as officially belonging to the company. Any message that uses a mixture of real company URLs and fake URLs could be detected. Not only could this detect phishing but also trademark and other violations. If a phisher stops using valid company URLs, then message filters will be able to identify illegitimate mail, which would push phishers out into the open.


An operational scenario illustrating the analysis of direct links 450 is depicted in FIG. 9. With reference to FIG. 9, a communication is received at 500 that is to be analyzed in order to determine whether a spoofing situation (e.g., phishing) is present. At step 502, a list is generated of which company URLs are present in the communication. Either before or after the communication to be analyzed was received, process block 504 receives which URLs are allowed to be used for a company communication. Process block 506 does a comparison between the corpus of URLs obtained in process block 502 with the corpus of URLs obtained in process block 504. The comparison result 508 is indicative of whether spoofing has occurred.



FIG. 10 depicts a spoofed message detector 32 that has been configured to be used with a reputation system 550. A reputation system 550 keeps track of whether a communication sender engages in good behavior (such as sending legitimate messages 34), bad behavior (such as sending spam, malicious code, or spoofed messages 36). By tracking sender behavior over time, a database of sender reputation can grow and be refined.


Many different types of reputations system can be used with the spoofed message detector 32. An example includes the reputation systems and methods disclosed in the commonly assigned U.S. patent application entitled “Systems and Methods for Classification of Messaging Entities” (Ser. No. 11/142,943; filed Jun. 2, 2005). As another example, the spoofed message detector 32 can be used with a system, such as the TrustedSource software system provided by the assignee of this application. The TrustedSource software system receives and analyzes billions of messages per month from CipherTrust's network of more than 4000 IronMail Gateway appliances deployed globally. TrustedSource assigns a reputation score and further classifies senders as good, bad or suspicious based on an in-depth analysis by processing more than a dozen behavior attributes to profile each sender. As an illustration, TrustedSource combines traffic data, whitelists, blacklists and network characteristics with CiperTrust's global customer base.


The results of whether a message is a spoofed message can be provided to such reputation systems as part of its determination of what reputation should be ascribed to a particular sender. As an illustration, the determination by the spoofed message detector 32 (through one or more of the techniques disclosed herein) that a sender is sending spoofed messages can be used by a reputation system 550 to adversely affect the reputation of the sender.


As other examples of how the results of a spoofed message detector 32 can be used, FIG. 11A illustrates that an action 44 that can be taken based upon the results of the spoofed message detector 32 is to shutdown the attacker's website 40 as indicated at 600. The shutdown can be accomplished in a variety of ways, such to inform the Internet Service Provider (ISP) that the attacker's website 40 is associated with improper behavior (i.e., spoofing activities). Other ways could include a more automated approach to shutting down the attacker's website.



FIG. 11B illustrates that an action 44 could include modifications/alerts 650 being sent to the company 660 associated with the website 38. The company 660 is thereby aware of the illegitimate use of their identity and can decide what additional actions need to be taken. Additional actions could include pursuing legal action against the attacker, notifying persons (e.g., customers) to be aware of this phishing activity, etc.


While examples have been used to disclose the invention, including the best mode, and also to enable any person skilled in the art to make and use the invention, the patentable scope of the invention is defined by claims, and may include other examples that occur to those skilled in the art. For example, in addition to or in place of the other spoof message detection approaches discussed herein, a spoof message detector can be configured to determine whether a target/href mismatch has occurred in a communication under analysis. For example, a communication may indicate as its target http://www.ebay.com when it is really linking to http://215.32.44.3-ebay.com. Such a mismatch indicates that spoofing has occurred. This could be used in place of or to supplement the spoofing determinations performed by the other approaches discussed herein.


The systems and methods disclosed herein may be implemented on various types of computer architectures, such as for example on different types of networked environments. As an illustration, FIG. 12 depicts a server access architecture within which the disclosed systems and methods may be used (e.g., as shown at 30 in FIG. 12). The architecture in this example includes a corporation's local network 790 and a variety of computer systems residing within the local network 790. These systems can include application servers 720 such as Web servers and e-mail servers, user workstations running local clients 730 such as e-mail readers and Web browsers, and data storage devices 710 such as databases and network connected disks. These systems communicate with each other via a local communication network such as Ethernet 750. Firewall system 740 resides between the local communication network and Internet 760. Connected to the Internet 760 are a host of external servers 770 and external clients 780.


Local clients 730 can access application servers 720 and shared data storage 710 via the local communication network. External clients 780 can access external application servers 770 via the Internet 760. In instances where a local server 720 or a local client 730 requires access to an external server 770 or where an external client 780 or an external server 770 requires access to a local server 720, electronic communications in the appropriate protocol for a given application server flow through “always open” ports of firewall system 740.


A system 30 as disclosed herein may be located in a hardware device or on one or more servers connected to the local communication network such as on the Internet 760 and/or Ethernet 780 and logically interposed between the firewall system 740 and the local servers 720 and clients 730. Application-related electronic communications attempting to enter or leave the local communications network through the firewall system 740 are routed to the system 30.


System 30 could be used to handle many different types of e-mail and its variety of protocols that are used for e-mail transmission, delivery and processing including SMTP and POP3. These protocols refer, respectively, to standards for communicating e-mail messages between servers and for server-client communication related to e-mail messages. These protocols are defined respectively in particular RFC's (Request for Comments) promulgated by the IETF (Internet Engineering Task Force). The SMTP protocol is defined in RFC 821, and the POP3 is defined in RFC 1939.


Since the inception of these standards, various needs have evolved in the field of e-mail leading to the development of further standards including enhancements or additional protocols. For instance, various enhancements have evolved to the SMTP standards leading to the evolution of extended SMTP. Examples of extensions may be seen in (1) RFC 1869 that defines a framework for extending the SMTP service by defining a means whereby a server SMTP can inform a client SMTP as to the service extensions it supports and in (2) RFC 1891 that defines an extension to the SMTP service, which allows an SMTP client to specify (a) that delivery status notifications (DSNs) should be generated under certain conditions, (b) whether such notifications should return the contents of the message, and (c) additional information, to be returned with a DSN, that allows the sender to identify both the recipient(s) for which the DSN was issued, and the transaction in which the original message was sent. In addition, the IMAP protocol has evolved as an alternative to POP3 that supports more advanced interactions between e-mail servers and clients. This protocol is described in RFC 2060.


Other communication mechanisms are also widely used over networks. These communication mechanisms include, but are not limited to, Voice Over IP (VoIP) and Instant Messaging. VoIP is used in IP telephony to provide a set of facilities for managing the delivery of voice information using the Internet Protocol (IP). Instant Messaging is a type of communication involving a client which hooks up to an instant messaging service that delivers communications (e.g., conversations) that can take place in realtime.



FIG. 13 illustrates that some systems 30 of this disclosure operate using an existing network of sensors 800. In this example the sensors 800 are IronMail servers, publicly available from CipherTrust®, of Alpharetta, Ga. These sensors review mail traveling through associated network elements, such as mail transfer agents, for example. It should be understood that a user 805 creates a message and passes the message to an electronic mail server 810. A network 815a passes the message to a mail transfer agent which is associated with sensor 800. The sensor(s) 800 collects statistics related to messages reviewed and stores them in a database 820. The mail transfer agent forwards the mail to a recipient system 825 associated with a recipient of the message via a network 815b. It should be understood that the networks discussed herein can be the same network, or different subparts to the same network, although it should be understood that this disclosure is not limited to such an environment.


System 30 can examine the data stored by the sensor(s) 800 as described above. The system 30 can also make the data available to a client 835 (e.g., a web browser, and e-mail client, an SMS message, etc.) via a network 815c. In various examples, the client 835 can receive and/or retrieve information about potential spoofing activity. In the web-based example, a user could enter an IP address or domain name to observe the traffic associated with a system. In other examples, the detection system can send a message to a user or domain administrator, for example, via an ISP. Information can also be gathered from off-network areas, purchased from other companies and used for comparison and alert purposes within the system.


It should be further noted that the sensors 800 can gather information that would be useful to a company to determine whether anyone inside their company is transmitting illegitimate messaging traffic. Similarly, traffic patterns collected by the sensors 800 can be used to determine if there is concerted activity on the part of many computers associated with a domain or IP addresses. Such situations are evidence that a computer or network is infected with a virus, worm or spy-ware causing the computer or network to operate as a zombie client, thereby showing large increases in messaging traffic originating from a domain or IP address. Correlation of large amounts of messaging traffic indicates zombie activity, and helps administrators. Moreover, it can alert a reputation system to discount the messages sent by a domain or IP address during the period the system is influence by a zombie, a worm, or a virus, except where the problem persists (e.g., where the problem is ignored). An example of such a system is RADAR™, publicly available from CipherTrust®, which includes a customizable interface enabling users to configure notifications. CipherTrust also makes this information available via the web at: www.trustedsource.org. RADAR also includes a customizable interface to view messages and instances (indicated by URLs embedded in spoofed messages as well as URLs obtained from sources outside of the network of sensors) that indicate brand abuse—name, domain, website. Furthermore, the customizable interface can be configured in some examples to sort by one or more parameters such as, for example: sender, content, brand, time, location (corporate or geographic), among many others. Moreover, in some examples, data can be displayed in graphs, charts, and/or listed in tables, which enable the user to drill down to see different parts of the data (e.g., email header and/or entire message and content). Data from a geographical user interface (GUI) display can also be packaged for delivery (once or at regular intervals) in a file (which can be stored in any format including, for example: a text file, CSV file, a binary file, etc.). In various examples, views can be customized by user type or vertical type (e.g. an ISP view, or a Law Enforcement view, Banking view).



FIG. 14 illustrates an architecture 900 for aggregating data from a plurality of sensors 800a-c and external data received from other types of data collection systems such as data at rest. Data at rest can include, for example, among many others, the data stored on a domain name server or on a web server. It should be understood that each of the sensors 800a-C can include a local data store 820a-c, respectively, in which the sensor can store collected information. This data can be shared with system 30 via network(s) 815.


It should be understood that the stored data from the sensors 800a-c can be automatically sent to system 30, periodically, in times of low traffic or processor usage, or based upon some other triggering mechanism. Alternatively, the stored data from the sensors 800a-c can be automatically retrieved by the system 30, periodically, in times of low traffic or processor usage, or based upon some other triggering mechanism.


Additionally, system 30 can collect external data 905a-b, such as web data, domain name data, or other data at rest via the network(s) 815. The external data 905a-b can be collected by systems outside of the network of sensors. The external data 905a-b can be aggregated with the stored data received from the network of sensors 800a-c, as shown by aggregation block 910. The aggregated data can be sorted and/or analyzed as shown by block 920. The sorted and/or analyzed data can then be shared via the network(s) 815 using data server 930.


It should be understood that the data server can be used to provide the analyzed data to customers and other users via the world wide web, for example. Moreover, it should be noted that the sensors 800a-c can be configured to periodically retrieve the analyzed data from system 30, in order to operate on communication data using the latest threat and/or classification information to the sensors 800a-c.


It is further noted that the systems and methods may include data signals conveyed via networks (e.g., local area network, wide area network, internet, etc.), fiber optic medium, carrier waves, wireless networks, etc. for communication with one or more data processing devices. The data signals can carry any or all of the data disclosed herein that is provided to or from a device.


Additionally, the methods and systems described herein may be implemented on many different types of processing devices by program code comprising program instructions that are executable by the device processing subsystem. The software program instructions may include source code, object code, machine code, or any other stored data that is operable to cause a processing system to perform methods described herein. Other implementations may also be used, however, such as firmware or even appropriately designed hardware configured to carry out the methods and systems described herein.


The systems' and methods' data (e.g., associations, mappings, etc.) may be stored and implemented in one or more different types of computer-implemented ways, such as different types of storage devices and programming constructs (e.g., data stores, RAM, ROM, Flash memory, flat files, databases, programming data structures, programming variables, IF-THEN (or similar type) statement constructs, etc.). It is noted that data structures describe formats for use in organizing and storing data in databases, programs, memory, or other computer-readable media for use by a computer program.


The systems and methods may be provided on many different types of computer-readable media including computer storage mechanisms (e.g., CD-ROM, diskette, RAM, flash memory, computer's hard drive, etc.) that contain instructions for use in execution by a processor to perform the methods' operations and implement the systems described herein.


The computer components, software modules, functions, data stores and data structures described herein may be connected directly or indirectly to each other in order to allow the flow of data needed for their operations. It is also noted that a module or processor includes but is not limited to a unit of code that performs a software operation, and can be implemented for example as a subroutine unit of code, or as a software function unit of code, or as an object (as in an object-oriented paradigm), or as an applet, or in a computer script language, or as another type of computer code. The software components and/or functionality may be located on a single computer or distributed across multiple computers depending upon the situation at hand.


It should be understood that as used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless that context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise. Finally, as used in the description herein and throughout the claims that follow, the meanings of “and” and “or” include both the conjunctive and disjunctive and may be used interchangeably unless the context expressly dictates otherwise; the phrase “exclusive or” may be used to indicate situation where only the disjunctive meaning may apply.

Claims
  • 1. A computer-implemented method for detecting a spoofing situation with respect to one or more electronic communications, the method comprising: receiving an electronic communication through a network interface addressed to a recipient;storing the electronic communication in computer memory;determining, by one or more processors and prior to the communication being provided to the recipient, that the electronic communication includes a link associated with a description of a first entity and that the link links to first content represented as particular content of the first entity, wherein the first content includes a first set of elements;identifying a legitimate version of the particular content including a second set of elements, wherein identifying the legitimate version includes identifying a first fingerprint of one or more elements from the legitimate version;generating a second fingerprint, wherein the second fingerprint is a fingerprint of one or more elements from the first content;determining a degree of match between the first and second sets of elements based at least in part on a comparison of the second fingerprint with the first fingerprint, wherein determining the degree of match includes determining whether one or more elements of the first set of elements originate from a second entity different from the first entity;detecting, by the one or more processors, prior to the communication being provided to the recipient that a spoofing situation exists with respect to the received electronic communication based upon the determined degree of match; andin response to detecting that a spoofing situation exists, blocking the communication from being provided to the recipient.
  • 2. The method of claim 1, wherein the spoofing situation is a phishing situation wherein the link to the linked entity is a hyperlink to a website operated by the linked entity.
  • 3. The method of claim 2, wherein the linked entity is an attacker whose website, to which the hyperlink links, is configured for feigning association with the first entity and for acquiring confidential information from a user for illegitimate gain.
  • 4. The method of claim 1, wherein the first and second sets of elements include graphical elements.
  • 5. The method of claim 1, wherein generating the fingerprints includes use of a winnowing fingerprint approach.
  • 6. The method of claim 1, wherein the first and second sets of elements include image elements.
  • 7. The method of claim 1, wherein the degree of match includes a match selected from the group consisting of a complete match, strong match, partial match, totally incomplete match, and combinations thereof.
  • 8. The method of claim 1, further comprising: storing a number of instances of legitimate and illegitimate usages based upon whether the fingerprint of the first content and the fingerprint of the legitimate version of the particular content match; anddisplaying statistics comparing the number of instances of legitimate usage versus the number of instances of illegitimate usages.
  • 9. The method of claim 1, wherein results of said detecting step are provided to a reputation system; wherein the reputation system uses the provided results as part of its determination of what reputation should be ascribed to a sender of the electronic communication.
  • 10. The method of claim 1, wherein results of said detecting step are provided to a fraud database for correlation and aggregation.
  • 11. The method of claim 1, wherein an action is performed in response to results of said detecting step; and wherein the action includes shutting down a website associated with the second entity.
  • 12. The method of claim 1, wherein a notification is provided to the first entity of results of said detecting step.
  • 13. The method of claim 1, further comprising: determining whether a mismatch has occurred with an href attribute in the received electronic communication; anddetecting whether a spoofing situation exists with respect to the received electronic communication based upon the determination with respect to the href attribute mismatch.
  • 14. The method of claim 1, wherein the electronic communication is a communication selected from the group consisting of an e-mail message, and instant message, an SMS communication, a VOIP communication, a WAP communication, and combinations thereof.
  • 15. The method of claim 1, further comprising: responsive to detecting a spoofing situation exists, performing at least one of the steps comprising changing the reputation of the sender of the communication.
  • 16. The method of claim 1, wherein the step of detecting further comprises: determining a reputation associated with a URL included in the communication;determining whether the age of the domain used in the URL is greater than a threshold;determining whether the owner of the domain/IP hosting a URL included in the communication matches the owner of an IP address associated with the communication; anddetermining whether an owner of a phone number associated with the communication matches a database of known spoofing phone numbers.
  • 17. The method of claim 1, wherein the first and second sets of elements include at least one of anchor text or an image.
  • 18. A method of detecting illegitimate traffic originating from a domain, the method comprising: deploying a plurality of sensor devices at a plurality of associated nodes on the Internet;gathering messaging information from the plurality of sensor devices, wherein the messaging information describes messages originating from a set of domains including the domain;correlating a portion of the gathered messaging information for the domain;determining from the correlation whether a probable security condition exists with regard to the domain, wherein the determining the probable security condition comprises: comparing legitimate content of the domain with content contained in the gathered messaging information to identify that a volume of messages described in the received messaging information includes content that does not match the legitimate content, wherein the comparing includes: identifying a first fingerprint of one or more elements from the legitimate content;generating a second fingerprint, wherein the second fingerprint is a fingerprint of one or more elements from the content contained in the gathered messaging information; andcomparing the second fingerprint with the first fingerprint;signaling the probable security condition based at least in part on identifying that the volume of messages includes content that does not match the legitimate content; andalerting an owner or an internet service provider associated with the domain of the probable security condition with regard to the domain.
  • 19. The method of claim 18, wherein the determining step comprises: comparing a list of company URLs contained in the gathered messaging information with an inventory of permitted URLs based upon one or more IP addresses associated with a particular entity associated with the domain; andif the list of company URLs contained in the received messaging information do not match the inventory of permitted URLs, signaling the probable security condition.
  • 20. The method of claim 18, wherein the determining step comprises: comparing message traffic levels of multiple machines associated with the same domain, the messaging traffic levels being based on a messaging traffic level of messages from the domain; andif the message traffic levels of multiple machines associated with the samc domain display similar peak or similarly sporadic traffic levels during similar time periods, signaling the probable security condition.
  • 21. The method of claim 18, wherein the sensor devices collect information about messaging traffic which travels across the associated nodes without regard to the origin or destination of the messaging traffic.
  • 22. The method of claim 21, wherein the sensor devices collect information about all messaging traffic which travels across the associated nodes without regard to a protocol associated with the messaging traffic.
  • 23. A method of detecting illegitimate traffic originating from a domain, the method comprising: deploying a plurality of sensor devices at a plurality of associated nodes on the Internet;gathering messaging information from the plurality of sensor devices, wherein the messaging information describes messages originating from a set of IP addresses including a particular IP address;correlating a portion of the gathered messaging information for the particular IP address;identifying a particular entity associated with the particular IP address;determining from the correlation whether a probable security condition exists with regard to the particular IP address, wherein the determining step comprises: comparing legitimate content of the particular entity with content contained in the gathered messaging information to identify that a volume of messages described in the received messaging information includes content that does not match the legitimate content, wherein the comparing includes: identifying a first fingerprint of one or more elements from the legitimate content;generating a second fingerprint, wherein the second fingerprint is a fingerprint of one or more elements from the content contained in the gathered messaging information; andcomparing the second fingerprint with the first fingerprint;signaling the probable security condition based at least in part on identifying that the volume of messages includes content that does not match the legitimate content; andalerting an owner associated with the particular IP address or an internet service provider associated with the IP address of the probable security condition with regard to the particular IP address.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation in part and claims priority to and the benefit of U.S. application Ser. No. 11/173,941, entitled, “MESSAGE PROFILING SYSTEMS AND METHODS,” filed on Jul. 1, 2005, which is a continuation in part of, and claims priority to and benefit of U.S. application Ser. No. 11/142,943, entitled “SYSTEMS AND METHODS FOR CLASSIFICATION OF MESSAGING ENTITIES,” filed on Jun. 2, 2005, both of which claim priority to and the benefit of U.S. Provisional Application Ser. No. 60/625,507, entitled “Classification of Messaging Entities,” filed on Nov. 5, 2004, all of which are incorporated herein by reference. This application is also a continuation-in-part of and claims priority to and the benefit of commonly assigned U.S. patent application Ser. No. 11/383,347, filed May 15, 2006, entitled “CONTENT-BASED POLICY COMPLIANCE SYSTEMS AND METHODS,” which claims priority to U.S. Provisional Application Ser. No. 60/736,121, filed Nov. 10, 2005, both of which are incorporated herein by reference. This application is a continuation in part of and claims priority to and the benefit of commonly assigned U.S. patent application Ser. No. 11/218,689, filed Nov. 10, 2005, both of which are incorporated herein by reference. This application is a continuation in part of and claims priority to and the benefit of commonly assigned U.S. patent application Ser. No. 11/218,689, entitled “SYSTEMS AND METHODS FOR ADAPTIVE MESSAGE INTERROGATION THROUGH MULTIPLE QUEUES,” filed Sep. 2, 2005 now U.S. Pat. No. 7,089,590, which is a continuation of U.S. patent application Ser. No. 10/093,553, entitled “SYSTEMS AND METHODS FOR ADAPTIVE MESSAGE INTERROGATION THROUGH MULTIPLE QUEUES,” filed on Mar. 8, 2002, now U.S. Pat. No. 6,941,467, both of which are incorporated herein by reference. This application is also a continuation in part and claims priority to and the benefit of commonly assigned U.S. patent application Ser. No. 10/094,211 now U.S. Pat. No. 7,458,098, entitled “SYSTEMS AND METHODS FOR ENHANCING ELECTRONIC COMMUNICATION SECURITY,” and U.S. patent application Ser. No. 10/094,266 now U.S. Pat. No. 7,124,438, entitled “SYSTEMS AND METHODS FOR ANOMALY DETECTION IN PATTERNS OF MONITORED COMMUNICATIONS,” both of which were filed on Mar. 8, 2002 and are hereby incorporated by reference in their entirety. This application is also a continuation in part of and claims to and the benefit of commonly assigned U.S. patent application Ser. No. 10/361,091, filed Feb. 7, 2003 now U.S. Pat. No. 7,096,498, entitled “SYSTEMS AND METHODS FOR MESSAGE THREAT MANAGEMENT,” U.S. patent application Ser. No. 10/373,325, filed Feb. 24, 2003 now U.S. Pat. No. 7,213,260, entitled “SYSTEMS AND METHODS FOR UPSTREAM THREAT PUSHBACK,” U.S. patent application Ser. No. 10/361,067, filed Feb. 7, 2003 now abandoned, entitled “SYSTEMS AND METHODS FOR AUTOMATED WHITELISTING IN MONITORED COMMUNICATIONS,” and U.S. patent application Ser. No. 10/384,924, filed Mar. 6, 2003, entitled “SYSTEMS AND METHODS FOR SECURE COMMUNICATION DELIVERY.” The entire disclosure of all of these applications is incorporated herein by reference. This application is also related to co-pending U.S. patent application Ser. No. 11/423,329 entitled “METHODS AND SYSTEMS FOR EXPOSING MESSAGING REPUTATION TO AN END USER,” and U.S. patent application Ser. No. 11/423,308 entitled “SYSTEMS AND METHODS FOR GRAPHICALLY DISPLAYING MESSAGING TRAFFIC,” both filed on Jun. 9, 2006. The entire disclosure of each of these applications is incorporated herein by reference.

US Referenced Citations (646)
Number Name Date Kind
4289930 Connolly et al. Sep 1981 A
4384325 Slechta et al. May 1983 A
4386416 Giltner et al. May 1983 A
4532588 Foster Jul 1985 A
4713780 Schultz et al. Dec 1987 A
4754428 Schultz et al. Jun 1988 A
4837798 Cohen et al. Jun 1989 A
4853961 Pastor Aug 1989 A
4864573 Horsten Sep 1989 A
4951196 Jackson Aug 1990 A
4975950 Lentz Dec 1990 A
4979210 Nagata et al. Dec 1990 A
5008814 Mathur Apr 1991 A
5020059 Gorin et al. May 1991 A
5051886 Kawaguchi et al. Sep 1991 A
5054096 Beizer Oct 1991 A
5105184 Pirani et al. Apr 1992 A
5119465 Jack et al. Jun 1992 A
5136690 Becker et al. Aug 1992 A
5144557 Wang Sep 1992 A
5144659 Jones Sep 1992 A
5144660 Rose Sep 1992 A
5167011 Priest Nov 1992 A
5210824 Putz et al. May 1993 A
5210825 Kavaler May 1993 A
5235642 Wobber et al. Aug 1993 A
5239466 Morgan et al. Aug 1993 A
5247661 Hager et al. Sep 1993 A
5276869 Forrest et al. Jan 1994 A
5278901 Shieh et al. Jan 1994 A
5283887 Zachery Feb 1994 A
5293250 Okumura et al. Mar 1994 A
5313521 Torii et al. May 1994 A
5319776 Hile et al. Jun 1994 A
5355472 Lewis Oct 1994 A
5367621 Cohen et al. Nov 1994 A
5377354 Scannell et al. Dec 1994 A
5379340 Overend et al. Jan 1995 A
5379374 Ishizaki et al. Jan 1995 A
5384848 Kikuchi Jan 1995 A
5404231 Bloomfield Apr 1995 A
5406557 Baudoin Apr 1995 A
5414833 Hershey et al. May 1995 A
5416842 Aziz May 1995 A
5418908 Keller et al. May 1995 A
5424724 Williams et al. Jun 1995 A
5479411 Klein Dec 1995 A
5481312 Cash et al. Jan 1996 A
5483466 Kawahara et al. Jan 1996 A
5485409 Gupta et al. Jan 1996 A
5495610 Shing et al. Feb 1996 A
5509074 Choudhury et al. Apr 1996 A
5511122 Atkinson Apr 1996 A
5513126 Harkins et al. Apr 1996 A
5513323 Williams et al. Apr 1996 A
5530852 Meske, Jr. et al. Jun 1996 A
5535276 Ganesan Jul 1996 A
5541993 Fan et al. Jul 1996 A
5544320 Konrad Aug 1996 A
5550984 Gelb Aug 1996 A
5550994 Tashiro et al. Aug 1996 A
5557742 Smaha et al. Sep 1996 A
5572643 Judson Nov 1996 A
5577209 Boyle et al. Nov 1996 A
5586254 Kondo et al. Dec 1996 A
5602918 Chen et al. Feb 1997 A
5606668 Shwed Feb 1997 A
5608819 Ikeuchi Mar 1997 A
5608874 Ogawa et al. Mar 1997 A
5619648 Canale et al. Apr 1997 A
5621889 Lermuzeaux et al. Apr 1997 A
5632011 Landfield et al. May 1997 A
5638487 Chigier Jun 1997 A
5644404 Hashimoto et al. Jul 1997 A
5657461 Harkins et al. Aug 1997 A
5673322 Pepe et al. Sep 1997 A
5675507 Bobo, II Oct 1997 A
5675733 Williams Oct 1997 A
5677955 Doggett et al. Oct 1997 A
5694616 Johnson et al. Dec 1997 A
5696822 Nachenberg Dec 1997 A
5706442 Anderson et al. Jan 1998 A
5708780 Levergood et al. Jan 1998 A
5708826 Ikeda et al. Jan 1998 A
5710883 Hong et al. Jan 1998 A
5727156 Herr-Hoyman et al. Mar 1998 A
5740231 Cohn et al. Apr 1998 A
5742759 Nessett et al. Apr 1998 A
5742769 Lee et al. Apr 1998 A
5745574 Muftic Apr 1998 A
5751956 Kirsch May 1998 A
5758343 Vigil et al. May 1998 A
5764906 Edelstein et al. Jun 1998 A
5768528 Stumm Jun 1998 A
5768552 Jacoby Jun 1998 A
5771348 Kubatzki et al. Jun 1998 A
5778372 Cordell et al. Jul 1998 A
5781857 Hwang et al. Jul 1998 A
5781901 Kuzma Jul 1998 A
5790789 Suarez Aug 1998 A
5790790 Smith et al. Aug 1998 A
5790793 Higley Aug 1998 A
5793763 Mayes et al. Aug 1998 A
5793972 Shane Aug 1998 A
5796942 Esbensen Aug 1998 A
5796948 Cohen Aug 1998 A
5801700 Ferguson Sep 1998 A
5805719 Pare, Jr. et al. Sep 1998 A
5812398 Nielsen Sep 1998 A
5812776 Gifford Sep 1998 A
5822526 Waskiewicz Oct 1998 A
5822527 Post Oct 1998 A
5826013 Nachenberg Oct 1998 A
5826014 Coley et al. Oct 1998 A
5826022 Nielsen Oct 1998 A
5826029 Gore et al. Oct 1998 A
5835087 Herz et al. Nov 1998 A
5845084 Cordell et al. Dec 1998 A
5850442 Muftic Dec 1998 A
5855020 Kirsch Dec 1998 A
5860068 Cook Jan 1999 A
5862325 Reed et al. Jan 1999 A
5864852 Luotonen Jan 1999 A
5878230 Weber et al. Mar 1999 A
5884033 Duvall et al. Mar 1999 A
5892825 Mages et al. Apr 1999 A
5893114 Hashimoto et al. Apr 1999 A
5896499 McKelvey Apr 1999 A
5898830 Wesinger et al. Apr 1999 A
5898836 Freivald et al. Apr 1999 A
5903723 Becker et al. May 1999 A
5911776 Guck Jun 1999 A
5923846 Gage et al. Jul 1999 A
5930479 Hall Jul 1999 A
5933478 Ozaki et al. Aug 1999 A
5933498 Schneck et al. Aug 1999 A
5937164 Mages et al. Aug 1999 A
5940591 Boyle et al. Aug 1999 A
5948062 Tzelnic et al. Sep 1999 A
5958005 Thorne et al. Sep 1999 A
5963915 Kirsch Oct 1999 A
5978799 Hirsch Nov 1999 A
5987609 Hasebe Nov 1999 A
5987610 Franczek et al. Nov 1999 A
5991881 Conklin et al. Nov 1999 A
5999932 Paul Dec 1999 A
6003027 Prager Dec 1999 A
6006329 Chi Dec 1999 A
6012144 Pickett Jan 2000 A
6014651 Crawford Jan 2000 A
6023723 McCormick et al. Feb 2000 A
6029256 Kouznetsov Feb 2000 A
6035423 Hodges et al. Mar 2000 A
6052709 Paul Apr 2000 A
6052784 Day Apr 2000 A
6058381 Nelson May 2000 A
6058482 Liu May 2000 A
6061448 Smith et al. May 2000 A
6061722 Lipa et al. May 2000 A
6072942 Stockwell et al. Jun 2000 A
6073142 Geiger et al. Jun 2000 A
6092114 Shaffer et al. Jul 2000 A
6092194 Touboul Jul 2000 A
6094277 Toyoda Jul 2000 A
6094731 Waldin et al. Jul 2000 A
6104500 Alam et al. Aug 2000 A
6108688 Nielsen Aug 2000 A
6108691 Lee et al. Aug 2000 A
6108786 Knowlson Aug 2000 A
6118856 Paarsmarkt et al. Sep 2000 A
6118886 Baumgart et al. Sep 2000 A
6119137 Smith et al. Sep 2000 A
6119142 Kosaka Sep 2000 A
6119230 Carter Sep 2000 A
6119236 Shipley Sep 2000 A
6122661 Stedman et al. Sep 2000 A
6141695 Sekiguchi et al. Oct 2000 A
6141778 Kane et al. Oct 2000 A
6145083 Shaffer et al. Nov 2000 A
6151675 Smith Nov 2000 A
6161130 Horvitz et al. Dec 2000 A
6165314 Gardner et al. Dec 2000 A
6185314 Crabtree et al. Feb 2001 B1
6185680 Shimbo et al. Feb 2001 B1
6185689 Todd, Sr. et al. Feb 2001 B1
6192360 Dumais et al. Feb 2001 B1
6192407 Smith et al. Feb 2001 B1
6199102 Cobb Mar 2001 B1
6202157 Brownlie et al. Mar 2001 B1
6219714 Inhwan et al. Apr 2001 B1
6223213 Cleron et al. Apr 2001 B1
6247045 Shaw et al. Jun 2001 B1
6249575 Heilmann et al. Jun 2001 B1
6249807 Shaw et al. Jun 2001 B1
6260043 Puri et al. Jul 2001 B1
6266668 Vanderveldt et al. Jul 2001 B1
6269447 Maloney et al. Jul 2001 B1
6269456 Hodges et al. Jul 2001 B1
6272532 Feinleib Aug 2001 B1
6275942 Bernhard et al. Aug 2001 B1
6279113 Vaidya Aug 2001 B1
6279133 Vafai et al. Aug 2001 B1
6282565 Shaw et al. Aug 2001 B1
6285991 Powar Sep 2001 B1
6289214 Backstrom Sep 2001 B1
6298445 Shostack et al. Oct 2001 B1
6301668 Gleichauf et al. Oct 2001 B1
6304898 Shiigi Oct 2001 B1
6304973 Williams Oct 2001 B1
6311207 Mighdoll et al. Oct 2001 B1
6317829 Van Oorschot Nov 2001 B1
6320948 Heilmann et al. Nov 2001 B1
6321267 Donaldson Nov 2001 B1
6324569 Ogilvie et al. Nov 2001 B1
6324647 Bowman-Amuah Nov 2001 B1
6324656 Gleichauf et al. Nov 2001 B1
6330589 Kennedy Dec 2001 B1
6347374 Drake et al. Feb 2002 B1
6353886 Howard et al. Mar 2002 B1
6363489 Comay et al. Mar 2002 B1
6370648 Diep Apr 2002 B1
6373950 Rowney Apr 2002 B1
6385655 Smith et al. May 2002 B1
6393465 Leeds May 2002 B2
6393568 Ranger et al. May 2002 B1
6405318 Rowland Jun 2002 B1
6434624 Gai et al. Aug 2002 B1
6442588 Clark et al. Aug 2002 B1
6442686 McArdle et al. Aug 2002 B1
6453345 Trcka et al. Sep 2002 B2
6460050 Pace et al. Oct 2002 B1
6460141 Olden Oct 2002 B1
6470086 Smith Oct 2002 B1
6487599 Smith et al. Nov 2002 B1
6487666 Shanklin et al. Nov 2002 B1
6502191 Smith et al. Dec 2002 B1
6516411 Smith Feb 2003 B2
6519703 Joyce Feb 2003 B1
6539430 Humes Mar 2003 B1
6546416 Kirsch Apr 2003 B1
6546493 Magdych et al. Apr 2003 B1
6550012 Villa et al. Apr 2003 B1
6574737 Kingsford et al. Jun 2003 B1
6578025 Pollack et al. Jun 2003 B1
6609196 Dickinson, III et al. Aug 2003 B1
6636946 Jeddelch Oct 2003 B2
6650890 Iriam et al. Nov 2003 B1
6654787 Aronson et al. Nov 2003 B1
6661353 Gopen Dec 2003 B1
6662170 Dom et al. Dec 2003 B1
6675153 Cook et al. Jan 2004 B1
6681331 Munson et al. Jan 2004 B1
6687687 Smadja Feb 2004 B1
6697950 Ko Feb 2004 B1
6701440 Kim et al. Mar 2004 B1
6704874 Porras et al. Mar 2004 B1
6711127 Gorman et al. Mar 2004 B1
6711687 Sekiguchi Mar 2004 B1
6725377 Kouznetsov Apr 2004 B1
6732101 Cook May 2004 B1
6732157 Gordon et al. May 2004 B1
6735703 Kilpatrick et al. May 2004 B1
6738462 Brunson May 2004 B1
6742116 Matsui et al. May 2004 B1
6742124 Kilpatrick et al. May 2004 B1
6742128 Joiner May 2004 B1
6754705 Joiner et al. Jun 2004 B2
6757830 Tarbotton et al. Jun 2004 B1
6760309 Rochberger et al. Jul 2004 B1
6768991 Hearnden Jul 2004 B2
6769016 Rothwell et al. Jul 2004 B2
6772196 Kirsch et al. Aug 2004 B1
6775657 Baker Aug 2004 B1
6792546 Shanklin et al. Sep 2004 B1
6880156 Landherr et al. Apr 2005 B1
6892178 Zacharia May 2005 B1
6892179 Zacharia May 2005 B1
6892237 Gai et al. May 2005 B1
6895385 Zacharia et al. May 2005 B1
6895438 Ulrich May 2005 B1
6907430 Chong et al. Jun 2005 B2
6910135 Grainger Jun 2005 B1
6928556 Black et al. Aug 2005 B2
6941348 Petry et al. Sep 2005 B2
6941467 Judge et al. Sep 2005 B2
6968461 Lucas et al. Nov 2005 B1
6981143 Mullen et al. Dec 2005 B2
7051077 Lin May 2006 B2
7076527 Bellegarda et al. Jul 2006 B2
7089428 Farley et al. Aug 2006 B2
7089590 Judge et al. Aug 2006 B2
7092992 Yu Aug 2006 B1
7093129 Gavagni et al. Aug 2006 B1
7096498 Judge Aug 2006 B2
7117358 Bandini et al. Oct 2006 B2
7124372 Brin Oct 2006 B2
7124438 Judge et al. Oct 2006 B2
7131003 Lord et al. Oct 2006 B2
7143213 Need et al. Nov 2006 B2
7152105 McClure et al. Dec 2006 B2
7155243 Baldwin et al. Dec 2006 B2
7164678 Connor Jan 2007 B2
7206814 Kirsch Apr 2007 B2
7209954 Rothwell et al. Apr 2007 B1
7213260 Judge May 2007 B2
7219131 Banister et al. May 2007 B2
7225466 Judge May 2007 B2
7254608 Yeager et al. Aug 2007 B2
7254712 Godfrey et al. Aug 2007 B2
7260840 Swander et al. Aug 2007 B2
7272149 Bly et al. Sep 2007 B2
7272853 Goodman et al. Sep 2007 B2
7278159 Kaashoek et al. Oct 2007 B2
7349332 Srinivasan et al. Mar 2008 B1
7376731 Khan et al. May 2008 B2
7379900 Wren May 2008 B1
7385924 Riddle Jun 2008 B1
7458098 Judge et al. Nov 2008 B2
7460476 Morris et al. Dec 2008 B1
7461339 Liao et al. Dec 2008 B2
7496634 Cooley Feb 2009 B1
7502829 Radatti et al. Mar 2009 B2
7506155 Stewart et al. Mar 2009 B1
7519563 Urmanov et al. Apr 2009 B1
7519994 Judge et al. Apr 2009 B2
7522516 Parker Apr 2009 B1
7523092 Andreev et al. Apr 2009 B2
7543053 Goodman et al. Jun 2009 B2
7543056 McClure et al. Jun 2009 B2
7545748 Riddle Jun 2009 B1
7610344 Mehr et al. Oct 2009 B2
7617160 Grove et al. Nov 2009 B1
7620986 Jagannathan et al. Nov 2009 B1
7624448 Coffman Nov 2009 B2
7644127 Yu Jan 2010 B2
7668951 Lund et al. Feb 2010 B2
7693947 Judge et al. Apr 2010 B2
7694128 Judge et al. Apr 2010 B2
7711684 Sundaresan et al. May 2010 B2
7716310 Foti May 2010 B2
7730316 Baccash Jun 2010 B1
7739409 Yanovsky et al. Jun 2010 B2
7748038 Olivier et al. Jun 2010 B2
7779156 Alperovitch et al. Aug 2010 B2
7779466 Judge et al. Aug 2010 B2
7870203 Judge et al. Jan 2011 B2
7899866 Buckingham et al. Mar 2011 B1
7903549 Judge et al. Mar 2011 B2
7917627 Andriantsiferana et al. Mar 2011 B1
7937480 Alperovitch et al. May 2011 B2
7941523 Andreev et al. May 2011 B2
7949716 Alperovitch et al. May 2011 B2
7949992 Andreev et al. May 2011 B2
7966335 Sundaresan et al. Jun 2011 B2
8042149 Judge Oct 2011 B2
8042181 Judge Oct 2011 B2
8045458 Alperovitch et al. Oct 2011 B2
8051134 Begeja et al. Nov 2011 B1
8069481 Judge Nov 2011 B2
8079087 Spies et al. Dec 2011 B1
8095876 Verstak et al. Jan 2012 B1
8132250 Judge et al. Mar 2012 B2
8160975 Tang et al. Apr 2012 B2
8179798 Alperovitch et al. May 2012 B2
8185930 Alperovitch et al. May 2012 B2
8214497 Alperovitch et al. Jul 2012 B2
20010037311 McCoy et al. Nov 2001 A1
20010049793 Sugimoto Dec 2001 A1
20020004902 Toh et al. Jan 2002 A1
20020009079 Jungck et al. Jan 2002 A1
20020013692 Chandhok et al. Jan 2002 A1
20020016824 Leeds Feb 2002 A1
20020016910 Wright et al. Feb 2002 A1
20020023089 Woo Feb 2002 A1
20020023140 Hile et al. Feb 2002 A1
20020026591 Hartley et al. Feb 2002 A1
20020032871 Malan et al. Mar 2002 A1
20020035683 Kaashoek et al. Mar 2002 A1
20020042876 Smith Apr 2002 A1
20020046041 Lang Apr 2002 A1
20020049853 Chu et al. Apr 2002 A1
20020051575 Myers et al. May 2002 A1
20020059454 Barrett et al. May 2002 A1
20020062368 Holtzman et al. May 2002 A1
20020078382 Sheikh et al. Jun 2002 A1
20020087882 Schneier et al. Jul 2002 A1
20020095492 Kaashoek et al. Jul 2002 A1
20020112013 Walsh Aug 2002 A1
20020112185 Hodges Aug 2002 A1
20020116627 Tarbotton et al. Aug 2002 A1
20020120853 Tyree Aug 2002 A1
20020133365 Grey et al. Sep 2002 A1
20020138416 Lovejoy et al. Sep 2002 A1
20020138755 Ko Sep 2002 A1
20020138759 Dutta Sep 2002 A1
20020138762 Horne Sep 2002 A1
20020143963 Converse et al. Oct 2002 A1
20020147734 Shoup et al. Oct 2002 A1
20020152399 Smith Oct 2002 A1
20020165971 Baron Nov 2002 A1
20020169954 Bandini et al. Nov 2002 A1
20020172367 Mulder et al. Nov 2002 A1
20020178227 Matsa et al. Nov 2002 A1
20020178383 Hrabik et al. Nov 2002 A1
20020178410 Haitsma et al. Nov 2002 A1
20020188732 Buckman et al. Dec 2002 A1
20020188864 Jackson Dec 2002 A1
20020194469 Dominique et al. Dec 2002 A1
20020199095 Bandini et al. Dec 2002 A1
20030005326 Flemming Jan 2003 A1
20030005331 Williams Jan 2003 A1
20030009554 Burch et al. Jan 2003 A1
20030009693 Brock et al. Jan 2003 A1
20030009696 Bunker V. et al. Jan 2003 A1
20030009699 Gupta et al. Jan 2003 A1
20030014664 Hentunen Jan 2003 A1
20030023692 Moroo Jan 2003 A1
20030023695 Kobata et al. Jan 2003 A1
20030023736 Abkemeier Jan 2003 A1
20030023873 Ben-Itzhak Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030023875 Hursey et al. Jan 2003 A1
20030028803 Bunker, V et al. Feb 2003 A1
20030033516 Howard et al. Feb 2003 A1
20030033542 Goseva-Popstojanova et al. Feb 2003 A1
20030041264 Black et al. Feb 2003 A1
20030046253 Shetty et al. Mar 2003 A1
20030051026 Carter et al. Mar 2003 A1
20030051163 Bidaud Mar 2003 A1
20030051168 King et al. Mar 2003 A1
20030055931 Cravo De Almeida et al. Mar 2003 A1
20030061506 Cooper et al. Mar 2003 A1
20030065943 Geis et al. Apr 2003 A1
20030084280 Bryan et al. May 2003 A1
20030084320 Tarquini et al. May 2003 A1
20030084323 Gales May 2003 A1
20030084347 Luzzatto May 2003 A1
20030088792 Card et al. May 2003 A1
20030093518 Hiraga May 2003 A1
20030093667 Dutta et al. May 2003 A1
20030093695 Dutta May 2003 A1
20030093696 Sugimoto May 2003 A1
20030095555 McNamara et al. May 2003 A1
20030097439 Strayer et al. May 2003 A1
20030097564 Tewari et al. May 2003 A1
20030105976 Copeland, III Jun 2003 A1
20030110392 Aucsmith et al. Jun 2003 A1
20030110396 Lewis et al. Jun 2003 A1
20030115485 Milliken Jun 2003 A1
20030115486 Choi et al. Jun 2003 A1
20030123665 Dunstan et al. Jul 2003 A1
20030126464 McDaniel et al. Jul 2003 A1
20030126472 Banzhof Jul 2003 A1
20030135749 Gales et al. Jul 2003 A1
20030140137 Joiner et al. Jul 2003 A1
20030140250 Taninaka et al. Jul 2003 A1
20030145212 Crumly Jul 2003 A1
20030145225 Bruton, III et al. Jul 2003 A1
20030145226 Bruton, III et al. Jul 2003 A1
20030149887 Yadav Aug 2003 A1
20030149888 Yadav Aug 2003 A1
20030152076 Lee et al. Aug 2003 A1
20030152096 Chapman Aug 2003 A1
20030154393 Young Aug 2003 A1
20030154399 Zuk et al. Aug 2003 A1
20030154402 Pandit et al. Aug 2003 A1
20030158905 Petry et al. Aug 2003 A1
20030159069 Choi et al. Aug 2003 A1
20030159070 Mayer et al. Aug 2003 A1
20030167308 Schran Sep 2003 A1
20030167402 Stolfo et al. Sep 2003 A1
20030172166 Judge et al. Sep 2003 A1
20030172167 Judge et al. Sep 2003 A1
20030172289 Soppera Sep 2003 A1
20030172291 Judge et al. Sep 2003 A1
20030172292 Judge Sep 2003 A1
20030172294 Judge Sep 2003 A1
20030172301 Judge et al. Sep 2003 A1
20030172302 Judge et al. Sep 2003 A1
20030182421 Faybishenko et al. Sep 2003 A1
20030187936 Bodin et al. Oct 2003 A1
20030187996 Cardina et al. Oct 2003 A1
20030204596 Yadav Oct 2003 A1
20030204719 Ben Oct 2003 A1
20030204741 Schoen et al. Oct 2003 A1
20030212791 Pickup Nov 2003 A1
20030233328 Scott et al. Dec 2003 A1
20040015554 Wilson Jan 2004 A1
20040025044 Day Feb 2004 A1
20040034794 Mayer et al. Feb 2004 A1
20040054886 Dickinson et al. Mar 2004 A1
20040058673 Iriam et al. Mar 2004 A1
20040059811 Sugauchi et al. Mar 2004 A1
20040088570 Roberts et al. May 2004 A1
20040098464 Koch et al. May 2004 A1
20040111519 Fu et al. Jun 2004 A1
20040111531 Staniford et al. Jun 2004 A1
20040122926 Moore et al. Jun 2004 A1
20040122967 Bressler et al. Jun 2004 A1
20040123157 Alagna et al. Jun 2004 A1
20040128355 Chao et al. Jul 2004 A1
20040139160 Wallace et al. Jul 2004 A1
20040139334 Wiseman Jul 2004 A1
20040165727 Moreh et al. Aug 2004 A1
20040167968 Wilson et al. Aug 2004 A1
20040177120 Kirsch Sep 2004 A1
20040203589 Wang et al. Oct 2004 A1
20040205135 Hallam Oct 2004 A1
20040221062 Starbuck et al. Nov 2004 A1
20040236884 Beetz Nov 2004 A1
20040249895 Way Dec 2004 A1
20040255122 Ingerman et al. Dec 2004 A1
20040267893 Lin Dec 2004 A1
20050021738 Goeller et al. Jan 2005 A1
20050021997 Beynon et al. Jan 2005 A1
20050033742 Kamvar et al. Feb 2005 A1
20050052998 Oliver et al. Mar 2005 A1
20050060295 Gould et al. Mar 2005 A1
20050060643 Glass et al. Mar 2005 A1
20050065810 Bouron Mar 2005 A1
20050086300 Yeager et al. Apr 2005 A1
20050091319 Kirsch Apr 2005 A1
20050091320 Kirsch et al. Apr 2005 A1
20050102366 Kirsch May 2005 A1
20050120019 Rigoutsos et al. Jun 2005 A1
20050141427 Bartky Jun 2005 A1
20050149383 Zacharia et al. Jul 2005 A1
20050159998 Buyukkokten et al. Jul 2005 A1
20050160148 Yu Jul 2005 A1
20050192958 Widjojo et al. Sep 2005 A1
20050193076 Flury et al. Sep 2005 A1
20050198159 Kirsch Sep 2005 A1
20050204001 Stein et al. Sep 2005 A1
20050216564 Myers et al. Sep 2005 A1
20050256866 Lu et al. Nov 2005 A1
20050262209 Yu Nov 2005 A1
20050262210 Yu Nov 2005 A1
20050262556 Waisman et al. Nov 2005 A1
20060007936 Shrum et al. Jan 2006 A1
20060009994 Hogg et al. Jan 2006 A1
20060015563 Judge et al. Jan 2006 A1
20060015942 Judge et al. Jan 2006 A1
20060021055 Judge et al. Jan 2006 A1
20060023940 Katsuyama Feb 2006 A1
20060031314 Brahms et al. Feb 2006 A1
20060031483 Lund et al. Feb 2006 A1
20060036693 Hulten et al. Feb 2006 A1
20060036727 Kurapati et al. Feb 2006 A1
20060041508 Pham et al. Feb 2006 A1
20060042483 Work et al. Mar 2006 A1
20060047794 Jezierski Mar 2006 A1
20060059238 Slater et al. Mar 2006 A1
20060095404 Adelman et al. May 2006 A1
20060095586 Adelman et al. May 2006 A1
20060112026 Graf et al. May 2006 A1
20060123083 Goutte et al. Jun 2006 A1
20060129810 Jeong et al. Jun 2006 A1
20060149821 Rajan et al. Jul 2006 A1
20060155553 Brohman et al. Jul 2006 A1
20060168024 Mehr et al. Jul 2006 A1
20060174337 Bernoth Aug 2006 A1
20060174341 Judge Aug 2006 A1
20060179113 Buckingham et al. Aug 2006 A1
20060184632 Marino et al. Aug 2006 A1
20060191002 Lee et al. Aug 2006 A1
20060212925 Shull et al. Sep 2006 A1
20060212930 Shull et al. Sep 2006 A1
20060212931 Shull et al. Sep 2006 A1
20060225136 Rounthwaite et al. Oct 2006 A1
20060230039 Shull et al. Oct 2006 A1
20060230134 Qian et al. Oct 2006 A1
20060248156 Judge et al. Nov 2006 A1
20060253447 Judge Nov 2006 A1
20060253458 Dixon et al. Nov 2006 A1
20060253578 Dixon et al. Nov 2006 A1
20060253579 Dixon et al. Nov 2006 A1
20060253582 Dixon et al. Nov 2006 A1
20060253584 Dixon et al. Nov 2006 A1
20060265747 Judge Nov 2006 A1
20060267802 Judge et al. Nov 2006 A1
20060277259 Murphy et al. Dec 2006 A1
20070002831 Allen et al. Jan 2007 A1
20070019235 Lee Jan 2007 A1
20070025304 Leelahakriengkrai Feb 2007 A1
20070027992 Judge et al. Feb 2007 A1
20070028301 Shull et al. Feb 2007 A1
20070043738 Morris et al. Feb 2007 A1
20070078675 Kaplan Apr 2007 A1
20070124803 Taraz May 2007 A1
20070130350 Alperovitch et al. Jun 2007 A1
20070130351 Alperovitch et al. Jun 2007 A1
20070168394 Vivekanand Jul 2007 A1
20070195753 Judge et al. Aug 2007 A1
20070195779 Judge et al. Aug 2007 A1
20070199070 Hughes Aug 2007 A1
20070203997 Ingerman et al. Aug 2007 A1
20070208817 Lund et al. Sep 2007 A1
20070214151 Thomas et al. Sep 2007 A1
20070233787 Pagan Oct 2007 A1
20070239642 Sindhwani et al. Oct 2007 A1
20070253412 Batteram et al. Nov 2007 A1
20080005223 Flake et al. Jan 2008 A1
20080022384 Yee et al. Jan 2008 A1
20080047009 Overcash et al. Feb 2008 A1
20080077517 Sappington Mar 2008 A1
20080082662 Dandliker et al. Apr 2008 A1
20080091765 Gammage et al. Apr 2008 A1
20080103843 Goeppert et al. May 2008 A1
20080104180 Gabe May 2008 A1
20080123823 Pirzada et al. May 2008 A1
20080159632 Oliver et al. Jul 2008 A1
20080175226 Alperovitch et al. Jul 2008 A1
20080175266 Alperovitch et al. Jul 2008 A1
20080177684 Laxman et al. Jul 2008 A1
20080177691 Alperovitch et al. Jul 2008 A1
20080178259 Alperovitch et al. Jul 2008 A1
20080178288 Alperovitch et al. Jul 2008 A1
20080184366 Alperovitch et al. Jul 2008 A1
20080301755 Sinha et al. Dec 2008 A1
20080303689 Iverson Dec 2008 A1
20090003204 Okholm et al. Jan 2009 A1
20090089279 Jeong et al. Apr 2009 A1
20090103524 Mantripragada et al. Apr 2009 A1
20090113016 Sen et al. Apr 2009 A1
20090119740 Alperovitch et al. May 2009 A1
20090122699 Alperovitch et al. May 2009 A1
20090125980 Alperovitch et al. May 2009 A1
20090164582 Dasgupta et al. Jun 2009 A1
20090192955 Tang et al. Jul 2009 A1
20090254499 Deyo Oct 2009 A1
20090254572 Redlich et al. Oct 2009 A1
20090254663 Alperovitch et al. Oct 2009 A1
20090282476 Nachenberg et al. Nov 2009 A1
20100115040 Sargent et al. May 2010 A1
20100306846 Alperovitch et al. Dec 2010 A1
20110280160 Yang Nov 2011 A1
20110296519 Ide et al. Dec 2011 A1
20120011252 Alperovitch et al. Jan 2012 A1
20120084441 Alperovitch et al. Apr 2012 A1
20120110672 Judge et al. May 2012 A1
20120174219 Hernandez et al. Jul 2012 A1
20120204265 Judge Aug 2012 A1
20120216248 Alperovitch et al. Aug 2012 A1
20120239751 Alperovitch et al. Sep 2012 A1
20120240228 Alperovitch et al. Sep 2012 A1
20120271890 Judge et al. Oct 2012 A1
Foreign Referenced Citations (123)
Number Date Country
2003230606 Oct 2003 AU
2005304883 May 2006 AU
2006315184 May 2007 AU
2008207924 Jul 2008 AU
2008207926 Jul 2008 AU
2008207930 Jul 2008 AU
2008323779 May 2009 AU
2008323784 May 2009 AU
2009203095 Aug 2009 AU
2478299 Sep 2003 CA
2564533 Dec 2005 CA
2564533 Dec 2005 CA
2586709 May 2006 CA
2628189 May 2007 CA
2654796 Dec 2007 CA
10140166 Apr 2009 CN
101443736 May 2009 CN
101730892 Jun 2010 CN
101730904 Jun 2010 CN
101730903 Nov 2012 CN
103095672 May 2013 CN
0375138 Jun 1990 EP
0413537 Feb 1991 EP
0420779 Apr 1991 EP
0720333 Jul 1996 EP
0838774 Apr 1998 EP
0869652 Oct 1998 EP
0907120 Apr 1999 EP
1326376 Jul 2003 EP
1488316 Dec 2004 EP
1271846 Jul 2005 EP
1672558 Jun 2006 EP
1820101 Aug 2007 EP
1819108 Jun 2008 EP
1982540 Oct 2008 EP
2036246 Mar 2009 EP
2115642 Nov 2009 EP
2115689 Nov 2009 EP
2213056 Aug 2010 EP
2223258 Sep 2010 EP
2562975 Feb 2013 EP
2562976 Feb 2013 EP
2562986 Feb 2013 EP
2562987 Feb 2013 EP
2271002 Mar 1994 GB
2357932 Jul 2001 GB
3279-DELNP-2007 Aug 2007 IN
4233-DELNP-2007 Aug 2008 IN
4842CHENP2009 Jan 2010 IN
4763CHENP2009 Jul 2010 IN
2000-148276 May 2000 JP
2000-215046 Aug 2000 JP
2001-028006 Jan 2001 JP
2003-150482 May 2003 JP
2004-533677 Nov 2004 JP
2004-537075 Dec 2004 JP
2005-520230 Jul 2005 JP
2006-268544 Oct 2006 JP
18350870 Dec 2006 JP
18350870 Dec 2006 JP
2006350870 Dec 2006 JP
2007-540073 Jun 2008 JP
2009-516269 Apr 2009 JP
10-0447082 Sep 2004 KR
2006-0012137 Feb 2006 KR
2006-0028200 Mar 2006 KR
2006028200 Mar 2006 KR
1020060041934 May 2006 KR
10-0699531 Mar 2007 KR
699531 Mar 2007 KR
10-0737523 Jul 2007 KR
737523 Jul 2007 KR
10-0750377 Aug 2007 KR
750377 Aug 2007 KR
447082 Dec 2009 KR
106744 Nov 2004 SG
142513 Jun 2008 SG
WO 9635994 Nov 1996 WO
WO 9905814 Feb 1999 WO
WO 9933188 Jul 1999 WO
WO 9937066 Jul 1999 WO
WO 0007312 Feb 2000 WO
WO 0008543 Feb 2000 WO
WO 0042748 Jul 2000 WO
WO 0059167 Oct 2000 WO
WO 0117165 Mar 2001 WO
WO 0122686 Mar 2001 WO
WO 0150691 Jul 2001 WO
WO 0176181 Oct 2001 WO
WO 0180480 Oct 2001 WO
WO 0188834 Nov 2001 WO
WO 0213469 Feb 2002 WO
WO 0213489 Feb 2002 WO
WO 0215521 Feb 2002 WO
WO 0275547 Sep 2002 WO
WO 02082293 Oct 2002 WO
WO 02091706 Nov 2002 WO
WO 03077071 Sep 2003 WO
WO 2004061698 Jul 2004 WO
WO 2004061703 Jul 2004 WO
WO2004061703 Jul 2004 WO
WO 2004081734 Sep 2004 WO
WO2004088455 Oct 2004 WO
WO 2005006139 Jan 2005 WO
WO 2005086437 Sep 2005 WO
WO 2005119485 Dec 2005 WO
WO 2005119488 Dec 2005 WO
WO 2005116851 Dec 2005 WO
WO 2006029399 Mar 2006 WO
WO 2006119509 Mar 2006 WO
WO 2006052736 May 2006 WO
WO2007030951 Mar 2007 WO
WO 2007059428 May 2007 WO
WO 2007146690 Dec 2007 WO
WO 2007146696 Dec 2007 WO
WO 2007146701 Dec 2007 WO
WO2008008543 Jan 2008 WO
WO 2008091980 Jul 2008 WO
WO 2008091982 Jul 2008 WO
WO 2008091986 Jul 2008 WO
WO 2009146118 Feb 2009 WO
WO 2009062018 May 2009 WO
WO 2009062023 May 2009 WO
Non-Patent Literature Citations (174)
Entry
Article entitled “Learning Limited Dependence Bayesian Classifiers” by Sahami, in Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, 1996, pp. 335-338.
Article entitled “An Evaluation of Phrasal and Clustered Representations on a Text Categorization Task” by Lewis, in 15th Ann Int'l SIGIR, Jun. 1992, pp. 37-50.
Book entitled Machine Learning by Mitchell, 1997, pp. 180-184.
Article entitled “Hierarchically classifying documents using very few words” by Koller et. al., in Proceedings of the Fourteenth International Conference on Machine Learning, 1997.
Article entitled “Classification of Text Documents” by Li et. al., in The Computer Journal, vol. 41, No. 8, 1998, pp. 537-546.
Article entitled “Issues when designing filters in messaging systems” by Palme et. al., in 19 Computer Communications, 1996, pp. 95-101.
Article entitled “Text Categorization with Support Vector Machines: Learning with Many Relevant Features” by Joachins in Machine Learning: ECML-98, Apr. 1998, pp. 1-14.
Article entitled “Smokey: Automatic Recognition of Hostile Messages” by Spertus in Innovative Applications, 1997, pp. 1058-1065.
Article entitled “CAFE: A Conceptual Model for Managing Information in Electronic Mail” by Takkinen et al. in Proc. 31st Annual Hawaii International Conference on System Sciences, 1998, pp. 44-53.
Article entitled “Spam!” by Cranor et. al. in Communications of the ACM, vol. 41, No. 8, Aug. 1998, pp. 74-83.
Article entitled “Sendmail and Spam” by LeFebvre in Performance Computing, Aug. 1998, pp. 55-58.
Article entitled “Implementing a Generalized Tool for Network Monitoring” by Ranum et. al. in LISA IX, Oct. 26-31, 1997, pp. 1-8.
Article entitled “Method for Automatic Contextual Transposition Upon Receipt of Item of Specified Criteria” printed Feb. 1994 in IBM Technical Disclosure Bulletin, vol. 37, No. 2B, p. 333.
Article entitled “Toward Optimal Feature Selection” by Koller et al., in Machine Learning: Proc. of the Thirteenth International Conference, 1996.
Article entitled “MIMEsweeper defuses virus network, net mail bombs” by Avery, in Info World, May 20, 1996, vol. 12, No. 21, p. N1.
Article entitled “Stomping out mail viruses” by Wilkerson, in PC Week, Jul. 15, 1996, p. N8.
Article entitled “Securing Electronic Mail Systems” by Serenelli et al., in Communications—Fusing Command Control and Intelligence: MILCOM '92, 1992, pp. 677-680.
Article entitled “Integralis' Minesweeper defuses E-mail bombs” by Kramer et. al., in PC Week, Mar. 18, 1996, p. N17-N23.
Articule entitled “A Toolkit and Methods for Internet Firewalls” by Ranum et. al., in Proc. of USENIX Summer 1994 Technical Conference, Jun. 6-10, 1994, pp. 37-44.
Article entitled “Firewall Systems: The Next Generation” by McGhie, in Integration Issues in Large Commercial Media Delivery Systems: Proc. of SPIE—The International Society for Optical Engineering, Oct. 23-24, 1995, pp. 270-281.
Article entitled “Design of the TTI Prototype Trusted Mail Agent” by Rose et. al., in Computer Message Systems-85: Proc. of the IFIP TC 6 International Symposium on Computer Message Systems, Sep. 5-7, 1985, pp. 377-399.
Article entitled “Designing an Academic Firewall: Policy, Practice, and Experience with SURF” by Greenwald et. al., in Proc. of the 1996 Symposium on Network and Distributed Systems Security, 1996, pp. 1-14.
Article entitled “X Through the Firewall, and Other Application Relays” by Treese et. al. in Proc. of the USENIX Summer 1993 Technical Conference, Jun. 21-25, 1993, pp. 87-99.
Article entitled “Firewalls for Sale” by Bryan, in BYTE, Apr. 1995, pp. 99-104.
Article entitled “A DNS Filter and Switch for Packet-filtering Gateways” by Cheswick et al., in Proc. of the Sixth Annual USENIX Security Symposium: Focusing on Applications of Cryptography, Jul. 22-25, 1996, pp. 15-19.
Article entitled “Safe Use of X Window System Protocol Across a Firewall” by Kahn, in Proc. of the Fifth USENIX UNIX Security Symposium, Jun. 5-7, 1995, pp. 105-116.
Article entitled “Automating the OSI to Internet to Internet Management Conversion Through the Use of an Object-Oriented Platform” by Pavlou et al., in Proc. of the IFIP TC6/WG6.4 International Conference on Advanced Information Processing Techniques for LAN and MAN Management, Apr. 7-9, 1993, pp. 245-260.
Article entitled “A Secure Email Gateway (Building an RCAS External Interface)” by Smith, in Tenth Annual Computer Security Applications Conference, Dec. 5-9, 1994, pp. 202-211.
Article entitled “Secure External References in Multimedia Email Messages” by Wiegel, in 3rd ACM Conference on Computer and Communications Security, Mar. 14-16, 1996, pp. 11-18.
Memo entitled “SOCKS Protocol Version 5” by Leech et. al., in Standards Track, Mar. 1996, pp. 1-9.
Article entitled “Securing the Web: fire walls, proxy servers, and data driven attacks” by Farrow in InfoWorld, Jun. 19, 1995, vol. 17, No. 25, p. 103.
Article entitled “Learning Rules that Classify E-mail” by Cohen, pp. 1-8. Conference: Machine learning in information access—Spring symposium—Technical Report—American Association for Artifical Intelligence SSS, AAAI Press, Mar. 1996.
Article entitled “Hierarchical Bayesian Clustering for Automatic Text Classification” by Iwayama et al. in Natural Language, pp. 1322-1327, vol. 2. Publication date, 1995.
Article entitled “A Comparison of Classifiers and Document Representations for the Routing Problem” by Schutze. pp. 229-237, Publication, 1995.
Article entitled “A Comparative Study on Feature Selection in Text Categorization” by Yang et. al. Machine learning—International Workshop Then Conference, p. 412-420, Jul. 1997.
Website: Technical Focus—Products—Entegrity AssureAccess. www2.entegrity.com. Published prior to May 2006, pp. 1-4.
Website: Create Secure Internet Communication Channels—Atabok Homepage. www.atabok.com, Published Feb. 19, 2002, pp. 1-3.
Website: ATABOK VCNMAIL™ Secure Email Solution—Atabok Related Produces. www.atabok.com, Published Feb. 19, 2002, pp. 1-2.
Website: ATABOK VCN Auto-Exchange™—Atabok Related Produces. www.atabok.com, Pub Feb. 19, 2002, 1 page.
Website: Controlling Digital Assets is a Paramount Need for All Business—Atabok Related Produces. www.atabok.com, Pub. Feb. 19, 2002, 1 page.
Website: Control Your Confidential Communications with ATABOK—Atabok Related Produces. www.atabok.com, Published prior to May 2006, 1 page.
Website: Entrust Entelligence—Entrust Homepage. www.entrust.com, Published prior to May 2006, 1 page.
Website: E-mail Plug-in—Get Technical/Interoperability—Entrust Entelligence. www.entrust.com, Pub. Feb. 19, 2002, 1 page.
Website: E-mail Plug-in—Get Technical/System Requirements—Entrust Entelligence. www.entrust.com, Published Feb. 19, 2002, 1 page.
Website: E-mail Plug-in—Features and Benefits—Entrust Entelligence. www.entrust.com, Published Feb. 19, 2002, pp. 1-2.
Website: Internet Filtering Software—Internet Manager Homepage. www.elronsw.com, Published Feb. 19, 2002, pp. 1-2.
Website: ESKE—Email with Secure Key Exchange—ESKE. www.danu.ie, Published prior to May 2006, p. 1.
Website: Terminet—ESKE. www.danu.ie, Pub. Feb. 19, 2002, p. 1.
Website: Baltimore Focus on e-Security—Baltimore Technologies. www.baltimore.com, Pub. Feb. 19, 2002, p. 1-2.
Website: Go Secure! for Microsoft Exchange—Products/Services—Verisign, Inc. www.verisign.com, Published prior to May 2006, p. 2.
Article entitled “A Comparison of Two Learning Algorithms for Text Categorization” by Lewis et al., in Third Annual Symposium on Document Analysis and Information Retrieval, Apr. 11-13, 1994, pp. 81-92.
Article entitled “An Example-Based Mapping Method for Text Categorization and Retrieval” by Yang et. al., in ACM Transactions on Information Systems, Jul. 1994, vol. 12, No. 3, pp. 252-277.
Edakandi, Ashwin Examiner's Report for Australian Patent Application No. 2006315184, dated Mar. 31, 2010, 8 pages.
China Patent Agent (H.K.) Ltd., First Office Action for Chinese Patent Application No. 200680050707.7, dated Mar. 9, 2010, 31 pages.
US Patent and Trademark Office Final Office Action Summary for U.S. Appl. No. 11/423,329, mailed Jan. 14, 2010, 21 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/423,329, mailed Jun. 29, 2009, 43 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/626,470, mailed Jan. 19, 2010, 45 pages.
US Patent and Trademark Office Final Office Action Summary for U.S. Appl. No. 11/142,943, mailed Apr. 29, 2009, 18 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/142,943, mailed Dec. 31, 2009, 15 pages.
US Patent and Trademark Office Restriction Requirement for U.S. Appl. No. 11/142,943, mailed Jan. 13, 2009, 7 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/142,943, mailed Jun. 26, 2008, 59 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/937,274, mailed Dec. 9, 2009, 53 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/937,274, mailed Jun. 29, 2009, 46 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/626,603, mailed Dec. 2, 2009, 47 pages.
US Patent and Trademark Office Restriction Requirement for U.S. Appl. No. 11/626,603, mailed Aug. 11, 2009, 7 pages.
US Patent and Trademark Office Nonfinal Office Action Summary for U.S. Appl. No. 11/626,479, mailed Mar. 17, 2010, 65 pages.
Ando, Ruo, “Real-time neural detection with network capturing”, Study report from Information Processing Society of Japan, vol. 2002, No. 12, IPSJ SIG Notes, Information Processing Society of Japan, 2002, Feb. 15, 2002, p. 145-150.
Aikawa, Narichika, “Q&A Collection: Personal computers have been introduced to junior high schools and accessing to the Internet has been started; however, we want to avoid the students from accessing harmful information. What can we do?”, DOS/V Power Report, vol. 8, No. 5, Japan, Impress Co., Ltd., May 1, 1998, p. 358 to 361.
Shishibori, Masami, et al., “A Filtering Method for Mail Documents Using Personal Profiles”, IEICE Technical Report, The Institute of Electronics, Information and Communication Engineers, vol. 98, No. 486, Dec. 17, 1998, pp. 9-16.
Lane, Terran et al., “Sequence Matching and Learning in Anomaly Detection for Computer Security,” AAAI Technical Report WS-97-07, 1997, p. 43 to 49.
Abika.com, “Trace IP address, email or IM to owner or user” http://www.abika.com/help/IPaddressmap.htm, 3 pp. (Jan. 25, 2006).
Abika.com, “Request a Persons Report”, http://www.abika.com/forms/Verifyemailaddress.asp, 1 p. (Jan. 26, 2006).
Lough et al., “A Short Tutorial on Wireless LANs and IEEE 802.11”, printed on May 27, 2002, in the IEEE Computer Society's Student Newsletter, Summer 1997, vol. 5, No. 2.
Feitelson et al., “Self-Tuning Systems”, Mar./Apr. 1999, IEEE, 0740-7459/99, pp. 52-60.
Natsev, Apostol et al., “WALRUS: A Similarity Retrieval Algorithm for Image Databases,” Mar. 2004.
Schleimer, Saul, et al., “Winnowing: Local Algorighms for Document Fingerprinting.” Jun. 2003.
Sobottka, K., et al., “Text Extraction from Colored Book and Journal Covers”, 2000 (pp. 163-176).
Thomas, R., et al., “The Game Goes on: An Analysis of Modern SPAM Techniques,” 2006.
Anklesaria, F. et al., “The Internet Gopher Protocol”, RFC 1436, Mar. 1993.
Berners-Lee, T. et al., “Uniform Resource Identifiers (URI): Generic Syntax”, RFC 2396, Aug. 1998.
Crispin, M., “Internet Message Access Protocol—Version 4rev1”, RFC 2060, Dec. 1996.
Franks, J. et al., “HITP Authentication: Basic and Digest Access Authentication”, RFC 2617, Jun. 1999.
Klensin, J. et al., “SMTP Service Extensions”, RFC 1869, Nov. 1995.
Moats, R., “URN Syntax”, RFC 2141, May 1997.
Moore, K., “SMTP Service Extension for Delivery Status Notifications”, RFC 1891, Jan. 1996.
Myers, J. et al., “Post Office Protocol—Version 3”, RFC 1939, May 1996.
Nielsen, H., et al., “An HTTP Extension Framework”, RFC 2774, Feb. 2000.
Postel, J., “Simple Mail Transfer Protocol”, RFC 821, Aug. 1982.
Braden, R., “Requirements for Internet Hosts—Application and Support”, RFC 1123, Oct. 1989, 98 pages.
Fielding, R. et al., “Hypertext Transfer Protocol—HTTP/1.1”, RFC 2616, Jun. 1999, 114 pages.
Krishnaswamy et al—Verity: A QoS Metric for Selecting Web Services and Providers, Proceedings of the Fourth International Conference on Web Information Systems Engineering Workshops (WISEEW'03), IEEE, 2004.
Kamvar et al., The Eigen Trust Algorithm for Reputation Management in P2P Networks, ACM, WWW2003, Budapest, Hungary, May 20-24, 2003, pp. 640-651.
Luk, W., et al. “Incremental Development of Hardware Packet Filters”, Proc. International Conference on Engineering of Reconfigurable Systems and Algorithms (ERSA). Jan. 1, 2001. pp. 115-118. XP055049950. Retrieved from the Internet: URL:www.doc.ic.ac.uk/-sy99/c1.ps.
Georgopoulos, C. et al., “A Protocol Processing Architecture Backing TCP/IP-based Security Applications in High Speed Networks”. Interworking 2000. Oct. 1, 2000. XP055049972. Bergen. Norway Available online at <URL:http://pelopas.uop.gr/fanis/html—files/pdf—files/papers/invited/12—IW2002.pdf>.
“Network Processor Designs for Next-Generation Networking Equipment”. White Paper Ezchip Technologies. XX. XX. Dec. 27, 1999. pp. 1-4. XP002262747.
Segal, Richard, et al. “Spam Guru: An Enterprise Anti-Spam Filtering System”, IBM, 2004 (7 pages).
Nilsson, Niles J., “Introduction to Machine Learning, an Early Draft of a Proposed Textbook”, Nov. 3, 1998; XP055050127; available online at <URL http://robotics.stanford.edu/˜nilsson/MLBOOK. pdf >.
Androutsopoulos, Ion et al., “Learning to Filter Spam E-Mail: A Comparison of a Naive Bayesian and a Memory-Based Approach”; Proceedings of the Workshop “Machine Learning and Textual Information Access”; 4th European Conference on Principles and Practice of Knowledge Discovery in Databases (PKDD-2000). Sep. 1, 2000 [XP055050141] Lyon, France; available online at <URL http://arxiv.org/ftp/cs/papers/0009/0009009.pdf>.
Rennie, J D M, “iFile: An application of Machine Learning to E-Mail Filtering”; Workshop on Text Mining; Aug. 1, 2000. [XP002904311]. pp. 1-6.
Clayton, Richard, “Good Practice for Combating Unsolicited Bulk Email,” Demon Internet, May 18, 1999 (16 pages).
IronMail™ Version 2.1, User's Manual. © 2001, published by CipherTrust, Inc., 114 pages [U.S. Appl. No. 10/361,067].
IronMail™ version 2.5, User's Manual, © 2001, published by CipherTrust, Inc., 195 pages [U.S. Appl. No. 10/361,067].
IronMail™ version 2.5.1, User's Manual, © 2001, published by CipherTrust, Inc., 203 pages [U.S. Appl. No. 10/361,067].
IronMail™ version 3.0, User's Manual, © 2002, published by CipherTrust, Inc., 280 pages.
IronMail™ version 3.0.1, User's Manual, © 2002, published by CipherTrust, Inc., 314 pages.
Website: Exchange Business Information Safely & Quickly—Without Compromising Security or Reliability—Atabok Secure Data Solutions, Feb. 19, 2002, 2 pages.
Yuchun Tang, “Granular Support Vector Machines Based on Granular Computing, Soft Computing and Statistical Learning.” Georgia State University: May 2006.
Drucker et al; “Support Vector Machines for Spam Categorization”; 1999; IEEE Transactions on Neural Networks; vol. 10, No. 5; pp. 1048-1054.
Graf et al.; “Parallel Support Vector Machines: The Cascade SVM”; 2005; pp. 1-8.
Rokach, Lior et al.; “Decomposition methodology for classification tasks”; 2005; Springer-Verlag London Limited; Pattern Analysis & Applications; pp. 257-271.
Wang, Jigang et al.; “Training Data Selection for Support Vector Machines”; 2005; ICNC 2005, LNCS 3610; pp. 554-564.
Skurichina, Marina et al.; Bagging, Boosting and the Random Subspace Method for Linear Classifiers; 2002; Springer-Verlag London Limited; pp. 121-135.
Tao, Dacheng et al.; “Asymmetric Bagging and Random Subspace for Support Vector Machines-Based Relevance Feedback in Image Retrieval”; 2006; IEEE Computer Society; pp. 1088-1099.
Kotsiantis, S. B. et al.; “Machine learning: a review of classification and combining techniques”; 2006; Springer; Artificial Intelligence Review; pp. 159-190.
Kane, Paul J. et al. “Quantification of Banding, Streaking and Grain in Flat Field Images”, 2000.
Kim, JiSoo et al. “Text Locating from Natural Scene Images Using Image Intensities”, 2005 IEEE.
Gupta, et al., “A Reputation System for Peer-to-Peer Networks,” ACM (2003).
Golbeck, et al., “Inferring Reputation on the Semantic Web,” ACM, 2004.
Okumura, Motonobu, “E-Mail Filtering by Relation Learning”, IEICE Technical Report, vol. 103, No. 603, The Institute of Electronics, Information and Communication Engineers, Jan. 19, 2004, vol. 103, p. 1-5 [English Abstract Only].
Inoue, Naomi, “Computer and Communication: Recent State of Filtering Software,” ISPJ Magazine, vol. 40, No. 10, Japan, The Institute of Electronics, Information and Communication Engineers, Oct. 15, 1999, vol. 40 p. 1007-1010 [English Abstract Only].
Australian Patent Office Examination Report in Australian Patent Application Serial No. 2003230606 mailed on Apr. 3, 2008.
Australian Patent Office Examination Report No. 1 in Australian Patent Application Serial No. 2009203095 mailed pm Oct. 12, 2010.
Australian Patent Office Examination Report No. 2 in Australian Patent Application Serial No. 2009203095 mailed pm Feb. 2, 2012.
Australian Patent Office Examination Report No. 3 in Australian Patent Application Serial No. 200903095 mailed on Mar. 28, 2012.
Canadian Intellectual Property Office Examination Report in Canadian Patent Application Serial No. 2478299 mailed on Jul. 9, 2010.
European Supplementary Search Report for EP Application No. 03723691.6, dated Jun. 29, 2010, 6 pages.
European Patent Office Action for EP Application No. 03723691.6, dated Oct. 12, 2010, 6 pages.
European Patent Office Communication Pursuant to Article 94(3) EPC in EP Application Serial No. 03723691.3 mailed on Jan. 30, 2013.
European Patent Office Search Report and Opinion in EP Application Serial No. 12189404.2 mailed on Jan. 30, 2013.
European Patent Office Search Report and Opinion in EP Application Serial No. 12189407.5 mailed on Jan. 28, 2013.
European Patent Office Search Report and Opinion in EP Application Serial No. 12189412.5 mailed on Jan. 30, 2013.
European Patent Office Search Report and Opinion in EP Application Serial No. 12189413.3 mailed on Jan. 24, 2013.
First/Consequent Examination Report for in Application No. 2639/DELNP/2004, dated Apr. 8, 2011, 3 pages.
Official Action (with uncertified Translation), Japanese Patent Application No. 2003-575222, Sep. 25, 2009, 13 pages.
PCT International Search Report in PCT International Application Serial No. PCT/US2003/007042 mailed on Nov. 13, 2003.
PCT International Preliminary Examination Report in PCT International Application Serial No. PCT/US2003/007042 mailed on Jan. 29, 2004.
Australian Patent Office Examination Report in Australian Patent Application Serial No. 2005304883 mailed on Apr. 16, 2010.
Canadian Patent Office Action in Canadian Patent Application Serial No. 2586709 mailed on Mar. 20, 2013.
China, State Intellectual Property Office, P.R. China, First Office Action in Chinese Patent Application Serial No. 20050046047 mailed on Mar. 1, 2010.
China, State Intellectual Property Office, P.R. China, Second Office Action in Chinese Patent Application Serial No. 20050046047 mailed on Dec. 7, 2010.
China, State Intellectual Property Office, P.R. China, Decision on Rejection in Chinese Patent Application Serial No. 20050046047 mailed on Jun. 27, 2011.
European Patent Office Supplementary Search Report and Written Opinion in EP Application Serial No. 05823134.1 mailed on Jun. 3, 2013.
Japanese Examiner Koji Tamaki, Office Action in JP App. Ser. No. 2007-540073 dated Dec. 16, 2010.
Japanese Patent Office Action in JP Application No. 2007-540073 dated Jul. 7, 2011 (with uncertified translation).
PCT International Search Report and Written Opinion in PCT Application Serial No. PCT/US2005/039978 mailed on Jul. 8, 2008.
PCT International Preliminary Report on Patentability in PCT Application Serial No. PCT/US2005/039978 mailed on May 5, 2009.
Canadian Office Action in Canadian Patent Application Serial No. 2,628,189 mailed on Dec. 8, 2011.
Canadian Office Action in Canadian Patent Application Serial No. 2,628,189 mailed on Jan. 31, 2013.
First Office Action for Chinese Patent Application Serial No. 200680050707.7 dated Mar. 9, 2010.
European Patent Office Search Report dated Nov. 26, 2010 and Written Opinion in EP Application Serial No. 06839820.5-2416 mailed on Dec. 3, 2010.
European Patent Office Communication Pursuant to Article 94(3) EPC 06839820.5-2416 mailed on Oct. 18, 2011 (including Annex EP Search Report dated Nov. 26, 2010).
PCT International Search Report and Written Opinion in PCT International Patent Application Serial No. PCT/US2006/060771 mailed on Feb. 12, 2008.
PCT International Preliminary Report on Patentability in PCT International Patent Application Serial No. PCT/US2006/060771 mailed on May 14, 2008.
PCT International Search Report and Written Opinion in PCT International Application Serial No. PCT/US2007/070483 mailed on Nov. 28, 2007.
PCT International Preliminary Report on Patentability in PCT International Application Serial No. PCT/US2007/070483 mailed on Dec. 10, 2008.
PCT International Search Report and Written Opinion in PCT International Application Serial No. PCT/US2007/070491 mailed on Dec. 20, 2007.
PCT International Preliminary Report on Patentability in PCT International Application Serial No. PCT/US2007/070491 mailed on Dec. 10, 2008.
Australian Patent Office First Examination Report and SIS in Australian Patent Application Serial No. 2008207924 mailed on Dec. 14, 2011.
State Intellectual Property Office, P.R. China First Office Action dated Nov. 9, 2011 in Chinese Patent Application Serial No. 200880009672.1.
State Intellectual Property Office, P.R. China Second Office Action dated Aug. 9, 2012 in Chinese Patent Application Serial No. 200880009672.1.
State Intellectual Property Office, P.R. China Third Office Action dated Nov. 9, 2012 in Chinese Patent Application Serial No. 200880009672.1.
European Patent Office Invitation Pursuant to Rule 62a(1) EPC mailed on Oct. 11, 2011.
PCT International Search Report in PCT International Application Serial No. PCT/US2008/051865 dated Jun. 4, 2008.
PCT International Preliminary Report on Patentability in PCT Application Serial No. PCT/US2008/051865 mailed on Jul. 28, 2009.
Australian Patent Office Patent Examination Report No. 1 issued in Australian Patent Application Serial No. 2008207930 on Dec. 9, 2011.
Australian Patent Office Examination Report No. 2 issued in Australian Patent Application Serial No. 2008207930 on Sep. 10, 2012.
China, State Intellectual Property Office, P.R. China, First Office Action in Chinese Patent Application Serial No. 200880009762.0 mailed on Sep. 14, 2011.
EPO Extended Search Report and Opinion in EP Application Serial No. 08728178.8 mailed on Aug. 2, 2012.
PCT International Search Report and Written Opinion in PCT International Application Serial No. PCT/US2008/051876 mailed on Jun. 23, 2008.
PCT International Preliminary Report on Patentability in PCT Application Serial No. PCT/US2008/051876 mailed on Jul. 28, 2009.
EPO Communication Pursuant to Article 94(3) EPC in EP Application Serial No. 08847431.7-2416 mailed on Dec. 11, 2012.
EPO Supplementary European Search Report in EP Application Serial No. 08847431.7-2416 mailed on Dec. 3, 2012.
PCT International Search Report and Written Opinion in PCT Application Serial No. PCT/US2008/082771, mailed on Aug. 24, 2009.
PCT International Preliminary Report on Patentability in PCT Application Serial No. PCT/US2008/082771, mailed on May 11, 2010.
Related Publications (1)
Number Date Country
20060251068 A1 Nov 2006 US
Provisional Applications (2)
Number Date Country
60625507 Nov 2004 US
60736121 Nov 2005 US
Continuations (2)
Number Date Country
Parent 10093553 Mar 2002 US
Child 11218689 US
Parent 11423313 US
Child 11218689 US
Continuation in Parts (11)
Number Date Country
Parent 11173941 Jul 2005 US
Child 11423313 US
Parent 11142943 Jun 2005 US
Child 11173941 US
Parent 11423313 US
Child 11173941 US
Parent 11383347 May 2006 US
Child 11423313 US
Parent 11218689 Sep 2005 US
Child 11383347 US
Parent 10094211 Mar 2002 US
Child 11423313 US
Parent 10094266 Mar 2002 US
Child 10094211 US
Parent 10361091 Feb 2003 US
Child 10094266 US
Parent 10373325 Feb 2003 US
Child 10361091 US
Parent 10361067 Feb 2003 US
Child 10373325 US
Parent 10384924 Mar 2003 US
Child 10361067 US