System and method for adding context to prevent data leakage over a computer network

Information

  • Patent Grant
  • 8938773
  • Patent Number
    8,938,773
  • Date Filed
    Wednesday, January 30, 2008
    17 years ago
  • Date Issued
    Tuesday, January 20, 2015
    11 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Abrishamkar; Kaveh
    Agents
    • Knobbe Martens Olson & Bear LLP
Abstract
Systems and methods for adding context to prevent data leakage over a computer network are disclosed. Data is classified and contextual information of the data is determined. A transmission policy is determined in response to the classification and contextual information. The data is either transmitted or blocked in response to the classification and the contextual information.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


This application relates to computer network security.


2. Description of the Related Technology


Computer networks are used to transmit data between computers in a seamless manner. A user can send a file or an email with an attachment to another user connected to the network. In many instances, this data transfer occurs over the Internet. Often, the content may contain sensitive information (i.e., business plans, financial data, product drawings, trade secrets, etc. . . . ) that should not be sent to the wrong recipient.


The owners of the data have an interest in preventing the leakage of sensitive data over computer networks. Currently, there are methods for analyzing and classifying data being sent over the network. These methods determine the type of data and prevent the dissemination of data classified as being protected. In this regard, these methods classify the data and apply a protection/transmission policy depending on the type of data. For example, a policy might forbid the transmission of any business information containing social security numbers.


However, a policy just based on the type of data may not provide the level of leakage prevention that is needed. For example, sometimes a company might want to limit the transmission of data to certain users or destinations. The analysis of the data itself cannot provide this level of classification and a reliable policy cannot be developed.


SUMMARY OF THE CERTAIN INVENTIVE ASPECTS

In one inventive aspect, a system for preventing the unauthorized transmission of data over a computer network is disclosed. The system has a data source having data in communication with the network. A network (Internet) gateway is in communication with the network and a destination. The network gateway is configured to determine a transmission policy in response to the type of data and contextual information of the data.


The contextual information can be sender contextual information and/or destination contextual information. For example, the sender contextual information may be an IP address of the data source, a user name or a group of users. The destination contextual information may be an IP address of the destination, a network of the destination or a category of the destination. The transmission policy may block transmission of the data, permit transmission of the data and/or report attempted transmission of the data.


Typically, the data source is an electronic device such as a PDA, computer, cell phone, etc. . . . and the devices communicate over the Internet.


The network gateway can include a classification module for determining the type of data, a context information module for determining the contextual information, a policy/reporting module for generating the transmission policy and an enforcement module for either transmitting the data, blocking the data and/or reporting the transmission of the data.


In another inventive aspect, a method of preventing an unauthorized transmission of data over a computer network is disclosed. The method comprises classifying the data and determining contextual information of the data. Next, a transmission policy is determined in response to the classification and contextual information. The data is either transmitted or blocked in response to the classification and the contextual information.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a representation of a computer network whereby data leakage prevention using classification and contextual information can occur.



FIG. 2 is a block diagram illustrating some of the components to prevent data leakage over the computer network shown in FIG. 1.



FIG. 3 is a flowchart illustrating the process of preventing data leakage using classification and contextual information.



FIG. 4 is a table illustrating how transmission policies are applied to different scenarios.





DETAILED DESCRIPTION OF CERTAIN INVENTIVE EMBODIMENTS

The following detailed description is directed to certain specific embodiments of the invention. However, the invention can be embodied in a multitude of different systems and methods. In this description, reference is made to the drawings wherein like parts are designated with like numerals throughout.


Referring to FIG. 1, a system for preventing the unauthorized transmission of data over a computer network is shown. A user can use a digital device as a source of data 10 (i.e., PDA 10a, laptop computer 10b, cell phone 10c, computer 10d, or other type of digital device) to transmit data over a computer network 12 and Internet 18 to a destination device 14 such as another electronic device. It will be recognized that any type of device 10 can be used as the source of data (i.e., fax machine, scanner, network disk drive, USB memory device, etc. . . . ). The devices 10 are connected to an internal network 12 either through a wired or wireless connection. Each of the devices 10 contain data that may be transmitted. The network 10 can be a local area network (LAN) with a connection to Internet 18. It will be recognized that multiple LAN's can be connected together to from a wide area network (WAN) that could connect to the Internet 18. The network 10 can be an Ethernet 10baseT topology, or based on any networking protocol, including wireless networks, token ring network, and the like.


The network 10 communicates with a network/Internet gateway 16 in order to provide the sources 10 a connection to the Internet 18. The Internet gateway 16 may be a server or combination of servers for translating TCP/IP protocols into proper protocols for communication across the local area network 12. The gateway 16 is well known in the art and normally communicates through routers or other data switching technology. Furthermore, the gateway 16 illustrated in FIG. 1 may include content filtering that prevents users from accessing prohibited websites, as well as a data leakage prevention to prevent prohibited content from traveling outside the network 12 as will be further explained below.


The Internet gateway 16 communicates with the Internet 18 and hence the destination 14 through commonly known techniques. Accordingly, other gateways, routers, switches and/or other devices may be in the path of communication between the Internet 18, the Internet gateway 16, the destination 14, the network 12 and the sources 10. The Internet gateway analyzes TCP/IP traffic passing there through. The destination 14 may be and electronic device, an IP address, email address, network address or other type of recipient.


Referring to FIG. 2, a block diagram showing the components of FIG. 1 is illustrated. The source 10 includes data 20 to be transmitted to the destination 14. The data 20 may be any type of data such as numerical, textual, graphical, etc. . . . The data 10 may be transmitted as an email attachment, instant message, FTP, or anything that can be converted into TCP/IP traffic. The data 20 is transmitted to Internet gateway 16 which contains software (i.e., modules) that prevents the unauthorized dissemination of confidential information. The term “module”, as used herein may be, but is not limited to, a software or hardware component, such as a FPGA or ASIC, which performs certain tasks. A module may be configured to reside on an addressable storage medium and configured to execute on one or more processors. Accordingly, a module may include components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules. The tasks to be performed by the modules may be programmed using ordinary techniques as is commonly known.


The Internet gateway 16 includes a classification module 22, a policy/reporting module 24, a who/where context information module 26 and an enforcement module 28. In addition, an administration module 30 may be in communication with the Internet gateway 16 and/or may be incorporated therein.


The classification module 22 analyzes the data 20 to determine whether the data 20 contains prohibited content by identifying a fingerprint and/or signature of the data 20. The fingerprint or signature of the data 20 is compared to a database of signatures in order to identify the content. As such, the classification module 22 determines the content of the data. The who/where context information module 26 determines who sent the data and the destination of the data. For example, the context information module 26 determines who sent the data by identifying the sender using directory services over the network 12. For example, senders can be identified by using a list of users complied in a company's database or using an LDAP service. Typically, the users are also mapped to IP addresses in order to obtain their location. The sender's identification is then used as contextual information that can be applied to the data. For instance, the sender's contextual information may be an IP address of the sender, the user's identity, group identity, or any other type of information that adds further context to the sender of the data 20.


The context information module 26 also determines contextual information regarding the destination of the data 20. For example, by comparing the IP address of the destination to a database of IP addresses sorted by categories, it is possible to categorize the destination. The database of IP addresses includes known IP addresses that are sorted based on contextual information. The IP addresses are grouped into categories depending on the type of destination and content contained. Some non-limiting examples of categories may be “malicious addresses”, “competitors”, “public sites”, etc. . . . The IP addresses are grouped by categories as is commonly known in the web filtering industry and generated by analyzing the destination. In addition to categorizing the destination, it is also possible to add other contextual information such as the network of the destination or just the address of the destination. Therefore, the destination contextual information may be any additional information that further defines the data 20. For example, the destination contextual information may be the reputation of the destination, the name and/or type of entity, the location of the destination, known malicious recipients, etc. . . .


The policy/reporting module 24 is used to determine the policy applied to the data 20. Specifically, based on the classification of the data 20 determined by the classification module 22 and the contextual information determined by the context information module 26, it is possible to generate a policy for the data 20. The policy determines whether the data 20 is to be transmitted, blocked and/or reported, as will be further explained in FIG. 3.


The enforcement module 28 applies the policy to the data 20 and either blocks or transmits the data 20 to the Internet 18. The administration module 30 allows an administrator to change policies and/or allow data 20 to be transmitted after further review.


Referring to FIG. 3, a flowchart for adding contextual information to data leakage prevention is shown. In block 300, data 20 is received or sent by the Internet gateway 16. Next, the data is inspected/analyzed in block 302 and classified in step 304 by classification module 22. As previously mentioned, the data 20 is fingerprinted and a signature is identified in order to determine if the data 20 is information that should be blocked or transmitted.


Next, in block 305, contextual information about the data 20 is determined by who/where context information module 26. Specifically, the “who” of the sender may be one or all of a specific IP address, an individual user or a member of a specified group. The “where” of the destination 14 may be one or all of a category of the destination, the network of the destination or the IP address of the destination.


In block 306 the policy for the data 20 is determined by policy/reporting module 24. The policy is determined using the classification determined from block 304 and the contextual information determined from block 305.


Referring to FIG. 4, a table showing some exemplary policies is illustrated. The classification of data/content derived from block 304 of FIG. 3 is shown in column 402. The sender's contextual information is listed in column 404, while the destination contextual information is listed in column 406. As previously described, the sender contextual information and the destination contextual information is generated in block 305 of FIG. 3. Column 408 of FIG. 4 lists the policy applied to the data/content for each respective row of the table. For example, in row 410, the data/content is business data, while the sender contextual information indicates that user A sent the information and the destination contextual information indicates that data is to be sent to network A. In this instance, the policy to be applied is to report that user A is attempting to send the data to network A. Rows 412 and 414 show similar scenarios except that the destination contextual information is different. Specifically, in row 412 the data/content is allowed to be transmitted while in row 414 the data/content is blocked because the data/content is being sent to IP address 3 which may be associated with a malicious site. Accordingly, rows 410, 412 and 414 illustrate examples whereby the data/content is the same and the sender is the same, but the policy is different based upon the destination contextual information. Similarly, rows 416, 418 and 420 illustrate an example whereby the data/content is the same and the destination contextual information is the same, but the policy changes based upon the sender contextual information. It will be recognized by those of ordinary skill in the art that many different policies can be configured in order to provide the desired type of security. By both classifying the type of data and using the contextual information it is possible to generate more granular policies. Furthermore, reporting, incident handling, alias usage and priority handling are facilitated by using both classification and contextual information.


Referring back to FIG. 3, in decision block 308, the policy/reporting module 24 determines whether the data should be blocked. If the data 20 should not be blocked then the data is transmitted in block 318 by enforcement module 28. However, if the policy for the data/content is to block the data 20 or report the data 20, then the process proceeds to block 310 whereby it is determined whether the attempt to transmit the data 20 should be reported. If the transmission is not to be reported, then the process proceeds to step 316 where the transmission of the data 20 is blocked by enforcement module 28. However, if the transmission is to be reported, then the process proceeds to step 312 whereby an administrator or other supervisor can review the data, sender and recipient information and determine whether to send the data 20. If the data 20 is to be sent, the process proceeds to block 318 whereby the data 20 is sent. However, the process proceeds to block 316 and the data is not transmitted if the supervisor or administrator believes it is not appropriate to send the data. It will be recognized that it is possible to omit the review manually step 312 and report and block the transmission of the data 20.


While block 308 has been described as blocking data 20, it will be recognized that other types of actions may occur after the policy has been determined in block 306. Specifically, block 308 may initiate a workflow based upon the policy determined in block 306 whereby the data 20 is further analyzed, classified, inspected, etc. . . .


While the above description has shown, described, and pointed out novel features of the invention as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made by those skilled in the art without departing from the spirit and scope of the invention.

Claims
  • 1. A system for preventing unauthorized transmission of data over a computer network, the system comprising: a network gateway device in communication with the computer network, the network gateway device configured to receive data in transit between a source and a destination, wherein the network gateway device comprises: a classification module configured to determine whether the data in transit includes prohibited content;a context information module configured to generate sender contextual information related to the source of the received data and destination contextual information related to the destination of the received data, wherein the destination contextual information comprises a categorization of the Internet Protocol (IP) address of the destination, and wherein the categorization of the IP address of the destination is based at least in part on website content stored at the destination; anda transmission policy module configured to determine a transmission policy based on the determination of the classification module and the sender contextual information and the destination contextual information.
  • 2. The system of claim 1 further comprising a database of internet protocol addresses sorted by categories, wherein the categorization of the destination is further based on a comparison of an internet protocol address associated with the destination to the database of internet protocol addresses.
  • 3. The system of claim 1 wherein the destination contextual information is further based on a reputation of the destination.
  • 4. The system of claim 1 wherein the destination contextual information is further based on a geographic location of the destination.
  • 5. The system of claim 4 wherein the sender contextual information comprises an IP address of the data source, a user name or a group of users.
  • 6. The system of claim 4 wherein the destination contextual information further comprises an IP address of the destination, a network of the destination or a category of the destination.
  • 7. The system of claim 4 wherein the transmission policy module is further configured to determine whether the network gateway transmits the data or blocks transmission of the data.
  • 8. The system of claim 7 wherein the transmission policy module is further configured to report that the data source is attempting to transmit data.
  • 9. The system of claim 1 wherein the data source is an electronic device.
  • 10. The system of claim 9 wherein the electronic device is selected from the group consisting of: a PDA;a computer; anda cell phone.
  • 11. The system of claim 1 wherein the computer network is the Internet.
  • 12. The system of claim 1 wherein the network gateway further comprises an enforcement module configured to transmit or block the data in response to data received from the transmission policy module.
  • 13. A method of preventing an unauthorized transmission of data over a computer network, the method comprising: receiving at a network gateway device connected to a network, data in transit between a source and a destination, wherein the source and the destination are in communication with the network;classifying the data to determine whether the data includes prohibited content;generating sender contextual information related to the source of the data;generating destination contextual information related to the destination of the data, wherein the destination contextual information comprises a categorization of the Internet Protocol (IP) address of the destination, wherein the categorization of the IP address of the destination is based on website content stored at the destination; anddetermining a transmission policy for the data in response to the classification of the data and the sender contextual information and the destination contextual information.
  • 14. The method of claim 13 wherein the destination contextual information is further based on a reputation of the destination.
  • 15. The method of claim 14 wherein the destination contextual information is further based on a based on a geographic location of the destination.
  • 16. The method of claim 13 further comprising storing a database of internet protocol addresses sorted by categories, wherein the categorization of the destination is further based on a comparison of an internet protocol address associated with the destination to the database of internet protocol addresses.
  • 17. The method of claim 13 wherein the sender contextual information comprises an IP address of the sender, a user name of the sender or a group name of the user.
  • 18. The method of claim 13 wherein the destination contextual information comprises an IP address of the destination, a network of the destination or a category of the destination.
  • 19. The method of claim 13 further comprising the step of reporting that the data is to be transmitted.
  • 20. The method of claim 13 wherein the step of classifying the data is performed by a classification module.
  • 21. The method of claim 13 wherein the step of determining the contextual information of the data is performed by a context information module.
  • 22. The method of claim 13 wherein the step of determining a transmission policy for the data is performed by a policy/reporting module.
  • 23. The method of claim 13 further comprising the step of sending the data to a network gateway prior to classifying the data.
  • 24. The method of claim 23 wherein the step of sending the data is performed with an electronic device connected to a network.
  • 25. The method of claim 24, wherein the electronic device is selected from the group consisting of: a PDA;a computer; anda cell phone.
  • 26. The method of claim 24 wherein the network is the Internet.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Patent Application No. 60/887,908, filed on Feb. 2, 2007, which is incorporated by reference herein in its entirety.

US Referenced Citations (171)
Number Name Date Kind
5414833 Hershey et al. May 1995 A
5581804 Cameron et al. Dec 1996 A
5590403 Cameron et al. Dec 1996 A
5596330 Yokev et al. Jan 1997 A
5712979 Graber et al. Jan 1998 A
5720033 Deo Feb 1998 A
5724576 Letourneau Mar 1998 A
5801747 Bedard Sep 1998 A
5828835 Isfeld et al. Oct 1998 A
5832228 Holden et al. Nov 1998 A
5899991 Karch May 1999 A
5905495 Tanaka et al. May 1999 A
5919257 Trostle Jul 1999 A
5937404 Csaszar et al. Aug 1999 A
6012832 Saunders et al. Jan 2000 A
6092194 Touboul Jul 2000 A
6185681 Zizzi Feb 2001 B1
6252884 Hunter Jun 2001 B1
6301658 Koehler Oct 2001 B1
6338088 Waters et al. Jan 2002 B1
6357010 Viets et al. Mar 2002 B1
6460141 Olden Oct 2002 B1
6493758 McLain Dec 2002 B1
6654787 Aronson et al. Nov 2003 B1
6804780 Touboul Oct 2004 B1
6832230 Zilliacus et al. Dec 2004 B1
6988209 Balasubramaniam et al. Jan 2006 B1
7051200 Manferdelli et al. May 2006 B1
7058822 Edery et al. Jun 2006 B2
7080000 Cambridge Jul 2006 B1
7089589 Chefalas et al. Aug 2006 B2
7100199 Ginter et al. Aug 2006 B2
7136867 Chatterjee et al. Nov 2006 B1
7155243 Baldwin et al. Dec 2006 B2
7185361 Ashoff et al. Feb 2007 B1
7249175 Donaldson Jul 2007 B1
7346512 Li-chun Wang et al. Mar 2008 B2
7376969 Njemanze et al. May 2008 B1
7447215 Lynch et al. Nov 2008 B2
7536437 Zmolek May 2009 B2
7617532 Alexander et al. Nov 2009 B1
7634463 Katragadda et al. Dec 2009 B1
7644127 Yu Jan 2010 B2
7693945 Dulitz et al. Apr 2010 B1
7707157 Shen Apr 2010 B1
7725937 Levy May 2010 B1
7783706 Robinson Aug 2010 B1
7814546 Strayer et al. Oct 2010 B1
7818800 Lemley, III et al. Oct 2010 B1
7991411 Johnson et al. Aug 2011 B2
8078625 Zhang et al. Dec 2011 B1
8498628 Shapiro et al. Jul 2013 B2
8695100 Cosoi Apr 2014 B1
20010047474 Takagi et al. Nov 2001 A1
20020078045 Dutta Jun 2002 A1
20020087882 Schneier et al. Jul 2002 A1
20020091947 Nakamura Jul 2002 A1
20020095592 Daniell et al. Jul 2002 A1
20020099952 Lambert et al. Jul 2002 A1
20020129140 Peled et al. Sep 2002 A1
20020129277 Caccavale Sep 2002 A1
20020162015 Tang Oct 2002 A1
20020174358 Wolff et al. Nov 2002 A1
20020199095 Bandini et al. Dec 2002 A1
20030018491 Nakahara et al. Jan 2003 A1
20030018903 Greca et al. Jan 2003 A1
20030074567 Charbonneau Apr 2003 A1
20030093694 Medvinsky et al. May 2003 A1
20030110168 Kester et al. Jun 2003 A1
20030135756 Verma Jul 2003 A1
20030172292 Judge Sep 2003 A1
20030177361 Wheeler et al. Sep 2003 A1
20030185395 Lee et al. Oct 2003 A1
20030185399 Ishiguro Oct 2003 A1
20030188197 Miyata et al. Oct 2003 A1
20030202536 Foster et al. Oct 2003 A1
20040003139 Cottrille et al. Jan 2004 A1
20040003286 Kaler et al. Jan 2004 A1
20040034794 Mayer et al. Feb 2004 A1
20040039921 Chuang Feb 2004 A1
20040111632 Halperin Jun 2004 A1
20040117624 Brandt et al. Jun 2004 A1
20040139351 Tsang Jul 2004 A1
20040153644 McCorkendale Aug 2004 A1
20040162876 Kohavi Aug 2004 A1
20040187029 Ting Sep 2004 A1
20040203615 Qu et al. Oct 2004 A1
20040255147 Peled et al. Dec 2004 A1
20040260924 Peled et al. Dec 2004 A1
20050025291 Peled et al. Feb 2005 A1
20050027980 Peled et al. Feb 2005 A1
20050033967 Morino et al. Feb 2005 A1
20050048958 Mousseau et al. Mar 2005 A1
20050055327 Agrawal et al. Mar 2005 A1
20050066197 Hirata et al. Mar 2005 A1
20050086520 Dharmapurikar et al. Apr 2005 A1
20050091535 Kavalam et al. Apr 2005 A1
20050108557 Kayo et al. May 2005 A1
20050111367 Jonathan Chao et al. May 2005 A1
20050120229 Lahti Jun 2005 A1
20050131868 Lin et al. Jun 2005 A1
20050138109 Redlich et al. Jun 2005 A1
20050138353 Spies et al. Jun 2005 A1
20050149726 Joshi et al. Jul 2005 A1
20050210035 Kester et al. Sep 2005 A1
20050223001 Kester et al. Oct 2005 A1
20050229250 Ring et al. Oct 2005 A1
20050251862 Talvitie Nov 2005 A1
20050273858 Zadok et al. Dec 2005 A1
20050283836 Lalonde et al. Dec 2005 A1
20050288939 Peled et al. Dec 2005 A1
20060004636 Kester et al. Jan 2006 A1
20060020814 Lieblich et al. Jan 2006 A1
20060021031 Leahy et al. Jan 2006 A1
20060026105 Endoh Feb 2006 A1
20060026681 Zakas Feb 2006 A1
20060031504 Hegli et al. Feb 2006 A1
20060036874 Cockerille et al. Feb 2006 A1
20060053488 Sinclair et al. Mar 2006 A1
20060068755 Shraim et al. Mar 2006 A1
20060080735 Brinson et al. Apr 2006 A1
20060095459 Adelman et al. May 2006 A1
20060095965 Phillips et al. May 2006 A1
20060098585 Singh et al. May 2006 A1
20060101514 Milener et al. May 2006 A1
20060129644 Owen et al. Jun 2006 A1
20060168006 Shannon et al. Jul 2006 A1
20060191008 Fernando et al. Aug 2006 A1
20060212723 Sheymov Sep 2006 A1
20060251068 Judge et al. Nov 2006 A1
20060259948 Calow et al. Nov 2006 A1
20060265750 Huddleston Nov 2006 A1
20060272024 Huang et al. Nov 2006 A1
20060277259 Murphy et al. Dec 2006 A1
20060282890 Gruper et al. Dec 2006 A1
20060288076 Cowings et al. Dec 2006 A1
20070005762 Knox et al. Jan 2007 A1
20070011739 Zamir et al. Jan 2007 A1
20070027965 Brenes et al. Feb 2007 A1
20070028302 Brennan et al. Feb 2007 A1
20070067844 Williamson et al. Mar 2007 A1
20070143424 Schirmer et al. Jun 2007 A1
20070150827 Singh et al. Jun 2007 A1
20070156833 Nikolov et al. Jul 2007 A1
20070195779 Judge et al. Aug 2007 A1
20070199054 Florencio et al. Aug 2007 A1
20070220607 Sprosts et al. Sep 2007 A1
20070260602 Taylor Nov 2007 A1
20070261112 Todd et al. Nov 2007 A1
20070294199 Nelken et al. Dec 2007 A1
20070294428 Guy et al. Dec 2007 A1
20070299915 Shraim et al. Dec 2007 A1
20080009268 Ramer et al. Jan 2008 A1
20080040804 Oliver et al. Feb 2008 A1
20080047017 Renaud Feb 2008 A1
20080100414 Diab et al. May 2008 A1
20080262991 Kapoor et al. Oct 2008 A1
20080267144 Jano et al. Oct 2008 A1
20080282338 Beer Nov 2008 A1
20080295177 Dettinger et al. Nov 2008 A1
20090064326 Goldstein Mar 2009 A1
20090100055 Wang Apr 2009 A1
20090100518 Overcash Apr 2009 A1
20090119402 Shull et al. May 2009 A1
20090131035 Aiglstorfer May 2009 A1
20090241191 Keromytis et al. Sep 2009 A1
20090320135 Cavanaugh Dec 2009 A1
20100024037 Grzymala-Busse et al. Jan 2010 A1
20100064347 More et al. Mar 2010 A1
20100198928 Almeida Aug 2010 A1
20100312843 Robinson Dec 2010 A1
Foreign Referenced Citations (24)
Number Date Country
1367595 Sep 2002 CN
1756147 Apr 2006 CN
101060421 Oct 2007 CN
1 180 889 Feb 2002 EP
1 278 330 Jan 2003 EP
1 280 040 Jan 2003 EP
1 457 885 Sep 2004 EP
1 510 945 Mar 2005 EP
1571578 Sep 2005 EP
1 638 016 Mar 2006 EP
1 643 701 Apr 2006 EP
1 643 701 Apr 2006 EP
2418330 Mar 2006 GB
2000-235540 Aug 2000 JP
WO 9605549 Feb 1996 WO
WO 9642041 Dec 1996 WO
WO 0124012 Apr 2001 WO
WO 2005017708 Feb 2005 WO
WO 2005119488 Dec 2005 WO
WO 2006027590 Mar 2006 WO
WO 2006062546 Jun 2006 WO
WO 2006136605 Dec 2006 WO
WO 2007059428 May 2007 WO
WO 2007106609 Sep 2007 WO
Non-Patent Literature Citations (34)
Entry
International Search Report and Written Opinion dated Feb. 11, 2009 for International Application No. PCT/US2008/052483.
“Google + StopBadward.org = Internet Gestapo?”, http://misterpoll.wordpress.com/2007/01/05/google-stopbadwareorg-internet-gestapo/, Jan. 5, 2007.
“Trends in Badware 2007”, StopBadware.org.
Borck, James R., Clearswift makes a clean sweep of Web threats, originally downloaded from http://www.infoworld.com/d/security-central/clearswift-makes-clean-sweep-web-threats-818., Aug. 22, 2007, pp. 2.
Broder et al., Network Applications of Bloom Filters: A Survey, Internet Mathematics, Apr. 13, 2004, vol. 1, Issue 4, pp. 485-509.
Honoroff, Jacob, An Examination of Bloom Filters and their Applications, originally downloaded from http://cs.unc.edu/˜fabian/courses/CS600.624/slides/bloomslides.pdf, Mar. 16, 2006, pp. 113.
IronPort Web Reputation White Paper, A Comprehensive, Proactive Approach to Web-Based Threats, Ironport Systems,, 2009, pp. 10.
Long, John A., Risk and the Right Model, originally downloaded from http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GEtTRDoc.pdf&AD=ADA161757, Jan. 1986, pp. 13.
Rubenking, Neil J., Norton Confidential, originally downloaded from http://www.pcmag.com/article2/0,2817,1999239,00.asp, Aug. 4, 2006, pp. 3.
Ruffo et al., EnFilter: A Password Enforcement and Filter Tool Based on Pattern Recognition Techniques, ICIAP 2005, LNCS 3617, pp. 75-82, 2005.
Spafford, Eugene, Prventing Weak Password Choices, Computer Science Technical Reports. Paper 875. http://docs.lib.purdue.edu/cstech/875, 1991.
Yang et al., Performance of Full Text Search in Structured and Unstructured Peer-to-Peer Systems, Proceedings IEEE Infocom; originally downloaded from http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04146962, 2006, pp. 12.
Wang Ping, “Research on Content Filtering-based Anti-spam Technology,” Outstanding Master's Degree Thesis of China, Issue 11, Nov. 15, 2006.
Ma Zhe, “Research and Realization of Spam Filtering System,” Outstanding Master's Degree Thesis of China, Issue 2, Jun. 15, 2005.
Zhang Yao Long, “Research and Application of Behavior Recognition in Anti-spam System,” Outstanding Master's Degree Thesis of China, Issue 11, Nov. 15, 2006.
Shanmugasundaram et al, Payload Attribution via Hierarchical Bloom Filters, CCS, Oct. 25-29, 2004.
Shanmugasundaram et al., ForNet: A Distributed Forensics Network, In Proceedings of the Second International Workshop on Mathematical Methods, Models and Architectures for Computer Networks Security, 2003.
Clear Text Password Risk Assessment Documentation, SANS Institute, 2002.
Song et al., Multi-pattern signature matching for hardware network intrusion detection systems, IEEE Globecom 2005, Jan. 23, 2006.
Adam Lyon, “Free Spam Filtering Tactics Using Eudora,”, May 21, 2004, pp. 1-4.
Cohen, F., A Cryptographic Checksum for Integrity Protection, Computers & Security, Elsevier Science Publishers, Dec. 1, 1987, vol. 6, Issue 6, pp. 505-510, Amsterdam, NL.
Dahan, M. Ed., “The Internet and government censorship: the case of the Israeli secretservice” Online information., Proceedings of the International Online Information Meeting, Oxford, Learned Infomration, GB, Dec. 12-14, 1989, vol. Meeting 13, December, Issue XP000601363, pp. 41-48, Sections 1,3., London.
Gittler F., et al., The DCE Security Service, Pub: Hewlett-Packard Journal, Dec. 1995, pp. 41-48.
IBM Technical Disclosure Bulletin, Mean to Protect System from Virus, IBM Corp., Aug. 1, 1994, Issue 659-660.
Igakura, Tomohiro et al., Specific quality measurement and control of the service-oriented networking application., Technical Report of IEICE, IEICE Association, Jan. 18, 2002, vol. 101, Issue 563, pp. 51-56, Japan.
International Search Report and Written Opinion for International Application No. PCT/GB2005/003482, Dec. 9, 2005.
IronPort Web Reputation: Protect and Defend Against URL-Based Threats; Ironport Systems, Apr. 2006, 8 pages.
PCT International Search Report and Written Opinion for International Application No. PCT/US2008/052483, Feb. 11, 2009.
Reid, Open Systems Security: Traps and Pitfalls, Computer & Security, 1995, Issue 14, pp. 496-517.
Resnick, P. et al., “PICS: Internet Access Controls Without Censorship”, Communications of the Association for Comuting Machinery, ACM, Oct. 1, 1996, vol. 39, Issue 10, pp. 87-93, New York, NY.
Stein, Web Security—a step by step reference guide, Addison-Wesley, 1997, pp. 387-415.
Symantec Corporation, E-security begins with sound security policies, Announcement Symantec, XP002265695, Jun. 14, 2001, pp. 1,9.
Williams, R., Data Integrity with Veracity, Retrieved from the Internet: <URL: ftp://ftp.rocksoft.com/clients/rocksoft/papers/vercty10.ps>, Sep. 12, 1994.
Zhang et al., The Role of URLs in Objectionable Web Content Categorization, Web Intelligence, 2006.
Related Publications (1)
Number Date Country
20080307489 A1 Dec 2008 US
Provisional Applications (1)
Number Date Country
60887908 Feb 2007 US