System and method for malware containment

Information

  • Patent Grant
  • 10068091
  • Patent Number
    10,068,091
  • Date Filed
    Monday, November 23, 2015
    8 years ago
  • Date Issued
    Tuesday, September 4, 2018
    5 years ago
Abstract
Systems and methods for malware containment on connection is provided. In exemplary embodiments, a malware containment method is described that performs a number of operations. The method involves redirecting network data received over a communication network to a virtual machine. The virtual machine is configured to simulate functionality of a digital device. Furthermore, the method involves analyzing of the redirected network data that including analyzing a response of the virtual machine to processing of the network data within the virtual machine to identify a malware attack. Thereafter, the method involves continuing to redirect the network data for processing by the virtual machine until expiration of a predetermined period of time without detection of malware, or continuing to redirect the network data for processing by the virtual machine beyond the predetermined period of time when malware is detected.
Description

This application is also related to U.S. patent application Ser. No. 11/717,408, filed Mar. 12, 2007, entitled “Malware Containment and Security Analysis on Connection”, U.S. patent application Ser. No. 11/717,474, filed Mar. 12, 2007, entitled “Systems and Methods for Malware Attack Prevention, and U.S. patent application Ser. No. 11/717,476, filed Mar. 12, 2007, entitled “Systems and Methods for Malware Attack Detection and Identification”. The above-referenced related nonprovisional patent applications are also incorporated by reference herein.


BACKGROUND OF THE INVENTION
Field of the Invention

The present invention relates generally to containment of malware. More particularly, the present invention relates to the containment of malware attacks from digital devices upon connection to a communication network.


Background Art

As the workplace becomes more automated, the use of computers and networks is commonplace. Computers have become indispensable tools that afford access to files and resources. Unfortunately, computers and networks can also place those files and resources at risk.


Computers can become infected with worms and viruses that replicate themselves and seek to damage files or limit network resources. As such, it is not uncommon to read in newspapers of a single infected computer that limited or destroyed the functionality of one or more networks. The cost caused by the damage from these attacks is enormous.


Currently, information technology (IT) staff and administrators have sought to limit worms and viruses by cleaning individual computers of worms/viruses, requiring anti-virus applications, and installing firewall applications on network servers and routers. Once the network is clear of worms and viruses, the IT staff and administrators continue to upgrade antivirus/firewall applications as well as virus/worm definitions for each server and router.


Even if the network is clean of viruses and worms, computers may still become infected. In one example, users of computers connected to an otherwise “clean” network may bring their computer home from work where the computer becomes infected over the Internet or a home network. Even if the computer has an anti-virus application resident on the machine, the anti-virus application may be insufficient to block or correct all possible attacking worms or viruses. Further, the anti-virus application or the worm/virus signature files may be out of date. Moreover, some worms or viruses may not be identified by some anti-virus applications or the worms or viruses may not be previously identified (e.g., a “zero day” attack) and, as such, a worm/virus signature that identifies the worm or virus may not exist. When the computer is brought back to work and reconnected to the network, the worm or virus may activate, make copies of itself, identify other machines on the network, gather information about the network, compromise network security, and/or infect other machines.


SUMMARY OF THE INVENTION

Systems and methods for malware containment on connection are provided. In exemplary embodiments, newly coupled digital devices are temporarily redirected for a predetermined period of time upon connection to the communication network. When a newly coupled digital device is quarantined, all network data transmitted by the digital device is temporarily redirected to a controller which then analyzes the network data to detect unauthorized activity and/or malware within the newly coupled digital device. An exemplary method to contain malware comprises detecting a digital device upon connection with a communication network, temporarily redirecting network data from the digital device for a predetermined period of time, and analyzing the network data received from the digital device to detect malware within the digital device. In some embodiments, the method further comprises determining if the digital device is associated with a white list.


Temporarily redirecting network data can comprise ARP manipulation, the configuration of DHCP services, or the reconfiguration of a switch to direct network data from the digital device to the controller. Analyzing the network data may comprise configuring a virtual machine to receive the network data and analyzing the response of the virtual machine to the network data to detect and/or identify a malware attack. In various embodiments, the method further comprises generating an unauthorized activity signature based on the detection of the malware attack.


Analyzing the network data may comprise analyzing the network data with a heuristic to identify network data containing suspicious activity, configuring a virtual machine to receive the network data, and analyzing the response of the virtual machine to the network data to detect and/or identify the malware within the digital device. Further, analyzing the network data may comprise retrieving a virtual machine configured to receive the network data, configuring a replayer to transmit the network data to the virtual machine, and analyzing a response by the virtual machine to the network data to detect and/or identify the malware within the digital device.


A malware containment system can comprise a controller for containing malware. The controller may comprise a quarantine module and an analysis environment. The quarantine module is configured to detect a digital device upon connection with a communication network and temporarily redirect network data from the digital device for a predetermined period of time. The analysis environment can be configured to analyze the network data to identify malware within the digital device.


In various embodiments, a machine readable medium may have embodied thereon executable code, the executable code being executable by a processor for performing a malware containment method. The malware containment method can comprise detecting a digital device upon connection with a communication network, temporarily redirecting the network data from the digital device for a predetermined period of time, and analyzing the network data to detect malware within the digital device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram of a malware containment environment in which embodiments of the present invention may be practiced.



FIG. 2 is a block diagram of an exemplary controller implementing embodiments of the present invention.



FIG. 3 is a block diagram of an exemplary analysis environment.



FIG. 4 is a flowchart of an exemplary method for malware containment upon connection of a digital device.



FIG. 5 is another flowchart of an exemplary method for malware containment upon connection of a digital device.



FIG. 6 is a flowchart of an exemplary method of generating and transmitting an unauthorized activity signature.



FIG. 7 is a block diagram of an exemplary controller in which embodiments of the present invention may be practiced.





DESCRIPTION OF EXEMPLARY EMBODIMENTS

Exemplary systems and methods for malware containment are provided. In exemplary embodiments, digital devices are quarantined for a predetermined period of time upon connection. When a digital device is quarantined, all network data transmitted by the digital device is directed to a controller which then analyzes the network data to identify unauthorized activity and/or malware within the newly connected digital device.


If malware is identified as present within the digital machine, corrective action can be taken. Possible corrective actions include, but are not limited to, permanently quarantining the infected digital device, transmitting a patch to remove the malware, generating an unauthorized activity signature, and sending the unauthorized activity signature to the client to remove the malware.


If malware is not identified as present within the digital machine, the network data directed to the controller can be re-transmitted to the proper destination. The quarantine continues until the predetermined period of time expires and no evidence of malware is found.


Malware is software created and distributed for malicious purposes and can take the form of viruses, worms, trojan horses or adware, for example. A virus is an intrusive program that infects a computer file by inserting a copy of itself in the file. The copy is usually executed when the file is loaded into memory, allowing the virus to infect other files. A worm is a program that propagates itself across multiple computers, usually by creating copies of itself in each computer's memory. A worm might duplicate itself in a computer so many times that it causes the computer to crash. A trojan horse is a destructive program disguised as a game, utility, or application. When run by a user or computer program, a trojan horse can harm the computer system while appearing to do something useful.


Malware may also include adware and spyware. Adware is a program configured to direct advertisements to a computer or a particular user. In one example, adware identifies the computer and/or the user to various websites visited by a browser on the computer. The website may then use the adware to either generate pop-up advertisements or otherwise direct specific advertisements to the user's browser. Spyware is a program configured to collect information regarding the user, the computer, and/or a user's network habits. In an example, spyware may collect information regarding the names and types of websites that the user browses and then transmit the information to another computer. Adware and spyware are often added to the user's computer after the user browses to a website that hosts the adware and/or spyware. The user is often unaware that these programs have been added and is similarly unaware of the adware's and/or spyware's function.



FIG. 1 is a diagram of a malware containment environment 100 in which embodiments of the present invention may be practiced. The malware containment environment 100 comprises a newly coupled device 105 and a controller 110 coupled to a switch 115. The switch 115 is further coupled to a communication network 120. An intended recipient device 125 is also coupled to the communication network 120. In some embodiments, the controller 110 is coupled to the switch 115 over a tap (not depicted).


A newly coupled device 105 is any digital device that recently coupled to the switch 115 and/or the communication network 120. The intended recipient device 125 is any digital device that the newly coupled device 105 transmits network data to. A digital device is any device comprising one or more processors. Some examples of digital devices include computers, servers, laptops, personal digital assistants, and cellular telephones. Network data comprises signals and data transmitted from the newly coupled device 105. Although FIG. 1 depicts the intended recipient device 125 as coupled to the communication network 120, the intended recipient device 125 may be directly coupled to the newly coupled device 105 or the switch 115. There may be any number of newly coupled devices 105, controllers 110, switches 115, communication networks 120, and/or intended recipient devices 125.


The controller 110 is any digital device or software configured to receive and analyze network data for the presence of malware. In exemplary embodiments, the controller 110 detects the presence of a newly coupled device 105 when the digital device initially couples to the switch 115 or the communication network 120. The controller 110 intercepts network data transmitted from the newly coupled device 105 for a predetermined period of time. In other embodiments, the switch 115 is configured to direct network data transmitted from the newly coupled device 105 to the controller 110 for the predetermined period of time. The switch 115 is further discussed herein.


The network data is then analyzed by the controller 110 to determine evidence of a malware attack. If malware or an attack is detected, the controller 110 may perform corrective actions. If no malware is detected, the network data may be re-transmitted to the intended recipient device 125 (e.g., another digital device on the communication network 120). The controller 110 is further discussed in FIG. 2.


The switch 115 is any device configured to receive and direct network data between one or more digital devices. Examples of a switch 115 include, but is not limited to, a router, gateway, bridge, and, or server.


The communication network 120 couples two or more digital devices together to allow the digital devices to communicate and transmit network data to each other. In some examples, the communication network 120 can be a public computer network such as the Internet, or a private computer network such as a wireless telecommunication network, wide area network, or local area network. In some embodiments, the communication network 120 comprises multiple routers, bridges, and hubs that couple a large number of digital devices.


In various embodiments, the controller 110 can receive network data from the communication network 120 over a tap (not depicted). The tap is a digital data tap configured to receive network data and provide a copy of the network data to the controller 110. In one example, the tap intercepts and copies network data without an appreciable decline in performance of devices coupled to the communication network 120, the newly coupled device 105, and/or the switch 115. The tap can copy any portion of the network data. For example, the tap can receive and copy any number of data packets from the network data. In other embodiments, the controller 110 receives network data from the communication network 120 over a span port.


In some embodiments, the network data can be organized into one or more data flows and provided to the controller 110. In various embodiments, the tap can sample the network data based on a sampling scheme. Data flows can be reconstructed based on the network data samples.



FIG. 2 is a block diagram of an exemplary controller 110 implementing embodiments of the present invention. The controller 110 can be any digital device or software that receives network data. The controller 110 can comprise a quarantine module 205, a heuristic module 210, a scheduler 215, a fingerprint module 220, a virtual machine pool 225, an analysis environment 230, a signature module 235, and a policy engine 240. In some embodiments, the controller 110 comprises a tap or span port which is further coupled to the communication network 120. In other embodiments, the controller 110 is coupled to an external tap, external span port, or may be directly coupled to the switch 115 or the communication network 120.


The quarantine module 205 detects one or more newly coupled devices 105 as they operatively couple to the network. When a newly coupled device 105 is detected, network data transmitted from the newly coupled device 105 is quarantined (i.e., temporarily redirected to the controller 110 for a predetermined time). Network data temporarily redirected to the controller 110 is analyzed to determine if the network data contains suspicious data (discussed below) or a malware attack. If the predetermined time expires and no suspicious data or malware is identified, then the quarantine module 205 ceases to redirect network data from the newly coupled device 105. However, if suspicious data or the presence of malware is determined, then corrective action may be taken.


The quarantine module 205 can detect a newly coupled device 105 by detecting a request for network services. When a newly coupled device 105 couples to a network, the newly coupled device 105 is configured for communication with the communication network 120. In one example, a newly coupled device 105 may request an (internet protocol) IP address. The IP address request as well as the IP address assignment may be detected by the quarantine module 205. Thereafter, all network data from the IP address of the newly coupled device 105 may be quarantined for a predetermined period of time. Those skilled in the art will appreciate that there may be many ways to detect a newly coupled device 105 upon connection with the switch 115 and/or the communication network 120.


The quarantine module 205 can redirect network data from the newly coupled device 105 in any number of ways including, but not limited to, Address Resolution Protocol (ARP) manipulation, DHCP services, DHCP manipulation, or configuring the switch 115. In ARP manipulation (also known as ARP spoofing), a newly connected newly coupled device 105 may send an ARP request to the IP address of another digital device (e.g., intended recipient device 125) for the digital device's media access control (MAC) address. The quarantine module 205 may receive the ARP request, store the ARP request, and provide the controller 110 MAC address in an ARP reply to the switch 115 and/or the newly coupled device 105. Once the switch 115 and/or the newly coupled device 105 receives the controller 110 MAC address in the ARP reply, the IP address of the digital device (e.g., intended recipient device 125) will be associated with the controller 110 MAC address (e.g., in memory storage or cache). Network data intended for the intended recipient device 125 may then be transmit from the newly coupled device 105 to the controller 110.


In one example of ARP manipulation, a newly coupled device 105 may be infected with malware which becomes active upon coupling to a switch 115 and/or a communication network 120. The malware may send network data to any number of other digital devices. Before the attack can proceed, the newly coupled device 105 may send a separate ARP request for the IP address of every other digital device the malware wishes to send data to. The controller 110 detects and responds to each ARP request by sending an ARP reply to each request with the controller 110 MAC address. The controller 110 MAC address may be associated with the IP address of the other digital devices on a table within the newly coupled device 105, switch 115, and/or server (not depicted). The table may be within memory, storage, buffered, and/or cached. Network data is then transmitted from the newly coupled device 105 to the controller 110 for the predetermined time.


If malware or suspicious data within the network data is not detected by the controller 110, the network data may then forward to the intended recipient device 125. In one example, the controller 110 scans and stores the IP address of each intended recipient device 125 of every ARP request. The controller 110 may then transmit an ARP request to receive each intended recipient device's MAC address. If the network data intended for an intended recipient device 125 does not contain suspicious data or a malware attack, the controller 110 may then send the network data to the intended recipient device 125. If the network data contains suspicious data or a malware attack, the network data may not be transmitted by the controller 110.


If, after a predetermined time, no suspicious activity or malware is detected within the network data, then the controller 110 may transmit new ARP responses to the newly coupled device 105, switch 115, and/or server. The new ARP responses can contain the correct MAC address for every ARP request originally sent by the newly coupled device 105. As a result, network data will no longer be directed to the controller 110.


The quarantine module 205 may manipulate dynamic host configuration protocol (DHCP) services to quarantine network data. As a newly coupled device 105 couples to the switch 115 and/or the communication network 120 and requests an IP address from a DHCP server. The quarantine module 205 may respond to the DHCP services request to configure the newly coupled device 105 to transmit network data to the controller 110. In one example, the quarantine module 205 may configure the newly coupled device 105 with a gateway IP address the same as the controller's 110 IP address to send all network data to the controller 110. In other embodiments, the quarantine module 205 may perform DHCP services for the communication network 120 as a DHCP server. If, after the predetermined time no suspicious data or malware is detected, the digital device 105 can be reconfigured so that network data is no longer transmitted to the controller 110.


The quarantine module 205 may send a request to the switch 115 to redirect network data from any newly coupled device 105 to the controller 110 for the predetermined time. In some embodiments, executable code is loaded onto the switch 115. In one example, the executable code configures the switch 115 to direct network data from any newly coupled device 105 to the controller 110 for the predetermined time. In another example, the executable code allows the quarantine module 205 to transmit a request to the switch 115 to direct network data from the newly coupled device 105 to the controller 110. The predetermined time may be set by the quarantine module 205, preloaded into the switch 115, or configured by a user.


The quarantine module 205 may monitor network data directly or receive a copy of the network data over a tap. In one example, the quarantine module 205 monitors and scans network data to detect the presence of a newly coupled device 105. When a newly coupled device 105 is added to the communication network 120 and/or the switch 115, the quarantine module 205 quarantines network data from the newly coupled device 105 for a predetermined time. In another example, a tap may scan network data for newly connected digital devices 105 and alert the quarantine module 205 when such a newly coupled device 105 is discovered. The quarantine module 205 may redirect all network data from the newly coupled device 105 to the controller 110 over a separate link (not depicted) to the switch 115 or the communication network 120. In some embodiments, there is not tap but rather a span port.


The heuristic module 210 can receive network data from the quarantine module 205. The heuristic module 210 applies heuristics and/or probability analysis to determine if the network data might contain suspicious activity. In one example, the heuristic module 210 applies a heuristic which identifies suspicious data within the network data. The heuristic module 210 then flags the network data as suspicious. The network data can then be buffered and organized into a data flow. The data flow can be provided to the scheduler 215. In some embodiments, the network data is provided directly to the scheduler 215 without buffering or organizing the data flow.


The heuristic module 210 can perform any heuristic and/or probability analysis. In one example, the heuristic module 210 performs a dark internet protocol (IP) heuristic. A dark IP heuristic can flag network data coming from the newly coupled device 105 that has not previously been identified by the heuristic module 210. The dark IP heuristic can also flag network data going to an unassigned IP address. In an example, an attacker (e.g., malware within a newly coupled device 105) scans random IP addresses of the communication network 120 to identify an active server or workstation. The dark IP heuristic can flag network data directed to an unassigned IP address.


The heuristic module 210 can also perform a dark port heuristic. A dark port heuristic can flag network data transmitted to an unassigned or unusual port address. Such network data transmitted to an unusual port can be indicative of a port scan by malware such as a worm or a hacker. Further, the heuristic module 210 can flag network data from the newly coupled device 105 that is significantly different than traditional data traffic generally transmitted by the newly coupled device 105. For example, the heuristic module 210 can flag network data from the newly coupled device 105 such as a laptop that begins to transmit network data that is common to a server.


The heuristic module 210 can retain data packets belonging to a particular data flow previously received (e.g., received from a tap) or data flow provided by the quarantine module 205. In one example, the heuristic module 210 receives data packets and stores the data packets within a buffer or other memory. Once the heuristic module 210 receives a predetermined number of data packets from a particular data flow, the heuristic module 210 performs the heuristics and/or probability analysis.


In some embodiments, the heuristic module 210 performs heuristic and/or probability analysis on a set of data packets belonging to a data flow and then stores the data packets within a buffer or other memory. The heuristic module 210 can then continue to receive new data packets belonging to the same data flow. Once a predetermined number of new data packets belonging to the same data flow are received, the heuristic and/or probability analysis can be performed upon the combination of buffered and new data packets to determine a likelihood of suspicious activity.


In some embodiments, an optional buffer receives the flagged network data from the heuristic module 210. The buffer can buffer and organize the flagged network data into one or more data flows before providing the one or more data flows to the scheduler 215. In various embodiments, the buffer can buffer network data and stall before providing the network data to the scheduler 215. In one example, the buffer stalls the network data to allow other components of the controller 110 time to complete functions or otherwise clear data congestion.


The scheduler 215 is a module configured to retrieve a virtual machine associated with the newly coupled device 105 or the intended recipient device 125 of the network data. A virtual machine 315 is software that is configured to mimic the performance of a device (e.g., the intended recipient device 125 of the network data). The virtual machine 315 can be retrieved from the virtual machine pool 225.


In some embodiments, the heuristic module 210 transmits the metadata identifying the intended recipient device 125 to the scheduler 215. In other embodiments, the scheduler 215 receives one or more data packets of the network data from the heuristic module 210 and analyzes the one or more data packets to identify the intended recipient device 125. In yet other embodiments, the metadata can be received from the tap.


The scheduler 215 can retrieve and configure the virtual machine to mimic the pertinent performance characteristics of the intended recipient device 125. In one example, the scheduler 215 configures the characteristics of the virtual machine to mimic only those features of the intended recipient device 125 that are affected by the network data copied by the tap. The scheduler 215 can determine the features of the intended recipient device 125 that are affected by the network data by receiving and analyzing the network data from the quarantine module 205. Such features of the intended recipient device 125 can include opening ports that are to receive the network data, select device drivers that are to respond to the network data, and configuring any other devices coupled to or contained within the intended recipient device 125 that can respond to the network data. In other embodiments, the heuristic module 210 can determine the features of the intended recipient device 125 that are affected by the network data by receiving and analyzing the network data from the tap. The heuristic module 210 can then transmit the features of the intended recipient device 125 to the scheduler 215.


The optional fingerprint module 220 is configured to determine the packet format of the network data to assist the scheduler 215 in the retrieval and/or configuration of the virtual machine. In one example, the fingerprint module 220 determines that the network data is based on a transmission control protocol/internet protocol (TCP/IP). Thereafter, the scheduler 215 will configure a virtual machine with the appropriate ports to receive TCP/IP packets. In another example, the fingerprint module 220 can configure a virtual machine with the appropriate ports to receive user datagram protocol/internet protocol (UDP/IP) packets. The fingerprint module 220 can determine any type of packet format of a network data.


In other embodiments, the optional fingerprint module 220 passively determines a software profile of the network data to assist the scheduler 215 in the retrieval and/or configuration of the virtual machine. The software profile may comprise the operating system (e.g., Linux RH6.2) of the newly coupled device 105 that generated the network data. The determination can be based on analysis of the protocol information of the network data. In an example, the optional fingerprint module 220 determines that the software profile of network data is Windows XP, SP1. The optional fingerprint module 220 can then configure a virtual machine with the appropriate ports and capabilities to receive the network data based on the software profile. In other examples, the optional fingerprint module 220 passes the software profile of the network data to the scheduler 215 which either selects or configures the virtual machine based on the profile.


The virtual machine pool 225 is configured to store virtual machines. The virtual machine pool 225 can be any storage capable of storing software. In one example, the virtual machine pool 225 stores a single virtual machine that can be configured by the scheduler 215 to mimic the performance of any intended recipient device 125 on the communication network 120. The virtual machine pool 225 can store any number of distinct virtual machines that can be configured to simulate the performance of any intended recipient device 125.


The analysis environment 230 is a module that simulates transmission of the network data between the newly coupled device 105 and the intended recipient device 125 to analyze the effects of the network data upon the intended recipient device 125. The analysis environment 230 can identify the effects of malware or illegitimate computer users (e.g., a hacker, computer cracker, or other computer user) by analyzing the simulation of the effects of the network data upon the intended recipient device 125 that is carried out on the virtual machine. There can be multiple analysis environments 230 to simulate multiple network data.


As the analysis environment 230 simulates the transmission of the network data, behavior of the virtual machine can be closely monitored for unauthorized activity. If the virtual machine crashes, performs illegal operations, performs abnormally, or allows access of data to an unauthorized computer user, the analysis environment 230 can react. In some embodiments, the analysis environment 230 performs dynamic taint analysis to identify unauthorized activity (dynamic taint analysis is further described in FIG. 3.)


Once unauthorized activity is detected, the analysis environment 230 can generate the unauthorized activity signature configured to identify network data containing unauthorized activity. Since the unauthorized activity signature does not necessarily require probabilistic analysis to detect unauthorized activity within network data, unauthorized activity detection based on the unauthorized activity signature may be very fast and save computing time.


In various embodiments, the unauthorized activity signature may provide code that may be used to eliminate or “patch” portions of network data containing an attack. Further, in some embodiments, the unauthorized activity signature may be used to identify and eliminate (i.e., delete) the malware causing the attack. The unauthorized activity signature may also be used to configure digital devices to eliminate vulnerabilities (e.g., correct system settings such as disabling active-x controls in a browser or updating an operating system.)


The analysis environment 230 may store the unauthorized activity signature within the signature module 235. The analysis environment 230 may also transmit or command the transmission of the unauthorized activity signature to one or more other controllers 110, switches 115, digital devices 105, and/or servers. By automatically storing and transmitting the unauthorized activity signature, known malware, previously unidentified malware, and the activities of illicit computer users can be quickly controlled and reduced before a computer system is damaged or compromised. The analysis environment 230 is further discussed with respect to FIG. 3.


The signature module 235 receives, authenticates, and stores unauthorized activity signatures. The unauthorized activity signatures may be generated by the analysis environment 230 or another controller 110. The unauthorized activity signatures may then be transmitted to the signature module 235 of one or more controllers 110.


The policy engine 240 is coupled to the heuristic module 210 and is a module that may identify network data as suspicious based upon policies contained within the policy engine 240. In one example, a newly coupled device 105 can be a computer designed to attract hackers and/or worms (e.g., a “honey pot”). The policy engine 240 can contain a policy to flag any network data directed to the “honey pot” as suspicious since the “honey pot” should not be receiving any legitimate network data. In another example, the policy engine 240 can contain a policy to flag network data directed to any intended recipient device 125 that contains highly sensitive or “mission critical” information.


The policy engine 240 can also dynamically apply a rule to copy all network data related to network data already flagged by the heuristic module 210. In one example, the heuristic module 210 flags a single packet of network data as suspicious. The policy engine 240 then applies a rule to flag all data related to the single packet (e.g., data flows) as suspicious. In some embodiments, the policy engine 240 flags network data related to suspicious network data until the analysis environment 230 determines that the network data flagged as suspicious is related to unauthorized activity.


The policy engine 240 may scan network data to detect unauthorized activity based upon an unauthorized activity signature. In some embodiments, the policy engine 240 retrieves the unauthorized activity signature from the signature module 235. The network data is then scanned for unauthorized activity based on the unauthorized activity signature.


The policy engine 240 can scan the header of a packet of network data as well as the packet contents for unauthorized activity. In some embodiments, the policy engine 240 scans only the header of the packet for unauthorized activity based on the unauthorized activity signature. If unauthorized activity is found, then no further scanning may be performed. In other embodiments, the policy engine 240 scans the packet contents for unauthorized activity.


Advantageously, unauthorized activity may be found by scanning only the header of a packet, the contents of the packet, or both the header and the contents of the packet. As a result, unauthorized activity that might otherwise evade discovery can be detected. In one example, evidence of unauthorized activity may be located within the contents of the packet. By scanning only the contents of the packet, unauthorized activity may be detected.


If the packet contents or the packet header indicate that the network data contains unauthorized activity, then the policy engine 240, the heuristic module 210, or the signature module 235 may take action. In one example, the policy engine 240 may generate a rule or command the quarantine module 205 to permanently quarantine the newly coupled device 105 and delete or bar the packet from the communication network 120. The policy engine 240 and/or the quarantine module 205 may also quarantine, delete, or bar other packets belonging to the same data flow as the unauthorized activity packet.



FIG. 3 depicts an analysis environment 230, in accordance with one embodiment of the present invention. The analysis environment 230 comprises an optional replayer 305, a virtual switch 310, and a virtual machine 315. The replayer 305 receives network data that has been flagged by the heuristic module 210 and replays the network data in the analysis environment 230. In some embodiments, the replayer 305 mimics the behavior of the newly coupled device 105 in transmitting the flagged network data. There can be any number of replayers 305 simulating the transmission of network data between the newly coupled digital device 105 and the intended recipient device 125. In a further embodiment, the replayer dynamically modifies session variables, as is appropriate, to emulate a “live” client or server of the protocol sequence being replayed. In one example, dynamic variables that may be dynamically substituted include dynamically assigned ports, transaction IDs, and any other variable that is dynamic to each protocol session. In other embodiments, the network data received from the heuristic module 210 is transmitted to the virtual machine 315 without a replayer 305.


The virtual switch 310 is software that is capable of forwarding packets of flagged network data to the virtual machine 315. In one example, the replayer 305 simulates the transmission of the data flow by the newly coupled device 105. The virtual switch 310 simulates the communication network 120 and the virtual machine 315 simulates the intended recipient device 125. The virtual switch 310 can route the data packets of the data flow to the correct ports of the virtual machine 315.


The virtual machine 315 is a representation of the intended recipient device 125 that can be provided to the analysis environment 230 by the scheduler 215. In one example, the scheduler 215 retrieves a virtual machine 315 from the virtual machine pool 225 and configures the virtual machine 315 to mimic the intended recipient device 125. The configured virtual machine 315 is then provided to the analysis environment 230 where it can receive flagged network data from the virtual switch 310.


As the analysis environment 230 simulates the transmission of the network data, behavior of the virtual machine 315 can be closely monitored for unauthorized activity. If the virtual machine 315 crashes, performs illegal operations, performs abnormally, or allows access of data to an unauthorized computer user, the analysis environment 230 can react.


In some embodiments, the analysis environment 230 performs dynamic taint analysis to identify unauthorized activity. For a malware attack to change the execution of an otherwise legitimate program, the malware attack may cause a value that is normally derived from a trusted source to be derived from the user's own input. Program values (e.g., jump addresses and format strings) are traditionally supplied by a trusted program and not from external untrusted inputs. Malware, however, may attempt to exploit the program by overwriting these values.


In one example of dynamic taint analysis, all input data from untrusted or otherwise unknown sources are flagged. Program execution of programs with flagged input data is then monitored to track how the flagged data propagates (i.e., what other data becomes tainted) and to check when the flagged data is used in dangerous ways. For example, use of tainted data as jump addresses or format strings often indicates an exploit of a vulnerability such as a buffer overrun or format string vulnerability.


In some embodiments, the analysis environment 230 monitors and analyzes the behavior of the virtual machine 315 in order to determine a specific type of malware or the presence of an illicit computer user. The analysis environment 230 can also generate computer code configured to eliminate new viruses, worms, or other malware. In various embodiments, the analysis environment 230 can generate computer code configured to identify data within the network data indicative of a malware attack, repair damage performed by malware, or the illicit computer user. By simulating the transmission of suspicious network data and analyzing the response of the virtual machine, the analysis environment 230 can identify known and previously unidentified malware and the activities of illicit computer users before a computer system is damaged or compromised.


In other embodiments, the controller 110 does not comprise a heuristic module 210 and the analysis environment 230 does not comprise a replayer 305. In one example, the controller 110 receives network data. The policy engine 240 can scan the network data to determine if the newly coupled device 105 is on a white list (further described herein). Further, the policy engine 240 can compare some or all of the network data to a signature (i.e., an unauthorized activity signature) to detect and/or identify a malware attack. The analysis environment 230 can receive the network data and orchestrate the transmission of the network data by transmitting the copy of the network data to a virtual machine 315. The analysis environment 230 can then monitor the reaction of the virtual machine 315 to the copy of the network data to identify a malware attack.



FIG. 4 is a flowchart of an exemplary method for malware containment upon connection of a newly coupled device 105. In step 400, the controller 110 detects a newly coupled device 105 upon connection with the switch 115 and/or the communication network 120. In one example, a user brings a laptop or other computer from home and then couples the laptop or other computer to the work communication network 120. The newly coupled device 105 (i.e., laptop or other computer) requests network resources which is detected by the controller 110.


In various embodiments, the controller 110 receives copies of network data from the switch 115 or the communication network 120 over a tap. The tap can transparently copy network data from the switch 115 and/or the communication network 120. The copy of network data is analyzed to determine if a newly coupled device 105 is requesting network services. In other embodiments, the controller 110 monitors network data directly to identify requests for network services.


In step 405, the quarantine module 205 temporarily redirects the network data from the newly coupled device 105. In one example, the controller 110 identifies a newly coupled device 105. The quarantine module 205 redirects network data transmitted from the newly coupled device 105 for a predetermined time.


In step 410, the controller 110 receives the network data from the newly coupled device 105. In various embodiments, the network data is received over a separate link without a tap between the controller 110 and the switch 115 or communication network 120. In one example, the controller 110 comprises an IP address. Network data directed to the controller 110 IP address (e.g., network data transmitted by the newly coupled device 105 to a gateway IP address that is the same as the controller 110 IP address) may be received by the controller 110 over the link.


In step 415, the controller 110 determines if a malware attack is within the network data. The heuristic module 210 can determine if the network data contains suspicious activity. In some embodiments, if the network data contains suspicious activity, then the heuristic module 210 directs the quarantine module 205 to take corrective action in step 420. In other embodiments, if the network data contains suspicious activity, the network data flagged as suspicious is directed to the analysis environment 230 for analysis to identify unauthorized activity. If unauthorized activity is identified, then the analysis environment 230 directs the quarantine module 205 to take corrective action in step 420.


In step 420, the quarantine module 205 takes corrective action. Corrective actions can include, but are not limited to, the permanent quarantine of network data from the newly coupled device 105. In various embodiments, the controller 110 continues to analyze network data from the newly coupled device 105 to further identify the malware or identify different types of malware on the newly coupled device 105. Notifications of malware may be sent to the newly coupled device 105, server, or security stations on the communication network 120. The signature module 235 may generate unauthorized activity signatures based on the identified malware attack.


If the heuristic module 210 does not flag the network data as suspicious and/or malware is not found by the analysis environment 230, then the quarantine module 205 determines if the predetermined time is expired. If the predetermined time is not expired, the controller 110 continues to receive network data from the digital device in step 410. If the predetermined time is expired, then the method ends. In various embodiments, if the heuristic module 210 does not flag the network data as suspicious and/or malware is not found by the analysis environment 230, the controller 110 (e.g., quarantine module 205) forwards the network data to the intended recipient device 125.


It will be appreciated by those skilled in the art, that the process depicted in FIG. 4 may simply continue to repeat upon the continuation of quarantine (step 420) or the expiration of the predetermined time (step 425). In one example, if the network data contains a malware attack (step 425), the redirection of the network data from the newly coupled device 105 can continue until reset by the IT administrator or the malware attack is no longer detected. In the meantime, however, other newly coupled devices 105 can join the network which may trigger the method of FIG. 4. The method of FIG. 4 can run in parallel or series (e.g., simultaneously) for many different newly coupled devices 105. In another example, once the predetermined time expires (step 425) the method can continue to digital devices upon connection (step 400).



FIG. 5 is another flowchart of an exemplary method for malware containment upon connection of a newly coupled device 105. In step 500, the controller 110 detects the newly coupled device 105 upon connection with the switch 115 and/or the communication network 120. In step 505, the quarantine module 205 determines if the newly coupled device 105 is associated with a white list. A white list comprises a table that identifies various digital devices and policies. In one example, the newly coupled device 105 of a chief executive officer (CEO) or chief information office (CIO) of a company may be identified within the white list. The policy associated with the newly coupled device 105 of the CEO or CIO may command the quarantine module 205 to not quarantine the newly coupled device 105. If the newly coupled device 105 is associated with the white list, the method may end. If the newly coupled device 105 is not associated with a white list, then the quarantine module 205 manipulates ARP to direct network data transmitted from the newly coupled device 105 to the controller 110 in step 510.


Although step 505 as described indicates that the method may end if the newly coupled device 105 is associated with one or more white lists, other actions may be taken. In one example, if a newly coupled device 105 is associated with a white list, the quarantine may last for a shorter or longer predetermined time. As such, the quarantine module 205 would then manipulate ARP to direct the network data from the newly coupled device 105 to the controller 110 in step 510.


In step 515, the controller 110 receives the network data from the newly coupled device 105. In step 520, the controller 110 determines if the network data contains a malware attack. In various embodiments, the network data is analyzed to determine whether the network data is suspicious. For example a heuristic module 210, such as the heuristic module 210, can analyze the network data. The heuristic module 210 can base the determination on heuristic and/or probabilistic analyses. In various embodiments, the heuristic module 210 has a very low threshold to determine whether the network data is suspicious. For example, a single command within the network data directed to an unusual port of the destination device can cause the network data to be flagged as suspicious.


The heuristic module 210 can alternatively include flagging network data as suspicious based on policies such as the identity of a newly coupled device 105, the intended recipient device 125, or the activity contained within the network data. In one example, even if the heuristic module 210 does not flag the network data, the network data can be flagged as suspicious based on a policy if the network data was transmitted from a device that does not normally transmit network data. Similarly, based on another policy, if the intended recipient device 125 or the newly coupled device 105 contains trade secrets or other critical data, then any network data so transmitted can be flagged suspicious. Similarly, if the network data is directed to a particularly important database or is attempting to gain rights or privileges within the communication network 120, the switch 115 or the intended recipient device 125, then the network data can be flagged as suspicious. In various embodiments, the policy engine 240 flags network data based on these and/or other policies.


In some embodiments, if the network data is flagged as suspicious, the quarantine module 205 may continue the quarantine of the network data indefinitely in step 525. In other embodiments, if the network data is flagged as suspicious, the network data may be provided to the analysis environment 230 to analyze the response of a virtual machine to the network data to identify malware attacks or other unauthorized activity. If a malware attack or other unauthorized activity is identified, then the quarantine of the network data from the newly coupled device 105 may continue indefinitely in step 525. If the network data is not flagged as suspicious or there is no evidence of a malware attack or other unauthorized activity in the network data previously flagged as suspicious, then the quarantine module 205 determines if the predetermined time has expired in step 530. If the predetermined time has not expired, then the controller 110 continues to receive the network data in step 515. If the predetermined time has expired, the method ends.



FIG. 6 is a flowchart of an exemplary method of analyzing network data to generate an unauthorized activity signature. In step 600, the scheduler 215 scans the network data previously flagged as suspicious by the heuristic module 210 to determine the intended recipient device 125. In one example, the scheduler 215 scans the destination IP address contained within the network data to identify the intended recipient device 125. The scheduler 215 may then retrieve a virtual machine 315 from the virtual machine pool 225. The virtual machine 315 may be previously configured to be similar to the intended recipient device 125. If there is no virtual machine 315 previously configured to be similar to the intended recipient device 125, then a generic virtual machine 315 may be retrieved by the scheduler 215. The scheduler 215 may configure to the virtual machine 315 retrieved from the virtual machine pool 225 to receive the network data (e.g., open ports, reduce or eliminate security settings, etc.) The scheduler 215 may then provide the virtual machine 315 to the analysis environment 230.


In step 610, the analysis environment 230 analyzes the response of the virtual machine 315 to the network data to identify a malware attack. In one example, an optional replayer 305 is configured to perform similarly to the newly coupled device 105 and transmit the network data over a virtual switch 310 to the virtual machine 315. In various embodiments, there may be any number of replayers 305 configured to transmit network data to different virtual machines 315 in parallel. Similarly, multiple analysis environments 230 may operate in parallel. The analysis environment 230 analyzes the response of the virtual machine 315 to the network data (e.g., with taint analysis).


If the network data does not contain unauthorized activity, then the method may end. If the network data contains unauthorized activity, then an unauthorized activity signature is generated based on the unauthorized activity in step 615. The unauthorized activity signature may be generated by the analysis environment 230 or the signature module 235.


In step 620, the unauthorized activity signature is transmitted to one or more other controllers 110 or any digital device (e.g., server, newly coupled device 105, switch 115). The receiving controller 110 can store the unauthorized activity signature within the receiving controller's signature module 235 or policy engine 240. The policy engine 240 may use the unauthorized activity signature to scan network data received by the controller 110 to flag the network data as suspicious or containing unauthorized activity without any further analysis (by either the heuristic module 210 or the analysis environment 230).


Optionally, the unauthorized activity signature may be authenticated. In some embodiments, the analysis environment 230 can generate an authentication code along with the unauthorized activity signature. The authentication code can then be scanned to determine that the unauthorized activity signature is verified. In one example, the analysis environment 230 generates the unauthorized activity signature and an authentication code. The analysis environment 230 transmits the unauthorized activity signature and the authentication code to another controller 110. The controller 110 verifies the authentication code to ensure that the unauthorized activity signature is genuine. If the unauthorized activity signature is authenticated, then the signature module 235 stores the unauthorized activity signature.


The unauthorized activity signature can also be encrypted. In one example, the controller 110 generates, encrypts, and transmits the unauthorized activity signature to another controller 110. The receiving controller 110 can decrypt the unauthorized activity signature and store the unauthorized activity signature within the signature module 235. In some embodiments, the controller 110 generates an authentication code and proceeds to encrypt the authentication code and the unauthorized activity signature prior to transmitting the authentication code and the unauthorized activity signature to another controller 110.



FIG. 7 is a block diagram of the controller 110 (FIG. 1), in accordance with one embodiment of the present invention. The controller 110 comprises a processor 700, a memory system 705, a storage system 710, an I/O interface 715, a communication network interface 720, and a display interface 725 which are all coupled to a system bus 730. The processor 700 is configured to execute executable instructions. In some embodiments, the processor 700 comprises circuitry or any one or more processors capable of processing the executable instructions.


The memory system 705 is any memory configured to store data. Some examples of the memory system 705 include storage devices, such as RAM or ROM.


The storage system 710 is any storage configured to retrieve and store data. Some examples of the storage system 710 are flash drives, hard drives, optical drives, and/or magnetic tape. The storage system 710 can comprise a database or other data structure configured to hold and organize data (e.g., network data, copies of network data, buffered data.) In some embodiments, the controller 110 includes memory 705 in the form of RAM and storage 710 in the form of flash data. The memory system 705 and/or the storage system 710 can comprise cache and buffers configured to retain network data or copies of network data.


The input/output (I/O) interface 715 is any device that can receive input and provide output to a user. The I/O interface 715 can be, but is not limited to, a keyboard, a mouse, a touchscreen, a keypad, a biosensor, or floppy disk drive.


The communication network interface 720 can be coupled to any user device via the links 735. The communication network interface 720 may support communication over a USB connection, a firewire connection, an Ethernet connection, a serial connection, a parallel connection, or an ATA connection. The communication network interface 720 may also support wireless communication (e.g., 802.11 a/b/g/n or wireless USB). It will be apparent to those skilled in the art that the communication network interface 720 can support many wired and wireless standards.


The display interface 725 is an interface configured to support a display, monitor, or screen. In some embodiments, the controller 110 comprises a graphical user interface to be displayed to a user over a monitor in order to allow the user to control the controller 110.


The above-described modules can be comprised of instructions that are stored on storage media. The instructions can be retrieved and executed by a processor (e.g., the processor 700). Some examples of instructions include software, program code, and firmware. Some examples of storage media comprise memory devices and integrated circuits. The instructions are operational when executed by the processor to direct the processor to operate in accordance with embodiments of the present invention. Those skilled in the art are familiar with instructions, processor(s), and storage media.


The present invention is described above with reference to exemplary embodiments. It will be apparent to those skilled in the art that various modifications may be made and other embodiments can be used without departing from the broader scope of the present invention. Therefore, these and other variations upon the exemplary embodiments are intended to be covered by the present invention.

Claims
  • 1. A malware containment method comprising: redirecting network data received over a communication network to a virtual machine, the virtual machine being configured to simulate functionality of a digital device;analyzing of the redirected network data including analyzing a response of the virtual machine to processing of the network data within the virtual machine to identify a malware attack;continuing to redirect the network data for processing by the virtual machine until expiration of a predetermined period of time and a lack of detection of malware within the redirected network data prior to expiration of the predetermined period of time; andcontinuing to redirect the network data for processing by the virtual machine beyond the predetermined period of time in response to detection of malware within the redirected network data.
  • 2. The method of claim 1, wherein the analyzing of the redirected network data further comprises determining if a digital device transmitting the network data over the communication network is associated with a white list, and halting the redirecting of the network data if the transmitting digital device is determined to be associated with the white list.
  • 3. The method of claim 1, wherein the continuing to redirect the network data for processing by the virtual machine beyond the predetermined period of time comprises continuing to redirect the network data until reset by an administrator.
  • 4. The method of claim 1, wherein the continuing to redirect the network data for processing by the virtual machine beyond the predetermined period of time comprises continuing to redirect the network data until the malware attack is no longer detected.
  • 5. The method of claim 1, further comprising generating an unauthorized activity signature based on a detection of the malware.
  • 6. The method of claim 5, further comprising: storing the unauthorized activity signature; andsending the unauthorized activity signature to another digital device.
  • 7. The method of claim 5, wherein the unauthorized activity signature is used to subsequently identify the malware and mitigate vulnerabilities including changing a setting in a browser application or an operating system.
  • 8. A system comprising: a processor; anda memory communicatively coupled to the processor, the memory comprises software instructions that, when executed by the processor, redirects a first portion of network data to a virtual machine, and analyzes the redirected first portion of network data by at least analyzing a response of the virtual machine processing the first portion of network data to identify suspicious activity,wherein a second portion of network data subsequently received after the first portion of network data continues to be redirected to the virtual machine for processing by the virtual machine and a lack of detection of suspicious activity during the processing of the first portion of network data by the virtual machine and expiration of a predetermined period of time; andthe second portion of the network data continues to be redirected to the virtual machine for processing beyond the predetermined period of time in response to detection of suspicious activity during processing of the second portion by the virtual machine.
  • 9. The system of claim 8, wherein the first portion of network data and the second portion of network data are part of a data flow.
  • 10. The system of claim 9, wherein the memory further comprises a policy engine that, when executed by the processor, analyzes the redirected first portion of network data by at least determining if a digital device that transmitted the first portion of the network data is associated with a white list.
  • 11. The system of claim 8, wherein the memory further comprises a policy engine that, when executed by the processor, analyzes the redirected first portion of network data by at least comparing some or all of the redirected first portion of network data with an unauthorized activity signature to detect malware.
  • 12. The system of claim 8, wherein the memory further comprises a scheduler that, when executed by the processor, configures the virtual machine based on metadata associated with the redirected first portion of network data.
  • 13. The system of claim 8, wherein the memory further comprising a signature module that, when executed by the processor, generates or stores an unauthorized activity signature based on a detection of malware that is causing the suspicious activity to occur.
  • 14. A malware containment method comprising: redirecting network data received over a communication network to a virtual machine;analyzing of the redirected network data including analyzing a response of the virtual machine to processing of the network data within the virtual machine to identify a malware attack;continuing to redirect the network data for processing by the virtual machine until expiration of a predetermined period of time and a lack of detection of malware within the redirected network data prior to expiration of the predetermined period of time; andcontinuing to redirect the network data for processing by the virtual machine beyond the predetermined period of time in response to detection of malware within the redirected network data prior to expiration of the predetermined period of time.
  • 15. The method of claim 14, wherein the analyzing of the redirected network data further comprises determining if a digital device that transmitted the network data over the communication network is associated with a white list, and halting the redirecting of the network data if the digital device is determined to be associated with the white list.
  • 16. The method of claim 14, wherein the continuing to redirect the network data for processing by the virtual machine, which is configured to simulate functionality of a digital device, beyond the predetermined period of time comprises continuing to redirect the network data until reset by an administrator.
  • 17. The method of claim 14, wherein the continuing to redirect the network data for processing by the virtual machine beyond the predetermined period of time comprises continuing to redirect the network data until the malware attack is no longer detected.
  • 18. The method of claim 14, further comprising generating an unauthorized activity signature based on a detection of the malware, the unauthorized activity signature is used to subsequently identify the malware and mitigate vulnerabilities including changing a setting in a browser application or an operating system.
  • 19. The method of claim 18, further comprising: storing the unauthorized activity signature; andsending the unauthorized activity signature to another digital device.
  • 20. The method of claim 2, wherein the redirecting of the network device comprises returning an Address Resolution Protocol (ARP) reply to the transmitting digital device in response to receipt of an ARP request message, the ARP reply includes an address of a controller different than the digital device.
  • 21. The method of claim 20, wherein the controller being a digital device or software configured to receive and analyze the network data for a presence of malware.
  • 22. The method of claim 2, wherein after expiration of the predetermined period of time and the lack of detection of malware within the redirected network data, transmitting an ARP response, by the controller, the ARP response include an address of the digital device to the transmitting digital device in response to receipt of each ARP request messages sent from the transmitting digital device to cease redirecting of the network data.
  • 23. The method of claim 2, wherein the redirecting of the network device comprises returning an Internet Protocol (IP) address of a controller, being different than the digital device, in response to receipt of a Dynamic Host Configuration Protocol (DHCP) services request from the transmitting digital device.
  • 24. The method of claim 1 further comprising: quarantining the network data from the digital device upon a controller, analyzing the redirected network data, continues the detection of malware.
  • 25. The method of claim 1 further comprising: continuing to analyze, by a controller, the redirected network data to further identify the malware or identify different types of malware on the digital device beyond the predetermined period of time in response to the detection of malware within the redirected network data.
  • 26. The system of claim 8, wherein the memory further comprises software instructions that, when executed by the processor, redirects the first portion of network data to the virtual machine by at least returning an address of a controller, being different than a digital device intended to receive the network data, to the digital device that transmitted the first portion of the network data.
  • 27. The system of claim 26, wherein the address of the controller being contained in a response to an Address Resolution Protocol (ARP) request or a Dynamic Host Configuration Protocol (DHCP) services request.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/059,381, filed Oct. 21, 2013, and is a continuation of U.S. patent application Ser. No. 11/717,475, filed Mar. 12, 2007, and is a continuation-in-part of U.S. patent application Ser. No. 11/494,990, filed Jul. 28, 2006, entitled “Dynamic Signature Creation and Enforcement”, which is a continuation-in-part of U.S. patent application Ser. No. 11/471,072, filed Jun. 19, 2006, entitled “Virtual Machine with Dynamic Data Flow Analysis”, which is a continuation-in-part of U.S. patent application Ser. No. 11/409,355, filed Apr. 20, 2006, entitled “Heuristic Based Capture with Replay to Virtual Machine”, which claims benefit to U.S. patent application Ser. No. 11/096,287, filed Mar. 31, 2005, entitled “System and Method of Detecting Computer Worms,” U.S. patent application Ser. No. 11/151,812, filed Jun. 13, 2005, entitled “System and Method of Containing Computer Worms,” and U.S. patent application Ser. No. 11/152,286, Jun. 13, 2005, entitled “Computer Worm Defense System and Method” all of which are incorporated by reference herein. U.S. patent application Ser. No. 11/096,287, filed Mar. 31, 2005, entitled “System and Method of Detecting Computer Worms,” claims benefit to provisional patent application No. 60/559,198, filed Apr. 1, 2004, entitled “System and Method of Detecting Computer Worms.” U.S. patent application Ser. No. 11/151,812, filed Jun. 13, 2005, entitled “System and Method of Containing Computer Worms,” claims benefit of provisional patent application No. 60/579,953, filed Jun. 14, 2004, entitled “System and Method of Containing Computer Worms” U.S. patent application Ser. No. 11/152,286, filed Jun. 13, 2005, entitled “Computer Worm Defense System and Method,” claims benefit of provisional patent application No. 60/579,910, filed Jun. 14, 2004, entitled “Computer Worm Defense System and Method.” The above-referenced provisional patent applications are also incorporated by reference herein.

US Referenced Citations (542)
Number Name Date Kind
4292580 Ott et al. Sep 1981 A
5175732 Hendel et al. Dec 1992 A
5440723 Arnold et al. Aug 1995 A
5490249 Miller Feb 1996 A
5657473 Killean et al. Aug 1997 A
5842002 Schnurer Nov 1998 A
5978917 Chi Nov 1999 A
6088803 Tso et al. Jul 2000 A
6094677 Capek Jul 2000 A
6108799 Boulay et al. Aug 2000 A
6269330 Cidon et al. Jul 2001 B1
6272641 Ji Aug 2001 B1
6279113 Vaidya Aug 2001 B1
6298445 Shostack et al. Oct 2001 B1
6357008 Nachenberg Mar 2002 B1
6363489 Comay Mar 2002 B1
6424627 Sørhaug et al. Jul 2002 B1
6442696 Wray et al. Aug 2002 B1
6484315 Ziese Nov 2002 B1
6487666 Shanklin et al. Nov 2002 B1
6493756 O'Brien et al. Dec 2002 B1
6550012 Villa et al. Apr 2003 B1
6775657 Baker Aug 2004 B1
6831893 Ben Nun et al. Dec 2004 B1
6832367 Choi et al. Dec 2004 B1
6854063 Qu Feb 2005 B1
6895550 Kanchirayappa et al. May 2005 B2
6898632 Gordy et al. May 2005 B2
6907396 Muttik et al. Jun 2005 B1
6941348 Petry et al. Sep 2005 B2
6971097 Wallman Nov 2005 B1
6981279 Arnold Dec 2005 B1
7007107 Ivchenko et al. Feb 2006 B1
7028179 Anderson et al. Apr 2006 B2
7043757 Hoefelmeyer et al. May 2006 B2
7069316 Gryaznov Jun 2006 B1
7080407 Zhao et al. Jul 2006 B1
7080408 Pak Jul 2006 B1
7093002 Wolff et al. Aug 2006 B2
7093239 van der Made Aug 2006 B1
7096498 Judge Aug 2006 B2
7100201 Izatt Aug 2006 B2
7107617 Hursey et al. Sep 2006 B2
7159149 Spiegel et al. Jan 2007 B2
7213260 Judge May 2007 B2
7231667 Jordan Jun 2007 B2
7240364 Branscomb et al. Jul 2007 B1
7240368 Roesch et al. Jul 2007 B1
7243371 Kasper et al. Jul 2007 B1
7249175 Donaldson Jul 2007 B1
7287278 Liang Oct 2007 B2
7308716 Danford et al. Dec 2007 B2
7328453 Merkle, Jr. et al. Feb 2008 B2
7346486 Ivancic et al. Mar 2008 B2
7356736 Natvig Apr 2008 B2
7386888 Liang et al. Jun 2008 B2
7392542 Bucher Jun 2008 B2
7418729 Szor Aug 2008 B2
7428300 Drew et al. Sep 2008 B1
7441272 Durham et al. Oct 2008 B2
7448084 Apap et al. Nov 2008 B1
7458098 Judge et al. Nov 2008 B2
7464404 Carpenter et al. Dec 2008 B2
7464407 Nakae et al. Dec 2008 B2
7467408 O'Toole, Jr. Dec 2008 B1
7478422 Valente Jan 2009 B2
7478428 Thomlinson Jan 2009 B1
7480773 Reed Jan 2009 B1
7487543 Arnold et al. Feb 2009 B2
7496960 Chen et al. Feb 2009 B1
7496961 Zimmer et al. Feb 2009 B2
7519990 Xie Apr 2009 B1
7523493 Liang et al. Apr 2009 B2
7530104 Thrower et al. May 2009 B1
7540025 Tzadikario May 2009 B2
7546638 Anderson et al. Jun 2009 B2
7565550 Liang et al. Jul 2009 B2
7568233 Szor et al. Jul 2009 B1
7584455 Ball Sep 2009 B2
7594009 Triulzi Sep 2009 B2
7603715 Costa et al. Oct 2009 B2
7607171 Marsden et al. Oct 2009 B1
7639714 Stolfo et al. Dec 2009 B2
7644441 Schmid et al. Jan 2010 B2
7657419 van der Made Feb 2010 B2
7676841 Sobchuk et al. Mar 2010 B2
7698548 Shelest et al. Apr 2010 B2
7707633 Danford et al. Apr 2010 B2
7711714 Takaragi May 2010 B2
7712136 Sprosts et al. May 2010 B2
7730011 Deninger et al. Jun 2010 B1
7739740 Nachenberg et al. Jun 2010 B1
7779463 Stolfo et al. Aug 2010 B2
7784097 Stolfo et al. Aug 2010 B1
7797752 Vaidya Sep 2010 B1
7832008 Kraemer Nov 2010 B1
7836502 Zhao et al. Nov 2010 B1
7849506 Dansey et al. Dec 2010 B1
7854007 Sprosts et al. Dec 2010 B2
7869073 Oshima Jan 2011 B2
7877803 Enstone et al. Jan 2011 B2
7904959 Sidiroglou et al. Mar 2011 B2
7908660 Bahl Mar 2011 B2
7930738 Petersen Apr 2011 B1
7937761 Bennett May 2011 B1
7949849 Lowe et al. May 2011 B2
7996556 Raghavan et al. Aug 2011 B2
7996836 McCorkendale et al. Aug 2011 B1
7996904 Chiueh et al. Aug 2011 B1
7996905 Arnold et al. Aug 2011 B2
8006305 Aziz Aug 2011 B2
8010667 Zhang et al. Aug 2011 B2
8020206 Hubbard et al. Sep 2011 B2
8028338 Schneider et al. Sep 2011 B1
8042184 Batenin Oct 2011 B1
8045094 Teragawa Oct 2011 B2
8045458 Alperovitch et al. Oct 2011 B2
8069484 McMillan et al. Nov 2011 B2
8087086 Lai et al. Dec 2011 B1
8171553 Aziz May 2012 B2
8176049 Deninger et al. May 2012 B2
8176480 Spertus May 2012 B1
8201072 Matulic Jun 2012 B2
8201246 Wu et al. Jun 2012 B1
8204984 Aziz et al. Jun 2012 B1
8214905 Doukhvalov et al. Jul 2012 B1
8220055 Kennedy Jul 2012 B1
8225288 Miller et al. Jul 2012 B2
8225373 Kraemer Jul 2012 B2
8233882 Rogel Jul 2012 B2
8234640 Fitzgerald et al. Jul 2012 B1
8234709 Viljoen et al. Jul 2012 B2
8239944 Nachenberg et al. Aug 2012 B1
8260914 Ranjan Sep 2012 B1
8266091 Gubin et al. Sep 2012 B1
8286251 Eker et al. Oct 2012 B2
8291198 Mott et al. Oct 2012 B2
8291499 Aziz Oct 2012 B2
8307435 Mann et al. Nov 2012 B1
8307443 Wang et al. Nov 2012 B2
8312545 Tuvell et al. Nov 2012 B2
8321240 Lorsch Nov 2012 B2
8321936 Green et al. Nov 2012 B1
8321941 Tuvell et al. Nov 2012 B2
8332571 Edwards, Sr. Dec 2012 B1
8365286 Poston Jan 2013 B2
8365297 Parshin et al. Jan 2013 B1
8370938 Daswani et al. Feb 2013 B1
8370939 Zaitsev et al. Feb 2013 B2
8375444 Aziz et al. Feb 2013 B2
8381299 Stolfo et al. Feb 2013 B2
8402529 Green et al. Mar 2013 B1
8464340 Ahn et al. Jun 2013 B2
8479174 Chiriac Jul 2013 B2
8479276 Vaystikh et al. Jul 2013 B1
8479291 Bodke Jul 2013 B1
8510827 Leake et al. Aug 2013 B1
8510828 Guo et al. Aug 2013 B1
8510842 Amit et al. Aug 2013 B2
8516478 Edwards et al. Aug 2013 B1
8516590 Ranadive et al. Aug 2013 B1
8516593 Aziz Aug 2013 B2
8522348 Chen et al. Aug 2013 B2
8528086 Aziz Sep 2013 B1
8533824 Hutton et al. Sep 2013 B2
8539582 Aziz et al. Sep 2013 B1
8549638 Aziz Oct 2013 B2
8555391 Demir et al. Oct 2013 B1
8561177 Aziz et al. Oct 2013 B1
8566946 Aziz et al. Oct 2013 B1
8584094 Dadhia et al. Nov 2013 B2
8584234 Sobel et al. Nov 2013 B1
8584239 Aziz et al. Nov 2013 B2
8595834 Xie et al. Nov 2013 B2
8627476 Satish et al. Jan 2014 B1
8635696 Aziz Jan 2014 B1
8682054 Xue et al. Mar 2014 B2
8682812 Ranjan Mar 2014 B1
8689333 Aziz Apr 2014 B2
8695096 Zhang Apr 2014 B1
8713631 Pavlyushchik Apr 2014 B1
8713681 Silberman et al. Apr 2014 B2
8726392 McCorkendale et al. May 2014 B1
8739280 Chess et al. May 2014 B2
8776229 Aziz Jul 2014 B1
8782792 Bodke Jul 2014 B1
8789172 Stolfo et al. Jul 2014 B2
8789178 Kejriwal et al. Jul 2014 B2
8793787 Ismael Jul 2014 B2
8805947 Kuzkin et al. Aug 2014 B1
8806647 Daswani et al. Aug 2014 B1
8832829 Manni et al. Sep 2014 B2
8850570 Ramzan Sep 2014 B1
8850571 Staniford et al. Sep 2014 B2
8881234 Narasimhan et al. Nov 2014 B2
8881282 Aziz et al. Nov 2014 B1
8898788 Aziz et al. Nov 2014 B1
8935779 Manni et al. Jan 2015 B2
8984638 Aziz et al. Mar 2015 B1
8990939 Staniford et al. Mar 2015 B2
8990944 Singh et al. Mar 2015 B1
8997219 Staniford et al. Mar 2015 B2
9009822 Ismael et al. Apr 2015 B1
9009823 Ismael et al. Apr 2015 B1
9027135 Aziz May 2015 B1
9071638 Aziz et al. Jun 2015 B1
9104867 Thioux et al. Aug 2015 B1
9106694 Aziz Aug 2015 B2
9118715 Staniford et al. Aug 2015 B2
9159035 Ismael et al. Oct 2015 B1
9165136 VanLund Oct 2015 B1
9171160 Vincent et al. Oct 2015 B2
9176843 Ismael et al. Nov 2015 B1
9189627 Islam Nov 2015 B1
9195829 Goradia et al. Nov 2015 B1
9197664 Aziz et al. Nov 2015 B1
9223972 Vincent et al. Dec 2015 B1
9225740 Ismael et al. Dec 2015 B1
9241010 Bennett et al. Jan 2016 B1
9251343 Vincent Feb 2016 B1
9262635 Paithane et al. Feb 2016 B2
9282109 Aziz et al. Mar 2016 B1
9294501 Mesdaq et al. Mar 2016 B2
9300686 Pidathala et al. Mar 2016 B2
9306960 Aziz Apr 2016 B1
9306974 Aziz et al. Apr 2016 B1
9311479 Manni et al. Apr 2016 B1
20010005889 Albrecht Jun 2001 A1
20010047326 Broadbent et al. Nov 2001 A1
20020018903 Kokubo et al. Feb 2002 A1
20020038430 Edwards et al. Mar 2002 A1
20020091819 Melchione et al. Jul 2002 A1
20020095607 Lin-Hendel Jul 2002 A1
20020116627 Tarbotton et al. Aug 2002 A1
20020129264 Rowland Sep 2002 A1
20020133586 Shanklin Sep 2002 A1
20020144156 Copeland Oct 2002 A1
20020162015 Tang Oct 2002 A1
20020166063 Lachman et al. Nov 2002 A1
20020169952 DiSanto et al. Nov 2002 A1
20020184528 Shevenell Dec 2002 A1
20020188887 Largman et al. Dec 2002 A1
20020194490 Halperin et al. Dec 2002 A1
20030021728 Sharpe et al. Jan 2003 A1
20030074578 Ford et al. Apr 2003 A1
20030084318 Schertz May 2003 A1
20030101381 Mateev et al. May 2003 A1
20030115483 Liang Jun 2003 A1
20030145232 Poletto Jul 2003 A1
20030177476 Sarma Sep 2003 A1
20030188190 Aaron et al. Oct 2003 A1
20030191957 Hypponen et al. Oct 2003 A1
20030200460 Morota et al. Oct 2003 A1
20030212902 van der Made Nov 2003 A1
20030229801 Kouznetsov et al. Dec 2003 A1
20030237000 Denton et al. Dec 2003 A1
20040003323 Bennett Jan 2004 A1
20040006473 Mills et al. Jan 2004 A1
20040015712 Szor Jan 2004 A1
20040019832 Arnold et al. Jan 2004 A1
20040047356 Bauer Mar 2004 A1
20040083408 Spiegel et al. Apr 2004 A1
20040088581 Brawn et al. May 2004 A1
20040093513 Cantrell et al. May 2004 A1
20040111531 Staniford et al. Jun 2004 A1
20040117478 Triulzi et al. Jun 2004 A1
20040117624 Brandt et al. Jun 2004 A1
20040128355 Chao et al. Jul 2004 A1
20040165588 Pandya Aug 2004 A1
20040236963 Danford et al. Nov 2004 A1
20040243349 Greifeneder et al. Dec 2004 A1
20040249911 Alkhatib et al. Dec 2004 A1
20040255161 Cavanaugh Dec 2004 A1
20040268147 Wiederin et al. Dec 2004 A1
20050005159 Oliphant Jan 2005 A1
20050021276 Southam Jan 2005 A1
20050021740 Bar et al. Jan 2005 A1
20050033960 Vialen et al. Feb 2005 A1
20050033989 Poletto et al. Feb 2005 A1
20050050148 Mohammadioun et al. Mar 2005 A1
20050086523 Zimmer et al. Apr 2005 A1
20050091513 Mitomo et al. Apr 2005 A1
20050091533 Omote et al. Apr 2005 A1
20050091652 Ross et al. Apr 2005 A1
20050108562 Khazan et al. May 2005 A1
20050114663 Cornell et al. May 2005 A1
20050125195 Brendel Jun 2005 A1
20050149726 Joshi et al. Jul 2005 A1
20050157662 Bingham et al. Jul 2005 A1
20050183143 Anderholm et al. Aug 2005 A1
20050201297 Peikari Sep 2005 A1
20050210533 Copeland et al. Sep 2005 A1
20050238005 Chen et al. Oct 2005 A1
20050240781 Gassoway Oct 2005 A1
20050262562 Gassoway Nov 2005 A1
20050265331 Stolfo Dec 2005 A1
20050268342 Shay Dec 2005 A1
20050283839 Cowburn Dec 2005 A1
20060005043 Hsueh Jan 2006 A1
20060010495 Cohen et al. Jan 2006 A1
20060015416 Hoffman et al. Jan 2006 A1
20060015715 Anderson Jan 2006 A1
20060015747 Van de Ven Jan 2006 A1
20060021029 Brickell et al. Jan 2006 A1
20060021054 Costa et al. Jan 2006 A1
20060031476 Mathes et al. Feb 2006 A1
20060047665 Neil Mar 2006 A1
20060070130 Costea et al. Mar 2006 A1
20060075496 Carpenter et al. Apr 2006 A1
20060095968 Portolani et al. May 2006 A1
20060101516 Sudaharan et al. May 2006 A1
20060101517 Banzhof et al. May 2006 A1
20060117385 Mester et al. Jun 2006 A1
20060123477 Raghavan et al. Jun 2006 A1
20060143709 Brooks et al. Jun 2006 A1
20060150249 Gassen et al. Jul 2006 A1
20060161983 Cothrell et al. Jul 2006 A1
20060161987 Levy-Yurista Jul 2006 A1
20060161989 Reshef et al. Jul 2006 A1
20060164199 Gilde et al. Jul 2006 A1
20060173992 Weber et al. Aug 2006 A1
20060179147 Tran et al. Aug 2006 A1
20060184632 Marino et al. Aug 2006 A1
20060191010 Benjamin Aug 2006 A1
20060221956 Narayan et al. Oct 2006 A1
20060236393 Kramer et al. Oct 2006 A1
20060242709 Seinfeld et al. Oct 2006 A1
20060248519 Jaeger et al. Nov 2006 A1
20060248582 Panjwani et al. Nov 2006 A1
20060251104 Koga Nov 2006 A1
20060288417 Bookbinder et al. Dec 2006 A1
20070006288 Mayfield et al. Jan 2007 A1
20070006313 Porras et al. Jan 2007 A1
20070011174 Takaragi et al. Jan 2007 A1
20070016951 Piccard et al. Jan 2007 A1
20070019286 Kikuchi Jan 2007 A1
20070033617 Bloebaum Feb 2007 A1
20070033645 Jones Feb 2007 A1
20070038943 FitzGerald et al. Feb 2007 A1
20070050848 Khalid Mar 2007 A1
20070064689 Shin et al. Mar 2007 A1
20070074169 Chess et al. Mar 2007 A1
20070094730 Bhikkaji et al. Apr 2007 A1
20070101435 Konanka et al. May 2007 A1
20070128855 Cho et al. Jun 2007 A1
20070142030 Sinha et al. Jun 2007 A1
20070143827 Nicodemus et al. Jun 2007 A1
20070156895 Vuong Jul 2007 A1
20070157180 Tillmann et al. Jul 2007 A1
20070157306 Elrod et al. Jul 2007 A1
20070168988 Eisner et al. Jul 2007 A1
20070171824 Ruello et al. Jul 2007 A1
20070174915 Gribble et al. Jul 2007 A1
20070192500 Lum Aug 2007 A1
20070192858 Lum Aug 2007 A1
20070198275 Malden et al. Aug 2007 A1
20070208822 Wang et al. Sep 2007 A1
20070220607 Sprosts et al. Sep 2007 A1
20070240218 Tuvell et al. Oct 2007 A1
20070240219 Tuvell et al. Oct 2007 A1
20070240220 Tuvell et al. Oct 2007 A1
20070240222 Tuvell et al. Oct 2007 A1
20070250627 May Oct 2007 A1
20070250930 Aziz et al. Oct 2007 A1
20070256132 Oliphant Nov 2007 A2
20070271446 Nakamura Nov 2007 A1
20080005782 Aziz Jan 2008 A1
20080018122 Zierler et al. Jan 2008 A1
20080028463 Dagon et al. Jan 2008 A1
20080040710 Chiriac Feb 2008 A1
20080046781 Childs et al. Feb 2008 A1
20080066179 Liu Mar 2008 A1
20080072326 Danford et al. Mar 2008 A1
20080077793 Tan et al. Mar 2008 A1
20080080518 Hoeflin et al. Apr 2008 A1
20080086720 Lekel Apr 2008 A1
20080098476 Syversen Apr 2008 A1
20080120722 Sima et al. May 2008 A1
20080134178 Fitzgerald et al. Jun 2008 A1
20080134334 Kim et al. Jun 2008 A1
20080141376 Clausen et al. Jun 2008 A1
20080181227 Todd Jul 2008 A1
20080184373 Traut et al. Jul 2008 A1
20080189787 Arnold et al. Aug 2008 A1
20080201778 Guo et al. Aug 2008 A1
20080209557 Herley et al. Aug 2008 A1
20080215742 Goldszmidt et al. Sep 2008 A1
20080222729 Chen et al. Sep 2008 A1
20080263665 Ma et al. Oct 2008 A1
20080295172 Bohacek Nov 2008 A1
20080301810 Lehane et al. Dec 2008 A1
20080307524 Singh et al. Dec 2008 A1
20080313738 Enderby Dec 2008 A1
20080320594 Jiang Dec 2008 A1
20090003317 Kasralikar et al. Jan 2009 A1
20090007100 Field et al. Jan 2009 A1
20090013408 Schipka Jan 2009 A1
20090031423 Liu et al. Jan 2009 A1
20090036111 Danford et al. Feb 2009 A1
20090037835 Goldman Feb 2009 A1
20090044024 Oberheide et al. Feb 2009 A1
20090044274 Budko et al. Feb 2009 A1
20090064332 Porras et al. Mar 2009 A1
20090077666 Chen et al. Mar 2009 A1
20090083369 Marmor Mar 2009 A1
20090083855 Apap et al. Mar 2009 A1
20090089879 Wang et al. Apr 2009 A1
20090094697 Provos et al. Apr 2009 A1
20090113425 Ports et al. Apr 2009 A1
20090125976 Wassermann et al. May 2009 A1
20090126015 Monastyrsky et al. May 2009 A1
20090126016 Sobko et al. May 2009 A1
20090133125 Choi et al. May 2009 A1
20090144823 Lamastra et al. Jun 2009 A1
20090158430 Borders Jun 2009 A1
20090172815 Gu et al. Jul 2009 A1
20090187992 Poston Jul 2009 A1
20090193293 Stolfo et al. Jul 2009 A1
20090199296 Xie et al. Aug 2009 A1
20090228233 Anderson et al. Sep 2009 A1
20090241187 Troyansky Sep 2009 A1
20090241190 Todd et al. Sep 2009 A1
20090265692 Godefroid et al. Oct 2009 A1
20090271867 Zhang Oct 2009 A1
20090300415 Zhang et al. Dec 2009 A1
20090300761 Park et al. Dec 2009 A1
20090328185 Berg et al. Dec 2009 A1
20090328213 Blake Dec 2009 A1
20090328221 Blumfield et al. Dec 2009 A1
20100005146 Drako et al. Jan 2010 A1
20100011205 McKenna Jan 2010 A1
20100017546 Poo et al. Jan 2010 A1
20100031353 Thomas et al. Feb 2010 A1
20100037314 Perdisci et al. Feb 2010 A1
20100043073 Kuwamura Feb 2010 A1
20100054278 Stolfo et al. Mar 2010 A1
20100058474 Hicks Mar 2010 A1
20100064044 Nonoyama Mar 2010 A1
20100077481 Polyakov et al. Mar 2010 A1
20100083376 Pereira et al. Apr 2010 A1
20100115621 Staniford et al. May 2010 A1
20100132038 Zaitsev May 2010 A1
20100154056 Smith et al. Jun 2010 A1
20100180344 Malyshev et al. Jul 2010 A1
20100192223 Ismael et al. Jul 2010 A1
20100220863 Dupaquis et al. Sep 2010 A1
20100235831 Dittmer Sep 2010 A1
20100251104 Massand Sep 2010 A1
20100281102 Chinta et al. Nov 2010 A1
20100281541 Stolfo et al. Nov 2010 A1
20100281542 Stolfo et al. Nov 2010 A1
20100287260 Peterson et al. Nov 2010 A1
20100299754 Amit et al. Nov 2010 A1
20100306173 Frank Dec 2010 A1
20110004737 Greenebaum Jan 2011 A1
20110025504 Lyon et al. Feb 2011 A1
20110041179 St Hlberg Feb 2011 A1
20110047594 Mahaffey et al. Feb 2011 A1
20110047620 Mahaffey et al. Feb 2011 A1
20110055907 Narasimhan et al. Mar 2011 A1
20110078794 Manni et al. Mar 2011 A1
20110093951 Aziz Apr 2011 A1
20110099620 Stavrou et al. Apr 2011 A1
20110099633 Aziz Apr 2011 A1
20110113231 Kaminsky May 2011 A1
20110145918 Jung et al. Jun 2011 A1
20110145920 Mahaffey et al. Jun 2011 A1
20110145934 Abramovici et al. Jun 2011 A1
20110167493 Song et al. Jul 2011 A1
20110167494 Bowen et al. Jul 2011 A1
20110173460 Ito et al. Jul 2011 A1
20110219449 St. Neitzel et al. Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225624 Sawhney et al. Sep 2011 A1
20110225655 Niemela et al. Sep 2011 A1
20110247072 Staniford et al. Oct 2011 A1
20110265182 Peinado et al. Oct 2011 A1
20110289582 Kejriwal et al. Nov 2011 A1
20110302587 Nishikawa et al. Dec 2011 A1
20110307954 Melnik et al. Dec 2011 A1
20110307955 Kaplan et al. Dec 2011 A1
20110307956 Yermakov et al. Dec 2011 A1
20110314546 Aziz et al. Dec 2011 A1
20120023593 Puder et al. Jan 2012 A1
20120054869 Yen et al. Mar 2012 A1
20120066698 Yanoo Mar 2012 A1
20120079596 Thomas et al. Mar 2012 A1
20120084859 Radinsky et al. Apr 2012 A1
20120110667 Zubrilin et al. May 2012 A1
20120117652 Manni et al. May 2012 A1
20120121154 Xue et al. May 2012 A1
20120124426 Maybee et al. May 2012 A1
20120174186 Aziz et al. Jul 2012 A1
20120174196 Bhogavilli et al. Jul 2012 A1
20120174218 McCoy et al. Jul 2012 A1
20120198279 Schroeder Aug 2012 A1
20120210423 Friedrichs et al. Aug 2012 A1
20120222121 Staniford et al. Aug 2012 A1
20120255015 Sahita et al. Oct 2012 A1
20120255017 Sallam Oct 2012 A1
20120260342 Dube et al. Oct 2012 A1
20120266244 Green et al. Oct 2012 A1
20120278886 Luna Nov 2012 A1
20120297489 Dequevy Nov 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20130014259 Gribble et al. Jan 2013 A1
20130036472 Aziz Feb 2013 A1
20130047257 Aziz Feb 2013 A1
20130074185 McDougal et al. Mar 2013 A1
20130086684 Mohler Apr 2013 A1
20130097699 Balupari et al. Apr 2013 A1
20130097706 Titonis et al. Apr 2013 A1
20130111587 Goel et al. May 2013 A1
20130117852 Stute May 2013 A1
20130117855 Kim et al. May 2013 A1
20130139264 Brinkley et al. May 2013 A1
20130160125 Likhachev et al. Jun 2013 A1
20130160127 Jeong et al. Jun 2013 A1
20130160130 Mendelev et al. Jun 2013 A1
20130160131 Madou et al. Jun 2013 A1
20130167236 Sick Jun 2013 A1
20130174214 Duncan Jul 2013 A1
20130185789 Hagiwara et al. Jul 2013 A1
20130185795 Winn et al. Jul 2013 A1
20130185798 Saunders et al. Jul 2013 A1
20130191915 Antonakakis et al. Jul 2013 A1
20130196649 Paddon et al. Aug 2013 A1
20130227691 Aziz et al. Aug 2013 A1
20130246370 Bartram et al. Sep 2013 A1
20130263260 Mahaffey et al. Oct 2013 A1
20130291109 Staniford et al. Oct 2013 A1
20130298243 Kumar et al. Nov 2013 A1
20140053260 Gupta et al. Feb 2014 A1
20140053261 Gupta et al. Feb 2014 A1
20140130158 Wang et al. May 2014 A1
20140137180 Lukacs et al. May 2014 A1
20140169762 Ryu Jun 2014 A1
20140179360 Jackson et al. Jun 2014 A1
20140328204 Klotsche et al. Nov 2014 A1
20140337836 Ismael Nov 2014 A1
20140351935 Shao et al. Nov 2014 A1
20150096025 Ismael Apr 2015 A1
Foreign Referenced Citations (11)
Number Date Country
2439806 Jan 2008 GB
2490431 Oct 2012 GB
0223805 Mar 2002 WO
0206928 Nov 2003 WO
2007117636 Oct 2007 WO
2008041950 Apr 2008 WO
2011084431 Jul 2011 WO
2011112348 Sep 2011 WO
2012075336 Jun 2012 WO
2012145066 Oct 2012 WO
2013067505 May 2013 WO
Non-Patent Literature Citations (79)
Entry
Simulating Realistic Network Worm Traffic for Worm Warning System Design and Testing, Liljenstram et al , Worm'03, Oct. 27, 2003, Washington DC, USA, ACM 1-58113-785-0/03/0010, pp. 1-10.
ReVirt: Enabling Intrusion analysis through virtual-machine logging and replay, W.Dunlap et al , 2002 sysmposium on operating systems design and Implementation(OSDI), pp. 1-14.
When Virtual is better than Real, M.Chen et al , Department of electrical engineering and comptuer science University of Michigan, pp. 1-6.
NetDetector Captures Intrusions, Info World media group Inc 1-2, Jul. 14, 2003, Issue 27, pp. 1-2.
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003).
“Packet”, Microsoft Computer Dictionary, Microsoft Press, (Mar. 2002), 1 page.
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.isp?reload=true&arnumbe- r=990073, (Dec. 7, 2013).
Aziz, Ashar, System and Method for Malware Containment, U.S. Appl. No. 14/620,060, filed Feb. 11, 2015, non-Final Office Action dated Apr. 3, 2015.
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184.
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77.
Boubalos, Chris , “Extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003).
Chaudet, C. , et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82.
Cisco, Configuring the Catalyst Switched Port Analyzer (SPAN) (“Cisco”), (1992-2003).
Costa, M. , et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005).
Crandall, J.R. , et al., “Minos:Control Data Attack Prevention Orthogonal to Memory Model”, 37th International Symposium on Microarchitecture, Portland, Oregon, (Dec. 2004).
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007).
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002).
Excerpt regarding First Printing Date for Merike Kaeo, Designing Network Security (“Kaeo”), (2005).
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003).
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286.
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”) (2003).
Krasnyansky, Max , et al., Universal TUN/TAP driver, available at https://www.kernel.org/doc/Documentation/networking/tuntap.txt (2002) (“Krasnyansky”).
Kreibich, C. , et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003).
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages.
Liljenstam, Michael , et al., “Simulating Realistic Network Traffic for Worm Warning System Design and Testing”, Institute for Security Technology studies, Dartmouth College (“Liljenstam”), (Oct. 27, 2003).
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001).
Margolis, P.E. , “Random House Webster's ‘Computer & Internet Dictionary 3rd Edition’”, ISBN 0375703519, (Dec. 1998).
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910.
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002).
Newsome, J. , et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), (Feb. 2005).
Newsome, J. , et al., “Polygraph: Automatically Generating Signatures for Polymorphic Worms”, In Proceedings of the IEEE Symposium on Security and Privacy, (May 2005).
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302.
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doom, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”).
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25.
Singh, S. , et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004).
Spitzner, Lance , “Honeypots: Tracking Hackers”, (“Spizner”), (Sep. 17, 2002).
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998).
U.S. Appl. No. 14/059,381, filed Oct. 21, 2013 Non-Final Office Action dated Oct. 29, 2014.
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003).
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages.
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9.
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108.
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003).
AltaVista Advanced Search Results. “Attack vector identifier”. Http://www.altavista.com/web/results?ltag=ody&pg=aq&aqmode=aqa=Event+Orch- estrator . . . , (Accessed on Sep. 15, 2009).
AltaVista Advanced Search Results. “Event Orchestrator”. Http://www.altavista.com/web/results?ltag=ody&pg=aq&aqmode=aqa=Event+Orch- esrator . . . , (Accessed on Sep. 3, 2009).
Apostolopoulos, George; hassapis, Constantinos; “V-eM: A cluster of Virtual Machines for Robust, Detailed, and High-Performance Network Emulation”, 14th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Sep. 11-14, 2006, pp. 117-126.
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006.
Baldi, Mario; Risso, Fulvio; “A Framework for Rapid Development and Portable Execution of Packet-Handling Applications”, 5th IEEE International Symposium Processing and Information Technology, Dec. 21, 2005, pp. 233-238.
Cisco “Intrusion Prevention for the Cisco ASA 5500-x Series” Data Sheet (2012).
Clark, John, Sylvian Leblanc,and Scott Knight. “Risks associated with usb hardware trojan devices used by insiders.” Systems Conference (SysCon), 2011 IEEE International. IEEE, 2011.
Cohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120.
Deutsch, P. , “Zlib compressed data format specification version 3.3” RFC 1950, (1996).
Filiol, Eric , et al., “Combinatorial Optimisation of Worm Propagation on an Unknown Network”, International Journal of Computer Science 2.2 (2007).
FireEye Malware Analysis & Exchange Network, Malware Protection System, FireEye Inc., 2010.
FireEye Malware Analysis, Modern Malware Forensics, FireEye Inc., 2010.
FireEye v.6.0 Security Target, pp. 1-35, Version 1.1, FireEye Inc., May 2011.
Gibler, Clint, et al. AndroidLeaks: automatically detecting potential privacy leaks in android applications on a large scale. Springer Berlin Heidelberg, 2012.
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28.
Gregg Keizer: “Microsoft's HoneyMonkeys Show Patching Windows Works”, Aug. 8, 2005, XP055143386, Retrieved from the Internet: URL:https://web.archive.org/web/20121022220617/http://www.informationweek- .com/microsofts-honeymonkeys-show-patching-wi/167600716 [retrieved on Sep. 29, 2014].
Heng Yin et al, Panorama: Capturing System-Wide Information Flow for Malware Detection and Analysis, Research Showcase @ CMU, Carnegie Mellon University, 2007.
Hjelmvik, Erik , “Passive Network Security Analysis with NetworkMiner”, (IN)Secure, Issue 18, (Oct. 2008), pp. 1-100.
Idika et al., A-Survey-of-Malware-Detection-Techniques, Feb. 2, 2007, Department of Computer Science, Purdue University.
IEEE Xplore Digital Library Sear Results for “detection of unknown computer worms”. Http//ieeexplore.ieee.org/searchresult.jsp?SortField=Score&SortOrder=desc- &ResultC . . . , (Accessed on Aug. 28, 2009).
Isohara, Takamasa, Keisuke Takemori, and Ayumu Kubota. “Kernel-based behavior analysis for android malware detection.” Computational intelligence and Security (CIS), 2011 Seventh International Conference on. IEEE, 2011.
Kevin A Roundy et al: “Hybrid Analysis and Control of Malware”, Sep. 15, 2010, Recent Advances in Intrusion Detection, Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 317-338, XP019150454 ISBN:978-3-642-15511-6.
Leading Colleges Select FireEye to Stop Malware-Related Data Breaches, FireEye Inc., 2009.
Li et al., A VMM-Based System Call Interposition Framework for Program Monitoring, Dec. 2010, IEEE 16th International Conference on Parallel and Distributed Systems, pp. 706-711.
Lindorfer, Martina, Clemens Kolbitsch, and Paolo Milani Comparetti. “Detecting environment-sensitive malware.” Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011.
Lok Kwong et al: “DroidScope: Seamlessly Reconstructing the OS and Dalvik Semantic Views for Dynamic Android Malware Analysis”, Aug. 10, 2012, XP055158513, Retrieved from the Internet: URL:https://www.usenix.org/system/files/conference/usenixsecurity12/sec12- -final107.pdf [retrieved on Dec. 15, 2014].
Morales, Jose A., et al., “Analyzing and exploiting network behaviors of malware.”, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34.
Mori, Detecting Unknown Computer Viruses, 2004, Springer-Verlag Berlin Heidelberg.
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987.
Oberheide et al., CloudAV.sub.--N-Version Antivirus in the Netowrk Cloud, 17th USENIX Security Symposium USENIX Security '08 Jul. 28-Aug. 1, 2008 San Jose, CA.
The Sniffers's Guide to Raw Traffic available at: yuba.stanford.edu/.about.casado/pcap/section1.html, (Jan. 6, 2014).
U.S. Pat. No. 8,171,553 filed Apr. 20, 2006, Inter Parties Review Decision dated Jul. 10, 2015.
U.S. Pat. No. 8,291,499 filed Mar. 16, 2012, Inter Parties Review Decision dated Jul. 10, 2015.
Wahid et al., Characterising the Evolution in Scanning Activity of Suspicious Hosts, Oct. 2009, Third International Conference on Network and System Security, pp. 344-350.
Yuhei Kawakoya et al: “Memory behavior-based automatic malware unpacking in stealth debugging environment”, Malicious and Unwanted Software (Malware), 2010 5th International Conference on, IEEE, Piscataway, NJ, USA, Oct. 19, 2010, pp. 39-46, XP031833827, ISBN:978-1-4244-8-9353-1.
Zhang et al., The Effects of Threading, Infection Time, and Multiple-Attacker Collaboration on Malware Propagation, Sep. 2009, IEEE 28th International Symposium on Reliable Distributed Systems, pp. 73-82.
Provisional Applications (3)
Number Date Country
60579910 Jun 2004 US
60579953 Jun 2004 US
60559198 Apr 2004 US
Continuations (2)
Number Date Country
Parent 14059381 Oct 2013 US
Child 14949771 US
Parent 11717475 Mar 2007 US
Child 14059381 US
Continuation in Parts (6)
Number Date Country
Parent 11494990 Jul 2006 US
Child 11717475 US
Parent 11471072 Jun 2006 US
Child 11494990 US
Parent 11409355 Apr 2006 US
Child 11471072 US
Parent 11152286 Jun 2005 US
Child 11409355 US
Parent 11151812 Jun 2005 US
Child 11152286 US
Parent 11096287 Mar 2005 US
Child 11151812 US