System and method for bot detection

Information

  • Patent Grant
  • 10587636
  • Patent Number
    10,587,636
  • Date Filed
    Monday, April 17, 2017
    7 years ago
  • Date Issued
    Tuesday, March 10, 2020
    4 years ago
Abstract
Exemplary systems and methods for detecting a communication channel of a bot. In exemplary embodiments, presence of a communication channel between a first network device and a second network device is detected. Data from the communication channel is scanned and used to determine if a suspected bot communication exists. If a bot communication is detected, then a recovery process may be initiated.
Description
BACKGROUND OF THE INVENTION
Field of the Invention

The present invention relates generally to network security and more particularly to detecting command and control communication channels of a bot.


Background Art

Presently, malicious software (i.e., malware) can attack various devices via a network. For example, malware may include any program or file that is harmful to a computer user, such as bots, computer viruses, worms, Trojan horses, spyware, or any programming that gathers information about a computer user or otherwise operates without permission. Various processes and devices have been employed to prevent the problems that malware can cause.


For example, computers often include antivirus scanning software that scans a particular client device for viruses. The scanning may be performed based on a schedule specified by a user associated with the particular computer, a system administrator, and so forth. Unfortunately, by the time a virus is detected by the scanning software, some damage on the particular computer may have already occurred.


Another option for preventing malware is a honey pot. A honey pot is a computer system on the Internet that is expressly set up to attract and “trap” an illicit user that attempts to penetrate another's computer system. The illicit user can include a hacker, a cracker, or a script kiddy, for example. The honey pot records the activities associated with the invasion of the computer system. Disadvantageously, as the honey pot is being invaded, so too are other users' computer systems on the same network. Thus, other users' computer systems may be harmed while the honey pot determines the nature of the malware invading the honey pot's own computer system.


In some instances, malware comprises a bot. A bot is a software robot configured to remotely control all or a portion of a digital device (e.g., a computer) without authorization by the digital device's user. Bot related activities include bot propagation and attacking other computers on a network. Bots commonly propagate by scanning nodes (e.g., computers or other digital devices) available on a network to search for a vulnerable target. When a vulnerable computer is scanned, the bot may install a copy of itself. Once installed, the new bot may continue to seek other computers on a network to infect.


A bot may also, without the authority of the infected computer user, establish a command and control communication channel to receive instructions. Bots may receive command and control communication from a centralized bot server or another infected computer (e.g., via a peer-to-peer (P2P) network established by a bot on the infected computer).


The bot may receive instructions to perform bot related activities. When a plurality of bots (i.e., a botnet) act together, the infected computers (i.e., zombies) can perform organized attacks against one or more computers on a network. In one example, bot infected computers may be directed to ping another computer on a network in a denial-of-service attack. In another example, upon receiving instructions, one or more bots may direct the infected computer to transmit spam across a network.


A bot may also receive instructions to transmit information regarding the infected host computer. In one example, the bot may be instructed to act as a keylogger and record keystrokes on the infected host computer. The bot may also be instructed to search for personal information and email addresses of other users contained in an email or contacts file. This information may be transmitted to one or more other infected computers or a user in command of the bot or botnet.


SUMMARY OF THE INVENTION

Systems and methods for detecting a command and control communication channel of a bot are provided. In exemplary embodiments, presence of a communication channel between a first network device and a second network device is detected.


Data from the communication channel is scanned and used to determine if a suspected bot communication exists. Several different methods may be utilized to detect a command and control (C&C) communication within the communication channel. In one embodiment, a fingerprint module may scan for a bot oriented command communications in an IRC channel. In one example, the fingerprint module scans for commands or messages that indicate that an IRC channel is being established. In an alternative embodiment, a port module may monitor for communications originating from a non-standard port. In a further embodiment, a virtual machine may be utilized to detect C&C communication channels either in a replay virtual machine environment or in a direct entry virtual machine environment. Accordingly, intercepted or replayed network data obtained from the communication channel is transmitted to the virtual machine, and the virtual machine response is then analyzed to determine if the virtual machine is infected. In some embodiments, an analysis environment may wait for an outbound domain name system (DNS) request, which may also identify the C&C channel. A pseudo-DNS server in the virtual machine can respond to the request with an IP address mapped to an internal-to-virtual machine-analysis pseudo-server. The outbound IRC or web request made to the supplied IP address may confirm the C&C channel.


If a bot communication is detected, then a recovery process may be initiated. In one embodiment, during the recovery process, the devices that are suspected as being infected may be flagged and/or proper users and administrators notified. For example, icons associated with nodes coupled to a network may be color coded based on their association with any infection propagation, command and control communication with a bot, and/or bot attack. In another embodiment, a router (i.e., switch) may be configured to direct all data from a bot server (e.g., from the source IP address of the bot server) to a controller. As a result, all the network data from the bot server, not only that which is transmitted to the network device, may be intercepted.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram of a channel detection environment in which embodiments of the present invention may be practiced.



FIG. 2 is a block diagram of an exemplary bot detector implementing some embodiments of the present invention.



FIG. 3 is a block diagram of an exemplary controller implementing some embodiments of the present invention.



FIG. 4 is a block diagram of an exemplary analysis environment, in accordance with some embodiments of the present invention.



FIG. 5 is a flowchart of an exemplary method for detecting a C&C channel of a bot.



FIG. 6 is a block diagram of the controller, in accordance with one embodiment of the present invention.





DESCRIPTION OF EXEMPLARY EMBODIMENTS

Exemplary systems and methods for detection of a command and control communication channel of a bot are provided. The bot running on a compromised device may be part of a plurality of software robots (e.g., a botnet) which run autonomously on a collection of compromised devices under a common command and control (C&C) infrastructure. In one example, a bot on the compromised device may open an Internet Relay Chat (IRC) channel with another device to receive commands. This IRC channel may be referred to as a C&C communication channel.


In some embodiments, the communication channel detection system may comprise a dynamic honey pot. A dynamic honey pot can monitor network traffic to detect the presence of a C&C communication channel. If a C&C channel or a suspected C&C channel is detected, then the network data from the C&C communication channel may be intercepted. In exemplary embodiments, the network traffic does not need to be directly transmitted to the dynamic honey pot. Rather, the dynamic honey pot can detect possible bot infection attempts or command and control communication with an existing bot on other devices on the network. Upon detection, the dynamic honey pot can then intercept future network data.


In exemplary embodiments, network data from a communication network may be copied and analyzed. If a C&C channel or a suspected C&C channel is detected, related network data may be intercepted. The intercepted network data may continue to be analyzed. If the intercepted network data comprises a network attack, command and control communication, and/or an attempt to propagate the bot, an unauthorized activity signature configured to identify the activity and/or bot may be generated.


The bot compromises one or more compromised devices which may send spam and malware, such as viruses, worms, or Trojan horses, for example. A virus is an intrusive program that infects a computer file by inserting a copy of itself in the file. The copy is usually executed when the file is loaded into memory, allowing the virus to infect other files. A worm is a program that propagates itself across multiple computers, usually by creating copies of itself in each computer's memory. A worm may duplicate itself in a computer so many times that it causes the computer to crash. A Trojan horse is a destructive program disguised as a game, utility, or application. When run by a user or computer program, a Trojan horse can harm the computer system while appearing to do something useful.


Malware may also include adware and spyware. Adware is a program configured to direct advertisements to a computer or a particular user. In one example, adware identifies the computer and/or the user to various websites visited by a browser on the computer. The website may then use the adware to either generate pop-up advertisements or otherwise direct specific advertisements to the user's browser. Spyware is a program configured to collect information regarding the user, the computer, and/or a user's network habits. In an example, spyware may collect information regarding the names and types of websites that the user browses and then transmit the information to another computer. Adware and spyware are often added to the user's computer after the user browses to a website that hosts the adware and/or spyware. The user is often unaware that these programs have been added and are similarly unaware of the adware and/or spyware's function.



FIG. 1 is a diagram of a channel detection environment 100 in which embodiments of the present invention may be practiced. The channel detection environment 100 may comprise a bot server 105 in communication via a communication network 110 with a network device 115. Additionally, a tap 120 may be coupled to the communication network 110. The tap 120 may be further coupled to a controller 125. Optionally, a router (not shown) may be provided for re-routing data from the communication network 110.


The bot server 105 and the network device 110 comprise digital devices. A digital device comprises any device with a processor. Some examples of digital devices include computers, servers, laptops, personal digital assistants, and cellular telephones. The bot server 105 is configured to transmit network data over the communication network 110 to the network device 115, which is configured to receive the network data. In some embodiments, the bot server 105 may establish a C&C communication channel with the network device 115 via the communication network 110. The C&C communication channel may be utilized by the bot server 105 to control a bot on the node or the node itself on the network device 115.


The bot server 105 may attempt to control the network device 115 by transmitting instructions or a bot to the network device 115. In one example, the bot server 105 is a computer controlled by an illicit user to control one or more bots or one or more network devices 115 through the use of bots. In another example, the bot server 105 is a network device similar to the network device 115; the bot server 105 may be a part of a P2P communication network for transmitting instructions to a bot on another digital device. In this example, once infected, the network device 115 may be a part of a P2P communication network whereby the network device 115 may transmit instructions to another network device similar to a bot server 105.


The tap 120 may comprise a digital data tap configured to monitor network data and provide a copy of the network data to the controller 125. In some embodiments, the tap 120 comprises a span port. The network data comprises signals and data that are transmitted over the communication network 110 including data flows from the bot server 105 to the network device 115. As discussed herein, the network data may include command and control instructions transmitted from the bot server 105. In one example, the tap 120 copies the network data without an appreciable decline in performance of the bot server 105, the network device 115, or the communication network 110. The tap 120 may copy any portion of the network data. For example, the tap 120 can receive and copy any number of data packets of the network data. In exemplary embodiments, the tap 120 can monitor and copy data transmitted from multiple devices without appreciably affecting the performance of the communication network 110 or the devices coupled to the communication network 110. In various embodiments, the tap 120 can sample the network data based on a sampling scheme.


The tap 120 can also capture metadata from the network data. The metadata can be associated with the bot server 105 and/or the network device 115. In one example, the metadata may identify the bot server 105 and/or the network device 110. In some embodiments, the bot server 105 transmits metadata, which is captured by the tap 120. In other embodiments, a heuristic module, described in more detail below, can detect the bot server 105 and/or the network device 110 by analyzing data packets within the network data and generate the metadata.


The communication network 110 may comprise a public computer network such as the Internet, a private computer network such as a wireless telecommunication network, wide area network, local area network, or any other type of network enabled to provide communications between coupled devices.


Although FIG. 1 depicts data transmitted from the bot server 105 to the network device 115, either device can transmit and receive data from the other device. Similarly, although only one bot server 105, communication network 110, network device 115, tap 120, and controller 125 are depicted in FIG. 1, there may be any number of bot servers 105, communication networks 110, network devices 115, taps 120, and controllers 125.


The controller 125 may comprise a processor and/or software configured to receive and analyze network data for the presence of network data sent via the C&C communication channel. In exemplary embodiments, the controller 125 receives network data over the tap 120. If the controller 125 detects commands within network data that potentially establishes a C&C communication channel, the controller 125 may intercept the associated network data. In one example, the controller 125 may intercept network data from the same data flow as that which potentially established the C&C communication channel. In another example, the controller 125 may intercept all network data from a node on the communication network that either received or sent the commands (e.g., the bot server 105 and the network device 115). When network data is intercepted, the network data is no longer received by the intended recipient but rather is received by the controller 125. In some embodiments, the associated network data is intercepted when network data is flagged as suspicious.


In some embodiments, the controller 125 can organize the network data into one or more data flows. Data flows can then be reconstructed based on the network data samples received from the tap. The controller 125 is further discussed in more detail in connection with FIG. 3.



FIG. 2 is a block diagram of an exemplary bot detector 200 implementing some embodiments of the present invention. In various embodiments, the bot detector 200 may be coupled to or comprised within the controller 125. In other embodiments, the bot detector 200 is coupled to the communication network 110. In various embodiments, the bot detector 200 is software that is loaded on a digital device. For example, the bot detector 200 may be provided to a user for installation onto their LAN or a network device (e.g., network device 115).


The exemplary bot detector 200 may comprise a protocol fingerprint module 205, a protocol state description module 210, a port module 215, a signature module 220, and a tracking module 225. Alternative embodiments may comprise more, less, or functionally equivalent modules.


In various embodiments, the use of an IRC protocol is used for bot command and control. Therefore, detecting the existence or establishment of an IRC channel in the network may indicate a possible botnet C&C communication channel. In one embodiment, the protocol fingerprint module 205 is utilized to detect an IRC C&C channel. The exemplary protocol fingerprint module 205 may comprise input/output related behavior that uniquely identifies a protocol implementation (e.g., version number, feature, vendor, etc.). In some embodiments, a network trace may map routes between the bot server 105 and the network device 115.


In exemplary embodiments, network data is scanned to detect a bot oriented IRC command, such as .advscan and SCAN, to highlight IRC channels to a potential bot server 105. Stateful protocol fingerprinting analysis by the protocol fingerprint module 205 may be performed to detect bot oriented commands in the IRC channels. For example, instead of simply scanning for .advscan in an input stream, the protocol fingerprint module 205 may first look for an IRC channel establishment (e.g., JOIN and JOIN confirm commands), and then scan for an .advscan message.


In some embodiments, the protocol fingerprinting module 205 may be extensible to other protocols via protocol feature description using protocol state descriptions provided by the protocol state description module 210 and regular expressions. A description of the IRC protocol is made possible using this technique. For example, if the protocol state description module 210 determines that the protocol being used is IRC, the protocol fingerprint module 205 may be configured by the protocol state description module 210 to detect IRC commands.


In various embodiments, the signature module 220 provides signatures which identify bot related behavior, known bot servers 105, suspected bot servers, mechanisms to block attacks, mechanisms to block bot propagation, and/or mechanisms that remove bots from network devices 115. These signatures may be provided to the protocol fingerprint module 205 and/or the controller 125 to take corrective action.


By correlating infection propagation with C&C communication activity, a higher degree of confidence can be ascribed to a suspected bot list. For example, if traffic is observed on a suspected IRC C&C channel and immediately thereafter there is discovery of infection propagation from the IRC server (e.g., bot server 105 or network device 115) that provided the C&C communication, then all nodes that have communicated to the same IRC server are highly suspect. This broadens the visibility of infected systems from those that are observed actively propagating infections to systems that have not been observed actively propagating but have been in communication with a confirmed active bot server 105.


Furthermore, detection of a central C&C server allows authorities and system administrators to take the central C&C server offline and/or block communications with the central C&C server. Once the C&C server has been neutralized, the bots that may otherwise receive commands from the C&C server are no longer controlled and are, in some examples, unable to function.


However, because the bot server 105 may be easily neutralized by shutting down the central C&C server, botnets controlled using a Peer-to-Peer (P2P) communications protocol have been developed. Due to the distributed nature of P2P communication channel, it becomes much harder to shut down a P2P controlled botnet.


In various embodiments, detection of a P2P C&C channel may be performed by the port module 215 detecting communications on a seldom used (non-standard) port. During base-lining, standard well-known ports are marked. For example, all well known ports and services in a network environment may be categorized as “standard.” A standard list may be compiled and stored by the port module 215. In exemplary embodiments, the standard list may comprise all ports defined in Internet RFCs as well as ports and services used by standard versions of Windows and Linux.


In some embodiments, the port module 215 may “learn” standard ports through observation of network data on a communication network 110. In one example, the software on multiple network devices 115 may transmit and receive network data on a variety of ports. The network data is received by the controller 125 and the port module 215 may update the standard list based on the ports of the network devices 115 that receive and transmit data over a predetermined period of time.


Any port not on the standard list may be considered a non-standard port. In some embodiments, the port module 215 will mark a number of nodes communicating over a non-standard port over a predetermined period of time a P2P communications channel when the number of nodes is over some threshold (e.g., 3 or 4 nodes). In one example, the port module 215 will mark a potential P2P communications channel when four network devices 115 communicate with each other over a non-standard port within 4 seconds.


These nodes do not need to be communicating on the same port, as long as the ports are seldom used non-standard ports. For example, the port module 215 may detect P2P chains that use a different port for each leg of the chain. In some embodiments, the time difference between anomaly propagation in the chain may be assumed to be small (e.g., less than 10 seconds). This short time difference allows the tracking module 225 to track various nodes without running into resource constraint issues. In one example, the tracking module 225 identifies network devices 115 that communicate with other network devices 115 over the predetermined period of time. The port module 215 may identify those network devices 115 communicating over non-standard ports. Once the port module 215 detects a network device 115 communicating over a non-standard port, the port module 215 may check the tracking module 225 to determine if any other network device 115 has been communicating over non-standard ports.


In other embodiments, the tracking module 225, tracks the source and destination of at least some communications over the communication network 110. If a bot server 105 or a potential bot server is detected, the tracking module 225 can provide a list of network devices 115 in communication with the bot server 105 or the potential bot server. In one example, the tracking module 225 can provide a list of nodes in communication with a suspected bot server 105 over a predetermined period of time.


While existence of a P2P channel is not conclusive evidence of a botnet, network operators may benefit from notification of P2P communications on their networks. If a P2P communication can be correlated to infection propagation via one or more nodes of the P2P chain, then all nodes of the P2P network may become highly suspect as members of a P2P controlled botnet.


In exemplary embodiments of the present invention, systems may be marked in order to identify infections. For example, any nodes that are not associated with any infection propagation may be placed in a yellow category. These nodes (e.g., network devices 115) may be considered “nodes of interest.” Nodes in an IRC or P2P network where at least one of the nodes (e.g., in a chain of nodes) is observed propagating an infection may be placed in, for example, an orange category. Nodes that are observed to be actively propagating an infection may be placed in a red category. Any nodes that have not been categorized as yellow, orange, or red may be assigned to a green category. In various embodiments, icons associated with nodes may be colored and/or associated with a color category.



FIG. 3 is a block diagram of an exemplary controller 125 implementing embodiments of the present invention. The controller 125 may be any digital device or software that receives network data. The exemplary controller 125 may comprise bot detection components similar to the bot detector 200 of FIG. 2 including a protocol fingerprint module 305, a protocol state description module 310, and a tracking module 315. In this example, the functions of the tracking module 225 (FIG. 2) and the port module 215 (FIG. 2) are combined.


The controller 125 may further comprise a heuristic module 320, a scheduler 325, a fingerprint module 330, a virtual machine pool 335, an analysis environment 340, a signature module 345, and a policy engine 350. In some embodiments, the controller 125 comprises a tap which is further coupled to the communication network 110. In other embodiments, the controller 125 is coupled to an external tap 120 or may be directly coupled to the communication network 110.


The exemplary heuristic module 320 may receive a copy of network data from the communication network 110. The heuristic module 320 applies heuristics and/or probability analysis to determine if the network data may contain suspicious activity (such as bot related activity). In one example, the heuristic module 320 flags network data as suspicious. The network data can then be buffered and organized into a data flow. The data flow is then provided to the scheduler 325. In some embodiments, the network data is provided directly to the scheduler 325 without buffering or organizing the data flow.


The heuristic module 320 can perform any heuristic and/or probability analysis. In some embodiments, once a C&C communication channel has been detected or suspected, analysis may be performed to confirm and/or verify the C&C channel. Once the protocol fingerprint module 305 identifies a potential C&C communication channel, network data from the channel is forwarded to the scheduler 325.


In other embodiments, the heuristic module 320 performs a dark internet protocol (IP) heuristic. A dark IP heuristic can flag network data coming from a bot server 105 that has not previously been identified by the heuristic module 320. The dark IP heuristic can also flag network data going to an unassigned IP address. In an example, an attacker scans random IP addresses of a network to identify an active server or workstation. The dark IP heuristic can flag network data directed to an unassigned IP address.


The heuristic module 320 can also perform a dark port heuristic. A dark port heuristic can flag network data transmitted to an unassigned or unusual port address. Such network data transmitted to an unusual port can be indicative of a port scan by a worm, hacker, or bot. Further, the heuristic module 320 can flag network data from the bot server 105 or network device 115 that is significantly different than traditional data traffic transmitted by the bot server 105 or network device 115. For example, the heuristic module 320 can flag network data from the bot server 105 such as a laptop that begins to transmit network data that is common to a server.


The heuristic module 320 can retain data packets belonging to a particular data flow previously copied by the tap 120. In one example, the heuristic module 320 receives data packets from the tap 120 and stores the data packets within a buffer or other memory. Once the heuristic module 320 receives a predetermined number of data packets from a particular data flow, the heuristic module 320 performs the heuristics and/or probability analysis.


In some embodiments, the heuristic module 320 performs heuristic and/or probability analysis on a set of data packets belonging to a data flow 320 can then continue to receive new data packets belonging to the same data flow. Once a predetermined number of new data packets belonging to the same data flow are received, the heuristic and/or probability analysis can be performed upon the combination of buffered and new data packets to determine a likelihood of suspicious activity.


In some embodiments, an optional buffer receives the flagged network data from the heuristic module 320. The buffer can buffer and organize the flagged network data into one or more data flows before providing the one or more data flows to the scheduler 325. In various embodiments, the buffer can buffer network data and stall before providing the network data to the scheduler 325. In one example, the buffer stalls the network data to allow other components of the controller 125 time to complete functions or otherwise clear data congestion.


The scheduler 325 is a module that identifies the network device 115 to receive the copied network data and retrieves a virtual machine associated with the network device 115. A virtual machine may be software that is configured to mimic the performance of a device (e.g., the network device 115). The virtual machine can be retrieved from the virtual machine pool 335.


In some embodiments, the heuristic module 320 transmits the metadata identifying the network device 115 to receive the copied network data to the scheduler 325. In other embodiments, the scheduler 325 receives one or more data packets of the network data from the heuristic module 320 and analyzes the one or more data packets to identify the network device 115. In yet other embodiments, the metadata can be received from the tap 120.


The scheduler 325 can retrieve and configure the virtual machine to mimic pertinent performance characteristics of the network device 115. In one example, the scheduler 325 configures characteristics of the virtual machine to mimic only those features of the network device 115 that are affected by the network data copied by the tap 120. The scheduler 325 can determine the features of the network device 115 that are affected by the network data by receiving and analyzing the network data from the tap 120. Such features of the network device 115 can include ports that are to receive the network data, select device drivers that are to respond to the network data and any other devices coupled to or contained within the network device 115 that can respond to the network data. In other embodiments, the heuristic module 320 can determine the features of the network device 115 that are affected by the network data by receiving and analyzing the network data from the tap 120. The heuristic module 320 can then transmit the features of the destination device to the scheduler 325.


The optional fingerprint module 330 is configured to determine the packet format of the network data to assist the scheduler 325 in the retrieval and/or configuration of the virtual machine. In one example, the fingerprint module 330 determines that the network data is based on a transmission control protocol/internet protocol (TCP/IP). Thereafter, the scheduler 325 will configure a virtual machine with the appropriate ports to receive TCP/IP packets. In another example, the fingerprint module 330 can configure a virtual machine with appropriate ports to receive user datagram protocol/internet protocol (UDP/IP) packets. The fingerprint module 330 can determine any type of packet format of the network data.


In other embodiments, the optional fingerprint module 330 passively determines a software profile of the network data to assist the scheduler 325 in the retrieval and/or configuration of the virtual machine. The software profile may comprise the operating system (e.g., Linux RH6.2) of the bot server 105 that generated the network data. The determination can be based on analysis of the protocol information of the network data. In an example, the fingerprint module 330 determines that the software profile of network data is Windows XP, SP1. The fingerprint module 330 can then configure a virtual machine with the appropriate ports and capabilities to receive the network data based on the software profile. In other examples, the fingerprint module 330 passes the software profile of the network data to the scheduler 325, and the scheduler 325 either selects or configures the virtual machine based on the profile.


The virtual machine pool 335 is configured to store virtual machines. The virtual machine pool 335 may include any storage capable of storing virtual machines. In one example, the virtual machine pool 335 stores a single virtual machine that can be configured by the scheduler 325 to mimic the performance of any network device, such as the network device 115 on the communication network 110. The virtual machine pool 335 can store any number of distinct virtual machines that can be configured to simulate the performance of any of the network devices 115.


The analysis environment 340 is a module that simulates transmission of unencrypted or decrypted network data between the bot server 105 and the network device 115 to identify the effects of malware or illegitimate computer users (e.g., a hacker, computer cracker, or other computer user) by analyzing the simulation of the effects of the network data upon the network device 115 that is carried out on the virtual machine. In exemplary embodiments, there may be multiple analysis environments 340 in order to simulate multiple network data.


In one example, the analysis environment 340 simulates transmission of the network data between the bot server 105 and the network device 115 to analyze the effects of the network data upon the network device 115 to detect unauthorized activity. As the analysis environment 340 simulates the transmission of the network data, behavior of the virtual machine can be closely monitored for unauthorized activity. If the virtual machine crashes, performs illegal operations, or performs bot related activity, the analysis environment 340 can react. In some embodiments, the analysis environment 340 performs dynamic taint analysis to identify unauthorized activity.


Once unauthorized activity is detected, the analysis environment 340 can generate the unauthorized activity signature configured to identify network data containing unauthorized activity (e.g., malware attacks or bot related activity). Since the unauthorized activity signature does not necessarily require probabilistic analysis to detect unauthorized activity within network data, unauthorized activity detection based on the unauthorized activity signature may be very fast and save computing time.


In various embodiments, the unauthorized activity signature may provide code that may be used to eliminate or “patch” portions of network data containing an attack. Further, in some embodiments, the unauthorized activity signature may be used to identify and eliminate (i.e., delete) the malware causing the attack. The unauthorized activity signature may also be used to configure digital devices to eliminate vulnerabilities (e.g., correct system settings such as disabling active-x controls in a browser or updating an operating system.)


The analysis environment 340 may store the unauthorized activity signature within the signature module 345. The analysis environment 340 may also transmit or command the transmission of the unauthorized activity signature to one or more other controllers 125, bot detectors 200 (e.g., to the signature module 220), network devices 115, switches, and/or servers. By automatically storing and transmitting the unauthorized activity signature, known malware, previously unidentified malware, and the activities of illicit computer users can be quickly controlled and reduced before a computer system is damaged or compromised. The analysis environment 340 is further discussed with respect to FIG. 4.


The signature module 345 receives, authenticates, and stores unauthorized activity signatures. The unauthorized activity signatures may be generated by the analysis environment 340 or another controller 125. The unauthorized activity signatures may then be transmitted to the signature module 345 of one or more controllers 125.


The policy engine 350 coupled to the heuristic module 320 and is a module that can identify network data as suspicious based upon policies contained within the policy engine 350. In one example, the network device 115 can be a computer designed to attract hackers and/or worms (e.g., a “honey pot”). The policy engine 350 can contain a policy to flag any network data directed to the “honey pot” as suspicious since the “honey pot” should not be receiving any legitimate network data. In another example, the policy engine 350 can contain a policy to flag network data directed to any network device 115 that contains highly sensitive or “mission critical” information.


The policy engine 350 can also dynamically apply a rule to copy all network data related to network data already flagged by the heuristic module 320. In one example, the heuristic module 320 flags a single packet of network data as suspicious. The policy engine 350 then applies a rule to flag all data related to the single packet (e.g., associated data flows) as suspicious. In some embodiments, the policy engine 350 flags network data related to suspicious network data until the analysis environment 340 determines that the network data flagged as suspicious is related to unauthorized activity.


The policy engine 350 may scan network data to detect unauthorized activity based upon an unauthorized activity signature. In some embodiments, the policy engine 350 retrieves the unauthorized activity signature from the signature module 345. The network data is then scanned for unauthorized activity based on the unauthorized activity signature.


The policy engine 350 can scan both the header and body of a packet of network data. In some embodiments, the policy engine 350 scans only the header of the packet for unauthorized activity based on the unauthorized activity signature. If unauthorized activity is found, then no further scanning may be performed. In other embodiments, the policy engine 350 scans the packet contents for unauthorized activity.


Unauthorized activity may be found by scanning only the header of a packet, the contents of the packet, or both the header and the contents of the packet. As a result, unauthorized activity that might otherwise evade discovery can be detected. In one example, evidence of unauthorized activity may be located within the contents of the packet. By scanning only the contents of the packet, unauthorized activity may be detected.


If the packet contents or the packet header indicate that the network data contains unauthorized activity, then the policy engine 350, the protocol fingerprint module 305, the heuristic module 320, or the signature module 345 may take action. In one example, the policy engine 350 may generate a rule or command an interceptor module (not shown) to intercept network data from the node that transmitted the network data and delete or bar the packet from the communication network 110. The policy engine 350 and/or the interceptor module may also quarantine, delete, or bar other packets belonging to the same data flow as the unauthorized activity packet.


Based on a determination that the network data is suspicious, the interceptor module can re-route the associated network data to a virtual machine from the virtual machine pool 335. As discussed herein, the heuristic module 320 can provide information that the network data is suspicious. The interceptor module can intercept all of the network data that is initially flagged by the heuristic module 320. The interceptor module can also base the interception of data on the detection of a malware attack by the analysis environment 340 or a policy or signature by the policy engine 350.


The interceptor module can provide the intercepted data to the heuristic module 320 for analysis with a heuristic or to the analysis environment 340 to orchestrate the transmission of the intercepted data to detect a malware attack. If no malware attack is detected, the interceptor module can transmit some or all of the intercepted data to the intended recipient (e.g., network device 115.) If a malware attack is detected within the intercepted data, the unauthorized activity signature may be generated by the signature module 345 and transmitted to one or more controllers 125 or other digital devices.


The interceptor module can redirect network data from the bot server 105 in any number of ways including, but not limited to, configuring a switch, Address Resolution Protocol (ARP) manipulation, or DHCP services.


The interceptor module may send a request to a switch to redirect network data from any bot server 105 to the controller 125. The switch includes any device configured to receive and direct network data between one or more digital devices. Examples of a switch include, but is not limited to, a router, gateway, bridge, and, or server.


In some embodiments, executable code is loaded onto the switch. In one example, the executable code configures the switch to direct network data from any bot server 105 to the controller 125. In another example, the executable code allows the interceptor module to transmit a request to the switch to direct network data from the bot server 105 to the controller 125. In some embodiments, the interceptor module configures the router to intercept network data from the bot server 105 for a predetermined time. The predetermined time may be set by the interceptor module, preloaded into the switch, or configured by a user.


The interceptor module may manipulate dynamic host configuration protocol (DHCP) services to intercept network data. As the bot server 105 transmits network data that is flagged as suspicious or otherwise identified as containing a malware attack. The interceptor module may manipulate DHCP services to assign new IP addresses, associate the controller 125 MAC address with the IP address of the network device 115, or otherwise redirect network data from the bot server 105 to the controller 125.


In various embodiments, the interceptor module can manipulate the DHCP server to configure the bot server 105 with a gateway IP address which is the same as the controller's IP address to send all network data to the controller 125. In other embodiments, the interceptor module may perform DHCP services for the communication network 110 as a DHCP server.


In one example of ARP manipulation, the heuristic module 320 or the interceptor module scans the copied network data flagged as suspicious to identify a source IP address and a target IP address. In this example, the source IP address is the IP address of the bot server 105 and the target IP address is the IP address of the network device 115. In some embodiments, the interceptor module may send an ARP reply to the bot server 105. The ARP reply is configured to identify the MAC address of the controller 125 with the IP address of the network device 115. When the bot server 105 receives the ARP reply, the bot server 105 may begin to send network data intended for the destination device to the controller 125.


In other embodiments, a policy within the policy engine 350 may indicate which IP addresses are bot servers 105. Whenever a bot server 105 sends network data for the first time to a network device 115, the bot server 105 may transmit an ARP request. The network data identifying the source IP address is copied by the tap 120 and the policy within the policy engine 350 can flag the source IP address as a bot server 105. Thereafter, the interceptor module may store the ARP request, and provide the controller 125 MAC address in an ARP reply to the switch and/or the bot server 105. Once the switch and/or the bot server 105 receives the controller 125 MAC address in the ARP reply, the IP address of the digital device (e.g., network device 115) will be associated with the controller 125 MAC address (e.g., in memory storage or cache). Network data intended for the network device 115 may then be transmit from the bot server 105 to the controller 125.


The bot server 105 may send the network data to any number of digital devices. Before the attack can proceed, the bot server 105 may send a separate ARP request for the IP address of every other digital device the malware wishes to send data to. The controller 125 detects and responds to each ARP request by sending an ARP reply to each request with the controller 125 MAC address. The controller 125 MAC address may be associated with the IP address of the other digital devices on a table within the bot server 105, switch, and/or server (not depicted). The table may be within memory, storage, buffered, and/or cached. As a result, network data transmitted by the bot server 105 to multiple network devices 115 may be intercepted by the controller 125.


Once the network data is intercepted, the network data is re-routed to the virtual machine, as discussed herein. Because the network data is re-routed, the actual machine or the network device 115 for which the network data is intended may not receive the network data and is, as a result, unaffected. A plurality of the network data can be re-routed to more than one virtual machine at one time (e.g., in parallel.) Thus, if the network data intended for a plurality of the network devices 115 is flagged as suspicious, or as coming from the device that has previously been deemed suspicious (e.g., the bot server 105), the interceptor module can select a plurality of virtual machines on which to test the suspicious network data.


The policy engine 350 may scan network data to detect unauthorized activity (e.g., including some bot related activity) based upon an unauthorized activity signature. In some embodiments, the policy engine 350 retrieves the unauthorized activity signature from the signature module 345. The network data is then scanned for unauthorized activity based on the unauthorized activity signature. The policy engine 350 can also flag network data as suspicious based on policies, as discussed herein.


Although FIG. 3 depicts various modules comprising the controller 125, fewer or more modules can comprise the controller 125 and still fall within the scope of various embodiments.



FIG. 4 is a block diagram of an exemplary analysis environment 340, in accordance with some embodiments of the present invention. The analysis environment 340 comprises a replayer 405, a virtual switch 410, and a virtual machine 415. The replayer 405 is a module that receives network data that has been flagged by the heuristic module 320 and replays the network data in the analysis environment 340. In some embodiments, the replayer 405 mimics the behavior of the infected bot server 105 in transmitting the flagged network data. There can be any number of replayers 405 simulating the transmission of network data between nodes on the communication network (e.g., the bot server 105 and the network device 115). In a further embodiment, the replayer 405 dynamically modifies session variables, as is appropriate, to emulate a “live” client or server of the protocol sequence being replayed. In one example, dynamic variables that may be dynamically substituted include dynamically assigned ports, transaction IDs, and any other variable that is dynamic to each protocol session. In other embodiments, the network data received from the heuristic module 205 is transmitted to the virtual machine 415 without a replayer 405.


The virtual switch 410 is a module that is capable of forwarding packets of flagged network data to the virtual machine 415. The virtual switch 410 simulates network device 115. The virtual switch 410 can route the data packets of the data flow to the correct ports of the virtual machine 415.


The virtual machine 415 is a representation of the network device 115 that can be provided to the analysis environment 340 by the scheduler 325. In one example, the scheduler 325 retrieves a virtual machine 415 from the virtual machine pool 335 and configures the virtual machine 415 to mimic the network device 115. The configured virtual machine 415 is then provided to the analysis environment 340 where it can receive flagged network data from the virtual switch 410.


As the analysis environment 340 simulates the transmission of the network data, behavior of the virtual machine 415 can be closely monitored for unauthorized activity. If the virtual machine 415 crashes, performs illegal operations, performs abnormally, or allows access of data to an unauthorized computer user, the analysis environment 340 can react.


In exemplary embodiments, virtual machines may be used to detect C&C channels and botnet infected systems using the C&C channels. C&C channel detection may occur in a replay virtual machine environment or in a direct entry virtual machine environment. While replay virtual analysis of virtual machines may be leveraged to extract C&C channel information, this may not be possible for all infection protocols. For infections protocols that can be replayed to result in a full bot infection, this technique may yield positive results. For infection protocols that do not go proceed to completion due to an inability to effectively replay unknown worms protocols, for example, the replay environment may not result in a full infection of the virtual machine 415. This may result in a denial of C&C channel information extraction, which will only become evident post-infection. In those instances, the analysis environment 340 may flag the devices involved in the suspected C&C channel as possibly infected with a bot and continue to track the nodes that communicate with those devices that participate within the suspected C&C channel.


Passive replay virtual machine environments may be effective for C&C channel discovery, since a passive worm may introduce no new worm protocol. Instead, a passive worm may merely piggyback on an existing protocol. Therefore, the existing passive worm replay may be adequate to detect a full bot infection. Passive replay of, for example, web based exploits may be extended to result in full infection and extraction of C&C channel information. Direct entry virtual machine environments are effective in extracting C&C channel information, since there is no need to replay an unknown worm protocol.


In some embodiments, the analysis environment 340 performs dynamic taint analysis to identify unauthorized activity. For a malware attack to change the execution of an otherwise legitimate program, the malware attack may cause a value that is normally derived from a trusted source to be derived from the user's own input. Program values (e.g., jump addresses and format strings) are traditionally supplied by a trusted program and not from external untrusted inputs. Malware, however, may attempt to exploit the program by overwriting these values.


In one example of dynamic taint analysis, all input data from untrusted or otherwise unknown sources are flagged. Program execution of programs with flagged input data is then monitored to track how the flagged data propagates (i.e., what other data becomes tainted) and to check when the flagged data is used in dangerous ways. For example, use of tainted data as jump addresses or format strings often indicates an exploit of a vulnerability such as a buffer overrun or format string vulnerability.


In some embodiments, the analysis environment 340 monitors and analyzes the behavior of the virtual machine 415 in order to determine a specific type of malware or the presence of an illicit computer user. The analysis environment 340 can also generate computer code configured to eliminate new viruses, worms, bots, or other malware. In various embodiments, the analysis environment 340 can generate computer code configured to identify data within the network data indicative of a malware attack, repair damage performed by malware, or the illicit computer user. By simulating the transmission of suspicious network data and analyzing the response of the virtual machine, the analysis environment 340 can identify known and previously unidentified malware and the activities of illicit computer users before a computer system is damaged or compromised.


Once the virtual machine is infected, via either replay or direct entry, the environment can wait for an outbound domain name system (DNS) request. The requested name in the DNS request is likely a C&C channel. A pseudo-DNS server in the virtual machine environment can respond to this request with an IP address mapped to an internal-to-virtual machine-analysis pseudo-server. If an outbound IRC or web request is made to the supplied IP address, then this confirms the existence of the C&C channel.


In some embodiments, all outbound DNS requires may be logged in a circular buffer (not shown). Once a C&C channel DNS name is identified, a search may be performed on all entries in the buffer for other source IP addresses that have requested the same DNS name. These source IP addresses are now highly suspect to be infected with the same bot or malware family that infected the virtual machine, even though these other IP addresses may not have been acknowledged as propagating an infection.


Once a C&C DNS name is discovered, the name may be communicated to all other devices as well as a cloud server. This allows other distributed devices to detect attempts to connect to the same C&C channel.



FIG. 5 is a flowchart 500 of an exemplary method for detecting a C&C channel of a bot. In step 505, the system (e.g., controller 125 and/or bot detector 200) determines if there is communication detected in the communication network 110. The determination may be performed, in accordance with some embodiments, by the tap 120, the bot detector 200, and/or the controller 125. If there is communication detected, then in step 510, data within the communication may be scanned. In one embodiment, the data may be copied. For example, the network data from the network device 115 to the bot server 105 may be copied by the tap 120. The network data is then sent from the tap 120 to the controller 125 for analysis. In an alternative embodiment, the data may be scanned directly by, for example, the bot detector 200.


In step 515, a bot communication analysis is performed. As discussed herein, the bot detector 200 or the controller 125 may utilizes various modules (e.g., protocol fingerprint module 305, heuristic module 320, and analysis environment 340) to determine whether the copied network data contains a possible bot communication or may otherwise be potentially harmful to the network device 115 for which the copied network data may be intended. Subsequently, suspicious nodes can be flagged. If the controller 125 or bot detector 200 does not identify the copied network data as possibly containing a possible bot communication, the network data may be transmitted to the intended destination (e.g., network device 115).


As described herein, several different methods may be utilized to detect a C&C communication within a channel on the communication network 110. In one embodiment, a fingerprint module 205 or 305 may scans for a bot oriented command in an IRC channel including IRC channel establishment commands or messages.


In an alternative embodiment, a port module 215 may monitor the use of non-standard ports. During base-lining a list of standard ports is compiled. Communications not originating from a standard port may be considered non-standard or an anomaly. As such, associated nodes may be flagged (e.g., color coded) and tracked.


In a further embodiment, a virtual machine may be utilized to detect C&C communication channels. The C&C communication channel detection may occur in a replay virtual machine environment or in a direct entry virtual machine environment. Accordingly, a virtual machine is retrieved which is used to mimic the network device 115. Intercepted or replayed network data obtained from the communication channel is transmitted to the virtual machine. The virtual machine response is then analyzed to determine if the virtual machine is infected. In some embodiments, the analysis environment 340 may wait for an outbound domain name system (DNS) request, which likely identifies the C&C channel. A pseudo-DNS server in the virtual machine can respond to the request with an IP mapped to an internal-to-virtual machine-analysis pseudo-server. If the outbound IRC or web request is made to the supplied IP address, then this confirms a C&C channel.


If a suspected bot communication is detected in step 520, then a recovery process may be performed in step 525. In one embodiment, the associated devices may be flagged and/or proper users and administrators notified. For example, any nodes that are not associated with any infection propagation may be placed in a yellow category. Nodes in an IRC or P2P network where at least one of the nodes (e.g., in a chain of nodes) is observed propagating an infection may be placed in, for example, an orange category. Nodes that are observed to be actively propagating an infection may be placed in a red category. Any nodes that have not been categorized as yellow, orange, or red may be assigned to a green category.


In another embodiment, a router (i.e., switch) may be configured to direct all data received from the bot server 105 (e.g., from the source IP address of the bot server 105) to the controller 125. As a result, all the network data from the bot server 105, not only that which is transmitted to the network device 115, may be intercepted.



FIG. 6 is a block diagram of the controller 125, in accordance with one embodiment of the present invention. The controller 125 comprises a processor 600, a memory system 605, a storage system 610, an I/O interface 615, and a communication network interface 620 which are all coupled to a system bus 625. The processor 600 is configured to execute executable instructions. In some embodiments, the processor 600 comprises circuitry or any one or more processors capable of processing the executable instructions.


The memory system 605 is any memory configured to store data. Some examples of the memory system 605 include storage devices, such as RAM or ROM.


The storage system 610 is any storage configured to retrieve and store data (e.g., a computer readable medium). Some examples of the storage system 610 are flash drives, hard drives, optical drives, and/or magnetic tape. The storage system 610 can comprise a database or other data structure configured to hold and organize data (e.g., network data, copies of network data, buffered data.) In some embodiments, the controller 125 includes memory in the form of RAM and storage in the form of flash data. The memory system 605 and/or the storage system 610 can comprise cache and buffers configured to retain network data or copies of network data.


The input/output (I/O) interface 615 is any device that can receive input and provide output to a user. The I/O interface 615 can be, but is not limited to, a keyboard, a mouse, a touchscreen, a keypad, a biosensor, or floppy disk drive.


The communication network interface 620 can be coupled to any user device via the link 630 through link 635. The communication network interface 620 may support communication over a USB connection, a firewire connection, an Ethernet connection, a serial connection, a parallel connection, or an ATA connection. The communication network interface 620 may also support wireless communication (e.g., 802.11 a/b/g/n or wireless USB). It will be apparent to those skilled in the art that the communication network interface 620 can support many wired and wireless standards.


Although only two links (630 and 635) are depicted in FIG. 6, there may be any number of links. In various embodiments, there may be one link 630 used by the tap 120 to transparently copy network data from the communication network 110. The other links may be used by the controller 125 to intercept data from one or more bot server 105 in parallel. In one example, the controller 125 comprises multiple IP addresses that may be broadcast from different links. Network data may be intercepted from different infected devices 105 by different links.


The above-described modules can be comprised of instructions that are stored on storage media (e.g., computer readable media). The instructions can be retrieved and executed by a processor (e.g., the processor 600). Some examples of instructions include software, program code, and firmware. Some examples of storage media comprise memory devices and integrated circuits. The instructions are operational when executed by the processor to direct the processor to operate in accordance with embodiments of the present invention. Those skilled in the art are familiar with instructions, processor(s), and storage media.


While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments.

Claims
  • 1. A method for detecting a communication channel of a bot in a network, comprising: analyzing a portion of network data being transmitted over the network;configuring a module within a controller to determine a communication protocol being used in a transmission of the network data over a communication channel;responsive to detecting the communication channel using the communication protocol, processing at least the portion of the network data within a first virtual machine to determine whether a bot communication exists by at least determining whether the portion of the network data includes a plurality of commands in a particular sequence that, according to the determined communication protocol, tend to be associated with the bot communication; andperforming a recovery process when the bot communication is detected, the recovery process including, determining one or more network devices that participated in communications using the communication channel operating as a command and control communication channel.
  • 2. The method of claim 1 further comprising detecting a presence of the communication channel in the network between a first network device and a second network device of the one or more network devices.
  • 3. The method of claim 1, wherein the processing of at least the portion of the network data within the first virtual machine to determine whether the bot communication exists further includes determining that the portion of the network data originated from a non-standard port.
  • 4. The method of claim 3, wherein prior to the processing of at least the portion of the network data within a first virtual machine, the method includes determining whether the portion of network data associated with the communication channel is associated with the suspicious activity, the suspicious activity including bot related activities that comprise at least one of: installing a copy of the bot on a third network device, establishing a command and control communication channel with the third network device, receiving one or more instructions from a bot-infected network device, transmitting a ping message to the third network device in a denial-of-service attack, receiving instructions to act as a keylogger, or transmitting personal information or email addresses over the network.
  • 5. The method of claim 4, wherein the third network device is different than both of the first network device and the second network device.
  • 6. The method of claim 1, wherein the processing of at least the portion of the network data includes analyzing content of at least the portion of the network data for a command of the plurality of commands directed to establishment of the communication channel being an Internet Relay Chat (IRC) channel.
  • 7. The method of claim 1, wherein the plurality of commands includes a first command to establish the communication channel followed by a scan command.
  • 8. The method of claim 7, wherein the first command establishes the communication channel being an Internet Relay Chat (IRC) channel.
  • 9. The method of claim 1, wherein the performing of the recovery process further includes identifying the one or more network devices that participated in communications using the communication channel operating as the command and control communication channel.
  • 10. The method of claim 9, wherein the identifying of the one or more network devices comprises adjusting a color of an icon associated with the one or more network devices.
  • 11. The method claim 1, wherein the determining whether the portion of the network data includes the plurality of commands in the particular sequence comprises scanning for commands that indicate a particular protocol associated with a command and control (C&C) channel is being established.
  • 12. The method claim 11, wherein the plurality of commands in the particular sequence include a JOIN command followed by a SCAN command.
  • 13. The method of claim 1, wherein the plurality of commands that tend to establish a command and control (C&C) communication channel.
  • 14. A controller comprising: one or more processors; anda storage device communicatively coupled to the one or more processors, the storage device including: a first software module that, when executed by the one or more processors, to detect network data transmitted between a first network device and a second network device over a network,a second software module that, when executed by the one or more processors, to (i) determine a communication protocol being used in a transmission of the network data over a communication channel of the network, and (ii) scan at least a portion of the network data for a plurality of commands in a particular sequence that, according to the determined communication protocol, tend to be associated with a bot communication, the scan including analyzing content of at least the portion of the network data for bot related activities including analyzing for the plurality of commands, including a scan command utilized by a bot to gather information from a targeted source and transfer to a third network device over the communication channel;a third software module including a plurality of virtual machines, wherein each of the plurality of virtual machines is in communication with the second software module, wherein at least a first virtual machine of the plurality of virtual machines (1) receives at least the portion of the network data, and (2) processes at least the portion of the network data to detect a bot related activity, and wherein responsive to detection of the bot related activity, the third software module generates an activity signature based on at least the detected bot related activity, anda signature module to store generated activity signatures.
  • 15. The controller of claim 14, wherein the second software module being further configured to analyzing whether the portion of the network data originated from a non-standard port.
  • 16. The controller of claim 14, wherein second software module that, when executed by the one or more processors, is further configured to scan at least the portion of the network data for suspicious activity including one or more bot related activities by at least installing a copy of the bot on the third network device, establishing a command and control communication channel with the third network device, transmitting a ping message to the third network device in a denial-of-service attack, or receiving instructions to act as a keylogger.
  • 17. The controller of claim 14, wherein the third network device is different than both of the first network device and the second network device.
  • 18. The controller of claim 14, wherein the scan by the second software module includes analyzing content of at least the portion of the network data for a JOIN command directed to establishing the communication channel between the first network device and the second network device via the network.
  • 19. The controller of claim 14, further comprising: a heuristic module that identifies a bot server, the first network device or the second network device by analyzing one or more data packets of the network data.
  • 20. The controller of claim 19, wherein the first virtual machine is selected by the third software module based on one or more characteristics of one of the first network device or the second network based on identification by the heuristic module.
  • 21. The controller of claim 14, further comprising: a fourth software module that, responsive to detection of the suspicious activities during scanning of at least the portion of the network data, performs a recovery process, wherein at least one of (i) the first network device, or (ii) the second network device are flagged as being associated with the suspicious activities.
  • 22. The controller of claim 14, wherein the second software scans at least the portion of the network data for the plurality of commands in the particular sequence by at least scanning for commands that indicate a particular protocol associated with a command and control (C&C) communication channel is being established.
  • 23. The controller of claim 14, wherein the plurality of commands in the particular sequence includes a JOIN command followed by a SCAN command.
  • 24. A controller comprising: one or more processors; anda storage system communicatively coupled with the one or more processors, the storage system includes a bot detection logic that, when executed by the one or more processors: (i) analyzes a portion of network data that permits control, via a network, of a first network device without authorization by a user of the first network device, (ii) configures a module within a controller that determines a communication protocol being used in a transmission of the network data over a communication channel, (iii) provides at least a portion of network data associated with the communication channel to a first virtual machine, and (iv) analyzes operations of the first virtual machine based on processing of at least the portion of the network data including (a) a plurality of commands in a particular sequence that, according to the communication protocol as determined by the controller, are part of the portion of the network data and tend to be associated with a bot communication and (b) the network data originated from a non-standard port.
  • 25. The controller of claim 24, the bot detection logic that, when executed by the one or more processors: (v) generates and stores an activity signature based on analysis by the first virtual machine for use in subsequent analyses.
  • 26. The controller of claim 24, wherein detection of the communication channel that permits control of the first network device without authorization by the user of the network device includes detection of at least one or more bot related activities.
  • 27. The controller of claim 26, wherein the bot related activities include one or more activities from among a plurality of activities including: (i) bot propagation, (ii) initiating a malware attack on the first network device, (iii) installing a copy of the bot on the first network device, (iv) establishing a command and control (C&C) communication channel with the first network device, (v) receiving one or more instructions from a bot-infected network device, (vi) transmitting a ping message to the first network device in a denial-of-service attack, (vii) transmitting spam across the communication channel, (viii) receiving instructions to act as a keylogger, (ix) recording keystrokes, (x) receiving instructions to search the first network device for personal information or email addresses, and (xi) transmitting personal information or email addresses over the network.
  • 28. The controller of claim 24, wherein the bot detection logic analyzes the operations of the first virtual machine by at least analyzing content of at least the portion of the network data for a command of the plurality of commands directed to establishing the communication channel being an Internet Relay Chat (IRC) channel.
  • 29. The controller of claim 24, wherein the bot detection logic that, when executed by the one or more processors: identifies a bot server or the first network device by analyzing one or more data packets of the network data.
  • 30. The controller of claim 29, wherein the first virtual machine is selected based on one or more characteristics of the first network device based on identification via analysis of the one or more data packets of the network data.
  • 31. The controller of claim 24, wherein the bot detection logic that, when executed by the one or more processors: responsive to detection of the communication channel that permits control, via a network, of the first network device without authorization by the user of the first network device, performing a recovery process, wherein the first network device is flagged as being associated with bot related activities.
  • 32. The controller of claim 24, wherein the bot detection logic analyzes operations of the first virtual machine based on processing of at least the portion of the network data including the plurality of commands in the particular sequence by at least scanning for commands that indicate a particular protocol associated with a command and control (C&C) communication channel is being established.
  • 33. The controller of claim 24, wherein the plurality of commands in the particular sequence includes a JOIN command followed by a SCAN command.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/052,632 filed Oct. 11, 2013, now U.S. Pat. No. 9,628,498 issued on Apr. 18, 2017, which is a continuation of U.S. patent application Ser. No. 11/998,605, filed Nov. 30, 2007, now U.S. Pat. No. 8,561,177, which claims benefit of provisional patent application No. 60/868,323, filed Dec. 1, 2006, entitled, “Detecting Command & Control Communication Channels of Botnets”, and is a continuation-in-part of U.S. patent application Ser. No. 11/494,990, filed Jul. 28, 2006, entitled “Dynamic Signature Creation and Enforcement”, which is a continuation-in-part of U.S. patent application Ser. No. 11/471,072, filed Jun. 19, 2006, entitled “Virtual Machine with Dynamic Data Flow Analysis”, which is a continuation-in-part of U.S. patent application Ser. No. 11/409,355, filed Apr. 20, 2006, entitled “Heuristic Based Capture with Replay to Virtual Machine”, which is a continuation-in-part of U.S. patent application Ser. No. 11/096,287, filed Mar. 31, 2005, entitled “System and Method of Detecting Computer Worms”, and is a continuation-in-part of U.S. patent application Ser. No. 11/151,812, filed Jun. 13, 2005, entitled “System and Method of Containing Computer Worms,” and is a continuation-in-part of U.S. patent application Ser. No. 11/152,286, filed Jun. 13, 2005, entitled “Computer Worm Defense System and Method; U.S. patent application Ser. No. 11/096,287 claims the benefit of U.S. Provisional Application No. 60/559,198 filed on Apr. 1, 2004, U.S. patent application Ser. No. 11/151,812 claims the benefit of U.S. Provisional Application No. 60/579,953 filed on Jun. 14, 2004, and U.S. patent application Ser. No. 11/152,286 claims the benefit of U.S. Provisional Application No. 60/579,910 filed on Jun. 14, 2004, all of which are incorporated by reference herein. This application is related to U.S. patent application Ser. No. 11/998,750, filed on Nov. 30, 2007, and entitled “Systems and Methods for Detecting Encrypted Bot Command & Control Channels.” The above-referenced related patent application is also incorporated by reference herein.

US Referenced Citations (724)
Number Name Date Kind
4292580 Ott et al. Sep 1981 A
5175732 Hendel et al. Dec 1992 A
5319776 Hile et al. Jun 1994 A
5440723 Arnold et al. Aug 1995 A
5490249 Miller Feb 1996 A
5657473 Killean et al. Aug 1997 A
5802277 Cowlard Sep 1998 A
5842002 Schnurer et al. Nov 1998 A
5960170 Chen et al. Sep 1999 A
5978917 Chi Nov 1999 A
5983348 Ji Nov 1999 A
6088803 Tso et al. Jul 2000 A
6092194 Touboul Jul 2000 A
6094677 Capek et al. Jul 2000 A
6108799 Boulay et al. Aug 2000 A
6118382 Hibbs et al. Sep 2000 A
6154844 Touboul et al. Nov 2000 A
6269330 Cidon et al. Jul 2001 B1
6272641 Ji Aug 2001 B1
6279113 Vaidya Aug 2001 B1
6298445 Shostack et al. Oct 2001 B1
6357008 Nachenberg Mar 2002 B1
6417774 Hibbs et al. Jul 2002 B1
6424627 Sørhaug et al. Jul 2002 B1
6442696 Wray et al. Aug 2002 B1
6484315 Ziese Nov 2002 B1
6487666 Shanklin et al. Nov 2002 B1
6493756 O'Brien et al. Dec 2002 B1
6550012 Villa et al. Apr 2003 B1
6700497 Hibbs et al. Mar 2004 B2
6775657 Baker Aug 2004 B1
6831893 Ben Nun et al. Dec 2004 B1
6832367 Choi et al. Dec 2004 B1
6895550 Kanchirayappa et al. May 2005 B2
6898632 Gordy et al. May 2005 B2
6907396 Muttik et al. Jun 2005 B1
6941348 Petry et al. Sep 2005 B2
6971097 Wallman Nov 2005 B1
6981279 Arnold et al. Dec 2005 B1
6995665 Appelt et al. Feb 2006 B2
7007107 Ivchenko et al. Feb 2006 B1
7028179 Anderson et al. Apr 2006 B2
7043757 Hoefelmeyer et al. May 2006 B2
7058822 Edery et al. Jun 2006 B2
7069316 Gryaznov Jun 2006 B1
7080407 Zhao et al. Jul 2006 B1
7080408 Pak et al. Jul 2006 B1
7093002 Wolff et al. Aug 2006 B2
7093239 van der Made Aug 2006 B1
7096498 Judge Aug 2006 B2
7100201 Izatt Aug 2006 B2
7107617 Hursey et al. Sep 2006 B2
7159149 Spiegel et al. Jan 2007 B2
7213260 Judge May 2007 B2
7231667 Jordan Jun 2007 B2
7240364 Branscomb et al. Jul 2007 B1
7240368 Roesch et al. Jul 2007 B1
7243371 Kasper et al. Jul 2007 B1
7249175 Donaldson Jul 2007 B1
7287278 Liang Oct 2007 B2
7308716 Danford et al. Dec 2007 B2
7328453 Merkle, Jr. et al. Feb 2008 B2
7346486 Ivancic et al. Mar 2008 B2
7356736 Natvig Apr 2008 B2
7386888 Liang et al. Jun 2008 B2
7392542 Bucher Jun 2008 B2
7418729 Szor Aug 2008 B2
7428300 Drew et al. Sep 2008 B1
7441272 Durham et al. Oct 2008 B2
7448084 Apap et al. Nov 2008 B1
7458098 Judge et al. Nov 2008 B2
7464404 Carpenter et al. Dec 2008 B2
7464407 Nakae et al. Dec 2008 B2
7467408 O'Toole, Jr. Dec 2008 B1
7478428 Thomlinson Jan 2009 B1
7480773 Reed Jan 2009 B1
7487543 Arnold et al. Feb 2009 B2
7496960 Chen et al. Feb 2009 B1
7496961 Zimmer et al. Feb 2009 B2
7519990 Xie Apr 2009 B1
7523493 Liang et al. Apr 2009 B2
7530104 Thrower et al. May 2009 B1
7540025 Tzadikario May 2009 B2
7546638 Anderson et al. Jun 2009 B2
7565550 Liang et al. Jul 2009 B2
7568233 Szor et al. Jul 2009 B1
7584455 Ball Sep 2009 B2
7603715 Costa et al. Oct 2009 B2
7607171 Marsden et al. Oct 2009 B1
7639714 Stolfo et al. Dec 2009 B2
7644441 Schmid et al. Jan 2010 B2
7657419 van der Made Feb 2010 B2
7676841 Sobchuk Mar 2010 B2
7698548 Shelest et al. Apr 2010 B2
7707633 Danford et al. Apr 2010 B2
7712136 Sprosts et al. May 2010 B2
7730011 Deninger et al. Jun 2010 B1
7739740 Nachenberg et al. Jun 2010 B1
7779463 Stolfo et al. Aug 2010 B2
7784097 Stolfo et al. Aug 2010 B1
7832008 Kraemer Nov 2010 B1
7836502 Zhao et al. Nov 2010 B1
7849506 Dansey et al. Dec 2010 B1
7854007 Sprosts et al. Dec 2010 B2
7869073 Oshima Jan 2011 B2
7877803 Enstone et al. Jan 2011 B2
7904959 Sidiroglou et al. Mar 2011 B2
7908660 Bahl Mar 2011 B2
7930738 Petersen Apr 2011 B1
7937387 Frazier et al. May 2011 B2
7937761 Bennett May 2011 B1
7949849 Lowe et al. May 2011 B2
7996556 Raghavan et al. Aug 2011 B2
7996836 McCorkendale et al. Aug 2011 B1
7996904 Chiueh et al. Aug 2011 B1
7996905 Arnold et al. Aug 2011 B2
8006305 Aziz Aug 2011 B2
8010667 Zhang et al. Aug 2011 B2
8020206 Hubbard et al. Sep 2011 B2
8028338 Schneider et al. Sep 2011 B1
8042184 Batenin Oct 2011 B1
8045094 Teragawa Oct 2011 B2
8045458 Alperovitch et al. Oct 2011 B2
8069484 McMillan et al. Nov 2011 B2
8087086 Lai et al. Dec 2011 B1
8171553 Aziz et al. May 2012 B2
8176049 Deninger et al. May 2012 B2
8176480 Spertus May 2012 B1
8201246 Wu et al. Jun 2012 B1
8204984 Aziz et al. Jun 2012 B1
8214905 Doukhvalov et al. Jul 2012 B1
8220055 Kennedy Jul 2012 B1
8225288 Miller et al. Jul 2012 B2
8225373 Kraemer Jul 2012 B2
8233882 Rogel Jul 2012 B2
8234640 Fitzgerald et al. Jul 2012 B1
8234709 Viljoen et al. Jul 2012 B2
8239944 Nachenberg et al. Aug 2012 B1
8260914 Ranjan Sep 2012 B1
8266091 Gubin et al. Sep 2012 B1
8286251 Eker et al. Oct 2012 B2
8291499 Aziz et al. Oct 2012 B2
8307435 Mann et al. Nov 2012 B1
8307443 Wang et al. Nov 2012 B2
8312545 Tuvell et al. Nov 2012 B2
8321936 Green et al. Nov 2012 B1
8321941 Tuvell et al. Nov 2012 B2
8332571 Edwards, Sr. Dec 2012 B1
8365286 Poston Jan 2013 B2
8365297 Parshin et al. Jan 2013 B1
8370938 Daswani et al. Feb 2013 B1
8370939 Zaitsev et al. Feb 2013 B2
8375444 Aziz et al. Feb 2013 B2
8381299 Stolfo et al. Feb 2013 B2
8402529 Green et al. Mar 2013 B1
8464340 Ahn et al. Jun 2013 B2
8479174 Chiriac Jul 2013 B2
8479276 Vaystikh et al. Jul 2013 B1
8479291 Bodke Jul 2013 B1
8490190 Hernacki Jul 2013 B1
8510827 Leake et al. Aug 2013 B1
8510828 Guo et al. Aug 2013 B1
8510842 Amit et al. Aug 2013 B2
8516478 Edwards et al. Aug 2013 B1
8516590 Ranadive et al. Aug 2013 B1
8516593 Aziz Aug 2013 B2
8522348 Chen et al. Aug 2013 B2
8528086 Aziz Sep 2013 B1
8533824 Hutton et al. Sep 2013 B2
8539582 Aziz et al. Sep 2013 B1
8549638 Aziz Oct 2013 B2
8555391 Demir et al. Oct 2013 B1
8561177 Aziz et al. Oct 2013 B1
8566476 Shiffer et al. Oct 2013 B2
8566928 Dagon Oct 2013 B2
8566946 Aziz et al. Oct 2013 B1
8584094 Dadhia et al. Nov 2013 B2
8584234 Sobel et al. Nov 2013 B1
8584239 Aziz et al. Nov 2013 B2
8595834 Xie et al. Nov 2013 B2
8627476 Satish et al. Jan 2014 B1
8635696 Aziz Jan 2014 B1
8682054 Xue et al. Mar 2014 B2
8682812 Ranjan Mar 2014 B1
8689333 Aziz Apr 2014 B2
8695096 Zhang Apr 2014 B1
8713631 Pavlyushchik Apr 2014 B1
8713681 Silberman et al. Apr 2014 B2
8726392 McCorkendale et al. May 2014 B1
8739280 Chess et al. May 2014 B2
8776229 Aziz Jul 2014 B1
8782792 Bodke Jul 2014 B1
8789172 Stolfo et al. Jul 2014 B2
8789178 Kejriwal et al. Jul 2014 B2
8793278 Frazier et al. Jul 2014 B2
8793787 Ismael et al. Jul 2014 B2
8805947 Kuzkin et al. Aug 2014 B1
8806647 Daswani et al. Aug 2014 B1
8832829 Manni et al. Sep 2014 B2
8850570 Ramzan Sep 2014 B1
8850571 Staniford et al. Sep 2014 B2
8881234 Narasimhan et al. Nov 2014 B2
8881271 Butler, II Nov 2014 B2
8881282 Aziz et al. Nov 2014 B1
8898788 Aziz et al. Nov 2014 B1
8935779 Manni et al. Jan 2015 B2
8949257 Shiffer et al. Feb 2015 B2
8984638 Aziz et al. Mar 2015 B1
8990939 Staniford et al. Mar 2015 B2
8990944 Singh et al. Mar 2015 B1
8997219 Staniford et al. Mar 2015 B2
9009822 Ismael et al. Apr 2015 B1
9009823 Ismael et al. Apr 2015 B1
9027135 Aziz May 2015 B1
9071638 Aziz et al. Jun 2015 B1
9104867 Thioux et al. Aug 2015 B1
9106630 Frazier et al. Aug 2015 B2
9106694 Aziz et al. Aug 2015 B2
9118715 Staniford et al. Aug 2015 B2
9159035 Ismael et al. Oct 2015 B1
9171160 Vincent et al. Oct 2015 B2
9176843 Ismael et al. Nov 2015 B1
9189627 Islam Nov 2015 B1
9195829 Goradia et al. Nov 2015 B1
9197664 Aziz et al. Nov 2015 B1
9223972 Vincent et al. Dec 2015 B1
9225740 Ismael et al. Dec 2015 B1
9241010 Bennett et al. Jan 2016 B1
9251343 Vincent et al. Feb 2016 B1
9262635 Paithane et al. Feb 2016 B2
9268936 Butler Feb 2016 B2
9275229 LeMasters Mar 2016 B2
9282109 Aziz et al. Mar 2016 B1
9292686 Ismael et al. Mar 2016 B2
9294501 Mesdaq et al. Mar 2016 B2
9300686 Pidathala et al. Mar 2016 B2
9306960 Aziz Apr 2016 B1
9306974 Aziz et al. Apr 2016 B1
9311479 Manni et al. Apr 2016 B1
9355247 Thioux et al. May 2016 B1
9356944 Aziz May 2016 B1
9363280 Rivlin et al. Jun 2016 B1
9367681 Ismael et al. Jun 2016 B1
9398028 Karandikar et al. Jul 2016 B1
9413781 Cunningham et al. Aug 2016 B2
9426071 Caldejon et al. Aug 2016 B1
9430646 Mushtaq et al. Aug 2016 B1
9432389 Khalid et al. Aug 2016 B1
9438613 Paithane et al. Sep 2016 B1
9438622 Staniford et al. Sep 2016 B1
9438623 Thioux et al. Sep 2016 B1
9459901 Jung et al. Oct 2016 B2
9467460 Otvagin et al. Oct 2016 B1
9483644 Paithane et al. Nov 2016 B1
9495180 Ismael Nov 2016 B2
9497213 Thompson et al. Nov 2016 B2
9507935 Ismael et al. Nov 2016 B2
9516057 Aziz Dec 2016 B2
9519782 Aziz et al. Dec 2016 B2
9536091 Paithane et al. Jan 2017 B2
9537972 Edwards et al. Jan 2017 B1
9560059 Islam Jan 2017 B1
9565202 Kindlund et al. Feb 2017 B1
9591015 Amin et al. Mar 2017 B1
9591020 Aziz Mar 2017 B1
9594904 Jain et al. Mar 2017 B1
9594905 Ismael et al. Mar 2017 B1
9594912 Thioux et al. Mar 2017 B1
9609007 Rivlin et al. Mar 2017 B1
9626509 Khalid et al. Apr 2017 B1
9628498 Aziz et al. Apr 2017 B1
9628507 Haq et al. Apr 2017 B2
9633134 Ross Apr 2017 B2
9635039 Islam et al. Apr 2017 B1
9641546 Manni et al. May 2017 B1
9654485 Neumann May 2017 B1
9661009 Karandikar et al. May 2017 B1
9661018 Aziz May 2017 B1
9674298 Edwards et al. Jun 2017 B1
9680862 Ismael et al. Jun 2017 B2
9690606 Ha et al. Jun 2017 B1
9690933 Singh et al. Jun 2017 B1
9690935 Shiffer et al. Jun 2017 B2
9690936 Malik et al. Jun 2017 B1
9736179 Ismael Aug 2017 B2
9740857 Ismael et al. Aug 2017 B2
9747446 Pidathala et al. Aug 2017 B1
9756074 Aziz et al. Sep 2017 B2
9773112 Rathor et al. Sep 2017 B1
9781144 Otvagin et al. Oct 2017 B1
9787700 Amin et al. Oct 2017 B1
9787706 Otvagin et al. Oct 2017 B1
9792196 Ismael et al. Oct 2017 B1
9824209 Ismael et al. Nov 2017 B1
9824211 Wilson Nov 2017 B2
9824216 Khalid et al. Nov 2017 B1
9825976 Gomez et al. Nov 2017 B1
9825989 Mehra et al. Nov 2017 B1
9838408 Karandikar et al. Dec 2017 B1
9838411 Aziz Dec 2017 B1
9838416 Aziz Dec 2017 B1
9838417 Khalid et al. Dec 2017 B1
9846776 Paithane et al. Dec 2017 B1
9876701 Caldejon et al. Jan 2018 B1
9888016 Amin et al. Feb 2018 B1
9888019 Pidathala et al. Feb 2018 B1
9910988 Vincent et al. Mar 2018 B1
9912644 Cunningham Mar 2018 B2
9912681 Ismael et al. Mar 2018 B1
9912684 Aziz et al. Mar 2018 B1
9912691 Mesdaq et al. Mar 2018 B2
9912698 Thioux et al. Mar 2018 B1
9916440 Paithane et al. Mar 2018 B1
9921978 Chan et al. Mar 2018 B1
9934376 Ismael Apr 2018 B1
9934381 Kindlund et al. Apr 2018 B1
9946568 Ismael et al. Apr 2018 B1
9954890 Staniford et al. Apr 2018 B1
9973531 Thioux May 2018 B1
10002252 Ismael et al. Jun 2018 B2
10019338 Goradia et al. Jul 2018 B1
10019573 Silberman et al. Jul 2018 B2
10025691 Ismael et al. Jul 2018 B1
10025927 Khalid et al. Jul 2018 B1
10027689 Rathor et al. Jul 2018 B1
10027690 Aziz et al. Jul 2018 B2
10027696 Rivlin et al. Jul 2018 B1
10033747 Paithane et al. Jul 2018 B1
10033748 Cunningham et al. Jul 2018 B1
10033753 Islam et al. Jul 2018 B1
10033759 Kabra et al. Jul 2018 B1
10050998 Singh Aug 2018 B1
10068091 Aziz et al. Sep 2018 B1
10075455 Zafar et al. Sep 2018 B2
10083302 Paithane et al. Sep 2018 B1
10084813 Eyada Sep 2018 B2
10089461 Ha et al. Oct 2018 B1
10097573 Aziz Oct 2018 B1
10104102 Neumann Oct 2018 B1
10108446 Steinberg et al. Oct 2018 B1
10121000 Rivlin et al. Nov 2018 B1
10122746 Manni et al. Nov 2018 B1
10133863 Bu et al. Nov 2018 B2
10133866 Kumar et al. Nov 2018 B1
10146810 Shiffer et al. Dec 2018 B2
10148693 Singh et al. Dec 2018 B2
10165000 Aziz et al. Dec 2018 B1
10169585 Pilipenko et al. Jan 2019 B1
10176321 Abbasi et al. Jan 2019 B2
10181029 Ismael et al. Jan 2019 B1
10191861 Steinberg et al. Jan 2019 B1
10192052 Singh et al. Jan 2019 B1
10198574 Thioux et al. Feb 2019 B1
10200384 Mushtaq et al. Feb 2019 B1
10210329 Malik et al. Feb 2019 B1
10216927 Steinberg Feb 2019 B1
10218740 Mesdaq et al. Feb 2019 B1
10242185 Goradia Mar 2019 B1
20010005889 Albrecht Jun 2001 A1
20010047326 Broadbent et al. Nov 2001 A1
20020018903 Kokubo et al. Feb 2002 A1
20020038430 Edwards et al. Mar 2002 A1
20020091819 Melchione et al. Jul 2002 A1
20020095607 Lin-Hendel Jul 2002 A1
20020116627 Tarbotton et al. Aug 2002 A1
20020144156 Copeland Oct 2002 A1
20020162015 Tang Oct 2002 A1
20020166063 Lachman et al. Nov 2002 A1
20020169952 DiSanto et al. Nov 2002 A1
20020184528 Shevenell et al. Dec 2002 A1
20020188887 Largman et al. Dec 2002 A1
20020194490 Halperin et al. Dec 2002 A1
20030021728 Sharpe et al. Jan 2003 A1
20030074578 Ford et al. Apr 2003 A1
20030084318 Schertz May 2003 A1
20030101381 Mateev et al. May 2003 A1
20030115483 Liang Jun 2003 A1
20030188190 Aaron et al. Oct 2003 A1
20030191957 Hypponen et al. Oct 2003 A1
20030200460 Morota et al. Oct 2003 A1
20030212902 van der Made Nov 2003 A1
20030229801 Kouznetsov et al. Dec 2003 A1
20030237000 Denton et al. Dec 2003 A1
20040003323 Bennett et al. Jan 2004 A1
20040006473 Mills et al. Jan 2004 A1
20040015712 Szor Jan 2004 A1
20040019832 Arnold et al. Jan 2004 A1
20040047356 Bauer Mar 2004 A1
20040083408 Spiegel Apr 2004 A1
20040088581 Brawn et al. May 2004 A1
20040093513 Cantrell et al. May 2004 A1
20040111531 Staniford et al. Jun 2004 A1
20040117478 Triulzi et al. Jun 2004 A1
20040117624 Brandt et al. Jun 2004 A1
20040128355 Chao et al. Jul 2004 A1
20040165588 Pandya Aug 2004 A1
20040236963 Danford et al. Nov 2004 A1
20040243349 Greifeneder et al. Dec 2004 A1
20040249911 Alkhatib et al. Dec 2004 A1
20040255161 Cavanaugh Dec 2004 A1
20040268147 Wiederin et al. Dec 2004 A1
20050005159 Oliphant Jan 2005 A1
20050021740 Bar et al. Jan 2005 A1
20050033960 Vialen et al. Feb 2005 A1
20050033989 Poletto et al. Feb 2005 A1
20050043548 Cates Feb 2005 A1
20050050148 Mohammadioun et al. Mar 2005 A1
20050086523 Zimmer et al. Apr 2005 A1
20050091513 Mitomo et al. Apr 2005 A1
20050091533 Omote et al. Apr 2005 A1
20050091652 Ross et al. Apr 2005 A1
20050108562 Khazan et al. May 2005 A1
20050114663 Cornell et al. May 2005 A1
20050125195 Brendel Jun 2005 A1
20050149726 Joshi et al. Jul 2005 A1
20050157662 Bingham et al. Jul 2005 A1
20050183143 Anderholm et al. Aug 2005 A1
20050201297 Peikari Sep 2005 A1
20050210533 Copeland et al. Sep 2005 A1
20050238005 Chen et al. Oct 2005 A1
20050240781 Gassoway Oct 2005 A1
20050262562 Gassoway Nov 2005 A1
20050265331 Stolfo Dec 2005 A1
20050283839 Cowburn Dec 2005 A1
20060010495 Cohen et al. Jan 2006 A1
20060015416 Hoffman et al. Jan 2006 A1
20060015715 Anderson Jan 2006 A1
20060015747 Van de Ven Jan 2006 A1
20060021029 Brickell et al. Jan 2006 A1
20060021054 Costa et al. Jan 2006 A1
20060031476 Mathes et al. Feb 2006 A1
20060047665 Neil Mar 2006 A1
20060070130 Costea et al. Mar 2006 A1
20060075496 Carpenter et al. Apr 2006 A1
20060095968 Portolani et al. May 2006 A1
20060101516 Sudaharan et al. May 2006 A1
20060101517 Banzhaf et al. May 2006 A1
20060117385 Mester et al. Jun 2006 A1
20060123477 Raghavan et al. Jun 2006 A1
20060143709 Brooks et al. Jun 2006 A1
20060150249 Gassen et al. Jul 2006 A1
20060161983 Cothrell et al. Jul 2006 A1
20060161987 Levy-Yurista Jul 2006 A1
20060161989 Reshef et al. Jul 2006 A1
20060164199 Gilde et al. Jul 2006 A1
20060173992 Weber et al. Aug 2006 A1
20060179147 Tran et al. Aug 2006 A1
20060184632 Marino et al. Aug 2006 A1
20060191010 Benjamin Aug 2006 A1
20060221956 Narayan et al. Oct 2006 A1
20060236393 Kramer et al. Oct 2006 A1
20060242709 Seinfeld et al. Oct 2006 A1
20060248519 Jaeger et al. Nov 2006 A1
20060248582 Panjwani et al. Nov 2006 A1
20060251104 Koga Nov 2006 A1
20060288417 Bookbinder Dec 2006 A1
20070006288 Mayfield et al. Jan 2007 A1
20070006313 Porras et al. Jan 2007 A1
20070011174 Takaragi et al. Jan 2007 A1
20070016951 Piccard et al. Jan 2007 A1
20070019286 Kikuchi Jan 2007 A1
20070033645 Jones Feb 2007 A1
20070038943 FitzGerald et al. Feb 2007 A1
20070064689 Shin et al. Mar 2007 A1
20070074169 Chess et al. Mar 2007 A1
20070094730 Bhikkaji et al. Apr 2007 A1
20070097976 Wood May 2007 A1
20070101435 Konanka et al. May 2007 A1
20070128855 Cho et al. Jun 2007 A1
20070142030 Sinha et al. Jun 2007 A1
20070143827 Nicodemus et al. Jun 2007 A1
20070156895 Vuong Jul 2007 A1
20070157180 Tillmann et al. Jul 2007 A1
20070157306 Elrod et al. Jul 2007 A1
20070168988 Eisner et al. Jul 2007 A1
20070171824 Ruello et al. Jul 2007 A1
20070174915 Gribble et al. Jul 2007 A1
20070192500 Lum Aug 2007 A1
20070192858 Lum Aug 2007 A1
20070198275 Malden et al. Aug 2007 A1
20070208822 Wang Sep 2007 A1
20070220607 Sprosts et al. Sep 2007 A1
20070240218 Tuvell et al. Oct 2007 A1
20070240219 Tuvell et al. Oct 2007 A1
20070240220 Tuvell et al. Oct 2007 A1
20070240222 Tuvell et al. Oct 2007 A1
20070250930 Aziz et al. Oct 2007 A1
20070256132 Oliphant Nov 2007 A2
20070271446 Nakamura Nov 2007 A1
20070283192 Shevchenko Dec 2007 A1
20080005782 Aziz Jan 2008 A1
20080018122 Zierler et al. Jan 2008 A1
20080028463 Dagon Jan 2008 A1
20080032556 Schreier Feb 2008 A1
20080040710 Chiriac Feb 2008 A1
20080046781 Childs et al. Feb 2008 A1
20080059588 Ratliff Mar 2008 A1
20080066179 Liu Mar 2008 A1
20080072326 Danford et al. Mar 2008 A1
20080077793 Tan et al. Mar 2008 A1
20080080518 Hoeflin Apr 2008 A1
20080086720 Lekel Apr 2008 A1
20080098476 Syversen Apr 2008 A1
20080120722 Sima et al. May 2008 A1
20080134178 Fitzgerald et al. Jun 2008 A1
20080134334 Kim et al. Jun 2008 A1
20080141376 Clausen et al. Jun 2008 A1
20080181227 Todd Jul 2008 A1
20080184367 McMillan et al. Jul 2008 A1
20080184373 Traut et al. Jul 2008 A1
20080189787 Arnold et al. Aug 2008 A1
20080201778 Guo et al. Aug 2008 A1
20080209557 Herley et al. Aug 2008 A1
20080215742 Goldszmidt et al. Sep 2008 A1
20080222729 Chen et al. Sep 2008 A1
20080263665 Ma et al. Oct 2008 A1
20080295172 Bohacek Nov 2008 A1
20080301810 Lehane et al. Dec 2008 A1
20080307524 Singh et al. Dec 2008 A1
20080313738 Enderby Dec 2008 A1
20080320594 Jiang Dec 2008 A1
20090003317 Kasralikar et al. Jan 2009 A1
20090007100 Field et al. Jan 2009 A1
20090013408 Schipka Jan 2009 A1
20090031423 Liu et al. Jan 2009 A1
20090036111 Danford et al. Feb 2009 A1
20090037835 Goldman Feb 2009 A1
20090044024 Oberheide et al. Feb 2009 A1
20090044274 Budko et al. Feb 2009 A1
20090049551 Ahn Feb 2009 A1
20090064332 Porras et al. Mar 2009 A1
20090077666 Chen et al. Mar 2009 A1
20090083369 Marmor Mar 2009 A1
20090083855 Apap et al. Mar 2009 A1
20090089879 Wang et al. Apr 2009 A1
20090094697 Provos et al. Apr 2009 A1
20090113425 Ports et al. Apr 2009 A1
20090125976 Wassermann et al. May 2009 A1
20090126015 Monastyrsky et al. May 2009 A1
20090126016 Sobko et al. May 2009 A1
20090133125 Choi et al. May 2009 A1
20090144823 Lamastra et al. Jun 2009 A1
20090158430 Borders Jun 2009 A1
20090172815 Gu et al. Jul 2009 A1
20090187992 Poston Jul 2009 A1
20090193293 Stolfo et al. Jul 2009 A1
20090198651 Shiffer et al. Aug 2009 A1
20090198670 Shiffer et al. Aug 2009 A1
20090198689 Frazier et al. Aug 2009 A1
20090199274 Frazier et al. Aug 2009 A1
20090199296 Xie et al. Aug 2009 A1
20090228233 Anderson et al. Sep 2009 A1
20090241187 Troyansky Sep 2009 A1
20090241190 Todd et al. Sep 2009 A1
20090265692 Godefroid et al. Oct 2009 A1
20090271867 Zhang Oct 2009 A1
20090300415 Zhang et al. Dec 2009 A1
20090300761 Park et al. Dec 2009 A1
20090328185 Berg et al. Dec 2009 A1
20090328221 Blumfield et al. Dec 2009 A1
20100005146 Drako et al. Jan 2010 A1
20100011205 McKenna Jan 2010 A1
20100017546 Poo et al. Jan 2010 A1
20100030996 Butler, II Feb 2010 A1
20100031353 Thomas et al. Feb 2010 A1
20100037314 Perdisci et al. Feb 2010 A1
20100043073 Kuwamura Feb 2010 A1
20100054278 Stolfo et al. Mar 2010 A1
20100058474 Hicks Mar 2010 A1
20100064044 Nonoyama Mar 2010 A1
20100077481 Polyakov et al. Mar 2010 A1
20100083376 Pereira et al. Apr 2010 A1
20100107261 Nagoya Apr 2010 A1
20100115621 Staniford et al. May 2010 A1
20100132038 Zaitsev May 2010 A1
20100154056 Smith et al. Jun 2010 A1
20100180344 Malyshev et al. Jul 2010 A1
20100192223 Ismael et al. Jul 2010 A1
20100220863 Dupaquis et al. Sep 2010 A1
20100235831 Dittmer Sep 2010 A1
20100251104 Massand Sep 2010 A1
20100281102 Chinta et al. Nov 2010 A1
20100281541 Stolfo et al. Nov 2010 A1
20100281542 Stolfo et al. Nov 2010 A1
20100287260 Peterson et al. Nov 2010 A1
20100299754 Amit et al. Nov 2010 A1
20100306173 Frank Dec 2010 A1
20110004737 Greenebaum Jan 2011 A1
20110025504 Lyon et al. Feb 2011 A1
20110041179 St Hlberg Feb 2011 A1
20110047594 Mahaffey et al. Feb 2011 A1
20110047620 Mahaffey et al. Feb 2011 A1
20110055907 Narasimhan et al. Mar 2011 A1
20110078794 Manni et al. Mar 2011 A1
20110093951 Aziz Apr 2011 A1
20110099620 Stavrou et al. Apr 2011 A1
20110099633 Aziz Apr 2011 A1
20110099635 Silberman et al. Apr 2011 A1
20110113231 Kaminsky May 2011 A1
20110145918 Jung et al. Jun 2011 A1
20110145920 Mahaffey et al. Jun 2011 A1
20110145934 Abramovici et al. Jun 2011 A1
20110167493 Song et al. Jul 2011 A1
20110167494 Bowen et al. Jul 2011 A1
20110173213 Frazier et al. Jul 2011 A1
20110173460 Ito et al. Jul 2011 A1
20110219449 St. Neitzel et al. Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225624 Sawhney et al. Sep 2011 A1
20110225655 Niemela et al. Sep 2011 A1
20110247072 Staniford et al. Oct 2011 A1
20110265182 Peinado et al. Oct 2011 A1
20110289582 Kejriwal et al. Nov 2011 A1
20110302587 Nishikawa et al. Dec 2011 A1
20110307954 Melnik et al. Dec 2011 A1
20110307955 Kaplan et al. Dec 2011 A1
20110307956 Yermakov et al. Dec 2011 A1
20110314546 Aziz et al. Dec 2011 A1
20120023593 Puder et al. Jan 2012 A1
20120054869 Yen et al. Mar 2012 A1
20120066698 Yanoo Mar 2012 A1
20120079596 Thomas et al. Mar 2012 A1
20120084859 Radinsky et al. Apr 2012 A1
20120096553 Srivastava et al. Apr 2012 A1
20120110667 Zubrilin et al. May 2012 A1
20120117652 Manni et al. May 2012 A1
20120121154 Xue et al. May 2012 A1
20120124426 Maybee et al. May 2012 A1
20120174186 Aziz et al. Jul 2012 A1
20120174196 Bhogavilli et al. Jul 2012 A1
20120174218 McCoy et al. Jul 2012 A1
20120198279 Schroeder Aug 2012 A1
20120210423 Friedrichs et al. Aug 2012 A1
20120222121 Staniford et al. Aug 2012 A1
20120255015 Sahita et al. Oct 2012 A1
20120255017 Sallam Oct 2012 A1
20120260342 Dube et al. Oct 2012 A1
20120266244 Green et al. Oct 2012 A1
20120278886 Luna Nov 2012 A1
20120297489 Dequevy Nov 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120331553 Aziz et al. Dec 2012 A1
20130014259 Gribble et al. Jan 2013 A1
20130036472 Aziz Feb 2013 A1
20130047257 Aziz Feb 2013 A1
20130074185 McDougal et al. Mar 2013 A1
20130086684 Mohler Apr 2013 A1
20130097699 Balupari et al. Apr 2013 A1
20130097706 Titonis et al. Apr 2013 A1
20130111587 Goel et al. May 2013 A1
20130117852 Stute May 2013 A1
20130117855 Kim et al. May 2013 A1
20130139264 Brinkley et al. May 2013 A1
20130160125 Likhachev et al. Jun 2013 A1
20130160127 Jeong et al. Jun 2013 A1
20130160130 Mendelev et al. Jun 2013 A1
20130160131 Madou et al. Jun 2013 A1
20130167236 Sick Jun 2013 A1
20130174214 Duncan Jul 2013 A1
20130185789 Hagiwara et al. Jul 2013 A1
20130185795 Winn et al. Jul 2013 A1
20130185798 Saunders et al. Jul 2013 A1
20130191915 Antonakakis et al. Jul 2013 A1
20130196649 Paddon et al. Aug 2013 A1
20130227691 Aziz et al. Aug 2013 A1
20130246370 Bartram et al. Sep 2013 A1
20130247186 LeMasters Sep 2013 A1
20130263260 Mahaffey et al. Oct 2013 A1
20130291109 Staniford et al. Oct 2013 A1
20130298243 Kumar et al. Nov 2013 A1
20130318038 Shiffer et al. Nov 2013 A1
20130318073 Shiffer et al. Nov 2013 A1
20130325791 Shiffer et al. Dec 2013 A1
20130325792 Shiffer et al. Dec 2013 A1
20130325871 Shiffer et al. Dec 2013 A1
20130325872 Shiffer et al. Dec 2013 A1
20140032875 Butler Jan 2014 A1
20140053260 Gupta et al. Feb 2014 A1
20140053261 Gupta et al. Feb 2014 A1
20140130158 Wang et al. May 2014 A1
20140137180 Lukacs et al. May 2014 A1
20140169762 Ryu Jun 2014 A1
20140179360 Jackson et al. Jun 2014 A1
20140181131 Ross Jun 2014 A1
20140189687 Jung et al. Jul 2014 A1
20140189866 Shiffer et al. Jul 2014 A1
20140189882 Jung et al. Jul 2014 A1
20140237600 Silberman et al. Aug 2014 A1
20140280245 Wilson Sep 2014 A1
20140283037 Sikorski et al. Sep 2014 A1
20140283063 Thompson et al. Sep 2014 A1
20140328204 Klotsche et al. Nov 2014 A1
20140337836 Ismael Nov 2014 A1
20140344926 Cunningham et al. Nov 2014 A1
20140351935 Shao et al. Nov 2014 A1
20140380473 Bu et al. Dec 2014 A1
20140380474 Paithane et al. Dec 2014 A1
20150007312 Pidathala et al. Jan 2015 A1
20150096022 Vincent et al. Apr 2015 A1
20150096023 Mesdaq et al. Apr 2015 A1
20150096024 Haq et al. Apr 2015 A1
20150096025 Ismael Apr 2015 A1
20150180886 Staniford et al. Jun 2015 A1
20150186645 Aziz et al. Jul 2015 A1
20150199513 Ismael et al. Jul 2015 A1
20150199531 Ismael et al. Jul 2015 A1
20150199532 Ismael et al. Jul 2015 A1
20150220735 Paithane et al. Aug 2015 A1
20150372980 Eyada Dec 2015 A1
20160004869 Ismael et al. Jan 2016 A1
20160006756 Ismael et al. Jan 2016 A1
20160044000 Cunningham Feb 2016 A1
20160127393 Aziz et al. May 2016 A1
20160191547 Zafar et al. Jun 2016 A1
20160191550 Ismael et al. Jun 2016 A1
20160261612 Mesdaq et al. Sep 2016 A1
20160285914 Singh et al. Sep 2016 A1
20160301703 Aziz Oct 2016 A1
20160335110 Paithane et al. Nov 2016 A1
20170083703 Abbasi et al. Mar 2017 A1
20180013770 Ismael Jan 2018 A1
20180048660 Paithane et al. Feb 2018 A1
20180121316 Ismael et al. May 2018 A1
20180288077 Siddiqui et al. Oct 2018 A1
Foreign Referenced Citations (11)
Number Date Country
2439806 Jan 2008 GB
2490431 Oct 2012 GB
02006928 Jan 2002 WO
0223805 Mar 2002 WO
2007117636 Oct 2007 WO
2008041950 Apr 2008 WO
2011084431 Jul 2011 WO
2011112348 Sep 2011 WO
2012075336 Jun 2012 WO
2012145066 Oct 2012 WO
2013067505 May 2013 WO
Non-Patent Literature Citations (92)
Entry
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001).
Margolis, P.E. , “Random House Webster's ‘Computer & Internet Dictionary 3rd Edition’”, ISBN 0375703519, (Dec. 1998).
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910.
Morales, Jose A., et al., ““Analyzing and exploiting network behaviors of malware.””, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34.
Mori, Detecting Unknown Computer Viruses, 2004, Springer-Verlag Berlin Heidelberg.
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002).
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987.
Newsome, J. , et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), (Feb. 2005).
Newsome, J. , et al., “Polygraph: Automatically Generating Signatures for Polymorphic Worms”, In Proceedings of the IEEE Symposium on Security and Privacy, (May 2005).
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302.
Oberheide et al., CloudAV.sub.—N-Version Antivirus in the Network Cloud, 17th USENIX Security Symposium USENIX Security '08 Jul. 28-Aug. 1, 2008 San Jose, CA.
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doorn, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”).
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25.
Singh, S. , et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004).
Spitzner, Lance , “Honeypots: Tracking Hackers”, (“Spizner”), (Sep. 17, 2002).
The Sniffers's Guide to Raw Traffic available at: yuba.stanford.edu/.about.casado/pcap/section1.html, (Jan. 6, 2014).
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998).
U.S. Appl. No. 11/998,605, filed Nov. 30, 2007 Final Office Action dated Dec. 5, 2011.
U.S. Appl. No. 11/998,605, filed Nov. 30, 2007 Final Office Action dated Jan. 8, 2013.
U.S. Appl. No. 11/998,605, filed Nov. 30, 2007 Non-Final Office Action dated Aug. 2, 2012.
U.S. Appl. No. 11/998,605, filed Nov. 30, 2007 Non-Final Office Action dated Mar. 16, 2012.
U.S. Appl. No. 11/998,605, filed Nov. 30, 2007 Non-Final Office Action dated May 27, 2011.
U.S. Appl. No. 11/998,605, filed Nov. 30, 2007 Notice of Allowance dated Jul. 17, 2013.
U.S. Appl. No. 14/052,632, filed Oct. 11, 2013 Final Office Action dated Jan. 15, 2015.
U.S. Appl. No. 14/052,632, filed Oct. 11, 2013 Final Office Action dated Jun. 3, 2016.
U.S. Appl. No. 14/052,632, filed Oct. 11, 2013 Non-Final Office Action dated Jun. 26, 2014.
U.S. Appl. No. 14/052,632, filed Oct. 11, 2013 Non-Final Office Action dated Nov. 3, 2015.
U.S. Appl. No. 14/052,632, filed Oct. 11, 2013 Notice of Allowance dated Dec. 12, 2016.
U.S. Pat. No. 8,171,553 filed Apr. 20, 2006, Inter Parties Review Decision dated Jul. 10, 2015.
U.S. Pat. No. 8,291,499 filed Mar. 16, 2012, Inter Parties Review Decision dated Jul. 10, 2015.
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003).
Wahid et al., Characterising the Evolution in Scanning Activity of Suspicious Hosts, Oct. 2009, Third International Conference on Network and System Security, pp. 344-350.
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages.
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9.
Yuhei Kawakoya et al: “Memory behavior-based automatic malware unpacking in stealth debugging environment”, Malicious and Unwanted Software (Malware), 2010 5th International Conference on, IEEE, Piscataway, NJ, USA, Oct. 19, 2010, pp. 39-46, XP031833827, ISBN:978-1-4244-8-9353-1.
Zhang et al., The Effects of Threading, Infection Time, and Multiple-Attacker Collaboration on Malware Propagation, Sep. 2009, IEEE 28th International Symposium on Reliable Distributed Systems, pp. 73-82.
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003).
“Packet”, Microsoft Computer Dictionary, Microsoft Press, (Mar. 2002), 1 page.
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.isp?reload=true&arnumbe- r=990073, (Dec. 7, 2013).
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108.
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003).
Adobe Systems Incorporated, “PDF 32000-1:2008, Document management—Portable document format—Part1:PDF 1.7”, First Edition, Jul. 1, 2008, 756 pages.
AltaVista Advanced Search Results. “attack vector identifier”. Http://www.altavista.com/web/results?Itag=ody&pg=aq&aqmode=aqa=Event+Orch- estrator . . . , (Accessed on Sep. 15, 2009).
AltaVista Advanced Search Results. “Event Orchestrator”. Http://www.altavista.com/web/results?Itag=ody&pg=aq&aqmode=aqa=Event+Orch- esrator . . . , (Accessed on Sep. 3, 2009).
Apostolopoulos, George; hassapis, Constantinos; “V-eM: A cluster of Virtual Machines for Robust, Detailed, and High-Performance Network Emulation”, 14th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Sep. 11-14, 2006, pp. 117-126.
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006.
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184.
Baldi, Mario; Risso, Fulvio; “A Framework for Rapid Development and Portable Execution of Packet-Handling Applications”, 5th IEEE International Symposium Processing and Information Technology, Dec. 21, 2005, pp. 233-238.
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77.
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003).
Chaudet, C. , et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82.
Chen, P. M. and Noble, B. D., “When Virtual is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”) (2001).
Cisco “Intrusion Prevention for the Cisco ASA 5500-x Series” Data Sheet (2012).
Cisco, Configuring the Catalyst Switched Port Analyzer (SPAN) (“Cisco”), (1992).
Clark, John, Sylvian Leblanc,and Scott Knight. “Risks associated with usb hardware trojan devices used by insiders.” Systems Conference (SysCon), 2011 IEEE International. IEEE, 2011.
Cohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120.
Costa, M. , et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005).
Crandall, J.R. , et al., “Minos:Control Data Attack Prevention Orthogonal to Memory Model”, 37th International Symposium on Microarchitecture, Portland, Oregon, (Dec. 2004).
Deutsch, P. , “Zlib compressed data format specification version 3.3” RFC 1950, (1996).
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007).
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002).
Excerpt regarding First Printing Date for Merike Kaeo, Designing Network Security (“Kaeo”), (2005).
Filiol, Eric , et al., “Combinatorial Optimisation of Worm Propagation on an Unknown Network”, International Journal of Computer Science 2.2 (2007).
FireEye Malware Analysis & Exchange Network, Malware Protection System, FireEye Inc., 2010.
FireEye Malware Analysis, Modern Malware Forensics, FireEye Inc., 2010.
FireEye v.6.0 Security Target, pp. 1-35, Version 1.1, FireEye Inc., May 2011.
Gibler, Clint, et al. AndroidLeaks: automatically detecting potential privacy leaks in android applications on a large scale. Springer Berlin Heidelberg, 2012.
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28.
Gregg Keizer: “Microsoft's HoneyMonkeys Show Patching Windows Works”, Aug. 8, 2005, XP055143386, Retrieved from the Internet: URL:http://www.informationweek.com/microsofts-honeymonkeys-show-patching-windows-works/d/d-id/1035069? [retrieved on Jun. 1, 2016].
Heng Yin et al, Panorama: Capturing System-Wide Information Flow for Malware Detection and Analysis, Research Showcase @ CMU, Carnegie Mellon University, 2007.
Hjelmvik, Erik , “Passive Network Security Analysis with NetworkMiner”, (IN)Secure, Issue 18, (Oct. 2008), pp. 1-100.
Idika et al., A-Survey-of-Malware-Detection-Techniques, Feb. 2, 2007, Department of Computer Science, Purdue University.
IEEE Xplore Digital Library Sear Results for “detection of unknown computer worms”. Http//ieeexplore.ieee.org/searchresult.jsp?SortField=Score&SortOrder=desc- &ResultC . . . , (Accessed on Aug. 28, 2009).
Isohara, Takamasa, Keisuke Takemori, and Ayumu Kubota. “Kernel-based behavior analysis for android malware detection.” Computational intelligence and Security (CIS), 2011 Seventh International Conference on. IEEE, 2011.
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003).
Kevin A Roundy et al: “Hybrid Analysis and Control of Malware”, Sep. 15, 2010, Recent Advances in Intrusion Detection, Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 317-338, XP019150454 ISBN:978-3-642-15511-6.
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286.
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”) (2003).
Krasnyansky, Max , et al., Universal TUN/TAP driver, available at https://www.kernel.org/doc/Documentation/networking/tuntap.txt (2002) (“Krasnyansky”).
Kreibich, C. , et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003).
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages.
Leading Colleges Select FireEye to Stop Malware-Related Data Breaches, FireEye Inc., 2009.
Li et al., A VMM-Based System Call Interposition Framework for Program Monitoring, Dec. 2010, IEEE 16th International Conference on Parallel and Distributed Systems, pp. 706-711.
Liljenstam, Michael , et al., “Simulating Realistic Network Traffic for Worm Warning System Design and Testing”, Institute for Security Technology studies, Dartmouth College (“Liljenstam”), (Oct. 27, 2003).
Lindorfer, Martina, Clemens Kolbitsch, and Paolo Milani Comparetti. “Detecting environment-sensitive malware.” Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011.
Lok Kwong et al: “DroidScope: Seamlessly Reconstructing the OS and Dalvik Semantic Views for Dynamic Android Malware Analysis”, Aug. 10, 2012, XP055158513, Retrieved from the Internet: URL:https://www.usenix.org/system/files/conference/usenixsecurity12/sec12- -final107.pdf [retrieved on Dec. 15, 2014].
“Mining Specification of Malicious Behavior”—Jha et al, UCSB, Sep. 2007 https://www.cs.ucsb.edu/.about.chris/research/doc/esec07.sub.--mining.pdf-.
Didier Stevens, “Malicious PDF Documents Explained”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 9, No. 1, Jan. 1, 2011, pp. 80-82, XP011329453, ISSN: 1540-7993, DOI: 10.1109/MSP.2011.14.
Hiroshi Shinotsuka, Malware Authors Using New Techniques to Evade Automated Threat Analysis Systems, Oct. 26, 2012, http://www.symantec.com/connect/blogs/, pp. 1-4.
Khaled Salah et al: “Using Cloud Computing to Implement a Security Overlay Network”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 11, No. 1, Jan. 1, 2013 (Jan. 1, 2013).
Lastline Labs, The Threat of Evasive Malware, Feb. 25, 2013, Lastline Labs, pp. 1-8.
Vladimir Getov: “Security as a Service in Smart Clouds—Opportunities and Concerns”, Computer Software and Applications Conference (COMPSAC), 2012 IEEE 36th Annual, IEEE, Jul. 16, 2012 (Jul. 16, 2012).
Provisional Applications (3)
Number Date Country
60868323 Dec 2006 US
60559198 Apr 2004 US
60579910 Jun 2004 US
Continuations (2)
Number Date Country
Parent 14052632 Oct 2013 US
Child 15489661 US
Parent 11998605 Nov 2007 US
Child 14052632 US
Continuation in Parts (6)
Number Date Country
Parent 11494990 Jul 2006 US
Child 14052632 US
Parent 11471072 Jun 2006 US
Child 11494990 US
Parent 11409355 Apr 2006 US
Child 11471072 US
Parent 11096287 Mar 2005 US
Child 11409355 US
Parent 11151812 Jun 2005 US
Child 11409355 US
Parent 11152286 Jun 2005 US
Child 11151812 US