Detecting malicious network content

Information

  • Patent Grant
  • 9519782
  • Patent Number
    9,519,782
  • Date Filed
    Friday, February 24, 2012
    12 years ago
  • Date Issued
    Tuesday, December 13, 2016
    8 years ago
Abstract
Systems and methods for detecting malicious content on portable data storage devices or remote network servers are provided. In an exemplary embodiment, a system comprises a quarantine module configured to detect one or more portable data storage devices upon insertion of the devices into a security appliance, wherein the security appliance is configured to receive the portable data storage devices, a controller configured to receive from the security appliance, via a communication network, data associated with the portable data storage devices, an analysis module configured to analyze the data to determine whether the data includes malware, and a security module to selectively identify, based on the determination, the one or more portable data storage devices storing the malware.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is related to U.S. patent application Ser. No. 13/011,344 entitled “Systems and Methods for Detecting Malicious PDF Network Content” which is a continuation-in-part of U.S. patent application Ser. No. 12/263,971 entitled “Systems and Methods for Detecting Malicious Network Content” and filed on Nov. 3, 2008. This application is also related to U.S. patent application Ser. No. 11/409,355 entitled “Heuristic Based Capture with Replay to Virtual Machine” and filed on Apr. 20, 2006, which is a continuation-in-part of U.S. patent application Ser. No. 11/152,286 entitled “Computer Worm Defense System and Method” and filed on Jun. 13, 2005, which claims the priority benefit of U.S. Provisional Patent Application Ser. No. 60/579,910 entitled “Computer Worm Defense System and Method” and filed on Jun. 14, 2004. U.S. patent application Ser. No. 11/409,355 is also a continuation-in-part of U.S. patent application Ser. No. 11/096,287 entitled “System and Method of Detecting Computer Worms” and filed on Mar. 31, 2005, which claims the priority benefit of U.S. Provisional Patent Application Ser. No. 60/559,198 entitled “System and Method of Detecting Computer Worms” and filed on Apr. 1, 2004. U.S. patent application Ser. No. 11/409,355 is also a continuation-in-part of U.S. patent application Ser. No. 11/151,812 entitled “System and Method of Containing Computer Worms” and filed on Jun. 13, 2005, which claims the priority benefit of U.S. Provisional Patent Application No. 60/579,953 entitled “System and Method of Containing Computer Worms” and filed on Jun. 14, 2004. Each of the aforementioned patent applications is incorporated by reference herein.


BACKGROUND

Field


The present disclosure relates generally to data processing. More particularly, the present disclosure relates to the detecting malicious network content on portable data storage devices and remote network servers.


Related Art


Governments, the military, corporations, financial institutions, hospitals, and private businesses amass a great amount of confidential information about their employees, customers, products, research, and their financial status. Furthermore, government information systems may include classified information related to national security, command and control of military forces, or fulfillment of intelligence missions. Protecting confidential information from theft and corruption while allowing the information to remain accessible and productive to its intended users has been one of the major goals of computer security. However, as computer security becomes savvier to malicious attacks via e-mail and other avenues, cybercriminals are turning to portable data storage devices for malware distribution. Portable data storage devices, such as Universal Serial Bus (USB) flash drives, are small, readily available, and inexpensive, thereby making them popular for storing and transporting files from one computer to another. However, these same characteristics make them appealing to attackers.


According to some research, a quarter of all of malware today is developed to be disseminated through USB devices. One reason for the popularity of USB devices is the simplicity with which malware can be distributed. Most hackers do not wish to spend hours and hours trying to hack secured computers. Spreading malware through USB devices is a simple way to distribute malware with just a few clicks. An attacker might infect a computer with malware that can detect when a USB drive is plugged into a computer. The malware may then download malicious code onto the drive. When the USB drive is plugged into another computer, the malware infects that computer.


There are solutions on the market for addressing the threat with varying degree of success. Some of these solutions aim at preventing USB drives from being recognized by computers. Other solutions require disabling AutoRun functionality or maintaining a dedicated computer for USB related activities. Some even advocate moving away from USB drives to cloud-based solutions. Most of these solutions require limiting accessibility of the information contained on the USB drives instead of addressing the threat directly.


The network file sharing technology is another solution for data transmission between computers. Lately, this technology has become a popular tool for sharing data over the Internet and/or local area networks. However, malware is often spread through remote network servers, making file sharing services one of the most frequent ways of virus infections and computer failures. Remote network servers may contain malware software which can be downloaded while downloading other files requested by users. Current anti-virus technology may be inefficient in detecting these malicious files as they may not become active until after the download is complete or until the requested files are run. Thus, it is desirable to detect malware on the remote network servers before any files are downloaded.


SUMMARY

Exemplary embodiments provide for detecting malicious network content on portable data storage devices. In a first exemplary embodiment, a method is disclosed for detecting malicious network content on portable data storage devices upon insertion of the devices into a security appliance. The method may comprise detecting the insertion of portable data storage devices in a security appliance, receiving, via a communication network, data associated with the portable data storage devices, analyzing the data to determine whether the data storage devices include malware, and selectively identifying the malware stored on the one or more portable data storage devices.


In a second exemplary embodiment, a method is disclosed for detecting malicious network content on a portable data storage device when the device is connected to a host device. The method may comprise detecting a portable data storage device upon connection to a computer, accessing device data, analyzing the device data to determine whether the portable storage device includes malware, and selectively identifying the portable storage device as having the malware.


In a third exemplary embodiment, a method is disclosed for detecting malicious network content within remote network servers. The method may comprise detecting connecting of a client device to a remote network server, receiving data stored on the remote network server, analyzing the data of the remote network server to determine whether the data includes malware, and based on the determination, selectively identifying the remote network server as storing the malware.


In further embodiments, modules, subsystems, or devices can be adapted to perform the recited methods. Furthermore, in various embodiments, a non-transitory machine readable medium may have executable code embodied thereon, with the executable code being executable by a processor for performing above methods.





BRIEF DESCRIPTION OF FIGURES


FIG. 1 is a diagram of an exemplary environment in which various embodiments for detecting malicious network content on portable data storage devices upon insertion of the devices into a security appliance may be practiced.



FIG. 2 is a diagram of an exemplary environment in which various embodiments for detecting malicious network content on portable data storage devices upon connection of the device to a host device may be practiced.



FIG. 3 is a diagram of an exemplary environment in which various embodiments for detecting malicious network content on remote network servers when the servers are connected to client devices may be practiced.



FIG. 4 is a block diagram of an exemplary controller implementing some embodiments of the present invention.



FIG. 5 is a block diagram of an exemplary analysis environment.



FIG. 6 is a flowchart of an exemplary method for detecting malicious network content on portable data storage devices upon insertion of the devices into a security appliance.



FIG. 7 is a flowchart of an exemplary method for detecting malicious network content on a portable data storage device upon connection to a host device.



FIG. 8 is a flowchart of an exemplary method for detecting malicious network content of a remote network server when the server is connected to a client device over a communication network.



FIG. 9 is a block diagram of an exemplary controller, in accordance with an embodiment of the present invention.





DESCRIPTION OF EXEMPLARY EMBODIMENTS

Exemplary systems and methods for detecting malicious network content (e.g., malicious software or malware) on portable data storage devices are provided. In some exemplary embodiments, the systems and methods may allow for protecting organizations from malware infections. The systems and methods may be embodied in three exemplary modes.


The first mode allows screening portable storage devices, such as USB flash drives or memory cards, upon inserting the devices into a security appliance. The security appliance may be associated with a security screening location such as a security checkpoint in a building. The security appliance may include a number of slots configured to receive a plurality of portable data storage devices simultaneously. Security personal may direct owners of the portable data storage devices to insert the portable storage devices into the security appliance. The security appliance may detect insertion of the portable data storage devices and send, via a communication network, the data stored on the portable storage devices to a remotely located controller for analysis. Upon insertion of a portable data storage device into the security appliance, the security appliance may provide the owner of the portable data storage device with an estimated time to complete the analysis, with the estimate being based on the current latency due to the analysis.


The controller may analyze the data and determine whether the portable data storage devices store malware. The controller may commence the procedure by analyzing the data with predetermined heuristics to determine whether the data includes certain suspicious traits. The latency due to the analysis process may be too high for the data to be analyzed with normal heuristics without exceeding the maximum time allotted for the analysis. Therefore, a faster heuristics analysis may be utilized. Since the faster heuristics analysis may not be as comprehensive, a copy of the data may be saved for later analysis. For example, at a security checkpoint, a determination may need to be made within a reasonable time to allow the queue to move quickly through a metal detector. Therefore, the controller should be able to analyze the files on a portable data storage device and send the result of the analysis rapidly. Depending on the latency of the communication network, the controller may need to decide which heuristics to use. If the heuristics analysis indicates a suspicious activity, the controller may configure a virtual machine to safely receive and execute the suspected data in a simulated real-life environment. The response of the virtual machine to the deployment of the suspected data may be analyzed to determine whether the data contains malware. If the controller determines that one or more portable security devices store malware, the security appliance may provide a warning signal. For example, a pattern of beeps and/or flashes may indicate a security threat. If, on the other hand, there is no current indication of malware, the portable data storage device may receive provisional clearance. However, if a later analysis with normal heuristics indicates a problem, the owner of the portable data storage device may be located by the security personal and appropriate measures may be taken. For example, the portable data storage device may be confiscated.


The second mode may allow detecting malicious content on a portable data storage device upon insertion of the device into a host device, such as a personal computer (PC). The techniques utilized to determine whether the portable data storage device contains malware are similar to the ones described above. The method may include detecting the portable data storage device upon connection to the host device and analyzing the data stored on the portable data storage device to determine whether the data includes malware.


The method may commence with the controller responsible for the data analysis quarantining the data within the host device. For example, the controller may prevent executing any files stored on the portable data storage device.


The third mode may allow detecting malicious content on a remote network server before files are downloaded over a network such as the Internet. Thus, files may be analyzed before the downloading to a client device may proceed, i.e. in an active rather than passive manner. The technique allowing determining whether the remote network server contains malware similarly to the ones described above. The techniques may include detecting that a client device accesses the remote network server when connection over a network is established and analyzing the data stored on the remote network server to determine whether or not the data includes malware. The technique may further include actively monitoring the remote network server for the presence of new files, actively downloading those new files, analyzing the files to determine if they are suspicious and running the files in a virtual machine environment to identify malware.


The client device such as a personal computer may embed a controller or the controller can be remotely located and accessible over the network. The controller may analyze the data stored on the remote network server with predetermined heuristics to determine whether the data includes certain suspicious traits. The controller may limit data to be analyzed to the data intended for downloading by the client device. If the heuristics analysis indicates a suspicious activity, the controller may configure a virtual machine to safely receive and execute the suspected data in a simulated real-life environment. The response of the virtual machine to the deployment of the suspected data may be analyzed to determine whether the data contains malware. If the controller determines that the remote networking server stores malware, the controller may provide a warning signal.


Malware is software created and distributed for malicious purposes and can take the form of viruses, worms, Trojan horses or adware, for example. A virus is an intrusive program that infects a computer file by inserting a copy of itself in the file. The copy is usually executed when the file is loaded into memory, allowing the virus to infect other files. A worm is a program that propagates itself across multiple computers, usually by creating copies of itself in each computer's memory. A worm might duplicate itself in a computer so many times that it causes the computer to crash. A Trojan horse is a destructive program disguised as a game, utility, or application. When run by a user or computer program, a Trojan horse can harm the computer system while appearing to do something useful.


Malware may also include adware and spyware. Adware is a program configured to direct advertisements to a computer or a particular user. In one example, adware identifies the computer and/or the user to various websites visited by a browser on the computer. The website may then use the adware to either generate pop-up advertisements or otherwise direct specific advertisements to the user's browser. Spyware is a program configured to collect information regarding the user, the computer, and/or a user's network habits. In an example, spyware may collect information regarding the names and types of websites that the user browses and then transmit the information to another computer. Adware and spyware are often added to the user's computer after the user browses a website that hosts the adware and/or spyware. The user is often unaware that these programs have been added and is similarly unaware of the adware and/or spyware's function.


Referring now to the drawings, FIG. 1 is a diagram of an exemplary environment 100 in which various embodiments for detecting malicious network content on portable data storage devices by inserting the devices into a security appliance may be practiced. The environment 100 may include portable data storage devices 105 inserted into a security appliance 130. The security appliance 130 may be communicatively coupled to a communication network 120. A controller 110 may also be communicatively coupled to the communication network 120.


The portable data storage devices 105 are any combination of one or more storage devices designed to hold any kind of digital data. For example, a USB flash drive is a portable data storage device including a flash memory data storage device integrated with a USB interface. In yet another example, the portable data storage device may refer to a memory card.


The security appliance 130 is a digital device which may include a plurality of interfaces to simultaneously receive one or more of the portable data storage devices 105. Upon insertion of the portable data storage devices 105 into the security appliance 130, the data stored on the portable data storage devices 105 may be transmitted to the controller 110 via the communication network 120. The security appliance 130 may include dedicated circuitry and comprise one or more processors. The security appliance 130 may include any combination of computers and servers. The data stored on the portable data storage devices 105 may include any kind of digital data. Although FIG. 1 depicts the security appliance as coupled to the controller 110 via the communication network 120, the security appliance 130 may be directly coupled to the controller 110.


The controller 110 may be a digital device or software configured to receive and analyze data for the presence of malware. In exemplary embodiments, the controller 110 may detect the presence of the portable data storage devices 105 when the portable data storage devices 105 are initially inserted into the security appliance 130. The controller 110 may intercept data transmitted from the portable data storage devices 105 for a predetermined period of time. In other embodiments, the security appliance 130 may direct the data transmitted from the portable data storage devices 105 to the controller 110 for a predetermined period of time.


The controller 110 may also be configured to transmit a command to the security appliance 130 to activate one or more security programs. The one or more security programs can be resident within the security appliance 130 and are configured to operate security functions. In some embodiments, the controller 110 can scan and activate security programs on the portable data storage devices 105 without the necessity of installing any agents on the security appliance 130. As such, multiple security programs on the portable data storage devices 105 may be activated upon insertion in the security appliance. By performing security functions upon connection, the portable data storage devices 105 may be analyzed for the presence of malware. Security functions are further described in FIG. 4. The data may then be analyzed by the controller 110 to determine evidence of malware. If malware is detected, the controller 110 may report the threat. The controller 110 is further discussed in FIG. 4.


The communication network 120 couples two or more digital devices together to allow the digital devices to communicate and transmit data to each other. In some exemplary embodiments, the communication network 120 may be a public computer network such as the Internet, or a private computer network such as a wireless telecommunication network, wide area network (WAN), or local area network (LAN). In some embodiments, the communication network 120 comprises multiple routers, bridges, and hubs that couple a large number of digital devices.



FIG. 2 is a diagram of an exemplary environment 200 in which various embodiments for detecting malicious network content on portable data storage devices when the devices are connected to a host device may be practiced.


The environment 200 includes a host device 230 and a portable data storage device 205. The portable data storage device 205 may be connected to the host device 230 via a storage device interface (not shown). In some embodiments, a controller 210 may be run within the host device 230. In other embodiments, the host device 230 may be located remotely and coupled to the controller 210 via the communication network 120. The portable data storage device 205 may be a storage device designed to hold any kind of digital data. The host device 230 may be any device comprising one or more processors. Some examples of host device 230 include computers, servers, laptops, tablet computers, personal digital assistants (PDAs), cellular telephones, and smart phones.


The portable data storage device 205 may include any kind of digital data. Although FIG. 2 depicts the controller 210 as optionally coupled to the communication network 120, the controller 210 may be directly coupled to the host device 230. The controller 210 may be a digital device or software configured to receive and analyze data for the presence of malware. In exemplary embodiments, the controller 210 may detect the presence of portable data storage device 205 when the portable data storage device 205 initially couples to host device 230. The controller 210 may intercept data transmitted from the portable data storage device 205 or the host device 230 for a predetermined period of time. In other embodiments, the host device 230 is configured to direct the data transmitted from the portable data storage device 205 to the controller 210 for a predetermined period of time.


The controller 210 may also be configured to transmit a command to the host device 230 to activate one or more security programs. In some exemplary embodiments, the one or more security programs can be resident within the portable data storage device 205 and are configured to operate security functions. The controller 210 can scan and activate security programs on the portable data storage device 205 without the necessity of installing an agent on the host device 230. By performing security functions upon connection, the controller may analyze the contents of the portable data storage device for malware. Security functions are further described in FIG. 4. The data is then analyzed by the controller 210 to determine evidence of malware. If malware is detected, the controller 210 may provide an indication to that effect. The controller 210 is further discussed in FIG. 4.



FIG. 3 is a diagram of an exemplary environment 300 in which various embodiments for detecting malicious network content on remote network servers when the servers are connected to client devices may be practiced.


The environment 300 may comprise a controller 310, a remote networking server 320, a client device 330, and the communication network 120 which operatively couples all mentioned modules.


In some embodiments, the controller 310 may be embedded within the client device 330. In other embodiments, the controller 310 may be located remotely and coupled to the client device 330 via the communication network 120. The client device 330 may be any device comprising one or more processors. Some examples of client device 330 include computers, servers, laptops, tablet computers, personal digital assistants (PDAs), cellular telephones, and smart phones.


The controller 310 may be a digital device, software or a combination thereof configured to receive and analyze data for the presence of malware. In exemplary embodiments, the controller 310 may detect intent by the client device 330 to download data from the remote networking server 320 over the communication network 120. The controller 310 may intercept data transmitted from the remote networking server 320 for a predetermined period of time. In other embodiments, the client device 310 may be configured to direct the data transmitted from the remote networking server 320 to the remotely located controller 310 for a predetermined period of time.


The controller 310 may also be configured to transmit a command to the client device 330 to activate one or more security programs. In some exemplary embodiments, the one or more security programs can be resident within the remote network server 320 and are configured to operate security functions. The controller 310 may scan and activate security programs on the remote network server 320 without installing an agent on the client device 330. By performing security functions upon connection, the controller 310 may analyze the content to be downloaded by the client device 330 for malware. The security functions are further described in FIG. 4. The data may be analyzed by the controller 310 for evidence of malware. If malware is detected, the controller 310 may provide an indication to that effect. The controller 310 is further discussed in FIG. 4.



FIG. 4 is a block diagram of an exemplary controller 110. The controller 110 can be any digital device or software that receives data stored on the portable data storage devices 105 and/or the portable data storage device 205. The controller 110 may be used to implement the controller 210 of FIG. 2 or the controller 310 of FIG. 3.


The controller 110 can comprise a quarantine module 400, a security module 405, a heuristic module 410, a scheduler 415, a virtual machine pool 425, an analysis environment 430, and a policy engine 440. In some embodiments, the controller 110 may also comprise a tap or span port which is further coupled to the communication network 120. In other embodiments, the controller 110 may be coupled to an external tap or external span port of the security appliance 130 and/or the host device 230 and/or the client device 330.


The quarantine module 400 may detect the portable data storage devices 105 and/or the portable data storage device 205 as they couple to security appliance 130 and/or the host device 230. When the portable data storage device 205 is detected, the data transmitted from the portable data storage device 205 is redirected to the controller 110 for a predetermined time. The data redirected to the controller 110 is analyzed to determine if the data contains suspicious data (discussed below) or a malware attack. If the predetermined time expires and no suspicious data or malware is identified, then the quarantine module 400 ceases containment of the data from the portable data storage device 205.


The quarantine module 400 can detect the portable data storage device 205 by detecting a request for network services. When the portable data storage device 205 is connected to the host device 230, the host device 230 may be configured by the controller 110. In one example, the portable data storage device 205 may request an IP address. The IP address request, as well as the IP address assignment, may be detected by the quarantine module 400. Thereafter, all data from the IP address of the portable data storage device 205 may be quarantined for a predetermined period of time. Those skilled in the art will appreciate that there may be many ways to detect the portable data storage device 205 upon connection to the host device 230 and/or the communication network 120.


Similarly to the embodiments shown in FIG. 3, the quarantine module 400 may detect that the client device 330 intends or starts downloading data from the remote network server 320 over the communication network 120. The detection can be executed by analyzing data packets transmitted, requests for IP addresses, and so forth.


The quarantine module 400 can redirect data from the portable data storage device 205 or the remote network server 320. The data may then be transmitted from the portable data storage device 205 to the controller 110. If malware or suspicious data within the data is not detected by the controller 110, the indication to that effect may be provided to the host device 230. Similar technique may be used to redirect data from the remote network server 320.


Attempts may be made to access files on the portable storage device 205 before it has been determined that the portable storage device 205 does not include malware. The quarantine module 400 may intercept such access to files on the portable storage device 205, e.g., until the determination has been made.


In some embodiments, the controller 110 may quarantine data transmittable from the remote network server 320. More specifically, when the client device 330 is connected to the network server 320 over the communication network 120 and requests an IP address from a DHCP server, the quarantine module 400 may respond to the DHCP services request by configuring the client device 330 to transmit data to the controller 110. In one example, the quarantine module 400 may configure the client device 330 with a gateway IP address which is the same as the controller's 110 IP address as to send all data to the controller 110. If, after a predetermined period of time, no suspicious data or malware is detected, the client device 330 can be reconfigured so that the data is no longer transmitted to the controller 110.


The quarantine module 400 may also monitor the data directly or receive a copy of the data over a tap. In one example, the quarantine module 400 monitors and scans the data to detect the presence of the portable data storage device 205. When the portable data storage device 205 is added to the communication network 120, the quarantine module 400 quarantines the data from the portable data storage device 205 for the predetermined time. In some other embodiments, the quarantine module 400 quarantines the data downloaded from or residing at the remote network server 230 for the predetermined period of time. In another example, a tap may scan data for the portable data storage device 205 and alert the quarantine module 400 when the portable data storage device 205 is discovered. The quarantine module 400 may redirect all data from the host device 230 to the controller 110 over a separate link (not depicted) to the communication network 120. In some embodiments, there is not a tap but rather a span port.


The security module 405 may be configured to transmit commands to one or more security program(s) on the secure appliance 130 and/or the host device 230 and/or the client device 330 and to analyze responses from the security program(s). The security program(s) may be resident on the secure appliance 130 and/or the host device 230 and/or the client device 330 and are configured to activate and control security functions.


Security functions may comprise updating the operating system, updating security applications, or updating security application files. The operating system controls the components of the secure appliance 130 and/or the host device 230 and/or the client device 330 and facilitates the operation of applications. Examples of operating systems include Windows XP, Linux, and MacOS. Security applications include those applications for which the primary function is security. Examples of security applications include anti-virus programs, firewalls, and anti-spyware applications. Security files are any files that support the security applications. Examples of security files include virus definitions or spyware updates.


The security program(s) may also generate a security profile of the portable data storage devices 105 and/or of the portable data storage device 205 and/or of the remote network server 320. The security profile may comprise a list of updates or patches that the operating system needs or possesses. In one example, the security program comprises the Microsoft update Application Programming Interface (API) in the Microsoft Windows Operating system. The Microsoft update API can scan the portable data storage device 205 to compile a list of existing patches and updates. The list may then be compared to an update list at the Microsoft website to determine needed patches and updates.


In various embodiments, the security profile comprises a list of security applications on the secure appliance 130 and/or the host device 230 and/or the client device 330. The security profile may also indicate which security applications are missing or inactive. The security profile may also indicate the date the security files were created and whether new security files may be available. In one example, the security profile shows the date when the anti-virus virus definitions file was created. The anti-virus virus definitions file is a file that comprises data to identify viruses and worms. The anti-virus definitions file may also include executable code configured to eliminate one or more viruses or worms.


The security status can also indicate whether the security applications are active. In one example, the security status indicates if the security applications are currently active. The security status may also indicate if the programs are automatically activated when the digital device is first turned on.


In some embodiments, the security status indicates the configuration of the security applications. In one example, the security status indicates if the firewall application is configured to block the transmission of data from and/or to high risk programs. The security status may also indicate if the anti-virus application is configured to scan for viruses in e-mail as e-mail arrives. In some embodiments, the security status also indicates if other applications have appropriate security settings. In one example, the security status may show if an e-mail program will allow the delivery of executable programs attached to e-mail or whether a web browser allows active-x programs to run.


The heuristic module 410 can receive data from the quarantine module 400. The heuristic module 410 applies heuristics and/or probability analysis to determine if the data from the portable data storage devices 105 and/or the portable data storage device 205 and/or the remote network server 320 contains suspicious activity. In one example, the heuristic module 410 applies a heuristic which identifies suspicious data within the data. The heuristic module 410 may then flag the data as suspicious. The data can then be buffered and organized into a data flow. The data flow can be provided to the scheduler 415. In some embodiments, the data is provided directly to the scheduler 415 without buffering or organizing the data flow.


The heuristic module 410 can perform any heuristic and/or probability analysis. The heuristic module 410 may identify the suspicious characteristic of the data as a result of inspecting the data. Further details regarding exemplary heuristics and/or probability analysis are described in U.S. patent application Ser. No. 13/011,344 entitled “Systems and Methods for Detecting Malicious PDF Network Content” incorporated by reference herein in its entirety. For example, when a characteristic of the data packet, such as a sequence of characters or keyword, is identified that meets the conditions of a heuristic used, a suspicious characteristic or “feature” of the packet of data is identified. The identified features may be stored for reference and analysis. Keywords used by heuristics may be chosen by performing an approximate Bayesian probability analysis of all the keywords in an HTML specification using a corpus of malicious data and a corpus of non-malicious data. The approximate Bayesian probability analysis may be based on the principles of the Bayesian theorem and/or naïve Bayesian classification. For instance, a probability Pm that the keyword appears in malicious data may be computed using the corpus of malicious data, while a probability Pn that the keyword appears in non-malicious data may be computed using the corpus of non-malicious data. A given keyword may be determined to be a suspicious characteristic for being associated with malicious data if a score based on a computed ratio Pm/Pn exceeds a threshold of suspicion. The threshold of suspicion may be a value greater than 1, 10, 30, 60, 100, or some other number indicating how much more likely the suspicious characteristic is to indicate malicious data than to indicate non-malicious data.


A score related to a probability that the suspicious identified characteristic indicates malicious data is determined. An approximate Bayesian probability analysis may be used to determine the score. In various embodiments, the approximate Bayesian probability analysis may be performed in real-time or using a look-up table based on a previously performed approximate Bayesian probability analysis.


For example, the approximate Bayesian probability analysis may be performed to determine a relative probability score that a particular feature is associated with the presence of malicious content in a data packet by comparing a corpus of malicious data and a corpus of regular, non-malicious data. A feature may include a characteristic of the data packet, such as a sequence of characters or keyword, that meets the conditions of a heuristic used. The feature may also include a characteristic involving more than one packet inspected in sequence or in parallel. An example of a feature may include the character sequence “eval(unescape(”, which indicates a JavaScript “unescape” command nested within a JavaScript “eval” command argument. A probability Pf|m that the feature is present in a data packet of malicious content is computed by analyzing the corpus of malicious content. A probability Pf|n that the feature is present in a data packet of non-malicious content is computed by analyzing the corpus of non-malicious content. A malicious probability score is computed as the base two logarithm of a relative probability factor Pm|f that the feature is associated with malicious content. The malicious probability score is computed by computing the ratio of the base two logarithm (log2) of the probability that the feature is present in a data packet of malicious content and the base two logarithm of the probability that the feature is present in a data packet of non-malicious content. The relative probability factor Pm|f may be expressed as follows:

log2(Pm|f)=log2(Pf|m)/log2(Pf|n)  Equation 1


The size of the result log2(Pm|f) (i.e., malicious probability score) may indicate the probability that the suspicious data includes malicious data. For example, a result of eleven may indicate that the feature is approximately two thousand times more likely to appear in malicious data than in non-malicious data. Likewise, a value of twelve may indicate that the feature is approximately four thousand times more likely to appear in malicious data. In some embodiments, the malicious corpus and/or the non-malicious corpus may be continuously updated in response to monitored network data traffic, and the malicious probability scores associated with the features may be continuously updated in response to the updates to the corpuses. In other embodiments, the corpuses may be created and used in advance to store pre-computed malicious probability scores in a look-up table for reference when features are identified. The features associated with significant probabilities of malicious data may change as the corpuses change.


Rather than analyzing all files of the remote network device 320, the heuristic analysis may include identifying the types of files and data to be analyzed and limiting the analysis to those types. In addition, the remote network device may be monitored to determine incremental files added to the remote network device 320 since the last analysis, and perform the analysis only on those incremental files.


Exemplary heuristics analysis is also discussed in more detail in U.S. patent application Ser. No. 13/011,344 entitled “Systems and Methods for Detecting Malicious PDF Network Content”, U.S. patent application Ser. No. 13/350,645 entitled “Network-Based Binary File Extraction and Analysis for Malware Detection”, and in U.S. patent application Ser. No. 12/263,971 entitled“Systems and Methods for Detecting Malicious Network Content,” which all are incorporated by reference herein in their entirety.


The heuristic module 410 can retain data packets belonging to a particular data flow previously received (e.g., received from a tap) or data flow provided by the quarantine module 400. In one example, the heuristic module 410 receives data packets and stores the data packets within a buffer or other memory. Once the heuristic module 410 receives a predetermined number of data packets from a particular data flow, the heuristic module 410 performs the heuristics and/or probability analysis.


In some embodiments, the heuristic module 410 performs heuristic and/or probability analysis on a set of data packets belonging to a data flow and then stores the data packets within a buffer or other memory. The heuristic module 410 can then continue to receive new data packets belonging to the same data flow. Once a predetermined number of new data packets belonging to the same data flow are received, the heuristic and/or probability analysis can be performed upon the combination of buffered and new data packets to determine a likelihood of suspicious activity.


In some embodiments, an optional buffer receives the flagged data from the heuristic module 410. The buffer can buffer and organize the flagged data into one or more data flows before providing the one or more data flows to the scheduler 415. In various embodiments, the buffer can buffer data and stall before providing the data to the scheduler 415. In one example, the buffer stalls the data to allow other components of the controller 110 some time to complete functions or otherwise clear data congestion.


The scheduler 415 is a module configured to retrieve a virtual machine associated with the portable data storage devices 105 and/or the portable data storage device 205 and/or the remote network server 320. The virtual machine is software that is configured to mimic the performance of a device. The virtual machine can be retrieved from the virtual machine pool 425.


In some embodiments, the heuristic module 410 transmits the metadata identifying the portable data storage devices 105 and/or the portable data storage device 205 and/or the remote network server 320 to the scheduler 415. In other embodiments, the scheduler 415 receives one or more data packets of the data from the heuristic module 410 and analyzes the one or more data packets to identify the portable data storage devices 105 and/or the portable data storage device 205 and/or the remote network server 320. In yet other embodiments, the metadata can be received from the tap.


The scheduler 415 can retrieve and configure the virtual machine to mimic the pertinent performance characteristics of a user device (not shown). In one example, the scheduler 415 configures the characteristics of the virtual machine to mimic only those features of the user device that are affected by the data copied by the tap. The scheduler 415 can determine the features of the user device that are affected by the data by receiving and analyzing the data from the quarantine module 400. Such features of the user device can include opening ports that are to receive the data, selecting device drivers that are to respond to the data, and configuring any other devices coupled to or contained within the user device that can respond to the data. In other embodiments, the heuristic module 410 can determine the features of the user device that are affected by the data by receiving and analyzing the data from the tap. The heuristic module 410 can then transmit the features of the user device to the scheduler 415.


The virtual machine pool 425 may be configured to store virtual machines. The virtual machine pool 425 can be any storage capable of storing software. In one example, the virtual machine pool 425 stores a single virtual machine that can be configured by the scheduler 415 to mimic the performance of any user device on the communication network 120. The virtual machine pool 425 can store any number of distinct virtual machines that can be configured to simulate the performance of any user devices.


The analysis environment 430 is a module for analysis of the data that may simulate transmission of the data (e.g., data files) between the portable data storage devices 105 and/or the portable data storage device 205 and/or the remote network server 320 and a user device (such as the host device 230, the client device 330 or any other electronic device), variously running the data files with its associated application or running an executable file in order to analyze the effects upon the user device. The analysis environment 430 may identify the effects of malware or illegitimate computer users (e.g., hackers, computer crackers, or other computer users) by analyzing the simulation of the effects of the data upon the user device that is carried out on the virtual machine. There may be multiple analysis environments 430 in some embodiments.


As the analysis environment 430 analyzes the data, behavior of the virtual machine can be closely monitored for unauthorized activity. If the virtual machine crashes, performs illegal operations, performs abnormally, or allows access of data to an unauthorized computer user, the analysis environment 430 can react. In some embodiments, the analysis environment 430 performs a dynamic taint analysis to identify unauthorized activity (dynamic taint analysis is further described in FIG. 5.)


Once unauthorized activity is detected, the analysis environment 430 can generate the unauthorized activity signature configured to identify data containing unauthorized activity. Since the unauthorized activity signature does not necessarily require probabilistic analysis to detect unauthorized activity within data, unauthorized activity detection based on the unauthorized activity signature may be very fast and save computing time.


The policy engine 440 may be coupled to the heuristic module 410 and is a module that may identify data as suspicious based upon policies contained within the policy engine 440. In one example, a user device may be a computer designed to attract hackers and/or worms (e.g., a “honey pot”). The policy engine 440 may contain a policy to flag any data directed to the honey pot as suspicious since the honey pot should not be receiving any legitimate data. In another example, the policy engine 440 can contain a policy to flag data directed to any intended user device that contains highly sensitive or “mission critical” information.


The policy engine 440 can also dynamically apply a rule to copy all data related to data already flagged by the heuristic module 410. In one example, the heuristic module 410 may flag a single packet of data as suspicious. The policy engine 440 may then apply a rule to flag all data related to the single packet (e.g., data flows) as suspicious. In some embodiments, the policy engine 440 flags data related to suspicious data until the analysis environment 430 determines that the data flagged as suspicious is related to unauthorized activity.


The policy engine 440 may scan data to detect unauthorized activity based upon an unauthorized activity signature. In some embodiments, the policy engine 440 retrieves the unauthorized activity signature from a signature module (not shown). The data is then scanned for unauthorized activity based on the unauthorized activity signature.


The policy engine 440 can scan the header of a packet of data as well as the packet contents for unauthorized activity. In some embodiments, the policy engine 440 scans only the header of the packet for unauthorized activity based on the unauthorized activity signature. If unauthorized activity is found, then no further scanning may be performed. In other embodiments, the policy engine 440 scans the packet contents for unauthorized activity.


Advantageously, unauthorized activity may be found by scanning only the header of a packet, the contents of the packet, or both the header and the contents of the packet. As a result, unauthorized activity that might otherwise evade discovery can be detected. In one example, evidence of unauthorized activity may be located within the contents of the packet. By scanning only the contents of the packet, unauthorized activity may be detected.



FIG. 5 depicts an analysis environment 430, in accordance with one embodiment of the present invention. The analysis environment 430 may comprise a virtual switch 510 and a virtual machine 515.


The virtual switch 510 may be software that is capable of forwarding packets of flagged data to the virtual machine 515. The virtual switch 510 simulates the communication network 120 and the virtual machine 515 simulates the user device. The virtual switch 510 can route the data packets of the data flow to the correct ports of the virtual machine 515.


The virtual machine 515 is a representation of the user device (such as, for example, the host device 230, the client device 330 or any other electronic device) that can be provided to the analysis environment 430 by the scheduler 415. In one example, the scheduler 415 retrieves a virtual machine 515 from the virtual machine pool 425 and configures the virtual machine 515 to mimic the user device. The configured virtual machine 515 is then provided to the analysis environment 430, where it can receive flagged data from the virtual switch 510.


As the analysis environment 430 simulates the transmission of the data, the behavior of the virtual machine 515 can be closely monitored for unauthorized activity. If the virtual machine 515 crashes, performs illegal operations, performs abnormally, or allows access of data to an unauthorized computer user, the analysis environment 430 can react.


In some embodiments, the analysis environment 430 performs dynamic taint analysis to identify unauthorized activity. For a malware attack to change the execution of an otherwise legitimate program, the malware attack may cause a value that is normally derived from a trusted source to be derived from the user's own input. Program values (e.g., jump addresses and format strings) are traditionally supplied by a trusted program and not from external untrusted inputs. Malware, however, may attempt to exploit the program by overwriting these values.


In one example of dynamic taint analysis, all input data from untrusted or otherwise unknown sources are flagged. Program execution of programs with flagged input data is then monitored to track how the flagged data propagates (i.e., what other data becomes tainted) and to check when the flagged data is used in dangerous ways. For example, use of tainted data as jump addresses or format strings often indicates an exploit of a vulnerability such as a buffer overrun or format string vulnerability.


In some embodiments, the analysis environment 430 monitors and analyzes the behavior of the virtual machine 515 in order to determine a specific type of malware or the presence of an illicit computer user. The analysis environment 430 can also generate computer code configured to eliminate new viruses, worms, or other malware. In various embodiments, the analysis environment 430 can generate computer code configured to identify data within the data indicative of a malware attack, repair damage performed by malware, or the illicit computer user. By simulating the transmission of suspicious data and analyzing the response of the virtual machine, the analysis environment 430 can identify known and previously unidentified malware and the activities of illicit computer users before a computer system is damaged or compromised.



FIG. 6 is a flowchart of an exemplary method 600 for detecting malicious network content of portable data storage devices upon insertion of the devices into a security appliance. The method 600 may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general-purpose computer system or a dedicated machine), or a combination of both. In one exemplary embodiment, the processing logic resides at the controller 110, as illustrated in FIG. 1.


The method 600 may commence at step 602 with the controller 110 detecting an insertion of the one or more portable data storage devices 105 into the security appliance 130. In some exemplary embodiments, the security appliance 130 may detect insertion of the portable data storage devices and send, via a communication network, the data stored on the portable storage devices to the controller 110 for analysis. In one example, a user may bring a portable data storage device from home to work with an intention of using the portable storage device within the communication network 120. Security personnel may ask the user to insert the portable storage device into a slot configured to interface with the security appliance 130. This approach may allow screening portable storage devices such as USB flash drives upon inserting the devices into a security appliance. The security appliance may be associated with a security screening location such as a security checkpoint in a building. The security appliance may include a number of slots configured to receive a plurality of portable data storage devices simultaneously.


At step 604, the controller 110 may receive data stored on the one or more portable data storage devices 105 forwarded by the security appliance 130 over the communication network 120. Upon insertion of the portable data storage device 150 into the security appliance 130, the security appliance 130 may provide the owner of the portable data storage device 105 with an estimated time to complete the analysis as shown at step 606. The estimate is based on the current latency of the communication network.


At step 608, the controller 110 may analyze the data received from the security appliance 130 with a predetermined heuristics to determine whether the data is suspicious (i.e., includes certain suspicious traits). The latency of the communication network 120 may not allow the data to be analyzed with normal heuristics without exceeding the maximum time allotted for the analysis. Therefore, a faster heuristics analysis may be utilized depending on the latency of the communication network 120. The controller 110 may need to decide which heuristics to use to identify suspected malicious content and execute the content in virtual machines.


If it is determined at step 610 that the data is not suspicious, a report to this effect is generated and sent to the security appliance via the communication network 120. The portable data storage device may then be returned to the owner. However, in the situation of a reduced level of scrutiny, the clearance may be provisional. If a later analysis with normal heuristics indicates a problem, the owner of the portable data storage device may be located by the security personnel and appropriate measures may be taken. For example, the portable data storage device may be confiscated.


If, on the other hand, the heuristics analysis indicates a suspicious activity, the controller may execute the suspected data in a simulated real-life environment. Thus, if it is determined at step 610 that the data is suspicious, at step 612, the controller 110 may configure a virtual machine to receive and safely execute the suspected data in a simulated real-life environment. The method 600 continues to analyze the response of the virtual machine to identify malware at step 614. At step 616 it may be determined whether the data includes malware. If it is determined that the data does not include malware, a report to this effect may be generated and sent to the security appliance via the communication network 120. The portable data storage device may then be returned to the owner. If, on the other hand, it is determined at step 616 that the data includes malware, the method 600 may proceed to step 618 to identify the data storage devices 105 containing malware. If malware is found, at step 620, the security appliance 130 may provide a warning signal. For example, a pattern of beeps and/or flashes may indicate a security threat.



FIG. 7 is a flowchart of an exemplary method 700 for detecting malicious network content of a portable data storage device connecting the device to a host device. The method 700 may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general-purpose computer system or a dedicated machine), or a combination of both. In one exemplary embodiment, the processing logic resides at the controller 210, as illustrated in FIG. 2.


The method 700 may allow detecting malicious content on a portable data storage device upon insertion of the device into a host device, such as a PC. The techniques utilized to determine whether the portable data storage device contains malware are similar to the ones described above. The exemplary method 700 may commence at step 702 with the controller detecting the portable data storage device 205 upon its connection to the host device 230. At step 704, the controller 210 may access data on the portable data storage device 205.


At step 706, the method 700 may continue with the controller 210 quarantining the data within the host device 230. For example, the controller 210 may prevent execution of any files stored on the portable data storage device 205. The quarantining may also include ARP manipulations or configuring DHCP services to direct the data from the portable data storage device to the controller.


At step 708, the controller 210 may analyze the data received from the host device 230 with predetermined heuristics to determine whether the data is suspicious (i.e., includes certain suspicious traits). If it is determined, at step 710, that the data is not suspicious, a report to this effect may be generated and displayed by the host device 230. If, on the other hand, the heuristics analysis indicates a suspicious activity, the controller 210 may execute the suspected data in a simulated real-life environment. Thus, if it is determined at step 710 that the data is suspicious, at step 712, the controller 210 may configure a virtual machine to receive and safely execute the suspected data in a simulated real-life environment. The method 700 may analyze the response of the virtual machine and identify malware at step 714. At step 716, it may be determined whether the data includes malware. If it is determined that the data does not include malware, a report to this effect may be generated and displayed by the host device 230.


If, on the other hand, it is determined at step 716 that the data includes malware, the method 700 may proceed to step 718 to identify that the data storage device 205 contain malware.



FIG. 8 is a flowchart of an exemplary method 800 for detecting malicious network content of a remote network server when the server is connected to a client device over a communication network. The method 800 may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general-purpose computer system or a dedicated machine), or a combination of both. In one exemplary embodiment, the processing logic resides at the controller 310, as illustrated in FIG. 3.


The method 800 may allow detecting malicious content on a remote network server upon establishment of connection of a client device, such as a PC, to the remote network server over a network such as the Internet or LAN. The techniques utilized to determine whether the remote network server contains malware are similar to the ones described above.


The exemplary method 800 may commence at step 802 with the controller 310 detecting connection of the client device 330 to the remote network server 320 over the communication network 120. Such connection may be detected, for example, when the client device 330 requests or confirms downloading some content from the remote network server 320. In some other embodiments, the connection can be detected by analyzing IP addresses of the client device 330.


At step 804, the controller 310 may redirect data flow from the client device 330 to the controller 310 thereby preventing downloading the content to the client device 330. At this step, the controller 310 receives the content of the remote network server 320.


At step 806, the controller 310 may analyze the data received from the remote network server 320 with predetermined heuristics to determine whether the data is suspicious (i.e., includes certain suspicious traits). If it is determined, at step 808, that the data is not suspicious, a report to this effect may be generated and displayed by the client device 330. If, on the other hand, the heuristics analysis indicates a suspicious activity, the controller 310 may execute the suspected data in a simulated real-life environment. Thus, if it is determined at step 808 that the data is suspicious, at step 810, the controller 310 may configure a virtual machine to receive and safely execute the suspected data in a simulated real-life environment. The method 800 may analyze the response of the virtual machine and identify malware at step 812. At step 814, it may be determined whether the data includes malware. If it is determined that the data does not include malware, a report to this effect may be generated and displayed by the client device 330. If, on the other hand, it is determined at step 814 that the data includes malware, the method 800 may proceed to step 816 to identify that the remote network server 320 contains malware. The data that includes malware may be located within one or more files in some embodiments and those one or more files may be moved to a pre-configured quarantine folder.


The malware found in regards to the remote network server may be associated with one or more callback channels for transmitting data back to the remote network server. For example, the malware may comprise a bot. A bot is a software robot configured to remotely control all or a portion of a digital device (e.g., a computer) without authorization by the digital device's user. Bot related activities include bot propagation and attacking other computers on a network. Bots commonly propagate by scanning nodes (e.g., computers or other digital devices) available on a network to search for a vulnerable target. When a vulnerable computer is scanned, the bot may install a copy of itself. Once installed, the new bot may continue to seek other computers on a network to infect. The bot may also, without the authority of the infected computer user, establish a command and control (C&C) communication channel, e.g. a callback channel, to receive instructions. For example, an IRC protocol may be used for bot command and control. Therefore, detecting the existence or establishment of an IRC channel in the network may indicate a possible botnet callback channel.


In some embodiments, the virtual machine is also configured to detect such callback channels. Information regarding the detected callback channels may be made stored or otherwise be made available to other elements of a malware detection system, e.g., to systems for detecting malware originating from the Internet, rather than just from the remote network server. Further details regarding exemplary callback (C&C) channel detection are described in U.S. patent application Ser. No. 11/998,750 entitled “Systems and Methods for Detecting Encrypted Bot Command & Control Communication Channels” and U.S. patent application Ser. No. 11/998,605 entitled “Systems and Methods for Detecting Communication Channels of Bots”, both of which are incorporated by reference herein in their entirety.



FIG. 9 is a block diagram of the controller 110 (FIG. 1), in accordance with one embodiment of the present invention. The controller 110 may be used to implement the controller 210 of FIG. 2 or the controller 310 of FIG. 3. The controller 110 comprises a processor 900, a memory system 905, a storage system 810, an input/output (I/O) interface 915, a communication network interface 920, and a display interface 925, which are all coupled to a system bus 930. The processor 900 is configured to execute executable instructions. In some embodiments, the processor 900 comprises circuitry or any one or more processors capable of processing the executable instructions.


The memory system 905 is any memory configured to store data. Some examples of the memory system 905 include storage devices such as RAM or ROM.


The storage system 910 is any storage configured to retrieve and store data. Some examples of the storage system 910 are flash drives, hard drives, optical drives, and/or magnetic tape. The storage system 910 can comprise a database or other data structure configured to hold and organize data (e.g., data, copies of data, buffered data.) In some embodiments, the controller 110 includes memory 905 in the form of RAM and storage 910 in the form of flash data. The memory system 905 and/or the storage system 910 can comprise caches and buffers configured to retain data or copies of data.


The I/O interface 915 is any device that can receive input and provide output to a user. The I/O interface 915 can be, but is not limited to, a keyboard, a mouse, a touchscreen, a keypad, a biosensor, or floppy disk drive.


The communication network interface 920 can be coupled to any user device via the links 935. The communication network interface 920 may support communication over a USB connection, a firewire connection, an Ethernet connection, a serial connection, a parallel connection, or an ATA connection. The communication network interface 920 may also support wireless communication (e.g., 802.11 a/b/g/n or wireless USB). It will be apparent to those skilled in the art that the communication network interface 920 can support many wired and wireless standards.


The display interface 925 is an interface configured to support a display, monitor, or screen. In some embodiments, the controller 110 comprises a graphical user interface to be displayed to a user over a monitor in order to allow the user to control the controller 110.


The above-described modules can be comprised of instructions that are stored on storage media. The instructions can be retrieved and executed by a processor (e.g., the processor 900). Some examples of instructions include software, program code, and firmware. Some examples of storage media comprise memory devices and integrated circuits. The instructions are operational when executed by the processor to direct the processor to operate in accordance with embodiments of the present invention. Those skilled in the art are familiar with instructions, processor(s), and storage media.


The present invention is described above with reference to exemplary embodiments. It will be apparent to those skilled in the art that various modifications may be made and other embodiments can be used without departing from the broader scope of the present invention. Therefore, these and other variations upon the exemplary embodiments are intended to be covered by the present invention.

Claims
  • 1. A method for detecting malicious content within data storage devices, the method comprising: detecting coupling of one or more data storage devices to an interface of a digital device upon insertion of the one or more data storage devices into the interface of the digital device;quarantining data associated with the one or more data storage devices by (i) redirecting at least a portion of the data, transmitted from the one or more data storage devices, to a controller remotely located from the digital device for analysis and (ii) intercepting access requests from the digital device to access the data;receiving, by the controller, the redirected data from the one or more data storage devices;selecting an analysis from a plurality of analysis types based on an estimated amount of time needed for analysis of the redirected data for malware without exceeding a predetermined time allotted for analysis;analyzing the redirected data with the selected analysis to determine whether the one or more data storage devices store malware; andbased on the determination, identifying whether the one or more data storage devices stores malware by providing a warning signal.
  • 2. The method of claim 1, wherein at least one of the one or more storage devices is a Universal Serial Bus (USB) flash drive and the interface is a USB interface.
  • 3. The method of claim 1, wherein the digital device is configured to receive a plurality of portable data storage devices simultaneously.
  • 4. The method of claim 1, wherein in response to the identifying of the one or more data storage devices as storing the malware, the digital device provides the warning signal that includes audible sound.
  • 5. The method of claim 1, wherein the digital device provides the estimated time that is based, at least in part, on a determined latency of the communication network.
  • 6. The method of claim 1, wherein the digital device comprises a security appliance that is positioned at a selected security screening location.
  • 7. The method of claim 1, wherein the digital device includes a host device and the quarantining further comprises: preventing processing by the host device of the data from the one or more data storage devices.
  • 8. The method of claim 1, wherein insertion of the one or more data storage devices into the digital device is detected based on a request by a data storage device for an internet protocol (IP) address.
  • 9. The method of claim 1, wherein the controller performs the receiving and the analyzing.
  • 10. The method of claim 9, wherein the analyzing is performed by the controller, at least in part, in a virtual analysis environment.
  • 11. The method of claim 1, wherein the redirected data associated with the one or more storage devices is received by the controller from the digital device over a communication network.
  • 12. The method of claim 1, wherein the redirecting of the data to the controller comprises redirecting the data to the controller contained in the digital device.
  • 13. The method of claim 1, wherein the selected analysis is a heuristic analysis.
  • 14. The method of claim 1, wherein the controller includes at least one digital device configured to receive and analyze the redirected data for a presence of the malware within the one or more data storage devices.
  • 15. The method of claim 1, wherein the digital device computes the estimated amount of time needed for analysis of the redirected data for malware.
  • 16. The method of claim 1, wherein the quarantining of the data associated with the one or more data storage devices occurs in response to insertion of a portable storage device of the one or more data storage devices into a corresponding slot of the digital device that is configured to receive the portable storage device.
  • 17. The method of claim 1, wherein the selecting of the analysis based on the estimated amount of time needed for analysis of the redirected data for malware comprises selecting one of a plurality of heuristic analyses.
  • 18. The method of claim 1, wherein the selecting of the analysis based on the estimated amount of time needed for analysis of the redirected data for malware comprises selecting a particular data type of the redirected data for analysis.
  • 19. The method of claim 18, wherein the selecting of the particular data type of the redirected data comprises selecting one or more particular files from a plurality of files that are part of the redirected data.
  • 20. The method of claim 1, further comprising transmitting a command to the digital device to configure security settings associated with the one or more data storage devices.
  • 21. The method of claim 1, wherein the analyzing of the redirected data comprises: configuring a virtual machine to receive the redirected data, wherein the virtual machine is configured with one of a plurality of different virtual machine configurations to simulate one or more performance characteristics of a destination device intended to receive the redirected data; andanalyzing a response of the virtual machine to the redirected data to identify a malware attack.
  • 22. The method of claim 1, wherein the analyzing of the redirected data comprises: analyzing the redirected data with the selected analysis to identify data containing one or more suspicious characteristics, the selected analysis being a heuristic analysis;configuring a virtual machine to receive the redirected data containing one or more suspicious characteristics; andanalyzing a response of the virtual machine to identify the malware within the redirected data containing one or more suspicious characteristics from the one or more data storage devices.
  • 23. The method of claim 22, wherein the analyzing of the redirected data with the selected analysis to identify data containing one or more suspicious characteristics further comprises saving a copy of the redirected data and analyzing the copy of the redirected data with the selected analysis; and wherein the analyzing of the response of the virtual machine further comprises selectively identifying the one or more data storage devices as storing the malware.
  • 24. A method for detecting malicious content within data storage devices, the method comprising: detecting a connection of a portable storage device to a host device upon insertion of the portable storage device into an interface of the host device;in response to the detected connection, quarantining data stored on the portable storage device by intercepting data transmitted from the portable storage device to the host device, and wherein access requests from the host device to access the data stored on the portable storage device are intercepted to further quarantine the data from the host device;selecting an analysis from a plurality of analysis types based on an estimated amount of time needed for analysis of the intercepted data for malware without exceeding a predetermined time allotted for analysis;analyzing the intercepted data with the selected analysis at a controller in communication with the host device via a communication network to determine whether the intercepted data includes malware; andbased on the determination, selectively identifying the portable storage device as storing the malware.
  • 25. The method of claim 24, wherein the analyzing of the intercepted data comprises: analyzing the intercepted data with the selected analysis to identify the intercepted data containing one or more suspicious characteristics, the selected analysis being a heuristic analysis;configuring a virtual machine to receive the intercepted data containing the one or more suspicious characteristics; andanalyzing the response of the virtual machine to the intercepted data containing the one or more suspicious characteristics to identify a malware attack.
  • 26. The method of claim 24, wherein the analyzing of the intercepted data to determine whether the intercepted data includes malware further comprises: configuring a virtual machine with the one of a plurality of different virtual machine configurations to simulate one or more performance characteristics of a destination device to receive the intercepted data;transmitting the intercepted data to the configured virtual machine; andanalyzing a response of the virtual machine to the intercepted data to identify a malware attack.
  • 27. The method of claim 24, wherein the controller is communicatively coupled with the host device over a communication network and the estimated time is based, at least in part, on a determined latency of the communication network.
  • 28. The method of claim 24, wherein the controller is executed within the host device.
  • 29. The method of claim 24, wherein the analyzing is performed by the controller.
  • 30. The method of claim 24, further comprising: notifying the host device that the intercepted data stored on the portable storage device does not contain malware.
  • 31. The method of claim 24, wherein the intercepted data is quarantined from the host device by the controller to prevent the host device from processing the intercepted data.
  • 32. The method of claim 24, wherein the controller is configured to transmit a command to the host device to activate one or more security programs.
  • 33. The method of claim 32, wherein the one or more security programs are resident on the portable storage device and are configured to operate security functions with respect to the intercepted data.
  • 34. The method of claim 1, wherein the plurality of analysis types comprise a first heuristic analysis and a second heuristic analysis that is less comprehensive than the first heuristic analysis.
  • 35. The method of claim 34, wherein a copy of the redirected data is stored for later analysis in response to selection of the second heuristic analysis.
  • 36. The method of claim 24, wherein the controller includes at least one digital device configured to receive and analyze the intercepted data for a presence of the malware within the portable data storage device.
  • 37. The method of claim 24, wherein the digital device computes the estimated amount of time needed for analysis of the intercepted data for malware.
  • 38. The method of claim 24, wherein the quarantining of the data occurs in response to insertion of the portable storage device into a corresponding slot of the host device that is configured to receive the portable storage device.
  • 39. The method of claim 24, wherein the selecting of the analysis based on the estimated amount of time needed for analysis of the intercepted data for malware comprises selecting one of a plurality of heuristic analyses to be conducted on the intercepted data.
  • 40. The method of claim 24, wherein the selecting of the analysis based on the estimated amount of time needed for analysis of the intercepted data for malware comprises selecting a particular data type of the intercepted data for analysis.
  • 41. The method of claim 40, wherein the selecting of the particular data type of the intercepted data comprises selecting one or more particular files from a plurality of files that are part of the intercepted data.
  • 42. The method of claim 41, wherein the estimated amount of time needed for analysis of the intercepted data is based, at least in part, on a latency of the first communication network.
  • 43. A system for detecting malicious content within portable data storage devices, the system comprising: a quarantine module configured to detect a connection of one or more data storage devices upon insertion of the one or more data storage devices into a digital device, the digital device being configured to receive the one or more data storage devices, wherein the quarantine module is configured to quarantine all data from an IP address of the data storage devices for a period of time by redirecting transmission of the data to a controller, from the one or more data storage devices, and intercepting access requests of the digital device to access the data; andthe controller communicatively coupled to the digital device via a communication network, the controller configured to receive at least the redirected data from the quarantine module, the controller includes at least a heuristic module configured to select a heuristic analysis from a plurality of heuristic analysis types based on an estimated amount of time needed for analysis of the redirected data for malware without exceeding a predetermined time and to analyze the redirected data with the selected heuristic analysis to determine whether the redirected data includes malware; anda security module to selectively identify, based on the determination, the one or more data storage devices storing the malware.
  • 44. The system of claim 43, wherein the heuristic module configured to select a heuristic analysis from the plurality of heuristic analysis types, the plurality of analysis types comprise a first heuristic analysis and a second heuristic analysis that is less comprehensive and requires less time for analysis than the first heuristic analysis.
  • 45. The system of claim 43, wherein the estimated time is based, at least in part, on a determined latency of the communication network.
  • 46. The system of claim 43, wherein the controller includes at least one digital device configured to receive and analyze the redirected data for a presence of the malware.
  • 47. The system of claim 43, wherein the digital device computes the estimated amount of time needed for analysis of the intercepted data for malware.
  • 48. The system of claim 43, wherein the quarantine module to quarantine the data from an Internet Protocol (IP) address of a portable data storage device of the one or more data storage devices in response to the portable data storage device being inserted into a corresponding slot of the digital device configured to receive the portable data storage device.
  • 49. The system of claim 43, wherein the estimated amount of time needed for analysis of the redirected data is based, at least in part, on a latency of the communication network.
  • 50. A non-transitory machine readable medium having embodied thereon executable code, the executable code being executed by a processor for performing a method for detecting malicious content within data storage devices, the method comprising: detecting coupling of one or more data storage devices to a security appliance upon insertion of the one or more data storage devices into an interface of the security appliance, the security appliance being configured with the interface to receive the one or more data storage devices;quarantining, by the security appliance, data associated with the one or more data storage devices by redirecting the data that is transmitted from the one or more data storage devices to the security appliance to a controller, and intercepting access requests from the security appliance to access the data;receiving from the security appliance, via a communication network, at least the redirected data from the security appliance;selecting an analysis from a plurality of analysis types based on an estimated amount of time needed for analysis of the redirected data for malware without exceeding a predetermined time allotted for analysis;analyzing the redirected data with the selected analysis to determine whether the data includes malware; andbased on the determination, selectively identifying the one or more data storage devices storing the malware.
  • 51. The non-transitory machine readable medium of claim 50, wherein the executable code being executed by the processor for selecting the analysis from the plurality of analysis types comprises selecting one of a first heuristic analysis or a second heuristic analysis as the selected analysis from the plurality of analysis types, the plurality of analysis types include the first heuristic analysis and the second heuristic analysis that is less comprehensive and requires less time for analysis than the first heuristic analysis.
  • 52. The non-transitory computer readable medium of claim 50, wherein the security appliance computes the estimated amount of time needed for analysis of the redirected data for malware.
  • 53. The non-transitory computer readable medium of claim 50, wherein the quarantining of the data associated with the one or more data storage devices occurs in response to insertion of a portable data storage device of the one or more data storage devices into a corresponding slot of the security appliance configured to receive the portable data storage device.
  • 54. The non-transitory computer readable medium of claim 50, wherein the selecting of the analysis based on the estimated amount of time needed for analysis of the redirected data for malware comprises selecting one of a plurality of heuristic analyses.
  • 55. The non-transitory computer readable medium of claim 50, wherein the selecting of the analysis based on the estimated amount of time needed for analysis of the redirected data for malware comprises selecting a particular data type of the redirected data for analysis.
  • 56. The non-transitory computer readable medium of claim 55, wherein the selecting of the particular data type of the redirected data comprises selecting one or more particular files from a plurality of files that are part of the redirected data.
US Referenced Citations (280)
Number Name Date Kind
4292580 Ott et al. Sep 1981 A
5175732 Hendel et al. Dec 1992 A
5440723 Arnold et al. Aug 1995 A
5657473 Killean et al. Aug 1997 A
5842002 Schnurer et al. Nov 1998 A
5978917 Chi Nov 1999 A
6088803 Tso et al. Jul 2000 A
6094677 Capek et al. Jul 2000 A
6269330 Cidon et al. Jul 2001 B1
6279113 Vaidya Aug 2001 B1
6298445 Shostack et al. Oct 2001 B1
6357008 Nachenberg Mar 2002 B1
6424627 Sørhaug et al. Jul 2002 B1
6484315 Ziese Nov 2002 B1
6487666 Shanklin et al. Nov 2002 B1
6493756 O'Brien et al. Dec 2002 B1
6550012 Villa et al. Apr 2003 B1
6775657 Baker Aug 2004 B1
6832367 Choi et al. Dec 2004 B1
6895550 Kanchirayappa et al. May 2005 B2
6898632 Gordy et al. May 2005 B2
6907396 Muttik et al. Jun 2005 B1
6981279 Arnold et al. Dec 2005 B1
7007107 Ivchenko et al. Feb 2006 B1
7028179 Anderson et al. Apr 2006 B2
7043757 Hoefelmeyer et al. May 2006 B2
7065789 Neyman Jun 2006 B1
7069316 Gryaznov Jun 2006 B1
7080407 Zhao et al. Jul 2006 B1
7080408 Pak et al. Jul 2006 B1
7093002 Wolff et al. Aug 2006 B2
7093239 van der Made Aug 2006 B1
7100201 Izatt Aug 2006 B2
7107617 Hursey et al. Sep 2006 B2
7159149 Spiegel et al. Jan 2007 B2
7231667 Jordan Jun 2007 B2
7240364 Branscomb et al. Jul 2007 B1
7240368 Roesch et al. Jul 2007 B1
7287278 Liang Oct 2007 B2
7308716 Danford et al. Dec 2007 B2
7356736 Natvig Apr 2008 B2
7386888 Liang et al. Jun 2008 B2
7392542 Bucher Jun 2008 B2
7418729 Szor Aug 2008 B2
7428300 Drew et al. Sep 2008 B1
7441272 Durham et al. Oct 2008 B2
7448084 Apap et al. Nov 2008 B1
7458098 Judge et al. Nov 2008 B2
7464404 Carpenter et al. Dec 2008 B2
7464407 Nakae et al. Dec 2008 B2
7480773 Reed Jan 2009 B1
7487543 Arnold et al. Feb 2009 B2
7496960 Chen et al. Feb 2009 B1
7496961 Zimmer et al. Feb 2009 B2
7519990 Xie Apr 2009 B1
7523493 Liang et al. Apr 2009 B2
7530104 Thrower et al. May 2009 B1
7540025 Tzadikario May 2009 B2
7565550 Liang et al. Jul 2009 B2
7603715 Costa et al. Oct 2009 B2
7639714 Stolfo et al. Dec 2009 B2
7644441 Schmid et al. Jan 2010 B2
7676841 Sobchuk et al. Mar 2010 B2
7698548 Shelest et al. Apr 2010 B2
7707633 Danford et al. Apr 2010 B2
7779463 Stolfo et al. Aug 2010 B2
7784097 Stolfo et al. Aug 2010 B1
7849506 Dansey et al. Dec 2010 B1
7869073 Oshima Jan 2011 B2
7877803 Enstone et al. Jan 2011 B2
7904959 Sidiroglou et al. Mar 2011 B2
7908660 Bahl Mar 2011 B2
7937761 Bennett May 2011 B1
7996556 Raghavan et al. Aug 2011 B2
7996836 McCorkendale et al. Aug 2011 B1
7996905 Arnold et al. Aug 2011 B2
8006305 Aziz Aug 2011 B2
8010667 Zhang et al. Aug 2011 B2
8020206 Hubbard et al. Sep 2011 B2
8028338 Schneider et al. Sep 2011 B1
8045094 Teragawa Oct 2011 B2
8045458 Alperovitch et al. Oct 2011 B2
8069484 McMillan et al. Nov 2011 B2
8087086 Lai et al. Dec 2011 B1
8171553 Aziz et al. May 2012 B2
8201246 Wu et al. Jun 2012 B1
8204984 Aziz et al. Jun 2012 B1
8220055 Kennedy Jul 2012 B1
8233882 Rogel Jul 2012 B2
8234709 Viljoen et al. Jul 2012 B2
8239944 Nachenberg et al. Aug 2012 B1
8291499 Aziz et al. Oct 2012 B2
8307435 Mann et al. Nov 2012 B1
8307443 Wang et al. Nov 2012 B2
8312545 Tuvell et al. Nov 2012 B2
8321936 Green et al. Nov 2012 B1
8321941 Tuvell et al. Nov 2012 B2
8370938 Daswani et al. Feb 2013 B1
8370939 Zaitsev et al. Feb 2013 B2
8375444 Aziz et al. Feb 2013 B2
8516593 Aziz Aug 2013 B2
8528086 Aziz Sep 2013 B1
8539582 Aziz et al. Sep 2013 B1
8549638 Aziz Oct 2013 B2
8561177 Aziz et al. Oct 2013 B1
8566946 Aziz et al. Oct 2013 B1
8584094 Dahdia et al. Nov 2013 B2
8584239 Aziz et al. Nov 2013 B2
8635696 Aziz Jan 2014 B1
8881234 Narasimhan et al. Nov 2014 B2
20010005889 Albrecht Jun 2001 A1
20010047326 Broadbent et al. Nov 2001 A1
20020018903 Kokubo et al. Feb 2002 A1
20020038430 Edwards et al. Mar 2002 A1
20020091819 Melchione et al. Jul 2002 A1
20020095607 Lin-Hendel Jul 2002 A1
20020116627 Tarbotton et al. Aug 2002 A1
20020144156 Copeland, III Oct 2002 A1
20020162015 Tang Oct 2002 A1
20020166063 Lachman et al. Nov 2002 A1
20020184528 Shevenell et al. Dec 2002 A1
20020188887 Largman et al. Dec 2002 A1
20020194490 Halperin et al. Dec 2002 A1
20030074578 Ford et al. Apr 2003 A1
20030084318 Schertz May 2003 A1
20030101381 Mateev et al. May 2003 A1
20030110258 Wolff Jun 2003 A1
20030115483 Liang Jun 2003 A1
20030188190 Aaron et al. Oct 2003 A1
20030191957 Hypponen et al. Oct 2003 A1
20030200460 Morota et al. Oct 2003 A1
20030212902 Van Der Made Nov 2003 A1
20030237000 Denton et al. Dec 2003 A1
20040003323 Bennett et al. Jan 2004 A1
20040015712 Szor Jan 2004 A1
20040019832 Arnold et al. Jan 2004 A1
20040047356 Bauer Mar 2004 A1
20040083408 Spiegel et al. Apr 2004 A1
20040111531 Staniford et al. Jun 2004 A1
20040165588 Pandya Aug 2004 A1
20040236963 Danford et al. Nov 2004 A1
20040243349 Greifeneder et al. Dec 2004 A1
20040243829 Jordan Dec 2004 A1
20040249911 Alkhatib et al. Dec 2004 A1
20040255161 Cavanaugh Dec 2004 A1
20040268147 Wiederin et al. Dec 2004 A1
20050033960 Vialen et al. Feb 2005 A1
20050033989 Poletto et al. Feb 2005 A1
20050050148 Mohammadioun et al. Mar 2005 A1
20050086523 Zimmer et al. Apr 2005 A1
20050091513 Mitomo et al. Apr 2005 A1
20050091533 Omote et al. Apr 2005 A1
20050114663 Cornell et al. May 2005 A1
20050125195 Brendel Jun 2005 A1
20050157662 Bingham et al. Jul 2005 A1
20050183143 Anderholm et al. Aug 2005 A1
20050201297 Peikari Sep 2005 A1
20050210533 Copeland et al. Sep 2005 A1
20050238005 Chen et al. Oct 2005 A1
20050265331 Stolfo Dec 2005 A1
20060010495 Cohen et al. Jan 2006 A1
20060015715 Anderson Jan 2006 A1
20060021054 Costa et al. Jan 2006 A1
20060031476 Mathes et al. Feb 2006 A1
20060047665 Neil Mar 2006 A1
20060070130 Costea et al. Mar 2006 A1
20060075496 Carpenter et al. Apr 2006 A1
20060095968 Portolani et al. May 2006 A1
20060101516 Sudaharan et al. May 2006 A1
20060101517 Banzhof et al. May 2006 A1
20060117385 Mester et al. Jun 2006 A1
20060123477 Raghavan et al. Jun 2006 A1
20060143316 Mills et al. Jun 2006 A1
20060143709 Brooks et al. Jun 2006 A1
20060150249 Gassen et al. Jul 2006 A1
20060161983 Cothrell et al. Jul 2006 A1
20060161987 Levy-Yurista Jul 2006 A1
20060161989 Reshef et al. Jul 2006 A1
20060164199 Gilde et al. Jul 2006 A1
20060173992 Weber et al. Aug 2006 A1
20060179147 Tran et al. Aug 2006 A1
20060184632 Marino et al. Aug 2006 A1
20060191010 Benjamin Aug 2006 A1
20060221956 Narayan et al. Oct 2006 A1
20060236393 Kramer et al. Oct 2006 A1
20060242709 Seinfeld et al. Oct 2006 A1
20060248582 Panjwani et al. Nov 2006 A1
20060251104 Koga Nov 2006 A1
20060282896 Qi Dec 2006 A1
20060288417 Bookbinder et al. Dec 2006 A1
20070006288 Mayfield et al. Jan 2007 A1
20070006313 Porras et al. Jan 2007 A1
20070011174 Takaragi et al. Jan 2007 A1
20070016951 Piccard et al. Jan 2007 A1
20070033645 Jones Feb 2007 A1
20070038943 FitzGerald et al. Feb 2007 A1
20070064689 Shin et al. Mar 2007 A1
20070094730 Bhikkaji et al. Apr 2007 A1
20070143827 Nicodemus et al. Jun 2007 A1
20070156895 Vuong Jul 2007 A1
20070157306 Elrod et al. Jul 2007 A1
20070171824 Ruello et al. Jul 2007 A1
20070174915 Gribble et al. Jul 2007 A1
20070192500 Lum Aug 2007 A1
20070192858 Lum Aug 2007 A1
20070198275 Malden et al. Aug 2007 A1
20070240218 Tuvell et al. Oct 2007 A1
20070240219 Tuvell et al. Oct 2007 A1
20070240220 Tuvell et al. Oct 2007 A1
20070240222 Tuvell et al. Oct 2007 A1
20070250930 Aziz et al. Oct 2007 A1
20070271446 Nakamura Nov 2007 A1
20080005782 Aziz Jan 2008 A1
20080072326 Danford et al. Mar 2008 A1
20080080518 Hoeflin et al. Apr 2008 A1
20080098476 Syversen Apr 2008 A1
20080120722 Sima et al. May 2008 A1
20080134334 Kim et al. Jun 2008 A1
20080141376 Clausen et al. Jun 2008 A1
20080184373 Traut et al. Jul 2008 A1
20080215742 Goldszmidt et al. Sep 2008 A1
20080222729 Chen et al. Sep 2008 A1
20080263665 Ma et al. Oct 2008 A1
20080295172 Bohacek Nov 2008 A1
20080301810 Lehane et al. Dec 2008 A1
20080320594 Jiang Dec 2008 A1
20090007100 Field et al. Jan 2009 A1
20090013408 Schipka Jan 2009 A1
20090031423 Liu et al. Jan 2009 A1
20090044024 Oberheide et al. Feb 2009 A1
20090044274 Budko et al. Feb 2009 A1
20090083369 Marmor Mar 2009 A1
20090083855 Apap et al. Mar 2009 A1
20090089879 Wang et al. Apr 2009 A1
20090094697 Provos et al. Apr 2009 A1
20090133125 Choi et al. May 2009 A1
20090193293 Stolfo et al. Jul 2009 A1
20090228233 Anderson et al. Sep 2009 A1
20090241190 Todd et al. Sep 2009 A1
20090271867 Zhang Oct 2009 A1
20090300761 Park et al. Dec 2009 A1
20090328185 Berg et al. Dec 2009 A1
20090328221 Blumfield et al. Dec 2009 A1
20100017546 Poo et al. Jan 2010 A1
20100043073 Kuwamura Feb 2010 A1
20100054278 Stolfo et al. Mar 2010 A1
20100058474 Hicks Mar 2010 A1
20100077481 Polyakov et al. Mar 2010 A1
20100083376 Pereira et al. Apr 2010 A1
20100115621 Staniford et al. May 2010 A1
20100192223 Ismael et al. Jul 2010 A1
20100251104 Massand Sep 2010 A1
20100281102 Chinta et al. Nov 2010 A1
20100281541 Stolfo et al. Nov 2010 A1
20100281542 Stolfo et al. Nov 2010 A1
20100287260 Peterson et al. Nov 2010 A1
20100293268 Jones Nov 2010 A1
20110025504 Lyon et al. Feb 2011 A1
20110047620 Mahaffey et al. Feb 2011 A1
20110055907 Narasimhan et al. Mar 2011 A1
20110078794 Manni et al. Mar 2011 A1
20110093951 Aziz Apr 2011 A1
20110099633 Aziz Apr 2011 A1
20110113231 Kaminsky May 2011 A1
20110247072 Staniford et al. Oct 2011 A1
20110265182 Peinado et al. Oct 2011 A1
20110307955 Kaplan et al. Dec 2011 A1
20110314546 Aziz et al. Dec 2011 A1
20120079596 Thomas et al. Mar 2012 A1
20120084859 Radinsky et al. Apr 2012 A1
20120117652 Manni et al. May 2012 A1
20120174186 Aziz et al. Jul 2012 A1
20120174218 McCoy et al. Jul 2012 A1
20120174227 Mashevsky Jul 2012 A1
20120210423 Friedrichs et al. Aug 2012 A1
20120222121 Staniford et al. Aug 2012 A1
20120330801 McDougal Dec 2012 A1
20130036472 Aziz Feb 2013 A1
20130227691 Aziz et al. Aug 2013 A1
20130291109 Staniford et al. Oct 2013 A1
Foreign Referenced Citations (5)
Number Date Country
GB2439806 Jan 2008 GB
WO2008041950 Apr 2008 SG
0206928 Jan 2002 WO
0223805 Mar 2002 WO
WO-2012145066 Oct 2012 WO
Non-Patent Literature Citations (61)
Entry
Office Actions mailed Sep. 5, 2008, Feb. 12, 2009, Oct. 26, 2009, Sep. 29, 2010, and Mar. 16, 2011, in U.S. Appl. No. 11/096,287, filed Mar. 31, 2005.
Office Actions mailed Apr. 2, 2009, Nov. 25, 2009, Oct. 5, 2010, and Aug. 29, 2011, in U.S. Appl. No. 11/151,812, filed Jun. 13, 2005.
Office Actions mailed Apr. 2, 2009, Oct. 28, 2009, and Aug. 31, 2010, and Notices of Allowance mailed Feb. 23, 2011 and Jun. 8, 2011, in U.S. Appl. No. 11/152,286, filed Jun. 13, 2005.
Office Actions mailed Oct. 13, 2009 and Apr. 23, 2010, in U.S. Appl. No. 11/494,990, filed Jul. 28, 2006.
Office Actions mailed Jul. 20, 2009, May 11, 2010, Dec. 21, 2010, and Aug. 19, 2011, in U.S. Appl. No. 11/471,072, filed Jun. 19, 2006.
Office Actions mailed Aug. 8, 2008, Feb. 11, 2009, Aug. 4, 2009, Mar. 29, 2010, Nov. 24, 2010, May 10, 2011, and Aug. 23, 2011, Advisory Action mailed Jul. 26, 2011, and Notices of Allowance mailed Dec. 15, 2011 and Feb. 7, 2012, in U.S. Appl. No. 11/409,355, filed Apr. 20, 2006.
Office Actions mailed May 6, 2010 and Nov. 22, 2010, in U.S. Appl. No. 11/717,475, filed Mar. 12, 2007.
Office Actions mailed Apr. 20, 2010, Sep. 28, 2010, and Dec. 20, 2011, in U.S. Appl. No. 11/717,408, filed Mar. 12, 2007.
Office Actions mailed May 25, 2010 and Feb. 2, 2011, in U.S. Appl. No. 11/717,474, filed Mar. 12, 2007.
Office Actions mailed Jun. 17, 2010 and Feb. 2, 2011, in U.S. Appl. No. 11/717,476, filed Mar. 12, 2007.
Office Actions mailed Feb. 2, 2010, Jul. 14, 2010, Nov. 10, 2010, and Jun. 9, 2011, and Notice of Allowance mailed Mar. 23, 2012, in U.S. Appl. No. 11/998,750, filed Nov. 30, 2007.
Office Actions mailed May 27, 2011, Dec. 5, 2011, and Mar. 16, 2012, in U.S. Appl. No. 11/998,605, filed Nov. 30, 2007.
Office Actions mailed Apr. 13, 2010, Oct. 28, 2010, and Jan. 31, 2012, in U.S. Appl. No. 11/709,570, filed Feb. 21, 2007.
Office Actions mailed Apr. 18, 2011, Aug. 26, 2011, and Apr. 11, 2012, and Advisory Action mailed Nov. 9, 2011, in U.S. Appl. No. 12/263,971, filed Nov. 3, 2008.
Costa, M. et al. “Vigilante: End-to-End Containment of Internet Worms,” SOSP '05, Oct. 23-26, 2005, Association for Computing Machinery, Inc., Brighton U.K.
Chaudet, C. et al. “Optimal Positioning of Active and Passive Monitoring Devices,” International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, Oct. 2005, pp. 71-82, CoNEXT '05, Toulousse, France.
Crandall, J.R. et al., “Minos: Control Data Attack Prevention Orthogonal to Memory Model,” 37th International Symposium on Microarchitecture, Dec. 2004, Portland, Oregon.
Kim, H. et al., “Autograph: Toward Automated, Distributed Worm Signature Detection,” Proceedings of the 13th Usenix Security Symposium (Security 2004), Aug. 2004, pp. 271-286, San Diego.
Kreibich, J. et al., “Honeycomb—Creating Intrusion Detection Signatures Using Honeypots” 2nd Workshop on Hot Topics in Networks (HotNets-11), 2003, Boston, USA.
Newsome, J. et al., “Polygraph: Automatically Generating Signatures for Polymorphic Worms,” In Proceedings of the IEEE Symposium on Security and Privacy, May 2005, pp. 226-241.
Newsome, J. et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software,” In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), Feb. 2005.
Singh, S. et al., “Automated Worm Fingerprinting,” Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, Dec. 2004, San Francisco, California.
Margolis, P.E., “Computer & Internet Dictionary 3rd Edition,” ISBN 0375603519, Dec. 1998.
Whyte, D. et al. “DNS-Based Detection of Scanning Worms in an Enterprise Network,” Proceedings of the 12th Annual Network and Distributed System Security Symposium, Feb. 2005. 15 pages.
Kristoff, J. “Botnets, Detection and Mitigation: DNS-Based Techniques,” NU Security Day 2005, 23 pages.
IEEE Xplore Digital Library Search Results for “detection of unknown computer worms”. Http//ieeexplore.ieee.org/searchresult.jsp?SortField=Score&SortOrder=desc&ResultC . . . Accessed on Aug. 28, 2009.
AltaVista Advanced Search Results. “Event Orchestrator”. Http://www.altavista.com/web/results?Itag=ody&pg=aq&aqmode=aqa=Event+Orchestrator . . . Accessed on Sep. 3, 2009.
AltaVista Advanced Search Results. “attack vector identifier”. Http://www.altavista.com/web/results?Itag=ody&pg=aq&aqmode=aqa=attack+vector+ide . . . Accessed on Sep. 15, 2009.
Silicon Defense, “Worm Containment in the Internal Network”, Mar. 2003, pp. 1-25.
Nojiri, D. et al., “Cooperative Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, Apr. 22-24, 2003, vol. 1, pp. 293-302.
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”), Jun. 9, 2003.
Peter M. Chen, and Brian D. Noble , “When Virtual Is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”), May 20, 2001.
IEEE Xplore Digital Library Sear Results for “detection of unknown computer worms”. Http//ieeexplore.ieee.org/searchresult.jsp?SortField=Score&SortOrder=desc&ResultC . . . , (Accessed on Aug. 28, 2009).
International Search Report and Written Opinion of the International Searching Authority Dated May 10, 2012; International Application No. PCT/US 12/21916.
International Search Report and Written Opinion of the International Searching Authority Dated May 25, 2012; International Application No. PCT/US 12/26402.
Cisco, Configuring the Catalyst Switched Port Analyzer (SPAN) (“Cisco”), (1992-2003 Cisco Systems).
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doorn, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”).
Excerpt regarding First Printing Date for Merike Kaeo, Designing Network Security (“Kaeo”), (Copyright 2005).
The Sniffers's Guide to Raw Traffic available at: yuba.stanford.edu/˜casado/pcap/section1.html, (Jan. 6, 2014).
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (Copyright 2003).
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&arnumber=990073, (Dec. 7, 2013).
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003).
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184.
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77.
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003).
Cohen, M.I. , “PyFlag—an advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120.
Crandall, J.R. , et al., “Minos:Control Data Attack Prevention Orthogonal to Memory Model”, 37th International Symposium on Microarchitecture, Portland, Oregon, (Dec. 2004).
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007).
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002).
Hjelmvik, Erik , “Passive Network Security Analysis with NetworkMiner”, (IN)SECURE, Issue 18, (Oct. 2008), pp. 1-100.
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003).
Krasnyansky, Max , et al., Universal TUN/TAP driver, available at https://www.kernel.org/doc/Documentation/networking/tuntap.txt (2002) (“Krasnyansky”).
Liljenstam, Michael , et al., “Simulating Realistic Network Traffic for Worm Warning System Design and Testing”, Institute for Security Technology studies, Dartmouth College, (“Liljenstam”), (Oct. 27, 2003).
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001).
Natvig, Kurt , “SandBoxII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002).
Spitzner, Lance , “Honeypots: Tracking Hackers”, (“Spizner”), Exhibit 1023, (Sep. 17, 2002).
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998).
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003).
Moore, D. et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, Mar. 30-Apr. 3, 2003, vol. 3, pp. 1901-1910.
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Dec. 2002, Las Vegas, NV, USA, pp. 1-9.
“Packet”, Microsoft Computer Dictionary, Microsoft Press, Mar. 2002, 1 pg.
Related Publications (1)
Number Date Country
20130227691 A1 Aug 2013 US