Nefarious individuals attempt to compromise computer systems in a variety of ways. As an example, such individuals may embed or otherwise include malicious software (“malware”) in email attachments and transmit or cause them to be transmitted to unsuspecting users. An administrator of the recipient computer system can attempt to prevent compromise by blocking all downloads of all attachments by the computer system. However, such a policy will also prevent legitimate attachments from being available to the user. As an alternate approach, the administrator can require that a security scan be performed prior to the download of an attachment. Unfortunately, malware authors are crafting increasingly sophisticated malware that is increasingly able to evade detection. Accordingly, there exists an ongoing need for improved techniques to detect malware and prevent its harm.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
In the example shown in
Appliance 102 can take a variety of forms. For example, appliance 102 can be a dedicated device or set of devices. The functionality provided by appliance 102 can also be integrated into or executed as software on a general purpose computer, a computer server, a gateway, and/or a network/routing device. In some embodiments services provided by data appliance 102 are instead (or in addition) provided to client 104 by software executing on client 104.
Whenever appliance 102 is described as performing a task, a single component, a subset of components, or all components of appliance 102 may cooperate to perform the task. Similarly, whenever a component of appliance 102 is described as performing a task, a subcomponent may perform the task and/or the component may perform the task in conjunction with other components. In various embodiments, portions of appliance 102 are provided by one or more third parties. Depending on factors such as the amount of computing resources available to appliance 102, various logical components and/or features of appliance 102 may be omitted and the techniques described herein adapted accordingly. Similarly, additional logical components/features can be added to system 102 as applicable.
As will be described in more detail below, appliance 102 can be configured to work in cooperation with one or more virtual machine servers (112, 124) to perform malware analysis/prevention. As one example, data appliance 102 can be configured to provide a copy of malware 130 to one or more of the virtual machine servers for real-time analysis. As another example, service 122 can provide a list of signatures of known-malicious documents to appliance 102 as part of a subscription. Those signatures can be generated by service 122 in conjunction with the techniques described herein.
An example of a virtual machine server is a physical machine comprising commercially available server-class hardware (e.g., a multi-core processor, 64+ Gigabytes of RAM, and one or more Gigabit network interface adapters) that runs commercially available virtualization software, such as VMware ESXi, Citrix XenServer, Kernel Based Virtual Machine (KVM), or Microsoft Hyper-V. The virtual machine servers can be separate from, but in communication with, data appliance 102, as shown in
Using Virtual Machines to Analyze Files
A virtual machine (VM) can be used to perform behavior profiling (e.g., in a VM sandbox environment) using various heuristic-based analysis techniques that can be performed in real-time during a file transfer (e.g., during an active file/attachment download) and/or on files previously collected (e.g., a collection of files submitted for batch analysis). Documents, executables, and other forms of potentially malicious software (e.g., to be evaluated) are referred to herein as “samples.”
Suppose a malicious user of system 120 sends an email message to a user of client 104 that includes a malicious attachment. The attachment may be an executable (e.g., having a .exe extension) and may also be a document (e.g., a .doc or .pdf file). The message is received by data appliance 102, which determines whether a signature for the attachment is present on data appliance 102. A signature, if present, can indicate that the attachment is known to be safe, and can also indicate that the attachment is known to be malicious. If no signature for the attachment is found, data appliance 102 is configured to provide the attachment to a virtual machine server, such as virtual machine server 112, for analysis.
Virtual machine server 112 is configured to execute (or open in an application, as applicable) the attachment in one or more virtual machines 114-116. The virtual machines may all execute the same operating system (e.g., Microsoft Windows XP SP 3), may execute different operating systems, and/or may collectively execute combinations of operating systems (and/or versions thereof) (e.g., with VM 116 emulating an Android operating system). In some embodiments, the VM(s) chosen to analyze the attachment are selected to match the operating system of the intended recipient of the attachment being analyzed (e.g., where the operating system of client 104 is Microsoft Windows XP SP 2). Observed behaviors resulting from executing/opening the attachment (such as changes to certain platform, software, or registry settings) are logged and analyzed for indications that the attachment is malicious. In some embodiments the log analysis is performed by the VM server (e.g., VM server 112). In other embodiments, the analysis is performed at least in part by appliance 102. The malware analysis and enforcement functionality illustrated in
If the attachment is determined to be malicious, appliance 102 can automatically block the file download based on the analysis result. Further, a signature can be generated and distributed (e.g., to other data appliances, and/or to cloud security service 122) to automatically block future file transfer requests to download the file determined to be malicious.
Configuring and Virtual Machine Instances
Malware often leverages exploits that are specific to a particular system configuration or set of system configurations. For example, malware 130 might be able to successfully compromise a computer system running Windows XP SP 3 (e.g., running on client 104), but be unable to compromise a computer system running any versions of Windows 7 (e.g., running on client 106). If the only virtual machine used to evaluate malware 130 is a Windows 7 image, the malicious nature of malware 130 might not be discovered. As another example, malware 130 might be able to successfully compromise a system upon which a particular combination of software is installed (e.g., a specific version of Internet Explorer with a specific version of Java). If the only virtual machine image(s) used to evaluate malware 130 include only one but not both applications, or include different versions of those applications, the malicious nature of malware 130 might not be discovered.
Some computing environments are relatively homogenous. For example, every employee at a startup might be issued the same laptop, running the same operating system, and with the same base applications installed. More typically, however, a range of different platforms and configurations is supported (e.g., in an enterprise environment). Further, certain employees (e.g., in the Finance Department) may need access to additional software (e.g., Microsoft Access) not included on the systems of other users. And, employees are often allowed to customize their systems, e.g., by adding or removing software.
Suppose malware 130 targets Microsoft Windows systems. Further suppose that the IT Department of Acme Company supports the following: Windows XP SP 3 and 64-bit Windows 7 installations, with either Internet Explorer 9 or 10, and any of Microsoft Office 2003, 2007, and 2010 installed. An Acme Company employee may thus potentially be opening malware 130 on any of twelve different officially supported Windows system configurations. Using the techniques described herein, malware analysis system 132 can efficiently evaluate malware 130 using twelve virtual machine instances (corresponding to each of the twelve potential system configurations) and will be able to detect that malware 130 is malicious. Further, as will be described in more detail below, runtime customizations can efficiently be made to those virtual machine instances (e.g., to efficiently test customized systems).
Copy-on-Write
One approach to providing each of the twelve different types of Windows emulation environments is to create a full virtual machine image for each configuration. In that scenario, virtual machine server 112 could be configured to store twelve full images, for each of the following, respectively:
A virtual machine executing malware is generally input/output bound, not CPU bound. By running the virtual machine in RAM, the input/output is reduced and the emulation can be performed significantly more quickly than where it is not run in RAM. Suppose each of the above full images occupies approximately 10 G of disk space. Executing each of the images will accordingly require a minimum of approximately 10 G of RAM, meaning virtual machine server 112 requires a minimum of 120 G of RAM, just for the twelve images (e.g., to run in a RAM disk). Unfortunately, while some systems (e.g., server 124) might have sufficient resources to support the parallel execution of each of the images, such an amount of memory is potentially very costly, and/or may be infeasible to leverage (e.g., due to limits of memory bandwidth). For example, if server 112 is required execute dozens of malware samples at the same time, RAM requirements could potentially be thousands of gigabytes. The situation can be exacerbated if additional software is supported. For example, if the IT Department begins supporting Microsoft Office 2013, sixteen system configurations are thus supported by the IT department (and would correspond to sixteen images and 160 G of RAM). If the IT Department instead begins supporting Windows 8, eighteen system configurations are thus supported by the IT department (and would correspond to eighteen images and 180 G of RAM). As another example, if a service pack is released for an application (e.g., Microsoft Office 2010 is updated), each of the full images including that application will need to be regenerated (or otherwise obtained, e.g., downloaded from service 122).
An alternate approach to providing emulators for each of Acme Company's system configurations is to create each of the emulators as a copy-on-write overlay of a base image (or hierarchy of images, described in more detail below). The following example assumes that the “Quick EMUlator” (QEMU) is used by virtual machine server 112. Other hypervisors can also be used, as applicable, to provide an appropriate environment for evaluating malware.
At 302, one or more original virtual machine images are copied to the RAM disk. Examples of such original virtual machine images include images reflecting a base installation of an operating system (e.g., Windows XP SP3 and 64-bit Windows 7). The base installation can be downloaded from a source of virtual machine images (e.g., by system 132 from service 122, from microsoft.com, or from any other appropriate source) and can also be created using original installation media (e.g., using an official Windows ISO file to create a base image). Similarly, one or more base images (or specifications for making such images) can be provided to service 122 by an administrator of enterprise network 110 (i.e., so that service 122 can evaluate malware 130 on behalf of Acme Company in accordance with the types of systems actually deployed in network 110).
In some embodiments, “original virtual machine images” also include images reflecting a base installation of an operating system (e.g., Windows XP SP3) after a major application package has been applied (e.g., Office 2013, requiring additional Gigabytes of changes to a system to install). Using the running example of Acme Company having twelve supported Windows-based system configurations, a total of six “original virtual machine images” could be created and used at 302. I.e., for each of the following combinations: Windows XP+Office 2003, Windows XP+Office 2007, Windows XP+Office 2010, Windows 7+Office 2003, Windows 7+Office 2007, and Windows 7+Office 2010; a virtual machine image can be created and copied to the RAM disk at 302 by installing the base operating system (e.g. Windows 7) and applicable version of Office (e.g., Office 2010) a total of six times to form six original virtual machine images. In other embodiments, such software packages are added as copy-on-write overlays, described in more detail below.
A copy-on-write overlay can be created (in QEMU) with the following example command: qemu-img create-b </path/to/base>-f qcow2</path/to/overlay>. In this example, a copy-on-write overlay will be created at the location “/path/to/overlay” and will be based on the existing base image located at “/path/to/base.” Any differences between the base install (e.g., due to executing the copy-on-write overlay and installing additional software in it) will be stored in “/path/to/overlay.” Using this approach, the same six kinds of virtual machine images (i.e., the same six combinations of: Windows XP+Office 2003, Windows XP+Office 2007, Windows XP+Office 2010, Windows 7+Office 2003, Windows 7+Office 2007, and Windows 7+Office 2010) can be created as before, but will ultimately consume considerably less space (and thus less RAM). In particular, three copy-on-write overlays can be created using Windows XP as a base (and then installing each of the three Office versions in the overlays) and also three copy-on-write overlays can be created using Windows 7 as a base (and installing each of the three Office versions in the overlays). As mentioned above, the amount of disk space (and thus RAM disk) consumed using this approach will be significantly less (e.g., 10 G for the Windows XP and Windows 7 base images, but only 500 M-1 G for each of the six copy-on-write overlays reflecting the three versions of office having been installed on top of each of the two operating systems). As will be described in more detail below, additional modifications (e.g., installing specific versions of Internet Explorer and other software) can be done dynamically, in conjunction with the evaluation of samples.
At 304, a first virtual machine instance is initialized as a first copy-on-write overlay. One example of the processing performed at 304 is as follows: A copy-on-write overlay is initialized, using as a (read-only) base, the system configuration to be emulated (e.g., using an image of “Windows 7+Office 2010” located at/path/to/win7office2010). As explained above, the “base” used in initializing the copy-on-write can either be a single base image (i.e. a 10 G image of Windows 7+Office 2010) or a copy-on-write overlay. The resulting first virtual machine instance can be used to evaluate a malware sample (e.g., malware 130).
The first virtual machine instance can also be initialized using a hierarchy of images. For example, the first virtual machine instance can be initialized as a copy-on-write overlay of “win7office2010IE9.qcow2,” which is a copy-on-write overlay of “win7office2010.qcow2,” which is a copy-on-write overlay of a base image of Windows 7 (without Office 2010 installed). In this example, “win7office2010.qcow2” is a file that reflects only those changes made to a base install of Windows 7 when Office 2010 is installed (e.g., after the execution of a command such as “startVM harddisk_file=office2010.qcow2-cdrom “office2010.iso”, and consuming ˜2 G of space once the install is completed). Similarly, “win7office2010IE9.qcow2” is a file that reflects only those changes made to “win7office2010.qcow2” after Internet Explorer 9 is installed (e.g., consuming 200 M of space).
At 306, the first virtual machine instance is started and the first sample is executed. Various techniques for providing the sample to the virtual machine instance are provided below. In some embodiments, the sample is executed for a fixed amount of time, such as five minutes. Any changes to the virtual machine will be captured in the copy-on-write overlay file and can be analyzed (e.g., to determine which system files the malware adds, deletes, modifies, or changes when executed). Network traffic associated with the first virtual machine instance can also be analyzed (e.g., using pcap). As will be described in more detail below, in some embodiments, additional modifications/customizations are made prior to executing the sample. For example, additional user software can be installed (e.g., Internet Explorer 9 and/or Java Version 7 Update 25), as can honey or other files, which can help draw out malicious behaviors of the sample (e.g., including data of likely interest to malware, such as a passwords file). Further, various hooks (e.g., kernel hooks and/or user hooks) can be installed to facilitate analysis of the sample.
At 308 and 310, portions 304 and 306 of the process are repeated, with respect to a second virtual machine instance (e.g., using “Windows XP SP3+Office 2007” as the base) and a second sample (e.g. sample 130 or a different sample). The same sample (e.g., sample 130) can be evaluated in parallel across multiple virtual machine instances (e.g., associated with different system configurations). And/or, different samples can be evaluated in parallel across multiple virtual machine instances, at least some of which are the same (e.g., with two Windows XP SP3+Office 2010 instances being used in parallel to evaluate two different samples).
Analysis of the results of emulating the samples are performed. As explained above, conclusions can be made as to whether the samples are malicious, and signatures can be generated for future use. The virtual machine instances can then be abandoned and new instances used to evaluate new samples.
Executing Malware Samples
As explained above, in various embodiments, runtime customizations are made to virtual machine instances prior to executing samples for evaluation. As one example, certain software packages (e.g., Java) are updated very frequently (e.g., with updates occurring once per week). While such packages could be included in the base images described above, in some circumstances, it may be more efficient to install such packages in conjunction with evaluating a given sample. As another example, some software (e.g., Microsoft Access) may only be installed/used by a small subset of employees in an enterprise. Resources (e.g., of system 132) can be conserved by selectively testing samples in environments which include such software.
The process begins at 402 when one or more modifications are made to a virtual machine instance. As one example, suppose a virtual machine instance, “malwareevaluator1” is initialized at 304 of process 300 as a copy-on-write overlay of an image (whether itself a copy-on-write overlay, or a base image) of a Windows 7 installation that has Office 2010 installed. At 402, modifications are made to malwareevaluator1. One way to make modifications is to open the virtual machine instance and modify it, using a command such as “libguestfs.” Other approaches can also be used, as applicable (e.g., depending on which type of virtualization or emulation software is being used by the virtual machine server). As malwareevaluator1 is a copy-on-write overlay of another image, only those changes resulting from the modifications will be stored.
One example of a modification is the inserting of kernel level and/or user level hooks that facilitate analysis of samples. Such hooks may be frequently updated (e.g., by the operator of service 122). Inserting the hooks just prior to execution is efficient, and can also mitigate attempts by malware 130 to determine that it is operating in an instrumented (i.e., hooked) environment.
Another example of a modification is the copying of the malware sample to the virtual machine instance (which, prior to modification, is clean). Other software can also be copied to the virtual machine instance, such as the most current (or previous) versions of Java or other software that updates frequently (or, as mentioned above, is less commonly installed on a system than larger packages such as Office).
Yet another example of a modification is the inclusion in the virtual machine instance of a set of startup instructions. One example of startup instructions is an autoexec.bat file which, when a Windows-based virtual machine instance is started, will be executed as part of the startup process. Similar types of instructions can be created for other types of operating systems, as applicable. One example of instructions that can be included is a set of instructions to install/load additional programs (e.g., install Java). Another example of instructions that can be included is a set of instructions to execute the malware (e.g., where malware 130 is an executable file) or to load the malware (e.g., where malware 130 is a Microsoft Word or PDF document).
Yet another example of a modification is to randomize certain environment values to thwart detection by the malware that it is being operated in a virtualized environment. For example, virtual machine server 112 can be configured to provide virtual machines such as virtual machine 114 with randomized product IDs. As another example, the computer name associated with virtual machine 114 can be randomized. As yet another example, the computer name associated with virtual machine 114 can be set to a very common name, such as “HOME” or “WORKGROUP.” As yet another example, the harddrive name can be randomized or set to a very common name. As yet another example, the MAC address can be randomized.
Returning to process 400, at 404 the modified virtual machine instance is started. As explained above, where the modification at 402 includes the installation of startup instructions (e.g., in an autoexec.bat file). Thus, as part of the processing performed at 404, actions such as installing additional software (e.g., Java), and executing the sample will occur.
Finally, at 406 data resulting from the executing of the virtual machine instance is captured. As one example, at 406, any modifications to the filesystem are captured (i.e., where the virtual machine instance is a copy-on-write overlay and the modifications are stored. As another example, at 406, any hooks installed at 402 can report log information (e.g., back to appliance 102) for analysis. As yet another example, at 406, network traffic can be captured and logged (e.g., using pcap).
Analysis of the results of emulating the sample are performed. As explained above, conclusions can be made as to whether the samples are malicious, and signatures can be generated for future use. The virtual machine instance can then be abandoned and new instances used to evaluate new samples.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
Number | Name | Date | Kind |
---|---|---|---|
5983348 | Ji | Nov 1999 | A |
7409718 | Hong et al. | Aug 2008 | B1 |
7496963 | Shipp | Feb 2009 | B2 |
7568233 | Szor et al. | Jul 2009 | B1 |
7603713 | Belov | Oct 2009 | B1 |
7649838 | Fishteyn et al. | Jan 2010 | B2 |
7664855 | Freed et al. | Feb 2010 | B1 |
7779472 | Lou | Aug 2010 | B1 |
7823202 | Nucci et al. | Oct 2010 | B1 |
7870610 | Mitchell et al. | Jan 2011 | B1 |
7930273 | Clark | Apr 2011 | B1 |
7945908 | Waldspurger | May 2011 | B1 |
7958555 | Chen et al. | Jun 2011 | B1 |
8011010 | Michael et al. | Aug 2011 | B2 |
8141132 | Oliver et al. | Mar 2012 | B2 |
8151352 | Novitchi | Apr 2012 | B1 |
8201246 | Wu et al. | Jun 2012 | B1 |
8209680 | Le et al. | Jun 2012 | B1 |
8225317 | Chiueh et al. | Jul 2012 | B1 |
8260914 | Ranjan | Sep 2012 | B1 |
8291468 | Chickering | Oct 2012 | B1 |
8316440 | Hsieh et al. | Nov 2012 | B1 |
8321936 | Green et al. | Nov 2012 | B1 |
8347100 | Thornewell et al. | Jan 2013 | B1 |
8359651 | Wu et al. | Jan 2013 | B1 |
8370938 | Daswani et al. | Feb 2013 | B1 |
8402543 | Ranjan et al. | Mar 2013 | B1 |
8407324 | McDougal | Mar 2013 | B2 |
8438639 | Lee et al. | May 2013 | B2 |
8443363 | Brennan, III et al. | May 2013 | B1 |
8443449 | Fan et al. | May 2013 | B1 |
8464341 | Cohen | Jun 2013 | B2 |
8479295 | Sahita et al. | Jul 2013 | B2 |
8484732 | Chen et al. | Jul 2013 | B1 |
8484739 | Seshadri | Jul 2013 | B1 |
8495742 | Abadi et al. | Jul 2013 | B2 |
8510827 | Leake et al. | Aug 2013 | B1 |
8516591 | Fly et al. | Aug 2013 | B2 |
8521667 | Zhu et al. | Aug 2013 | B2 |
8533842 | Satish | Sep 2013 | B1 |
8539577 | Stewart et al. | Sep 2013 | B1 |
8566946 | Aziz | Oct 2013 | B1 |
8572740 | Mashevsky et al. | Oct 2013 | B2 |
8578481 | Rowley | Nov 2013 | B2 |
8584239 | Aziz et al. | Nov 2013 | B2 |
8631489 | Antonakakis et al. | Jan 2014 | B2 |
8646071 | Pereira et al. | Feb 2014 | B2 |
8646088 | Pistoia et al. | Feb 2014 | B2 |
8656491 | Daswani et al. | Feb 2014 | B1 |
8677487 | Balupari et al. | Mar 2014 | B2 |
8683584 | Daswani et al. | Mar 2014 | B1 |
8707441 | Cidambi et al. | Apr 2014 | B1 |
8756691 | Nachenberg | Jun 2014 | B2 |
8763125 | Feng | Jun 2014 | B1 |
8826426 | Dubey | Sep 2014 | B1 |
8838570 | English | Sep 2014 | B1 |
8966625 | Zuk et al. | Feb 2015 | B1 |
9003526 | El-Moussa | Apr 2015 | B2 |
9117079 | Huang | Aug 2015 | B1 |
9223962 | Kashyap | Dec 2015 | B1 |
9317680 | Carter, III et al. | Apr 2016 | B2 |
20040030913 | Liang et al. | Feb 2004 | A1 |
20040107416 | Buban et al. | Jun 2004 | A1 |
20050177602 | Kaler et al. | Aug 2005 | A1 |
20050283837 | Olivier | Dec 2005 | A1 |
20060021029 | Brickell et al. | Jan 2006 | A1 |
20060168024 | Mehr et al. | Jul 2006 | A1 |
20070039053 | Dvir | Feb 2007 | A1 |
20070050850 | Katoh et al. | Mar 2007 | A1 |
20070106986 | Worley et al. | May 2007 | A1 |
20070192857 | Ben-Itzhak | Aug 2007 | A1 |
20070261112 | Todd et al. | Nov 2007 | A1 |
20080016552 | Hart | Jan 2008 | A1 |
20080127338 | Cho et al. | May 2008 | A1 |
20080155694 | Kwon et al. | Jun 2008 | A1 |
20080177755 | Stern et al. | Jul 2008 | A1 |
20080177994 | Mayer | Jul 2008 | A1 |
20080209562 | Szor | Aug 2008 | A1 |
20080229393 | Congdon | Sep 2008 | A1 |
20080256633 | Arnold et al. | Oct 2008 | A1 |
20080263658 | Michael et al. | Oct 2008 | A1 |
20080263659 | Alme | Oct 2008 | A1 |
20080320594 | Jiang | Dec 2008 | A1 |
20090007100 | Field et al. | Jan 2009 | A1 |
20090019547 | Palliyil et al. | Jan 2009 | A1 |
20090055928 | Kang et al. | Feb 2009 | A1 |
20090144826 | Piccard | Jun 2009 | A2 |
20090150419 | Kim et al. | Jun 2009 | A1 |
20090235357 | Ebringer | Sep 2009 | A1 |
20090241190 | Todd et al. | Sep 2009 | A1 |
20090254989 | Achan et al. | Oct 2009 | A1 |
20090265786 | Xie et al. | Oct 2009 | A1 |
20090282485 | Bennett | Nov 2009 | A1 |
20090288167 | Freericks et al. | Nov 2009 | A1 |
20100037314 | Perdisci et al. | Feb 2010 | A1 |
20100043072 | Rothwell | Feb 2010 | A1 |
20100077481 | Polyakov et al. | Mar 2010 | A1 |
20100107252 | Mertoguno | Apr 2010 | A1 |
20100115586 | Raghavan et al. | May 2010 | A1 |
20100154059 | McNamee et al. | Jun 2010 | A1 |
20100162350 | Jeong et al. | Jun 2010 | A1 |
20100175132 | Zawadowskiy et al. | Jul 2010 | A1 |
20100281458 | Paladino | Nov 2010 | A1 |
20110041179 | St. Hlberg | Feb 2011 | A1 |
20110055923 | Thomas | Mar 2011 | A1 |
20110090911 | Hao et al. | Apr 2011 | A1 |
20110099620 | Stavrou et al. | Apr 2011 | A1 |
20110161955 | Woller et al. | Jun 2011 | A1 |
20110167495 | Antonakakis et al. | Jul 2011 | A1 |
20110173698 | Polyakov et al. | Jul 2011 | A1 |
20110185425 | Lee et al. | Jul 2011 | A1 |
20110208714 | Soukal et al. | Aug 2011 | A1 |
20110239299 | Chen et al. | Sep 2011 | A1 |
20110252474 | Ward et al. | Oct 2011 | A1 |
20110271342 | Chung et al. | Nov 2011 | A1 |
20110283360 | Abadi et al. | Nov 2011 | A1 |
20110296412 | Banga et al. | Dec 2011 | A1 |
20110296486 | Burch et al. | Dec 2011 | A1 |
20120042381 | Antonakakis et al. | Feb 2012 | A1 |
20120054869 | Yen et al. | Mar 2012 | A1 |
20120084860 | Cao et al. | Apr 2012 | A1 |
20120089700 | Safruti et al. | Apr 2012 | A1 |
20120096549 | Amini et al. | Apr 2012 | A1 |
20120117650 | Nachenberg | May 2012 | A1 |
20120117652 | Manni et al. | May 2012 | A1 |
20120192274 | Odom et al. | Jul 2012 | A1 |
20120233691 | Jiang | Sep 2012 | A1 |
20120240224 | Payne et al. | Sep 2012 | A1 |
20120255018 | Sallam | Oct 2012 | A1 |
20120255019 | McNamee et al. | Oct 2012 | A1 |
20120255021 | Sallam | Oct 2012 | A1 |
20120255031 | Sallam | Oct 2012 | A1 |
20120278889 | El-Moussa | Nov 2012 | A1 |
20120291042 | Stubbs et al. | Nov 2012 | A1 |
20120291131 | Turkulainen et al. | Nov 2012 | A1 |
20130014259 | Gribble et al. | Jan 2013 | A1 |
20130047147 | McNeill | Feb 2013 | A1 |
20130055394 | Beresnevichiene et al. | Feb 2013 | A1 |
20130091350 | Gluck | Apr 2013 | A1 |
20130091570 | McCorkendale et al. | Apr 2013 | A1 |
20130104230 | Tang et al. | Apr 2013 | A1 |
20130145002 | Kannan et al. | Jun 2013 | A1 |
20130145008 | Kannan et al. | Jun 2013 | A1 |
20130152200 | Alme | Jun 2013 | A1 |
20130227165 | Liu | Aug 2013 | A1 |
20130232574 | Carothers | Sep 2013 | A1 |
20130246685 | Bhargava et al. | Sep 2013 | A1 |
20130298184 | Ermagan et al. | Nov 2013 | A1 |
20130298192 | Kumar et al. | Nov 2013 | A1 |
20130298230 | Kumar et al. | Nov 2013 | A1 |
20130298242 | Kumar et al. | Nov 2013 | A1 |
20130298243 | Kumar et al. | Nov 2013 | A1 |
20130298244 | Kumar et al. | Nov 2013 | A1 |
20130326625 | Anderson et al. | Dec 2013 | A1 |
20140059641 | Chapman et al. | Feb 2014 | A1 |
20140096131 | Sonnek et al. | Apr 2014 | A1 |
20140283037 | Sikorski et al. | Sep 2014 | A1 |
20140337836 | Ismael | Nov 2014 | A1 |
20140380474 | Paithane et al. | Dec 2014 | A1 |
20150058984 | Shen et al. | Feb 2015 | A1 |
20150067862 | Yu | Mar 2015 | A1 |
Number | Date | Country |
---|---|---|
2012134584 | Oct 2012 | WO |
2013067505 | May 2013 | WO |
2013067508 | May 2013 | WO |
2013134206 | Sep 2013 | WO |
Entry |
---|
Author Unknown, “FireEye Malware Analysis”, FireEye.com, FireEye, Inc., 2010. |
Author Unknown, “Hybrid Sandboxing for Detecting and Analyzing Advanced and Unknown Malware”, Blue Coat Systems, Inc., 2014. |
Lindorfer et al., “Detecting Enviroment-Sensitive Malware”, Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011. |
Singh et al., “Hot Knives Through Butter: Evading File-based Sandboxes”, FireEye, Inc., Feb. 2014. |
Author Unknown, “Multi-Vector Virtual Execution (MVX) Engine”, FireEye, Inc., http://www.fireeye.com/products-and-solutions/virtual-execution-engine.html, 2014. |
Wagener et al., “An Instrumented Analysis of Unknown Software and Malware Driven by Free Libre Open Source Software”, Signal Image Technology and Internet Based Systems, 2008. SITIS'08. IEEE International Conference on. IEEE, 2008. |
Lau et al., “Measuring Virtual Machine Detection in Malware using DSD Tracer”, Sophoslabs, Journal in Computer Virology, 2008. |
Davidoff et al., “Chapter 12: Malware Forensics”, Network Forensics: Tracking Hackers Through Cyberspace, Pearson Education Inc., Jun. 2012, 60 pages. |
Ligh et al., “Chapter 5: Researching Domains and IP Addresses,” Malware Analyst's Cookbook, John Wiley & Sons, 2011, 38 pages. |
van der Heide et al., “DNS Anomaly Detection,” System and Network Engineering Research Group, University of Amsterdam, Feb. 6, 2011, 20 pages. |
Abu Rajab et al., “A Multifaceted Approach to Understanding the Botnet Phenonmenon,” Proceedings of the 6th ACM SIGCOMM conference on Internet measurement, 2006, 12 pages. |
Schechter et al., “Fast Detection of Scanning Worm Infections,” Recent Advances in Intrusion Detection: 7th International Symposium RAID 2004 Proceedings, 2004, 24 pages. |
Sikorski et al., “Chapter 14: Malware-Focused Network Signatures,” Practical Malware Anlaysis, No Starch Press, Feb. 2012, 13 pages. |
Chen et al., “Chapter 4: Guarding Against Network Intrusions,” Network and System Security, Elsevier Inc., 2009, 5 pages. |
Zang et al., “Botnet Detection Through Fine Flow Classifcation”, CSE Dept. Technical Report No. CSE11-001, p. 1-17, Jan. 31, 2011. |
Landecki, Grzegorz, Detecting Botnets, Linux Journal, Jan. 1, 2009. |
Karasaridis, Anestis et al., Wide-scale Botnet Detection and Characterization, Dec. 14, 2010. |
Author Unknown, Advanced Persistent Threats (APT), What's an APT? A Brief Definition, Damballa, Dec. 14, 2010. |
Author Unknown, Executive Overview, The Command Structure of the Aurora Botnet, Damballa, Mar. 2010. |
Strayer, W. Timothy et al. Detecting Botnets with Tight Command and Control, BBN Technologies, Nov. 2006. |
Ramachandran, Anirudh et al., Revealing Botnet Membership Using DNSBL Counter-Intelligence, Jul. 7, 2006. |
Goebel, Jan et al., Rishi: Identify Bot Contaminated Hosts by IRC Nickname Evaluation, Apr. 2007. |
Gu, Guofei et al., BotSniffer: Detecting Botnet Command and Control Channels in Network Traffic, Feb. 2008. |
Gu, Guofei et al., BotHunter: Detecting Malware Infection Through IDS-Driven Dialog Correlation, Aug. 2007. |
Gu, Guofei et al., BotMiner: Clustering Analysis of Network Traffic for Protocol- and Structure-Independent Botnet Detection, Jul. 2008. |
Royal, Paul, Analysis of the Kraken Botnet, Damballa, Apr. 9, 2008. |
Livadas, Carl et al., Using Machine Learning Techniques to Identify Botnet Traffic, BBN Technologies, Nov. 2006. |
Binkley, James R. et al., An Algorithm for Anomaly-based Botnet Detection, Jul. 2006. |
Yen, Ting-Fang et al., Traffic Aggregation for Malware Detection, Jul. 2008. |
Author Unknown, Anatomy of a Targeted Attack, Damballa, Dec. 3, 2008. |
Author Unknown, Layer 8, How and Why Targeted Attacks Exploit Your Users, Damballa, Nov. 2011. |
Author Unknown, Targeted Attacks for Fun and Profit, An Executed Guide to a New and Growing Enterprise Threat, Damballa, Oct. 13, 2008. |
Author Unknown, AV, IDS/IPS and Damballa's Response to Targeted Attacks, A Technology Comparison, Damballa, Nov. 2008. |
Author Unknown, Updated on the Enemy, A Deconstruction of Who Profits From Botnets, Damballa, May 13, 2009. |
Author Unknown, A Day in the Life of a BotArmy, Damballa, 2008. |
Ollmann, Gunter, Botnet Communication Topologies, Understanding the Intricacies of Bonet Command and Control, Damballa, Jun. 2009. |
Ollmann, Gunter, The Botnet vs. Malware Relationship, The One to one Botnet Myth, Damballa, Jun. 2009. |
Author Unknown, Closed Window, How Failsafe Enhancements Dramatically Limit Opportunities for Malware Armies and other Targeted Attacks, Damballa, Sep. 23, 2009. |
Author Unknown, Damballa's In-The-Cloud Security Model, Enterprise Protection Moves Beyond the Network Perimeter, Damballa, Aug. 24, 2008. |
Ollmann, Gunter, Extracting CnC from Malware, The Role of malware Sample Analysis in Botnet Detection, Damballa, Dec. 8, 2009. |
Ollmann, Gunter, The Opt-In Botnet Generation, Hacktivism and Centrally Controlled Protesting, Social Networks, Damballa, Apr. 26, 2010. |
Ollmann, Gunter, Serial Variant Evasion Tactics, Techniques Used to Automatically Bypass Antivirus Technologies, Damballa, Oct. 7, 2009. |
Author Unknown, Damballa: A Different Approach, Targeted Attacks Requires a New Solution, Damballa, Sep. 23, 2008. |
Author Unknown, Trust Betrayed, What to Do When a Targeted Attack Turns Your Networks Against You, Damballa, Feb. 22, 2008. |
Author Unknown, How to Be a Hero in the War Against BotArmies, Damballa, 2008. |
Giroire, Frederic et al., Exploiting Temporal Persistence to Detect Convert Botnet Channels, Sep. 2009. |
Russ White, “High Availability in Routing”, Mar. 2004, Cisco Systems, vol. 7, Issue 1, pp. 2-14. |
Yadav et al., “Detecting Algorithmically Generated Malicious Domain Names”, Nov. 2010. |
Dittrich et al., P2P as Botnet Command and Control; A Deeper Insight, 2008 3rd International Conference on Malicious and Unwanted Software (MALWARE), Oct. 2008, IEEE, vol. 10, pp. 41-48. |
Niazario et al., As the Net Churns: Fast-Flux Botnet Observations, IEEE, pp. 24-31, Sep. 5, 2008. |
Sun et al. “Malware Virtualization-resistant behavior detection”, 2011 IEEE, pp. 912-917. |