Nefarious individuals attempt to compromise computer systems in a variety of ways. As an example, such individuals may embed or otherwise include malicious software (“malware”) in email attachments and transmit or cause them to be transmitted to unsuspecting users. An administrator of the recipient computer system can attempt to prevent compromise by blocking all downloads of all attachments by the computer system. However, such a policy will also prevent legitimate attachments from being available to the user. As an alternate approach, the administrator can require that a security scan be performed prior to the download of an attachment. Unfortunately, malware authors are crafting increasingly sophisticated malware that is increasingly able to evade detection. Accordingly, there exists an ongoing need for improved techniques to detect malware and prevent its harm.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
In the example shown in
Appliance 102 can take a variety of forms. For example, appliance 102 can be a dedicated device or set of devices. The functionality provided by appliance 102 can also be integrated into or executed as software on a general purpose computer, a computer server, a gateway, and/or a network/routing device. For example, in some embodiments services provided by data appliance 102 are instead (or in addition) provided to client 104 by software executing on client 104.
Whenever appliance 102 is described as performing a task, a single component, a subset of components, or all components of appliance 102 may cooperate to perform the task. Similarly, whenever a component of appliance 102 is described as performing a task, a subcomponent may perform the task and/or the component may perform the task in conjunction with other components. In various embodiments, portions of appliance 102 are provided by one or more third parties. Depending on factors such as the amount of computing resources available to appliance 102, various logical components and/or features of appliance 102 may be omitted and the techniques described herein adapted accordingly. Similarly, additional logical components/features can be added to system 102 as applicable.
As will be described in more detail below, appliance 102 can be configured to work in cooperation with one or more virtual machine servers (112, 124) to perform malware analysis/prevention. As one example, data appliance 102 can be configured to provide a copy of malware 130 to one or more of the virtual machine servers for real-time analysis. As another example, service 122 can provide a list of signatures of known-malicious documents to appliance 102 as part of a subscription. Those signatures can be generated by service 122 in conjunction with the techniques described herein.
An example of a virtual machine server is a physical machine comprising commercially available server-class hardware (e.g., a multi-core processor, 4+ Gigabytes of RAM, and one or more Gigabit network interface adapters) that runs commercially available virtualization software, such as VMware ESXi, Citrix XenServer, or Microsoft Hyper-V. The virtual machine servers may be separate from, but in communication with, data appliance 102, as shown in
Using Virtual Machines to Analyze Attachments
A virtual machine (VM) can be used to perform behavior profiling (e.g., in a VM sandbox environment) using various heuristic-based analysis techniques that can be performed in real-time during a file transfer (e.g., during a file/attachment download). As one example, suppose a malicious user of system 120 sends an email message to a user of client 104 that includes a malicious attachment. The attachment may be an executable (e.g., having a .exe extension) and may also be a document (e.g., a .doc or .pdf file). The message is received by data appliance 102, which determines whether a signature for the attachment is present on data appliance 102. A signature, if present, can indicate that the attachment is known to be safe, and can also indicate that the attachment is known to be malicious. If no signature for the attachment is found, data appliance 102 is configured to provide the attachment to a virtual machine server, such as virtual machine server 112, for analysis.
Virtual machine server 112 is configured to execute (or open, as applicable) the attachment in one or more virtual machines 114-116. The virtual machines may all execute the same operating system (e.g., Microsoft Windows) or may execute different operating systems or versions thereof (e.g., with VM 116 emulating an Android operating system). In some embodiments, the VM(s) chosen to analyze the attachment are selected to match the operating system of the intended recipient of the attachment being analyzed (e.g., the operating system of client 104). Observed behaviors resulting from executing/opening the attachment (such as changes to certain platform, software, or registry settings) are logged and analyzed for indications that the attachment is malicious. In some embodiments the log analysis is performed by the VM server (e.g., VM server 112). In other embodiments, the analysis is performed at least in part by appliance 102. The malware analysis and enforcement functionality illustrated in
If the attachment is determined to be malicious, appliance 102 can automatically block the file download based on the analysis result. Further, a signature can be generated and distributed (e.g., to other data appliances) to automatically block future file transfer requests to download the file determined to be malicious.
Detecting Anti-Virtual Machine Actions
Malware authors use increasingly sophisticated techniques when crafting their malware so that it avoids detection by security systems. One such technique is to have the malware attempt to determine whether it is executing in a virtual machine environment, and if so, to stop executing or otherwise not engage in malicious activities.
In various embodiments, malware analysis system 132 is configured to detect attempts (e.g., by an attachment executing in a virtual machine) to detect that it is executing within a virtual machine environment. Any such attempts (also referred to herein as “anti-virtual machine actions”) are treated as malicious actions and will result in the attachment being classified as malicious. A variety of techniques for detecting anti-virtual machine actions will be described in conjunction with
At 304, the candidate malware is analyzed using one or more virtual machines. For example, the candidate malware can be executed in virtual machine 114 and any behaviors logged for analysis by system 132. As another example, the candidate malware can be executed in virtual machines 126-128 and analyzed by service 122.
The following are examples of anti-virtual machine actions:
Additional anti-virtual machine actions are discussed below.
At 306, a determination is made as to whether anti-virtual machine actions (such as, but not limited to those described herein) have taken place. And, if so, at 308, output is generated that indicates that the candidate malware is malicious. As one example, at 308 a signature for the attachment is generated (e.g., as an MD5 hash-based signature). As another example, instead of or in addition to generating a signature, an alert is generated that instructs data appliance 102 not to provide the attachment to client 104.
In various embodiments, system 132 is configured to thwart anti-virtual machine actions. For example, virtual machine server 112 can be configured to provide virtual machines such as virtual machine 114 with randomized product IDs. As another example, the computer name associated with virtual machine 114 can be randomized. As yet another example, the computer name associated with virtual machine 114 can be set to a very common name, such as “HOME” or “WORKGROUP.” As yet another example, the harddrive name can be randomized or set to a very common name. As yet another example, the MAC address can be randomized.
VM-Specific Opcodes
An additional way that malware can attempt to ascertain whether it is being run in a virtual machine is by attempting to use a specific sequence of opcodes that are only supported in given virtualized environments. Illustrated in
In some embodiments, system 132 is configured to perform static analysis of candidate malware. In particular, it is configured to look for the presence in the candidate malware of functions/methods/opcodes that are only supported in virtualized environments.
In addition to or instead of performing such static analysis, system 132 can also be configured to apply one or more hotpatches to a virtual machine such as virtual machine 114. As one example, a hotpatch can be used to hook the hypervisor layer to return a FAIL (or a random string) instead of revealing to the malware that it is executing in a virtualized environment (e.g., because a string containing “vmware” would otherwise be returned). As another example, the hotpatch can be used to implement detailed logging (e.g., to obtain a detailed call graph).
Sophisticated malware might attempt to detect whether hotpatches have been applied (i.e., the running environment has been hooked). As an example, suppose the malware calls “LoadLibraryA” and “GetProcAddress” to get the original binary code from system files, and compares it with the version that is in memory (e.g., using memcmp). In some embodiments, system 132 is configured to monitor for such function calls. For example, which API calls are being made can be logged and analyzed. If calls indicative of attempts to detect API hooking are observed, the candidate malware is deemed to be taking anti-virtual machine actions and is flagged as malicious accordingly. Further, system 132 can be configured to thwart hotpatch detection by allowing a portion of memory to be writable and executable, but not readable.
In the event that malware determines it is operating in a hooked environment, it might simply cease executing. It may also attempt to restore the hooked portion back to its original state. Any such restore attempts can be monitored for by system 132 and, if observed, treated as anti-virtual machine actions. In some embodiments, system 132 is configured to take steps to counteract any efforts by malware to revert hooked portions. As one example, a standalone thread can be used to periodically check that hotpatches that ought to have been applied are still present. A second approach is to perform on-demand checking. For example, whenever the function “WriteFile” is called, system 132 can perform a check that “CreateFileA”/“CreateFileW” is restored.
Sleep
One additional approach that malware might use to attempt to evade detection while executing in a virtualized environment is the use of a sleep function. Specifically, the malware may attempt to sleep for an extended period of time—either through repeated sleep calls or through a single, lengthy sleep. In some embodiments, system 132 is configured to modify the behavior of the sleep function through API-hooking. For example, a maximum sleep duration can be specified. As another example, repeated calls to sleep can be marked as suspicious and/or subsequent sleeps can be ignored.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
This application is a continuation of co-pending U.S. patent application Ser. No. 13/631,654, entitled DETECTING MALWARE filed Sep. 28, 2012 which is incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
7603713 | Belov | Oct 2009 | B1 |
7823202 | Nucci et al. | Oct 2010 | B1 |
7930273 | Clark | Apr 2011 | B1 |
7945908 | Waldspurger | May 2011 | B1 |
8151352 | Novitchi | Apr 2012 | B1 |
8209680 | Le | Jun 2012 | B1 |
8239492 | Pottenger | Aug 2012 | B2 |
8347100 | Thornewell et al. | Jan 2013 | B1 |
8364664 | Bennett | Jan 2013 | B2 |
8495742 | Abadi et al. | Jul 2013 | B2 |
8533842 | Satish | Sep 2013 | B1 |
8566928 | Dagon | Oct 2013 | B2 |
8566946 | Aziz | Oct 2013 | B1 |
8631489 | Antonakakis et al. | Jan 2014 | B2 |
8646071 | Pereira et al. | Feb 2014 | B2 |
8763071 | Sinha | Jun 2014 | B2 |
8763125 | Feng | Jun 2014 | B1 |
8813240 | Northup | Aug 2014 | B1 |
8826426 | Dubey | Sep 2014 | B1 |
8893124 | Bork | Nov 2014 | B2 |
8931088 | Chen | Jan 2015 | B2 |
8966625 | Zuk et al. | Feb 2015 | B1 |
9049221 | Yen | Jun 2015 | B1 |
9117079 | Huang | Aug 2015 | B1 |
9141801 | Moon | Sep 2015 | B2 |
9152694 | Padidar | Oct 2015 | B1 |
9183383 | Yablokov | Nov 2015 | B1 |
9223962 | Kashyap | Dec 2015 | B1 |
9294486 | Chiang | Mar 2016 | B1 |
9317680 | Carter, III et al. | Apr 2016 | B2 |
9361089 | Bradfield | Jun 2016 | B2 |
9591003 | Johansson | Mar 2017 | B2 |
20030191911 | Kleinschnitz, Jr. | Oct 2003 | A1 |
20040107416 | Buban | Jun 2004 | A1 |
20040133796 | Cohen | Jul 2004 | A1 |
20050240756 | Mayer | Oct 2005 | A1 |
20050283837 | Olivier | Dec 2005 | A1 |
20060021029 | Brickell et al. | Jan 2006 | A1 |
20070039053 | Dvir | Feb 2007 | A1 |
20070079375 | Copley | Apr 2007 | A1 |
20070106986 | Worley et al. | May 2007 | A1 |
20070174915 | Gribble | Jul 2007 | A1 |
20080127338 | Cho et al. | May 2008 | A1 |
20080155694 | Kwon et al. | Jun 2008 | A1 |
20080177755 | Stern | Jul 2008 | A1 |
20080263658 | Michael | Oct 2008 | A1 |
20090007100 | Field et al. | Jan 2009 | A1 |
20090036111 | Danford | Feb 2009 | A1 |
20090077383 | de Monseignat | Mar 2009 | A1 |
20090150419 | Kim | Jun 2009 | A1 |
20090235357 | Ebringer | Sep 2009 | A1 |
20090241190 | Todd | Sep 2009 | A1 |
20090254989 | Achan | Oct 2009 | A1 |
20090265786 | Xie et al. | Oct 2009 | A1 |
20100037314 | Perdisci et al. | Feb 2010 | A1 |
20100281458 | Paladino | Nov 2010 | A1 |
20110090911 | Hao | Apr 2011 | A1 |
20110161955 | Woller et al. | Jun 2011 | A1 |
20110173698 | Polyakov | Jul 2011 | A1 |
20110208714 | Soukal et al. | Aug 2011 | A1 |
20110239299 | Chen et al. | Sep 2011 | A1 |
20110271342 | Chung | Nov 2011 | A1 |
20110276695 | Maldaner | Nov 2011 | A1 |
20110283360 | Abadi et al. | Nov 2011 | A1 |
20120042381 | Antonakakis et al. | Feb 2012 | A1 |
20120054869 | Yen et al. | Mar 2012 | A1 |
20120084860 | Cao | Apr 2012 | A1 |
20120192274 | Odom | Jul 2012 | A1 |
20120291042 | Stubbs et al. | Nov 2012 | A1 |
20130091350 | Gluck | Apr 2013 | A1 |
20130117848 | Golshan | May 2013 | A1 |
20130152200 | Alme | Jun 2013 | A1 |
20130212684 | Li | Aug 2013 | A1 |
20130232574 | Carothers | Sep 2013 | A1 |
20140283037 | Sikorski | Sep 2014 | A1 |
20140337836 | Ismael | Nov 2014 | A1 |
20150047033 | Thomas | Feb 2015 | A1 |
20150067862 | Yu | Mar 2015 | A1 |
20150074810 | Saher | Mar 2015 | A1 |
20150195299 | Zoldi | Jul 2015 | A1 |
20160036836 | Grill | Feb 2016 | A1 |
Entry |
---|
Nazario et al., As the Net Churns: Fast-Flux Botnet Observations, IEEE, pp. 24-31, Sep. 5, 2008. |
Shabtai et al., Andromaly: A Behavioral Malware Detection Framework for Android Devices, J Intell Inf Syst (2012) 38:161-190, Springer, Jan. 6, 2011. |
Barr, The VMware Mobile Virtualization Platform: Is that a Hypervisor in your Pocket?, Dec. 2010, VMware, pp. 124-135. |
Number | Date | Country | |
---|---|---|---|
Parent | 13631654 | Sep 2012 | US |
Child | 14793459 | US |