Embodiments of the disclosure relate to cybersecurity. More particularly, one embodiment of the disclosure relates to an analytic tool and corresponding method for automatically generating malware detection rule recommendations based on events monitored by a cybersecurity system.
Network devices provide useful and necessary services that assist individuals in business and in their everyday lives. In recent years, a growing number of cyberattacks are being conducted on all types of network devices. In some cases, these cyberattacks are orchestrated in an attempt to gain access to content stored on one or more network devices. Such access is for illicit (i.e., unauthorized) purposes, such as spying or other malicious or nefarious activities. For protection, cybersecurity appliances may be deployed at a local network in efforts to detect a cyberattack caused by a malicious object being uploaded to a network device.
Currently, some advanced cybersecurity appliances perform a two-phase approach for detecting malware contained in network traffic. This two-phase approach includes a static phase and a dynamic phase. During the dynamic phase, a virtual machine deployed within the cybersecurity appliance executes objects obtained from the network traffic being analyzed and monitors the behaviors of each object during execution. Each behavior, also referred to as an “event,” include meta-information associated with that event.
Conventional cybersecurity appliances rely on malware detection rules in controlling what events are being monitored and subsequently analyzed in efforts to classify the objects under analysis as either malicious (malware) or benign. As a result, the malware detection rules greatly influence the effectiveness of the cybersecurity appliance in determining whether or not an object is associated with malware. Given a constantly changing threat landscape, the malware detection rules are frequently evaluated and updated to maintain their effectiveness.
The generation of malware detection rules is a highly specialized, time intensive task. Currently, in response to an uncovered analytical error committed by a cybersecurity system (e.g., an object misclassification), a human analyst may receive an arcane report listing hundreds or even thousands of detected events that were captured during analysis of the misclassified object at the cybersecurity system. From these detected events, besides attempting to identify trends associated with malware, the analyst is responsible for (i) identifying detected events that are highly suggestive of the object being malicious or benign, and (ii) generating malware detection rule updates to avoid such object misclassifications in the future. Given the subjective nature of the review, the manual generation of these malware detection rule updates is prone to sub-optimal detection rule generation or even (in some cases) error. Also, the slow, arduous review and selection of detected events by an analyst for use as the basis for the malware detection rules greatly delays the release of malware detection rule updates, leaving the analyst with little-to-no time to re-evaluate whether any currently deployed malware detection rules are becoming stale (e.g., less efficient or efficacious). Hence, over time, a good percentage of the malware detection rules become repetitive (e.g., in that they detect the same malware as other malware detection rules), or non-effective (e.g., in that they no longer detect malware that may be modifying its form or functioning to avoid detection), which inadvertently wastes system resources. The waste of system resources may lead to resource over-utilization, namely system detection inefficiencies resulting in an increase of false negative (FN) detections and/or false positive (FP) classifications.
Embodiments of the disclosure are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
In general, one embodiment of the disclosure relates to a malware detection rule generation system that is designed to shorten the time duration currently needed to create efficacious malware detection rules in efforts to address the constantly changing threat landscape. For this embodiment of the disclosure, the rule generation system includes a receiver configured to receive meta-information associated with a plurality of events (e.g., monitored characteristics or behaviors) detected during malware analysis of an object by one or more cybersecurity systems (e.g., cybersecurity appliances and/or detection agents deployed on network devices). The received meta-information may be obtained from a log that maintains detected events based on operations performed by malware detection rules associated with dynamic analysis (e.g., isolated execution) as well as static analysis (object characteristics are analyzed without execution), correlation of detected events, and/or classification of the object (determining whether to classify the object as malicious, benign or any other selected classification to categorizing the object).
Each cybersecurity system is configured to conduct analyses of objects to determine whether any of the objects may be associated with malware. Each analysis may include (i) inspection of the objects as to form or content, (ii) processing an object within one or more virtual machines, and monitoring for selected events or combinations of events of the object and/or the virtual machine produced during such processing, and/or (iii) detecting occurrences of any of these monitored events or combinations of events or an absence of such event(s). Hence, the receiver is configured to receive meta-information associated with monitored events from each cybersecurity system.
As described below, according to one embodiment of the disclosure, the receiver of the rule generation system includes a parser and feature extraction logic. The parser extracts meta-information associated with the monitored events and converts the meta-information into a structured format (if not already in such format) according to event type. The feature extraction logic is configured to access one or more portions of the meta-information associated with each monitored event (hereinafter, “feature”) and provide that feature to the rule recommendation subsystem as described below.
Herein, a “feature” may be categorized as either (1) a portion of the meta-information where an occurrence of such meta-information, which is associated with one or more events of the plurality of events, may assist in determining a level of maliciousness of an object, or (2) repetitive patterns within text of the meta-information discovered using a sliding window (e.g., N-Gram, Skip-Gram, etc.), which may lead to rule candidate patterns. For the first category, the feature may constitute the portion of the meta-information itself in which a single occurrence of that portion of meta-information is a probative factor in determining the maliciousness of an object (e.g., object hash value, path identifying location of the object, Application Programming Interface (API) name, etc.). Alternatively, the “feature” may constitute a number of occurrences (i.e., aggregate) of a particular event within a determined period of time that exceeds a set threshold signifying a malicious activity (e.g., excessive API calls, excessive function calls such as a Sleep function, etc.).
The rule generation system further includes a data store and a rule recommendation subsystem. The data store provides temporary storage for the selected features associated with the events received via the receiver. The rule recommendation subsystem is configured to generate one or more rule recommendations, which are based on the selected features that are analyzed using one or more software machine learning (ML) models trained using supervised learning with malicious events associated with known malware or benign events associated with known goodware (that is, non-malicious software). For instance, a family of these supervised learning models such as Gradient Boosted Tree Ensemble models, Support Vector Machines, Bayesian Graphical Models, Hidden Markov Models, or Deep Neural Networks models for example, may be applied to build a predictive model to identify salient features that lead to rule recommendations. A “salient” feature is a feature that, based on the rule recommendation subsystem, is statistically discriminative in differentiating a “malicious” event from a “benign” event. Hence, a feature may not be “salient” when that feature does not contribute to prediction as some features may operate as a basis or foundation for enabling discovery of certain salient features.
More specifically, according to one embodiment of the disclosure, a plurality of ML models are utilized by the rule recommendation subsystem, where each ML model is configured to analyze features associated with a specific event type or a specific combination of event types. Hence, upon selecting certain features from meta-information of an event or a combination of events (e.g., a first event, a second event, a combination of first and second events, etc.), the feature extraction logic provides such features to a dedicated ML model trained to analyze features associated with that specific event type (e.g., first event type, second event type, combination of first and second event types, etc.). Examples of event types may include, but are not limited or restricted to communication port accesses, various file commands (e.g., open file, close file, etc.), mutex, registry key changes, or the like.
The application of the ML model may result in a predicted classification for the feature as potentially malicious, benign or suspicious (i.e., neither malicious nor benign) based on known malware/goodware. The predicted classification may be represented by a score that conveys a level of maliciousness for the feature and thereby its usefulness in classifying an object as malicious or benign based on this feature. The score assigned to the feature may be further adjusted based on a selected weighting scheme (e.g., increase scores for features with a higher probability of being associated with malware and/or decrease scores for features with a lesser probability of being associated with malware). The ML model result may further include information supporting (or explaining the rationale behind) the assigned score, as described below.
Additionally, while some ML models may correspond to different event types (i.e., an API-based ML model applied to feature(s) associated with the API call), other ML models may be configured to analyze parameters that are based on an aggregate of events occurring within a set time period or an absence of certain events. For these ML models, the aggregate of the events (e.g., number of total API calls within the set period of time), and not the presence of the event (or feature) per se, is considered when generating a score.
For a group of features having a ML prediction score that surpasses a selected threshold and is not concentrated in a specific event type, which may be controlled by limiting number of the features that are considered to be the “salient” features (i.e. limit the number of features associated with any event type to less than a maximum event threshold). The salient features form the basis for the rule recommendations provided to an analytic system. According to one embodiment of the disclosure, the format of the rule recommendations is selected to reduce the amount of meta-information provided to the analytic system (i.e., exclude meta-information associated with events without a salient feature) and highlight the salient features. For example, the salient features may be highlighted by (1) altering the ordering of the meta-information associated with each event including one or more salient features in order to prominently display the salient feature(s); (2) modifying the visual perception of the salient features referenced in the meta-information (e.g. display window or portion of a window, color, size, type, style, and/or effects); (3) extracting the salient features and providing only the salient features to the analytic system; and/or (4) reordering the salient features within the meta-information for placement at a prescribed location within the meta-information. The last two examples may be used for automated rule generation where the extraction and/or ordering allow for parsing of the salient features and automated rule generation by logic implemented within the analytic system (e.g., artificial neural network logic).
Thereafter, one or more provisional malware detection rules (i.e., a Boolean logic representation of the salient features) are generated based on the rule recommendations, and these provisional malware detection rule(s) are tested at one or more cybersecurity systems. After a prescribed period of time, for each provisional malware detection rule, if the malware analysis performance results (telemetry) associated with that provisional malware detection rule conveys a number or rate of false positive (FP) classifications below a first test threshold and/or a number or rate of false negatives (FN) classifications below a prescribed second test threshold, where the first and second test thresholds may differ from each other, the provisional malware detection rule is uploaded to one or more cybersecurity systems as a final malware detection rule for detecting and blocking malware. Otherwise, the features associated with the provisional malware detection rules may undergo additional back testing in which some provisional malware detection rules associated with certain features may be removed and other provisional malware detection rules associated with additional features may be added in efforts to address the FPs and/or FNs.
As an illustrative example, the cybersecurity system, such as a cybersecurity appliance for example, may perform a malware detection analysis on an object and the detected events are collected and provided, directly or indirectly, to a rule generation system. The events may be captured at an operating system (OS) level in the VM or even outside the VM and relate to behaviors of the object. For this embodiment, the plurality of detected events are provided as an indexed aggregate of detected events sometimes referred to as an “Event report,” as shown in
According to one embodiment of the disclosure, the malware detection rule recommendations may be further altered based on testing and experiential knowledge by an analyst. The alteration of the rule recommendations may involve removal of, or modification or addition to some of these rule recommendations (i.e., selected malicious and/or suspicious features). The alteration of the rule recommendations can be further tested (verified) against known malicious events and known benign events to determine the suitability of finalized malware detection rules to be uploaded to the cybersecurity appliance. Through ML-based formulation of these rule recommendations, the generation of the finalized malware detection rules to address newly uncovered threats may be more quickly developed.
The rule generation system also comprises a pipeline for re-training the ML models, either on demand, or when the key performance indicators (KPI) deteriorates as determined using open source or commercial available tools. For instance, occasionally there may be new malware or threat actors with new TTP (Tactics, Techniques and Procedures) that can evade an existing ML model trained using stale training sets. When the KPI deteriorates, the rule generation system may alert a system administrator to re-train the supervised ML model on the new dataset with malware and goodware, so that the ML model can adapt to the constantly evolving threat landscape.
In the following description, certain terminology is used to describe various features of the invention. For example, each of the terms “logic,” “system,” and “subsystem” may be representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, the term logic (or engine or component) may include circuitry having data processing and/or storage functionality. Examples of such circuitry may include, but are not limited or restricted to a hardware processor (e.g., microprocessor, one or more processor cores, a digital signal processor, a programmable gate array, a microcontroller, an application specific integrated circuit “ASIC”, etc.), a semiconductor memory, or combinatorial elements.
Additionally, or in the alternative, the logic (or system or subsystem) may include software such as one or more processes, one or more instances, Application Programming Interface(s) (API), subroutine(s), function(s), applet(s), servlet(s), routine(s), source code, object code, shared library/dynamic link library (dll), or even one or more instructions. This software may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of a non-transitory storage medium may include, but are not limited or restricted to a programmable circuit; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); or persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the logic (or component) may be stored in persistent storage.
The term “object” generally relates to information having a logical structure or organization for malware analysis. The information may include an executable (e.g., an application, program, code segment, a script, dll or any file in a format that can be directly executed by a computer such as a file with an “.exe” extension, etc.), a non-executable (e.g., a file; any document such as a Portable Document Format “PDF” document; a word processing document such as Word® document; an electronic mail “email” message, web page, etc.), or simply a collection of related data (e.g., packets).
The term “computerized” generally represents that any corresponding operations are conducted by hardware in combination with software and/or firmware. The term “data store” generally refers to a data storage device such as the non-transitory storage medium described above, which provides non-persistent or persistent storage for the information (e.g., events). An “event” may be construed as an activity that is performed by an object during execution and/or the meta-information associated with the activity. The meta-information may include, but is not limited or restricted to event type (e.g., file command, mutex, time query, API call, etc.), object name, object path, hash value of the object, timestamp, process identifier, or the like.
According to one embodiment of the disclosure, the term “malware” may be broadly construed as any code, communication or activity that initiates or furthers a cyberattack. Malware may prompt or cause unauthorized, anomalous, unintended and/or unwanted behaviors or operations constituting a security compromise of information infrastructure. For instance, malware may correspond to a type of malicious computer code that, as an illustrative example, executes an exploit to take advantage of a vulnerability in a network, network device or software, to gain unauthorized access, harm or co-opt operations of the network, the network device or the software, or to misappropriate, modify or delete data. Alternatively, as another illustrative example, malware may correspond to information (e.g., executable code, script(s), data, command(s), etc.) that is designed to cause a network device to experience anomalous (unexpected or undesirable) behaviors. The anomalous behaviors may include a communication-based anomaly or an execution-based anomaly, which, for example, could (1) alter the functionality of a network device executing application software in an unauthorized or malicious manner; (2) alter the functionality of the network device executing that application software without any malicious intent; and/or (3) provide unwanted functionality which may be generally acceptable in another context.
The term “network device” may be construed as any electronic computing system with the capability of processing data and connecting to a network. The network may be a public network such as the Internet and/or a local (private) network such as an enterprise network, a wireless local area network (WLAN), a local area network (LAN), a wide area network (WAN), or the like. Examples of a network device may include, but are not limited or restricted to an endpoint (e.g., a laptop, a mobile phone, a tablet, a computer, a video console, a copier, etc.), a network appliance, a server, a router or other intermediary communication device, a firewall, etc.
The term “transmission medium” may be construed as a physical or logical communication path between two or more network devices or between components within a network device. For instance, as a physical communication path, wired and/or wireless interconnects in the form of electrical wiring, optical fiber, cable, bus trace, or a wireless channel using radio frequency (RF) or infrared (IR), may be used. A logical communication path may simply represent a communication path between two or more network devices or between components within a network device such as one or more Application Programming Interfaces (APIs).
Finally, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
As this invention is susceptible to embodiments of many different forms, it is intended that the present disclosure is to be considered as an example of the principles of the invention and not intended to limit the invention to the specific embodiments shown and described.
Referring to
More specifically, each of the cybersecurity systems 1101-110N (e.g., cybersecurity system 1101) may be deployed as a network device, which is communicatively coupled to receive and analyze objects within network traffic (e.g., object of incoming network traffic, objects propagating in network traffic over a local network 130, etc.). As a network device, the cybersecurity system 1101 includes logic being physical components that analyze incoming objects for malware, such as a processor and memory including, in some embodiments, one or more virtual machines, software (e.g., OS(es), application(s), plug-in(s), etc.) to instantiate each of the virtual machines, and monitoring logic to monitor for certain events (e.g., behaviors) conducted by an object running in a virtual machine (VM). Alternatively, the cybersecurity system 1101 may be deployed as a virtual device, namely a software (daemon) agent to detect cyberattacks, which may operate in the foreground or background for a network device (e.g., an endpoint). For both of these deployments, component(s) within the cybersecurity system 1101 monitor for certain events performed by the object and collect meta-information associated with the events, which is provided to the rule generation system 120 for analysis. Stated differently, the collected meta-information may be obtained from a log as described above, such as a behavior log, endpoint dynamic behavior monitor log, or a static PE (portable execution) file that contains API calls, file accesses, etc.
Each of the cybersecurity systems 1101-110N (e.g., cybersecurity system 1101) may be deployed on-premises (e.g., as an edge network device for the local network 130, as a network device within the local network 130) to detect and analyze objects propagating into or through the local network 130 for malware. Alternatively, although not shown, each of the cybersecurity systems 1101-110N may be deployed as a cloud-based solution in which the objects (or a representation thereof) are captured at the local network 130 and submitted to at least one of the cloud-based cybersecurity systems 1101-110N. Additionally, although not shown, at least one of the cybersecurity systems 1101-110N (e.g., cybersecurity system 1101) may be deployed at an endpoint as a software agent operating in the background to analyze and monitor for certain behaviors by the object.
Referring still to
As shown in
Besides the analysis results 140, the rule generation system 120 further receives an event summary 150, namely a plurality of events being monitored and detected during processing of a particular object upon which more in-depth analysis is requested. This particular object may correspond to an object upon which a malware detection analysis by the cybersecurity system 1101 has completed. For example, the particular object may correspond to an object that, based on telemetry (e.g., malware analysis performance results), has been incorrectly classified in a prior malware detection analysis by the cybersecurity system 1101 (e.g., FP or FN misclassification). The telemetry may be stored remotely from the rule generation system such as within a private cloud service, public cloud service or other scalable “big data” platform.
According to one embodiment of the disclosure, the event summary 150 may be provided as an indexed aggregate of the detected events for the particular object (hereinafter, “Event report 150”). Based on receipt of the events 152 included in the Event report 150 as shown in
According to one embodiment of the disclosure, as shown in
Herein, the removal or addition of salient features may be accomplished by issuance of the rule modification message 180 from the analytic system 170, which may cause the rule generation system 120 to increase or decrease at least one threshold parameter used in selecting the salient features and reissue new rule recommendations 185. According to one embodiment, removal or addition of a salient feature by the rule generation system 120 may be accomplished by returning the feature to a non-highlighted or highlighted form and the newly added salient features may be highlighted in the same manner or perhaps in a different manner to more easily identify the new salient features from the prior salient features.
As a first illustrative example, the rule modification message 180 may request a decrease/increase in a first score threshold utilized by the rule generation system 120, where the first score threshold identifies whether a scored feature is “malicious” upon exceeding the first score threshold. Hence, an increase or decrease of the first score threshold caused by the rule modification message 180 may decrease or increase the number of salient features selected for rule consideration. Additionally or as an alternative, as a second illustrative example, the rule modification message 180 may request a decrease/increase in a second score threshold utilized by the rule generation system 120, where the second score threshold identifies whether a scored feature is “benign” upon falling below the second score threshold. Hence, an increase or decrease of the second score threshold caused by the rule modification message 180 may increase or decrease the number of salient “benign” features selected for rule consideration.
Once the rule recommendations are finalized at the analytic system 170, provisional malware detection rules 190 are generated from the finalized rule recommendations. The analytic system 170 transmits the provisional malware detection rules 190 via network 195 to one or more of the cybersecurity systems 1101-110N (e.g., cybersecurity system 1101) for initial testing and, generally, verification. Further verification may be conducted by analysis of the operability of the cybersecurity systems 1101-110N and the results of the verification may be reported therefrom or the analytic system 170.
Referring now to
Upon receipt of the Event report 150 from the cybersecurity system 1101, the rule generation system 120 parses the Event report 150 to identify and extract meta-information associated with the monitored events. The rule generation system 120 further selects features from the extracted meta-information and conducts an analysis of each feature, using machine learning (ML) models (not shown), to predict the salient features, namely the features having a higher likelihood of assisting in the classification of an object. Herein, each of the ML models is specifically configured to analyze features associated with a different event type and a measure of the predicted likelihood may be referred to as a “score.” The ML models are trained, in accordance with supervised learning, using at least the analysis results 140 from the cybersecurity system 1101 as a training set. It is contemplated that other data, besides the analysis results 140 (e.g., third party data, etc.) may be used in training the ML models, as described below.
Based on the scores assigned to the analyzed features, the rule generation system 120 operates as a filter by (i) reducing the number of analyzed features to be considered as salient features (referred to as “potential salient features”) to a first subset of analyzed features, and (ii) further reducing the first subset of analyzed features to a second subset of analyzed features that represent the salient features. According to one embodiment of the disclosure, the potential salient features may be determined by selecting the analyzed features having an assigned score that meets or exceeds a first score threshold (malicious features) and/or analyzed features having an assigned score that meets or falls below a second score threshold (benign features). Thereafter, the salient features may be determined by restricting the number of features associated with an event type from exceeding a maximum event threshold. This may be accomplished by retaining the features for each event type having the highest and/or lowest scores.
According to one embodiment of the disclosure, referring back to
Thereafter, when finalized, the rule recommendations 160 (or the new rule recommendations 185 in response to alteration of the rule recommendations 160) may be converted into one or more provisional (i.e. recommended) malware detection rules 190. Each provisional malware detection rule 190 may be tested against a searchable data store 220 including meta-information associated with known malware and/or known goodware to determine the suitability of the provisional malware detection rule 190, as shown in
Referring to
The processor 310 is a multi-purpose, programmable component that accepts digital data as input, processes the input data according to stored instructions, and provides results as output. One example of a processor may include an Intel® x86 central processing unit (CPU) with an instruction set architecture. Alternatively, the processor 310 may include another type of CPU, a digital signal processor (DSP), an Application Specific Integrated Circuit (ASIC), a field-programmable gate array (FPGA), or the like.
As shown in
The administrative interface 340 is a portal that allows an administrator, after credential exchange and authentication, to access and update logic stored within the memory 330 of the rule generation system 120. For instance, the administrative interface 340 may include authentication logic (not shown) to authenticate an administrator requesting access to stored logic within the rule generation system 120. Upon authentication, the administrator is able to modify (i) the parser 385 of the receiver 380 to change parsing operations as well as the type of events to be parsed from the Event report 150 and (ii) the rule recommendation subsystem 390 to alter one or more of the machine learning models (not shown) corresponding to the event types being parsed by the parser 385 in efforts to more quickly and more accurately generate malware detection rules based on the results produced from the ML models.
As an optional interface, the analyst interface 350 is a portal that allows an analyst, after credential exchange and authentication, to access and update stored meta-information associated with monitored events within the data store 335. For instance, the analyst interface 350 may provide a graphics user interface (GUI) that allows an analyst to conduct search queries for different events based on a variety of search parameters. For instance, the search parameters may be directed to the type of event, a source (upload and/or originating) or time of occurrence as the meta-information for each event provided by the Event report 150 may include, but is not limited or restricted to (i) event type, (ii) time of occurrence (timestamp), (iii) a path for accessing the object under analysis (i.e., file path in memory), (iv) identifier associated with, e.g., hash value of, the object under analysis, and/or (v) process identifier for the process running during detection of the event, or the like. Additionally, the analyst interface 350 allows for adjustment of the score threshold(s) to adjust the throughput of salient features.
Referring now to
More specifically, the parser 385 may be configured to extract the meta-information 325 associated with each of the event types. As an illustrative example, where the event pertains to a file operation (e.g., Open( )); file command), the meta-information 325 may identify the type of event (e.g., file Open( )); file path URL, file size, time of occurrence of the Open( ) command (e.g., timestamp value), the name of the file being opened (filename), and/or hash value of the file. Additionally, or in the alternative, the parser 385 may be configured to generate the meta-information 325 associated with a selected event type, such as an aggregation of the number of activities associated with the identified event (e.g., the number API calls directed to a particular API within a prescribed period of time, the number of times that a SLEEP command is called within a prescribed period of time, etc.). The extracted and/or generated meta-information 325 may be placed into a readable format and the formatted meta-information 405 is provided to both the feature extraction logic 410 and a searchable data store 220 of labeled events associated with known malware and known goodware. Alternatively, the extracted and/or generated meta-information may be placed into a format that may be parsed and/or processed automatically without human intervention.
The feature extraction logic 410 extracts features 430 from the formatted meta-information 405, where the features 430 may be categorized as either (1) a portion of the formatted meta-information 405 where an occurrence of the formatted meta-information 405 may represent a useful factor in determine a level of maliciousness of an object, or (2) repetitive patterns within text of the formatted meta-information 405 discovered using a sliding window (e.g., N-Gram, Skip-Gram, etc.), which may lead to rule candidate patterns. For example, one of the features 430 may constitute the portion of the formatted meta-information 405 in which an occurrence of particular meta-information is discovered (e.g., object hash value, path identifying location of the object, API name, or any combination thereof). Additionally, or in the alternative, one or the features 430 may constitute a parameter provided in response to at least a prescribed number of occurrences (aggregate) of a particular event (e.g., API calls, Sleep function calls, etc.) within a determined period of time.
It is contemplated that, for some embodiments, the particular features selected from meta-information 325 associated with an event may depend on the event type. For instance, where the event is directed to behaviors conducted by a file within a virtual machine, the features may be directed to content specific to that file that may be more difficult to change (e.g., hash value of the file, file path, etc.) instead of more easily modified content (e.g., filename). Such features, when used as a basis for malware detection rules, are more dispositive in determining whether such features represent that the object is malware or benign.
Thereafter, according to one embodiment of the disclosure, the features 430 are made available to the rule recommendation subsystem 390. As shown, the feature extraction logic 410 stores the features 430 in the data store 335, which is accessible by the rule recommendation subsystem 390. Alternatively, the feature extraction logic 410 may be configured to provide each extracted feature to a corresponding one of a plurality of machine learning (ML) models 440 specific to that identified event type (or combination of events).
Referring still to
In addition to providing the extracted features 430 to the rule recommendation subsystem 390, the feature extraction logic 410 may be configured to conduct a sliding window analysis on incoming meta-information 405 associated with the features to detect the presence of a particular event pattern within the sliding window over a prescribed period of time. The sliding window analysis may be conducted by analyzing timestamps maintained as part of the features and determining a count value representing the number of occurrences (aggregate) of a particular event that occur during a determined period of time. The count value is provided to one or more of the plurality of machine learning (ML) models (e.g., fourth ML model 448), which determines a score for the repetitive occurrence of a particular event. For the fourth ML model 448, the frequency of the event, not the meta-information of the event, determines the likelihood of the particular events denoting a presence of malware.
As also shown in
According to one embodiment of the disclosure, the predictive filtering logic 450 automatically selects a plurality of features (or a combination of features) that are predicted, based on their score values, to operate as the salient features in the formation of the rule recommendations 160. These salient features may be restricted in number (i) on a per object basis (e.g., maximum number of salient features in total may not exceed a first feature threshold such as 20 salient features) and/or (ii) on a per event type basis (e.g., maximum number of salient features per event type may not exceed a second feature threshold such as 4 features per event type). In many analyses, the rule recommendations 160 will include salient features for some, but not all, of the event types.
In light of the foregoing, the predictive filtering logic 450 is configured to exclude, from the rule recommendations 160, those features that may be less effective for detecting malware. As a result, where the score for a feature (or combination of features) exceeds the first score threshold, the predictive filtering logic 450 may include that feature (or combination of features) as part of the rule recommendations 160, provided the per event type restriction (described above) is maintained. Each rule recommendation 160 may include the same syntax as the feature (e.g., a string of meta-information as shown in
According to one embodiment of the disclosure, rule recommendation verification may be accomplished by conducting query searches 460 to evaluate the rule recommendations 160, where the query searches 460 are directed to evaluating each salient feature of the rule recommendations 160 with features are associated with known malicious objects and/or benign objects maintained by the data store 220. The data store 220 may be positioned locally to the analytic system 170 or remotely therefrom (e.g., within a cloud service).
Upon initial verification of their applicability in accurately detecting an object incorrectly classified by a cybersecurity system based on the Event report 150 provided, the rule recommendations 160 are translated into provisional malware detection rules 190, which are uploaded to at least one cybersecurity system 1101 for malware detection and, in some embodiments, for subsequent use in malware blocking as a final set of malware detection rules once the provisional malware detection rules 190 have been confirmed to operate satisfactorily in the field (e.g., the number of detected FP as determined by the telemetry does not exceed corresponding thresholds). According to one embodiment of the disclosure, if the rule recommendations 160 require an additional iteration before the rule translation (i.e., more rule recommendations are requested), the rule recommendations 160 (or a portion thereof) may be returned to the rule recommendation subsystem 390 and the new rule modification 185 may be generated for features that were previously filtered from the rule recommendations 160.
Where the malware detection rules 190 are determined to be operating unsatisfactorily unreliable (e.g., the number of detected FP exceeds the corresponding thresholds), the rule modification message 180 from the analytic system 170 may cause the rule generation system 120 to supplement the recommended rules 190.
Referring now to
Referring now to
More specifically, the plurality of ML models are configured for a corresponding plurality of event types and/or combination or sequences of events. Stated differently, each ML model of the plurality of ML models corresponds to a specific event type or event combination and is applied to features associated with the specific event type or event combination. As an illustrative example, the first ML model 442 may be applied to features associated with a first event type (feature A) to determine whether the feature (and corresponding event type) is indicative of a potentially malicious behavior while the second ML model 444 may be applied to features associated with a second event type (feature B) and the fourth ML model 448 may be applied to features associated with a combination of events (features A&B) to determine whether features associated with certain events (or the combination of events) are indicative of a potentially malicious behavior.
According to one embodiment of the disclosure, upon being applied to features associated with a first event type (feature A), the first ML model 442 generates a score representing a scale (e.g., from highest to lowest or lowest to highest) as to the level of correlation between the feature associated with a first event type (feature A) and a level of maliciousness (e.g., association with malware). The same operations are performed by a second ML model 444, which may be applied to features associated with a second event type (feature B), and/or a third ML model 446 may be applied to features associated with a combination of events (features A&B) to determine whether features associated with a particular event (or a combination of events) are indicative of a finding of maliciousness.
Thereafter, according to one embodiment of the disclosure, predictive filtering logic may be configured to analyze the scores associated with each event type and automatically select a prescribed number of features that are predicted, based on the score values, to achieve a higher likelihood of being associated with malware (operations 630 and 640). The prescribed number of features for each event type (or combination of events) form as rule recommendations. Hence, the predictive filtering logic excludes those features that are less effective features for detecting malware from the rule recommendations. Alternatively, according to another embodiment of the disclosure, the predictive filtering logic may be configured to analyze the scores associated with each event type and display the salient features (and score values) on a graphic user interface (GUI) accessible via the administrator interface 340 and/or the analyst interface 350 of
Referring now to
Thereafter, a ML model is selected for each feature according to event type (i.e. event type from which the feature was extracted), where the ML models are trained to process features extracted from specific event type (operation 720). A ML model of a plurality of ML models is applied to each feature according to event type or combination of events (i.e., the event type from which the feature was extracted), where the ML models generate predicted scores associated with each feature (operation 730).
After applying the ML models to each feature, the salient features are determined from the totality of analyzed (modeled) features based on the assigned score (with any optional weighting applied) and taking into account a limit on the maximum number of features per event type (operation 740). As an illustrative example, a first subset of analyzed features (potential salient features) are selected for those features having an assigned score that either (i) meets or exceeds a first score threshold (malicious features) or (ii) meets or falls below a second score threshold (benign features). Thereafter, the salient features are determined from the potential salient features by restricting the number of features associated with an event type from exceeding a maximum event threshold (e.g., less than 5 events, less than 3 events, etc.). Therefore, not all of the event types monitored by the rules generation system may be represented by a salient feature.
Based on these salient features, rule recommendations are generated (operation 750). According to one embodiment of the disclosure, the rule recommendation includes the salient features along with meta-information associated with the event from which the features were extracted.
Thereafter, the rule recommendations are verified by comparing the salient features to features associated with known malware and/or known goodware (operation 760). This verification may be conducted within the same network, same public cloud computing service or same private cloud computing service in which the rule generation system is deployed. If the rule recommendations are verified, the salient features are used as a basis for generation of the provisional malware detection rules that control malware detection analyses by one or more cybersecurity appliance, and the provisional malware detection rules are uploaded to the cybersecurity appliance for use (operations 770 & 780). Alternatively, if the rule recommendations are not verified (i.e., local testing results in FP and/or FN exceeding a prescribed threshold), the rule recommendations may be altered by adjusting a threshold parameter (e.g., decrease/increase the first score threshold or increase/decrease the second score threshold) used in selecting the salient features. This adjust may cause reissuance of a new set of rule recommendations (operations 790) for verification and roll-out.
In the foregoing description, the invention is described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4292580 | Ott et al. | Sep 1981 | A |
5175732 | Hendel et al. | Dec 1992 | A |
5319776 | Hile et al. | Jun 1994 | A |
5440723 | Arnold et al. | Aug 1995 | A |
5490249 | Miller | Feb 1996 | A |
5657473 | Killean et al. | Aug 1997 | A |
5802277 | Cowlard | Sep 1998 | A |
5842002 | Schnurer et al. | Nov 1998 | A |
5960170 | Chen et al. | Sep 1999 | A |
5978917 | Chi | Nov 1999 | A |
5983348 | Ji | Nov 1999 | A |
6088803 | Tso et al. | Jul 2000 | A |
6092194 | Touboul | Jul 2000 | A |
6094677 | Capek et al. | Jul 2000 | A |
6108799 | Boulay et al. | Aug 2000 | A |
6154844 | Touboul et al. | Nov 2000 | A |
6269330 | Cidon et al. | Jul 2001 | B1 |
6272641 | Ji | Aug 2001 | B1 |
6279113 | Vaidya | Aug 2001 | B1 |
6298445 | Shostack et al. | Oct 2001 | B1 |
6357008 | Nachenberg | Mar 2002 | B1 |
6424627 | Sorhaug et al. | Jul 2002 | B1 |
6442696 | Wray et al. | Aug 2002 | B1 |
6484315 | Ziese | Nov 2002 | B1 |
6487666 | Shanklin et al. | Nov 2002 | B1 |
6493756 | O'Brien et al. | Dec 2002 | B1 |
6550012 | Villa et al. | Apr 2003 | B1 |
6775657 | Baker | Aug 2004 | B1 |
6831893 | Ben Nun et al. | Dec 2004 | B1 |
6832367 | Choi et al. | Dec 2004 | B1 |
6895550 | Kanchirayappa et al. | May 2005 | B2 |
6898632 | Gordy et al. | May 2005 | B2 |
6907396 | Muttik et al. | Jun 2005 | B1 |
6941348 | Petry et al. | Sep 2005 | B2 |
6971097 | Wallman | Nov 2005 | B1 |
6981279 | Arnold et al. | Dec 2005 | B1 |
7007107 | Ivchenko et al. | Feb 2006 | B1 |
7028179 | Anderson et al. | Apr 2006 | B2 |
7043757 | Hoefelmeyer et al. | May 2006 | B2 |
7058822 | Edery et al. | Jun 2006 | B2 |
7069316 | Gryaznov | Jun 2006 | B1 |
7080407 | Zhao et al. | Jul 2006 | B1 |
7080408 | Pak et al. | Jul 2006 | B1 |
7093002 | Wolff et al. | Aug 2006 | B2 |
7093239 | van der Made | Aug 2006 | B1 |
7096498 | Judge | Aug 2006 | B2 |
7100201 | Izatt | Aug 2006 | B2 |
7107617 | Hursey et al. | Sep 2006 | B2 |
7159149 | Spiegel et al. | Jan 2007 | B2 |
7213260 | Judge | May 2007 | B2 |
7231667 | Jordan | Jun 2007 | B2 |
7240364 | Branscomb et al. | Jul 2007 | B1 |
7240368 | Roesch et al. | Jul 2007 | B1 |
7243371 | Kasper et al. | Jul 2007 | B1 |
7249175 | Donaldson | Jul 2007 | B1 |
7287278 | Liang | Oct 2007 | B2 |
7308716 | Danford et al. | Dec 2007 | B2 |
7328453 | Merkle, Jr. et al. | Feb 2008 | B2 |
7346486 | Ivancic et al. | Mar 2008 | B2 |
7356736 | Natvig | Apr 2008 | B2 |
7386888 | Liang et al. | Jun 2008 | B2 |
7392542 | Bucher | Jun 2008 | B2 |
7418729 | Szor | Aug 2008 | B2 |
7428300 | Drew et al. | Sep 2008 | B1 |
7441272 | Durham et al. | Oct 2008 | B2 |
7448084 | Apap et al. | Nov 2008 | B1 |
7458098 | Judge et al. | Nov 2008 | B2 |
7464404 | Carpenter et al. | Dec 2008 | B2 |
7464407 | Nakae et al. | Dec 2008 | B2 |
7467408 | O'Toole, Jr. | Dec 2008 | B1 |
7478428 | Thomlinson | Jan 2009 | B1 |
7480773 | Reed | Jan 2009 | B1 |
7487543 | Arnold et al. | Feb 2009 | B2 |
7496960 | Chen et al. | Feb 2009 | B1 |
7496961 | Zimmer et al. | Feb 2009 | B2 |
7519990 | Xie | Apr 2009 | B1 |
7523493 | Liang et al. | Apr 2009 | B2 |
7530104 | Thrower et al. | May 2009 | B1 |
7540025 | Tzadikario | May 2009 | B2 |
7546638 | Anderson et al. | Jun 2009 | B2 |
7565550 | Liang et al. | Jul 2009 | B2 |
7568233 | Szor et al. | Jul 2009 | B1 |
7584455 | Ball | Sep 2009 | B2 |
7603715 | Costa et al. | Oct 2009 | B2 |
7607171 | Marsden et al. | Oct 2009 | B1 |
7639714 | Stolfo et al. | Dec 2009 | B2 |
7644441 | Schmid et al. | Jan 2010 | B2 |
7657419 | van der Made | Feb 2010 | B2 |
7676841 | Sobchuk et al. | Mar 2010 | B2 |
7698548 | Shelest et al. | Apr 2010 | B2 |
7707633 | Danford et al. | Apr 2010 | B2 |
7712136 | Sprosts et al. | May 2010 | B2 |
7730011 | Deninger et al. | Jun 2010 | B1 |
7739740 | Nachenberg et al. | Jun 2010 | B1 |
7779463 | Stolfo et al. | Aug 2010 | B2 |
7784097 | Stolfo et al. | Aug 2010 | B1 |
7832008 | Kraemer | Nov 2010 | B1 |
7836502 | Zhao et al. | Nov 2010 | B1 |
7849506 | Dansey et al. | Dec 2010 | B1 |
7854007 | Sprosts et al. | Dec 2010 | B2 |
7869073 | Oshima | Jan 2011 | B2 |
7877803 | Enstone et al. | Jan 2011 | B2 |
7904959 | Sidiroglou et al. | Mar 2011 | B2 |
7908660 | Bahl | Mar 2011 | B2 |
7930738 | Petersen | Apr 2011 | B1 |
7937387 | Frazier et al. | May 2011 | B2 |
7937761 | Bennett | May 2011 | B1 |
7949849 | Lowe et al. | May 2011 | B2 |
7996556 | Raghavan et al. | Aug 2011 | B2 |
7996836 | McCorkendale et al. | Aug 2011 | B1 |
7996904 | Chiueh et al. | Aug 2011 | B1 |
7996905 | Arnold et al. | Aug 2011 | B2 |
8006305 | Aziz | Aug 2011 | B2 |
8010667 | Zhang et al. | Aug 2011 | B2 |
8020206 | Hubbard et al. | Sep 2011 | B2 |
8028338 | Schneider et al. | Sep 2011 | B1 |
8042184 | Batenin | Oct 2011 | B1 |
8045094 | Teragawa | Oct 2011 | B2 |
8045458 | Alperovitch et al. | Oct 2011 | B2 |
8069484 | McMillan et al. | Nov 2011 | B2 |
8087086 | Lai et al. | Dec 2011 | B1 |
8171553 | Aziz et al. | May 2012 | B2 |
8176049 | Deninger et al. | May 2012 | B2 |
8176480 | Spertus | May 2012 | B1 |
8201246 | Wu et al. | Jun 2012 | B1 |
8204984 | Aziz et al. | Jun 2012 | B1 |
8214905 | Doukhvalov et al. | Jul 2012 | B1 |
8220055 | Kennedy | Jul 2012 | B1 |
8225288 | Miller et al. | Jul 2012 | B2 |
8225373 | Kraemer | Jul 2012 | B2 |
8233882 | Rogel | Jul 2012 | B2 |
8234640 | Fitzgerald et al. | Jul 2012 | B1 |
8234709 | Viljoen et al. | Jul 2012 | B2 |
8239944 | Nachenberg et al. | Aug 2012 | B1 |
8260914 | Ranjan | Sep 2012 | B1 |
8266091 | Gubin et al. | Sep 2012 | B1 |
8286251 | Eker et al. | Oct 2012 | B2 |
8291499 | Aziz et al. | Oct 2012 | B2 |
8307435 | Mann et al. | Nov 2012 | B1 |
8307443 | Wang et al. | Nov 2012 | B2 |
8312545 | Tuvell et al. | Nov 2012 | B2 |
8321936 | Green et al. | Nov 2012 | B1 |
8321941 | Tuvell et al. | Nov 2012 | B2 |
8332571 | Edwards, Sr. | Dec 2012 | B1 |
8365286 | Poston | Jan 2013 | B2 |
8365297 | Parshin et al. | Jan 2013 | B1 |
8370938 | Daswani et al. | Feb 2013 | B1 |
8370939 | Zaitsev et al. | Feb 2013 | B2 |
8375444 | Aziz et al. | Feb 2013 | B2 |
8381299 | Stolfo et al. | Feb 2013 | B2 |
8402529 | Green et al. | Mar 2013 | B1 |
8464340 | Ahn et al. | Jun 2013 | B2 |
8479174 | Chiriac | Jul 2013 | B2 |
8479276 | Vaystikh et al. | Jul 2013 | B1 |
8479291 | Bodke | Jul 2013 | B1 |
8510827 | Leake et al. | Aug 2013 | B1 |
8510828 | Guo et al. | Aug 2013 | B1 |
8510842 | Amit et al. | Aug 2013 | B2 |
8516478 | Edwards et al. | Aug 2013 | B1 |
8516590 | Ranadive et al. | Aug 2013 | B1 |
8516593 | Aziz | Aug 2013 | B2 |
8522348 | Chen et al. | Aug 2013 | B2 |
8528086 | Aziz | Sep 2013 | B1 |
8533824 | Hutton et al. | Sep 2013 | B2 |
8539582 | Aziz et al. | Sep 2013 | B1 |
8549638 | Aziz | Oct 2013 | B2 |
8555385 | Bhatkar | Oct 2013 | B1 |
8555391 | Demir et al. | Oct 2013 | B1 |
8561177 | Aziz et al. | Oct 2013 | B1 |
8566476 | Shifter et al. | Oct 2013 | B2 |
8566946 | Aziz et al. | Oct 2013 | B1 |
8584094 | Dadhia et al. | Nov 2013 | B2 |
8584234 | Sobel et al. | Nov 2013 | B1 |
8584239 | Aziz et al. | Nov 2013 | B2 |
8595834 | Xie et al. | Nov 2013 | B2 |
8627476 | Satish et al. | Jan 2014 | B1 |
8635696 | Aziz | Jan 2014 | B1 |
8640245 | Zaitsev | Jan 2014 | B2 |
8682054 | Xue et al. | Mar 2014 | B2 |
8682812 | Ranjan | Mar 2014 | B1 |
8689333 | Aziz | Apr 2014 | B2 |
8695096 | Zhang | Apr 2014 | B1 |
8713631 | Pavlyushchik | Apr 2014 | B1 |
8713681 | Silberman et al. | Apr 2014 | B2 |
8726392 | McCorkendale et al. | May 2014 | B1 |
8739280 | Chess et al. | May 2014 | B2 |
8776229 | Aziz | Jul 2014 | B1 |
8782792 | Bodke | Jul 2014 | B1 |
8789172 | Stolfo et al. | Jul 2014 | B2 |
8789178 | Kejriwal et al. | Jul 2014 | B2 |
8793278 | Frazier et al. | Jul 2014 | B2 |
8793787 | Ismael et al. | Jul 2014 | B2 |
8805947 | Kuzkin et al. | Aug 2014 | B1 |
8806647 | Daswani et al. | Aug 2014 | B1 |
8832829 | Manni et al. | Sep 2014 | B2 |
8850570 | Ramzan | Sep 2014 | B1 |
8850571 | Staniford et al. | Sep 2014 | B2 |
8881234 | Narasimhan et al. | Nov 2014 | B2 |
8881271 | Butler, II | Nov 2014 | B2 |
8881282 | Aziz et al. | Nov 2014 | B1 |
8898788 | Aziz et al. | Nov 2014 | B1 |
8935779 | Manni et al. | Jan 2015 | B2 |
8949257 | Shiffer et al. | Feb 2015 | B2 |
8984638 | Aziz et al. | Mar 2015 | B1 |
8990939 | Staniford et al. | Mar 2015 | B2 |
8990944 | Singh et al. | Mar 2015 | B1 |
8997219 | Staniford et al. | Mar 2015 | B2 |
9009822 | Ismael et al. | Apr 2015 | B1 |
9009823 | Ismael et al. | Apr 2015 | B1 |
9027135 | Aziz | May 2015 | B1 |
9071638 | Aziz et al. | Jun 2015 | B1 |
9104867 | Thioux et al. | Aug 2015 | B1 |
9106630 | Frazier et al. | Aug 2015 | B2 |
9106694 | Aziz et al. | Aug 2015 | B2 |
9118715 | Staniford et al. | Aug 2015 | B2 |
9159035 | Ismael et al. | Oct 2015 | B1 |
9171160 | Vincent et al. | Oct 2015 | B2 |
9176843 | Ismael et al. | Nov 2015 | B1 |
9189627 | Islam | Nov 2015 | B1 |
9195829 | Goradia et al. | Nov 2015 | B1 |
9197664 | Aziz et al. | Nov 2015 | B1 |
9223972 | Vincent et al. | Dec 2015 | B1 |
9225740 | Ismael et al. | Dec 2015 | B1 |
9241010 | Bennett et al. | Jan 2016 | B1 |
9251343 | Vincent et al. | Feb 2016 | B1 |
9262635 | Paithane et al. | Feb 2016 | B2 |
9268936 | Butler | Feb 2016 | B2 |
9275229 | LeMasters | Mar 2016 | B2 |
9282109 | Aziz et al. | Mar 2016 | B1 |
9288220 | Raugas | Mar 2016 | B2 |
9292686 | Ismael et al. | Mar 2016 | B2 |
9294501 | Mesdaq et al. | Mar 2016 | B2 |
9300686 | Pidathala et al. | Mar 2016 | B2 |
9306960 | Aziz | Apr 2016 | B1 |
9306974 | Aziz et al. | Apr 2016 | B1 |
9311479 | Manni et al. | Apr 2016 | B1 |
9355247 | Thioux et al. | May 2016 | B1 |
9356944 | Aziz | May 2016 | B1 |
9363280 | Rivlin et al. | Jun 2016 | B1 |
9367681 | Ismael et al. | Jun 2016 | B1 |
9398028 | Karandikar et al. | Jul 2016 | B1 |
9413781 | Cunningham et al. | Aug 2016 | B2 |
9426071 | Caldejon et al. | Aug 2016 | B1 |
9430646 | Mushtaq et al. | Aug 2016 | B1 |
9432389 | Khalid et al. | Aug 2016 | B1 |
9438613 | Paithane et al. | Sep 2016 | B1 |
9438622 | Staniford et al. | Sep 2016 | B1 |
9438623 | Thioux et al. | Sep 2016 | B1 |
9459901 | Jung et al. | Oct 2016 | B2 |
9467460 | Otvagin et al. | Oct 2016 | B1 |
9483644 | Paithane et al. | Nov 2016 | B1 |
9495180 | Ismael | Nov 2016 | B2 |
9497213 | Thompson et al. | Nov 2016 | B2 |
9507935 | Ismael et al. | Nov 2016 | B2 |
9516057 | Aziz | Dec 2016 | B2 |
9519782 | Aziz et al. | Dec 2016 | B2 |
9536091 | Paithane et al. | Jan 2017 | B2 |
9537972 | Edwards et al. | Jan 2017 | B1 |
9560059 | Islam | Jan 2017 | B1 |
9565202 | Kindlund et al. | Feb 2017 | B1 |
9591015 | Amin et al. | Mar 2017 | B1 |
9591020 | Aziz | Mar 2017 | B1 |
9594904 | Jain et al. | Mar 2017 | B1 |
9594905 | Ismael et al. | Mar 2017 | B1 |
9594912 | Thioux et al. | Mar 2017 | B1 |
9609007 | Rivlin et al. | Mar 2017 | B1 |
9626509 | Khalid et al. | Apr 2017 | B1 |
9628498 | Aziz et al. | Apr 2017 | B1 |
9628507 | Haq et al. | Apr 2017 | B2 |
9633134 | Ross | Apr 2017 | B2 |
9635039 | Islam et al. | Apr 2017 | B1 |
9641546 | Manni et al. | May 2017 | B1 |
9654485 | Neumann | May 2017 | B1 |
9661009 | Karandikar et al. | May 2017 | B1 |
9661018 | Aziz | May 2017 | B1 |
9674298 | Edwards et al. | Jun 2017 | B1 |
9680862 | Ismael et al. | Jun 2017 | B2 |
9690606 | Ha et al. | Jun 2017 | B1 |
9690933 | Singh et al. | Jun 2017 | B1 |
9690935 | Shiffer et al. | Jun 2017 | B2 |
9690936 | Malik et al. | Jun 2017 | B1 |
9690937 | Duchin | Jun 2017 | B1 |
9690938 | Saxe | Jun 2017 | B1 |
9736179 | Ismael | Aug 2017 | B2 |
9740857 | Ismael et al. | Aug 2017 | B2 |
9747446 | Pidathala et al. | Aug 2017 | B1 |
9756074 | Aziz et al. | Sep 2017 | B2 |
9773112 | Rathor et al. | Sep 2017 | B1 |
9781144 | Otvagin et al. | Oct 2017 | B1 |
9787700 | Amin et al. | Oct 2017 | B1 |
9787706 | Otvagin et al. | Oct 2017 | B1 |
9792196 | Ismael et al. | Oct 2017 | B1 |
9824209 | Ismael et al. | Nov 2017 | B1 |
9824211 | Wilson | Nov 2017 | B2 |
9824216 | Khalid et al. | Nov 2017 | B1 |
9825976 | Gomez et al. | Nov 2017 | B1 |
9825989 | Mehra et al. | Nov 2017 | B1 |
9838408 | Karandikar et al. | Dec 2017 | B1 |
9838411 | Aziz | Dec 2017 | B1 |
9838416 | Aziz | Dec 2017 | B1 |
9838417 | Khalid et al. | Dec 2017 | B1 |
9846776 | Paithane et al. | Dec 2017 | B1 |
9876701 | Caldejon et al. | Jan 2018 | B1 |
9888016 | Amin et al. | Feb 2018 | B1 |
9888019 | Pidathala et al. | Feb 2018 | B1 |
9910988 | Vincent et al. | Mar 2018 | B1 |
9912644 | Cunningham | Mar 2018 | B2 |
9912681 | Ismael et al. | Mar 2018 | B1 |
9912684 | Aziz et al. | Mar 2018 | B1 |
9912691 | Mesdaq et al. | Mar 2018 | B2 |
9912698 | Thioux et al. | Mar 2018 | B1 |
9916440 | Paithane et al. | Mar 2018 | B1 |
9921978 | Chan et al. | Mar 2018 | B1 |
9934376 | Ismael | Apr 2018 | B1 |
9934381 | Kindlund et al. | Apr 2018 | B1 |
9946568 | Ismael et al. | Apr 2018 | B1 |
9954890 | Staniford et al. | Apr 2018 | B1 |
9973531 | Thioux | May 2018 | B1 |
10002252 | Ismael et al. | Jun 2018 | B2 |
10019338 | Goradia et al. | Jul 2018 | B1 |
10019573 | Silberman et al. | Jul 2018 | B2 |
10025691 | Ismael et al. | Jul 2018 | B1 |
10025927 | Khalid et al. | Jul 2018 | B1 |
10027689 | Rathor et al. | Jul 2018 | B1 |
10027690 | Aziz et al. | Jul 2018 | B2 |
10027696 | Rivlin et al. | Jul 2018 | B1 |
10033747 | Paithane et al. | Jul 2018 | B1 |
10033748 | Cunningham et al. | Jul 2018 | B1 |
10033753 | Islam et al. | Jul 2018 | B1 |
10033759 | Kabra et al. | Jul 2018 | B1 |
10050998 | Singh | Aug 2018 | B1 |
10068091 | Aziz et al. | Sep 2018 | B1 |
10075455 | Zafar et al. | Sep 2018 | B2 |
10083302 | Paithane et al. | Sep 2018 | B1 |
10084813 | Eyada | Sep 2018 | B2 |
10089461 | Ha et al. | Oct 2018 | B1 |
10097573 | Aziz | Oct 2018 | B1 |
10104102 | Neumann | Oct 2018 | B1 |
10108446 | Steinberg et al. | Oct 2018 | B1 |
10121000 | Rivlin et al. | Nov 2018 | B1 |
10122746 | Manni et al. | Nov 2018 | B1 |
10133863 | Bu et al. | Nov 2018 | B2 |
10133866 | Kumar et al. | Nov 2018 | B1 |
10146810 | Shiffer et al. | Dec 2018 | B2 |
10148693 | Singh et al. | Dec 2018 | B2 |
10165000 | Aziz et al. | Dec 2018 | B1 |
10169585 | Pilipenko et al. | Jan 2019 | B1 |
10176321 | Abbasi et al. | Jan 2019 | B2 |
10181029 | Ismael et al. | Jan 2019 | B1 |
10187401 | Machlica | Jan 2019 | B2 |
10191861 | Steinberg et al. | Jan 2019 | B1 |
10192052 | Singh et al. | Jan 2019 | B1 |
10198574 | Thioux et al. | Feb 2019 | B1 |
10200384 | Mushtaq et al. | Feb 2019 | B1 |
10210329 | Malik et al. | Feb 2019 | B1 |
10216927 | Steinberg | Feb 2019 | B1 |
10218740 | Mesdaq et al. | Feb 2019 | B1 |
10230749 | Rostami-Hesarsorkh | Mar 2019 | B1 |
10242185 | Goradia | Mar 2019 | B1 |
10313379 | Han | Jun 2019 | B1 |
20010005889 | Albrecht | Jun 2001 | A1 |
20010047326 | Broadbent et al. | Nov 2001 | A1 |
20020018903 | Kokubo et al. | Feb 2002 | A1 |
20020038430 | Edwards et al. | Mar 2002 | A1 |
20020091819 | Melchione et al. | Jul 2002 | A1 |
20020095607 | Lin-Hendel | Jul 2002 | A1 |
20020116627 | Tarbotton et al. | Aug 2002 | A1 |
20020144156 | Copeland | Oct 2002 | A1 |
20020162015 | Tang | Oct 2002 | A1 |
20020166063 | Lachman et al. | Nov 2002 | A1 |
20020169952 | DiSanto et al. | Nov 2002 | A1 |
20020184528 | Shevenell et al. | Dec 2002 | A1 |
20020188887 | Largman et al. | Dec 2002 | A1 |
20020194490 | Halperin et al. | Dec 2002 | A1 |
20030021728 | Sharpe et al. | Jan 2003 | A1 |
20030074578 | Ford et al. | Apr 2003 | A1 |
20030084318 | Schertz | May 2003 | A1 |
20030101381 | Mateev et al. | May 2003 | A1 |
20030115483 | Liang | Jun 2003 | A1 |
20030188190 | Aaron et al. | Oct 2003 | A1 |
20030191957 | Hypponen et al. | Oct 2003 | A1 |
20030200460 | Morota et al. | Oct 2003 | A1 |
20030212902 | van der Made | Nov 2003 | A1 |
20030229801 | Kouznetsov et al. | Dec 2003 | A1 |
20030237000 | Denton et al. | Dec 2003 | A1 |
20040003323 | Bennett et al. | Jan 2004 | A1 |
20040006473 | Mills et al. | Jan 2004 | A1 |
20040015712 | Szor | Jan 2004 | A1 |
20040019832 | Arnold et al. | Jan 2004 | A1 |
20040047356 | Bauer | Mar 2004 | A1 |
20040083408 | Spiegel et al. | Apr 2004 | A1 |
20040088581 | Brawn et al. | May 2004 | A1 |
20040093513 | Cantrell et al. | May 2004 | A1 |
20040111531 | Staniford et al. | Jun 2004 | A1 |
20040117478 | Triulzi et al. | Jun 2004 | A1 |
20040117624 | Brandt et al. | Jun 2004 | A1 |
20040128355 | Chao et al. | Jul 2004 | A1 |
20040165588 | Pandya | Aug 2004 | A1 |
20040236963 | Danford et al. | Nov 2004 | A1 |
20040243349 | Greifeneder et al. | Dec 2004 | A1 |
20040249911 | Alkhatib et al. | Dec 2004 | A1 |
20040255161 | Cavanaugh | Dec 2004 | A1 |
20040268147 | Wiederin et al. | Dec 2004 | A1 |
20050005159 | Oliphant | Jan 2005 | A1 |
20050021740 | Bar et al. | Jan 2005 | A1 |
20050033960 | Vialen et al. | Feb 2005 | A1 |
20050033989 | Poletto et al. | Feb 2005 | A1 |
20050050148 | Mohammadioun et al. | Mar 2005 | A1 |
20050086523 | Zimmer et al. | Apr 2005 | A1 |
20050091513 | Mitomo et al. | Apr 2005 | A1 |
20050091533 | Omote et al. | Apr 2005 | A1 |
20050091652 | Ross et al. | Apr 2005 | A1 |
20050108562 | Khazan et al. | May 2005 | A1 |
20050114663 | Cornell et al. | May 2005 | A1 |
20050125195 | Brendel | Jun 2005 | A1 |
20050149726 | Joshi et al. | Jul 2005 | A1 |
20050157662 | Bingham et al. | Jul 2005 | A1 |
20050183143 | Anderholm et al. | Aug 2005 | A1 |
20050201297 | Peikari | Sep 2005 | A1 |
20050210533 | Copeland et al. | Sep 2005 | A1 |
20050238005 | Chen et al. | Oct 2005 | A1 |
20050240781 | Gassoway | Oct 2005 | A1 |
20050262562 | Gassoway | Nov 2005 | A1 |
20050265331 | Stolfo | Dec 2005 | A1 |
20050283839 | Cowburn | Dec 2005 | A1 |
20060010495 | Cohen et al. | Jan 2006 | A1 |
20060015416 | Hoffman et al. | Jan 2006 | A1 |
20060015715 | Anderson | Jan 2006 | A1 |
20060015747 | Van de Ven | Jan 2006 | A1 |
20060021029 | Brickell et al. | Jan 2006 | A1 |
20060021054 | Costa et al. | Jan 2006 | A1 |
20060031476 | Mathes et al. | Feb 2006 | A1 |
20060047665 | Neil | Mar 2006 | A1 |
20060070130 | Costea et al. | Mar 2006 | A1 |
20060075496 | Carpenter et al. | Apr 2006 | A1 |
20060095968 | Portolani et al. | May 2006 | A1 |
20060101516 | Sudaharan et al. | May 2006 | A1 |
20060101517 | Banzhof et al. | May 2006 | A1 |
20060117385 | Mester et al. | Jun 2006 | A1 |
20060123477 | Raghavan et al. | Jun 2006 | A1 |
20060143709 | Brooks et al. | Jun 2006 | A1 |
20060150249 | Gassen et al. | Jul 2006 | A1 |
20060161983 | Cothrell et al. | Jul 2006 | A1 |
20060161987 | Levy-Yurista | Jul 2006 | A1 |
20060161989 | Reshef et al. | Jul 2006 | A1 |
20060164199 | Glide et al. | Jul 2006 | A1 |
20060173992 | Weber et al. | Aug 2006 | A1 |
20060179147 | Tran et al. | Aug 2006 | A1 |
20060184632 | Marino et al. | Aug 2006 | A1 |
20060191010 | Benjamin | Aug 2006 | A1 |
20060221956 | Narayan et al. | Oct 2006 | A1 |
20060236393 | Kramer et al. | Oct 2006 | A1 |
20060242709 | Seinfeld et al. | Oct 2006 | A1 |
20060248519 | Jaeger et al. | Nov 2006 | A1 |
20060248582 | Panjwani et al. | Nov 2006 | A1 |
20060251104 | Koga | Nov 2006 | A1 |
20060288417 | Bookbinder et al. | Dec 2006 | A1 |
20070006288 | Mayfield et al. | Jan 2007 | A1 |
20070006313 | Porras et al. | Jan 2007 | A1 |
20070011174 | Takaragi et al. | Jan 2007 | A1 |
20070016951 | Piccard et al. | Jan 2007 | A1 |
20070019286 | Kikuchi | Jan 2007 | A1 |
20070033645 | Jones | Feb 2007 | A1 |
20070038943 | FitzGerald et al. | Feb 2007 | A1 |
20070064689 | Shin et al. | Mar 2007 | A1 |
20070074169 | Chess et al. | Mar 2007 | A1 |
20070094730 | Bhikkaji et al. | Apr 2007 | A1 |
20070101435 | Konanka et al. | May 2007 | A1 |
20070128855 | Cho et al. | Jun 2007 | A1 |
20070142030 | Sinha et al. | Jun 2007 | A1 |
20070143827 | Nicodemus et al. | Jun 2007 | A1 |
20070156895 | Vuong | Jul 2007 | A1 |
20070157180 | Tillmann et al. | Jul 2007 | A1 |
20070157306 | Elrod et al. | Jul 2007 | A1 |
20070168988 | Eisner et al. | Jul 2007 | A1 |
20070171824 | Ruello et al. | Jul 2007 | A1 |
20070174915 | Gribble et al. | Jul 2007 | A1 |
20070192500 | Lum | Aug 2007 | A1 |
20070192858 | Lum | Aug 2007 | A1 |
20070198275 | Malden et al. | Aug 2007 | A1 |
20070208822 | Wang et al. | Sep 2007 | A1 |
20070220607 | Sprosts et al. | Sep 2007 | A1 |
20070240218 | Tuvell et al. | Oct 2007 | A1 |
20070240219 | Tuvell et al. | Oct 2007 | A1 |
20070240220 | Tuvell et al. | Oct 2007 | A1 |
20070240222 | Tuvell et al. | Oct 2007 | A1 |
20070250930 | Aziz et al. | Oct 2007 | A1 |
20070256132 | Oliphant | Nov 2007 | A2 |
20070271446 | Nakamura | Nov 2007 | A1 |
20080005782 | Aziz | Jan 2008 | A1 |
20080018122 | Zierler et al. | Jan 2008 | A1 |
20080028463 | Dagon et al. | Jan 2008 | A1 |
20080040710 | Chiriac | Feb 2008 | A1 |
20080046781 | Childs et al. | Feb 2008 | A1 |
20080066179 | Liu | Mar 2008 | A1 |
20080072326 | Danford et al. | Mar 2008 | A1 |
20080077793 | Tan et al. | Mar 2008 | A1 |
20080080518 | Hoeflin et al. | Apr 2008 | A1 |
20080086720 | Lekel | Apr 2008 | A1 |
20080098476 | Syversen | Apr 2008 | A1 |
20080120722 | Sima et al. | May 2008 | A1 |
20080134178 | Fitzgerald et al. | Jun 2008 | A1 |
20080134334 | Kim et al. | Jun 2008 | A1 |
20080141376 | Clausen et al. | Jun 2008 | A1 |
20080184367 | McMillan et al. | Jul 2008 | A1 |
20080184373 | Traut et al. | Jul 2008 | A1 |
20080189787 | Arnold et al. | Aug 2008 | A1 |
20080201778 | Guo et al. | Aug 2008 | A1 |
20080209557 | Herley et al. | Aug 2008 | A1 |
20080215742 | Goldszmidt et al. | Sep 2008 | A1 |
20080222729 | Chen et al. | Sep 2008 | A1 |
20080263665 | Ma et al. | Oct 2008 | A1 |
20080295172 | Bohacek | Nov 2008 | A1 |
20080301810 | Lehane et al. | Dec 2008 | A1 |
20080307524 | Singh et al. | Dec 2008 | A1 |
20080313738 | Enderby | Dec 2008 | A1 |
20080320594 | Jiang | Dec 2008 | A1 |
20090003317 | Kasralikar et al. | Jan 2009 | A1 |
20090007100 | Field et al. | Jan 2009 | A1 |
20090013408 | Schipka | Jan 2009 | A1 |
20090031423 | Liu et al. | Jan 2009 | A1 |
20090036111 | Danford et al. | Feb 2009 | A1 |
20090037835 | Goldman | Feb 2009 | A1 |
20090044024 | Oberheide et al. | Feb 2009 | A1 |
20090044274 | Budko et al. | Feb 2009 | A1 |
20090064332 | Porras et al. | Mar 2009 | A1 |
20090077666 | Chen et al. | Mar 2009 | A1 |
20090083369 | Marmor | Mar 2009 | A1 |
20090083855 | Apap et al. | Mar 2009 | A1 |
20090089879 | Wang et al. | Apr 2009 | A1 |
20090094697 | Provos et al. | Apr 2009 | A1 |
20090113425 | Ports et al. | Apr 2009 | A1 |
20090125976 | Wassermann et al. | May 2009 | A1 |
20090126015 | Monastyrsky et al. | May 2009 | A1 |
20090126016 | Sobko et al. | May 2009 | A1 |
20090133125 | Choi et al. | May 2009 | A1 |
20090144823 | Lamastra et al. | Jun 2009 | A1 |
20090158430 | Borders | Jun 2009 | A1 |
20090172815 | Gu et al. | Jul 2009 | A1 |
20090187992 | Poston | Jul 2009 | A1 |
20090193293 | Stolfo et al. | Jul 2009 | A1 |
20090198651 | Shiffer et al. | Aug 2009 | A1 |
20090198670 | Shiffer et al. | Aug 2009 | A1 |
20090198689 | Frazier et al. | Aug 2009 | A1 |
20090199274 | Frazier et al. | Aug 2009 | A1 |
20090199296 | Xie et al. | Aug 2009 | A1 |
20090228233 | Anderson et al. | Sep 2009 | A1 |
20090241187 | Troyansky | Sep 2009 | A1 |
20090241190 | Todd et al. | Sep 2009 | A1 |
20090265692 | Godefroid et al. | Oct 2009 | A1 |
20090271867 | Zhang | Oct 2009 | A1 |
20090300415 | Zhang et al. | Dec 2009 | A1 |
20090300761 | Park et al. | Dec 2009 | A1 |
20090328185 | Berg et al. | Dec 2009 | A1 |
20090328221 | Blumfield et al. | Dec 2009 | A1 |
20100005146 | Drako et al. | Jan 2010 | A1 |
20100011205 | McKenna | Jan 2010 | A1 |
20100017546 | Poo et al. | Jan 2010 | A1 |
20100030996 | Butler, II | Feb 2010 | A1 |
20100031353 | Thomas et al. | Feb 2010 | A1 |
20100037314 | Perdisci et al. | Feb 2010 | A1 |
20100043073 | Kuwamura | Feb 2010 | A1 |
20100054278 | Stolfo et al. | Mar 2010 | A1 |
20100058474 | Hicks | Mar 2010 | A1 |
20100064044 | Nonoyama | Mar 2010 | A1 |
20100077481 | Polyakov et al. | Mar 2010 | A1 |
20100083376 | Pereira et al. | Apr 2010 | A1 |
20100115621 | Staniford et al. | May 2010 | A1 |
20100132038 | Zaitsev | May 2010 | A1 |
20100154056 | Smith et al. | Jun 2010 | A1 |
20100180344 | Malyshev et al. | Jul 2010 | A1 |
20100192223 | Ismael et al. | Jul 2010 | A1 |
20100220863 | Dupaquis et al. | Sep 2010 | A1 |
20100235831 | Dittmer | Sep 2010 | A1 |
20100251104 | Massand | Sep 2010 | A1 |
20100281102 | Chinta et al. | Nov 2010 | A1 |
20100281541 | Stolfo et al. | Nov 2010 | A1 |
20100281542 | Stolfo et al. | Nov 2010 | A1 |
20100287260 | Peterson et al. | Nov 2010 | A1 |
20100299754 | Amit et al. | Nov 2010 | A1 |
20100306173 | Frank | Dec 2010 | A1 |
20110004737 | Greenebaum | Jan 2011 | A1 |
20110025504 | Lyon et al. | Feb 2011 | A1 |
20110041179 | St Hlberg | Feb 2011 | A1 |
20110047594 | Mahaffey et al. | Feb 2011 | A1 |
20110047620 | Mahaffey et al. | Feb 2011 | A1 |
20110055907 | Narasimhan et al. | Mar 2011 | A1 |
20110078794 | Manni et al. | Mar 2011 | A1 |
20110093951 | Aziz | Apr 2011 | A1 |
20110099620 | Stavrou et al. | Apr 2011 | A1 |
20110099633 | Aziz | Apr 2011 | A1 |
20110099635 | Silberman et al. | Apr 2011 | A1 |
20110113231 | Kaminsky | May 2011 | A1 |
20110145918 | Jung et al. | Jun 2011 | A1 |
20110145920 | Mahaffey et al. | Jun 2011 | A1 |
20110145934 | Abramovici et al. | Jun 2011 | A1 |
20110167493 | Song et al. | Jul 2011 | A1 |
20110167494 | Bowen et al. | Jul 2011 | A1 |
20110173213 | Frazier et al. | Jul 2011 | A1 |
20110173460 | Ito et al. | Jul 2011 | A1 |
20110219449 | St. Neitzel et al. | Sep 2011 | A1 |
20110219450 | McDougal et al. | Sep 2011 | A1 |
20110225624 | Sawhney et al. | Sep 2011 | A1 |
20110225655 | Niemela et al. | Sep 2011 | A1 |
20110247072 | Staniford et al. | Oct 2011 | A1 |
20110265182 | Peinado et al. | Oct 2011 | A1 |
20110289582 | Kejriwal et al. | Nov 2011 | A1 |
20110302587 | Nishikawa et al. | Dec 2011 | A1 |
20110307954 | Melnik et al. | Dec 2011 | A1 |
20110307955 | Kaplan et al. | Dec 2011 | A1 |
20110307956 | Yermakov et al. | Dec 2011 | A1 |
20110314546 | Aziz et al. | Dec 2011 | A1 |
20120023593 | Puder et al. | Jan 2012 | A1 |
20120054869 | Yen et al. | Mar 2012 | A1 |
20120066698 | Yanoo | Mar 2012 | A1 |
20120079596 | Thomas et al. | Mar 2012 | A1 |
20120084859 | Radinsky et al. | Apr 2012 | A1 |
20120096553 | Srivastava et al. | Apr 2012 | A1 |
20120110667 | Zubrilin et al. | May 2012 | A1 |
20120117652 | Manni et al. | May 2012 | A1 |
20120121154 | Xue et al. | May 2012 | A1 |
20120124426 | Maybee et al. | May 2012 | A1 |
20120174186 | Aziz et al. | Jul 2012 | A1 |
20120174196 | Bhogavilli et al. | Jul 2012 | A1 |
20120174218 | McCoy et al. | Jul 2012 | A1 |
20120198279 | Schroeder | Aug 2012 | A1 |
20120210423 | Friedrichs et al. | Aug 2012 | A1 |
20120222121 | Staniford et al. | Aug 2012 | A1 |
20120255015 | Sahita et al. | Oct 2012 | A1 |
20120255017 | Sallam | Oct 2012 | A1 |
20120260342 | Dube et al. | Oct 2012 | A1 |
20120266244 | Green et al. | Oct 2012 | A1 |
20120278886 | Luna | Nov 2012 | A1 |
20120297489 | Dequevy | Nov 2012 | A1 |
20120304244 | Xie | Nov 2012 | A1 |
20120330801 | McDougal et al. | Dec 2012 | A1 |
20120331553 | Aziz et al. | Dec 2012 | A1 |
20130014259 | Gribble et al. | Jan 2013 | A1 |
20130036472 | Aziz | Feb 2013 | A1 |
20130047257 | Aziz | Feb 2013 | A1 |
20130074185 | McDougal et al. | Mar 2013 | A1 |
20130086684 | Mohler | Apr 2013 | A1 |
20130097699 | Balupari et al. | Apr 2013 | A1 |
20130097706 | Titonis et al. | Apr 2013 | A1 |
20130111587 | Goel et al. | May 2013 | A1 |
20130117852 | Stute | May 2013 | A1 |
20130117855 | Kim et al. | May 2013 | A1 |
20130139264 | Brinkley et al. | May 2013 | A1 |
20130160125 | Likhachev et al. | Jun 2013 | A1 |
20130160127 | Jeong et al. | Jun 2013 | A1 |
20130160130 | Mendelev et al. | Jun 2013 | A1 |
20130160131 | Madou et al. | Jun 2013 | A1 |
20130167236 | Sick | Jun 2013 | A1 |
20130174214 | Duncan | Jul 2013 | A1 |
20130185789 | Hagiwara et al. | Jul 2013 | A1 |
20130185795 | Winn et al. | Jul 2013 | A1 |
20130185798 | Saunders et al. | Jul 2013 | A1 |
20130191915 | Antonakakis et al. | Jul 2013 | A1 |
20130196649 | Paddon et al. | Aug 2013 | A1 |
20130227691 | Aziz et al. | Aug 2013 | A1 |
20130246370 | Bartram et al. | Sep 2013 | A1 |
20130247186 | LeMasters | Sep 2013 | A1 |
20130263260 | Mahaffey et al. | Oct 2013 | A1 |
20130291109 | Staniford et al. | Oct 2013 | A1 |
20130298243 | Kumar et al. | Nov 2013 | A1 |
20130318038 | Shiffer et al. | Nov 2013 | A1 |
20130318073 | Shiffer et al. | Nov 2013 | A1 |
20130325791 | Shiffer et al. | Dec 2013 | A1 |
20130325792 | Shiffer et al. | Dec 2013 | A1 |
20130325871 | Shiffer et al. | Dec 2013 | A1 |
20130325872 | Shiffer et al. | Dec 2013 | A1 |
20140032875 | Butler | Jan 2014 | A1 |
20140053260 | Gupta et al. | Feb 2014 | A1 |
20140053261 | Gupta et al. | Feb 2014 | A1 |
20140130158 | Wang et al. | May 2014 | A1 |
20140137180 | Lukacs et al. | May 2014 | A1 |
20140169762 | Ryu | Jun 2014 | A1 |
20140179360 | Jackson et al. | Jun 2014 | A1 |
20140181131 | Ross | Jun 2014 | A1 |
20140189687 | Jung et al. | Jul 2014 | A1 |
20140189866 | Shiffer et al. | Jul 2014 | A1 |
20140189882 | Jung et al. | Jul 2014 | A1 |
20140237600 | Silberman et al. | Aug 2014 | A1 |
20140280245 | Wilson | Sep 2014 | A1 |
20140283037 | Sikorski et al. | Sep 2014 | A1 |
20140283063 | Thompson et al. | Sep 2014 | A1 |
20140328204 | Klotsche et al. | Nov 2014 | A1 |
20140337836 | Ismael | Nov 2014 | A1 |
20140344926 | Cunningham et al. | Nov 2014 | A1 |
20140351935 | Shao et al. | Nov 2014 | A1 |
20140380473 | Bu et al. | Dec 2014 | A1 |
20140380474 | Paithane et al. | Dec 2014 | A1 |
20150007312 | Pidathala et al. | Jan 2015 | A1 |
20150096022 | Vincent et al. | Apr 2015 | A1 |
20150096023 | Mesdaq et al. | Apr 2015 | A1 |
20150096024 | Haq et al. | Apr 2015 | A1 |
20150096025 | Ismael | Apr 2015 | A1 |
20150180883 | Aktas | Jun 2015 | A1 |
20150180886 | Staniford et al. | Jun 2015 | A1 |
20150186645 | Aziz et al. | Jul 2015 | A1 |
20150199513 | Ismael et al. | Jul 2015 | A1 |
20150199531 | Ismael et al. | Jul 2015 | A1 |
20150199532 | Ismael | Jul 2015 | A1 |
20150220735 | Paithane et al. | Aug 2015 | A1 |
20150372980 | Eyada | Dec 2015 | A1 |
20160004869 | Ismael et al. | Jan 2016 | A1 |
20160006756 | Ismael et al. | Jan 2016 | A1 |
20160044000 | Cunningham | Feb 2016 | A1 |
20160127393 | Aziz et al. | May 2016 | A1 |
20160191547 | Zafar et al. | Jun 2016 | A1 |
20160191550 | Ismael et al. | Jun 2016 | A1 |
20160261612 | Mesdaq et al. | Sep 2016 | A1 |
20160277423 | Apostolescu | Sep 2016 | A1 |
20160285914 | Singh et al. | Sep 2016 | A1 |
20160301703 | Aziz | Oct 2016 | A1 |
20160335110 | Paithane et al. | Nov 2016 | A1 |
20170083703 | Abbasi et al. | Mar 2017 | A1 |
20170099304 | Anderson | Apr 2017 | A1 |
20180013770 | Ismael | Jan 2018 | A1 |
20180048660 | Paithane et al. | Feb 2018 | A1 |
20180060738 | Achin | Mar 2018 | A1 |
20180121316 | Ismael et al. | May 2018 | A1 |
20180276560 | Hu | Sep 2018 | A1 |
20180288077 | Siddiqui et al. | Oct 2018 | A1 |
20190132334 | Johns | May 2019 | A1 |
20190199736 | Howard | Jun 2019 | A1 |
20190260779 | Bazalgette | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
2439806 | Jan 2008 | GB |
2490431 | Oct 2012 | GB |
0206928 | Jan 2002 | WO |
0223805 | Mar 2002 | WO |
2007117636 | Oct 2007 | WO |
2008041950 | Apr 2008 | WO |
2011084431 | Jul 2011 | WO |
2011112348 | Sep 2011 | WO |
2012075336 | Jun 2012 | WO |
2012145066 | Oct 2012 | WO |
2013067505 | May 2013 | WO |
Entry |
---|
“Mining Specification of Malicious Behavior”—Jha et al, UCSB, Sep. 2007 https://www.cs.ucsb.edu/.about.chris/research/doc/esec07.sub.--mining.pdf-. |
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003). |
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.isp?reload=true&amumbe- r=990073, (Dec. 7, 2013). |
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108. |
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003). |
Apostolopoulos, George; hassapis, Constantinos; “V-eM: A cluster of Virtual Machines for Robust, Detailed, and High-Performance Network Emulation”, 14th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Sep. 11-14, 2006, pp. 117-126. |
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006. |
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184. |
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77. |
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists_org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003). |
Chaudet, C. , et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82. |
Chen, P. M. and Noble, B. D., “When Virtual is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”) (2001). |
Cisco “Intrusion Prevention for the Cisco ASA 5500-x Series” Data Sheet (2012). |
Cohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120. |
Costa, M. , et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005). |
Didier Stevens, “Malicious PDF Documents Explained”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 9, No. 1, Jan. 1, 2011, pp. 80-82, XP011329453, ISSN: 1540-7993, DOI: 10.1109/MSP.2011.14. |
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007). |
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002). |
FireEye Malware Analysis & Exchange Network, Malware Protection System, FireEye Inc., 2010. |
FireEye Malware Analysis, Modern Malware Forensics, FireEye Inc., 2010. |
FireEye v.6.0 Security Target, pp. 1-35, Version 1.1, FireEye Inc., May 2011. |
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28. |
Gregg Keizer: “Microsoft's HoneyMonkeys Show Patching Windows Works”, Aug. 8, 2005, XP055143386, Retrieved from the Internet: URL:http://www.informationweek.com/microsofts-honeymonkeys-show-patching-windows-works/d/d-id/1035069? [retrieved on Jun. 1, 2016]. |
Heng Yin et al, Panorama: Capturing System-Wide Information Flow for Malware Detection and Analysis, Research Showcase @ CMU, Carnegie Mellon University, 2007. |
Hiroshi Shinotsuka, Malware Authors Using New Techniques to Evade Automated Threat Analysis Systems, Oct. 26, 2012, http://www.symantec.com/connect/blogs/, pp. 1-4. |
Idika et al., A-Survey-of-Malware-Detection-Techniques, Feb. 2, 2007, Department of Computer Science, Purdue University. |
Isohara, Takamasa, Keisuke Takemori, and Ayumu Kubota. “Kernel-based behavior analysis for android malware detection.” Computational intelligence and Security (CIS), 2011 Seventh International Conference on. IEEE, 2011. |
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003). |
Kevin A Roundy et al: “Hybrid Analysis and Control of Malware”, Sep. 15, 2010, Recent Advances in Intrusion Detection, Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 317-338, XP019150454 ISBN:978-3-642-15511-6. |
Khaled Salah et al: “Using Cloud Computing to Implement a Security Overlay Network”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 11, No. 1, Jan. 1, 2013 (Jan. 1, 2013). |
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286. |
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”), (2003). |
Kreibich, C. , et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003). |
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages. |
Lastline Labs, The Threat of Evasive Malware, Feb. 25, 2013, Lastline Labs, pp. 1-8. |
Li et al., A VMM-Based System Call Interposition Framework for Program Monitoring, Dec. 2010, IEEE 16th International Conference on Parallel and Distributed Systems, pp. 706-711. |
Lindorfer, Martina, Clemens Kolbitsch, and Paolo Milani Comparetti. “Detecting environment-sensitive malware.” Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011. |
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001). |
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910. |
Morales, Jose A., et al., ““Analyzing and exploiting network behaviors of malware.””, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34. |
Mori, Detecting Unknown Computer Viruses, 2004, Springer-Verlag Berlin Heidelberg. |
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002). |
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987. |
Newsome, J. , et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium NDSS '05), (Feb. 2005). |
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302. |
Oberheide et al., CloudAV.sub.--N-Version Antivirus in the Network Cloud, 17th USENIX Security Symposium USENIX Security '08 Jul. 28-Aug. 1, 2008 San Jose, CA. |
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doom, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”). |
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25. |
Singh, S. , et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004). |
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998). |
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003). |
Vladimir Getov: “Security as a Service in Smart Clouds—Opportunities and Concerns”, Computer Software and Applications Conference (COMPSAC), 2012 IEEE 36th Annual, IEEE, Jul. 16, 2012 (Jul. 16, 2012). |
Wahid et al., Characterising the Evolution in Scanning Activity of Suspicious Hosts, Oct. 2009, Third International Conference on Network and System Security, pp. 344-350. |
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages. |
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9. |
Yuhei Kawakoya et al: “Memory behavior-based automatic malware unpacking in stealth debugging environment”, Malicious and Unwanted Software (Malware), 2010 5th International Conference on, IEEE, Piscataway, NJ, USA, Oct. 19, 2010, pp. 39-46, XP031833827, ISBN:978-1-4244-8-9353-1. |
Zhang et al., The Effects of Threading, Infection Time, and Multiple-Attacker Collaboration on Malware Propagation, Sep. 2009, IEEE 28th International Symposium on Reliable Distributed Systems, pp. 73-82. |