System and method for predicting and mitigating cybersecurity system misconfigurations

Information

  • Patent Grant
  • 10826931
  • Patent Number
    10,826,931
  • Date Filed
    Thursday, March 29, 2018
    6 years ago
  • Date Issued
    Tuesday, November 3, 2020
    3 years ago
Abstract
A computerized method for reconfiguring one or more malware detection systems each performing cybersecurity analyses on incoming data is described. The method involves receiving meta-information including metrics associated with a malware detection system. Based on the meta-information, a determination is made whether the malware detection system is operating at an optimal performance level. If not, results produced by conducting behavior analyses predicting operability of the malware detection system are determined and the results are provided as feedback to the malware detection system to update one or more configuration parameter values thereof.
Description
FIELD

Embodiments of the disclosure relate to cyber security. More particularly, one embodiment of the disclosure relates to a system and method for predicting the current performance level of a malware detection system and altering its configuration based on the predicted performance level.


GENERAL BACKGROUND

Network devices provide useful and necessary services that assist individuals in business and in their everyday lives. In recent years, a growing number of cyberattacks are being conducted on all types of network devices, especially network devices deployed at an enterprise (e.g., private or publicly-traded company, a governmental agency, etc.). In some cases, these cyberattacks are orchestrated in an attempt to gain access to content stored on one or more of these enterprise-based network devices. Such access is for illicit (i.e., unauthorized) purposes, such as spying or other malicious or nefarious activities. For protection, many enterprises deploy cybersecurity systems, such as on-premises malware detection systems that monitor and analyze content propagating over a local network in efforts to detect a cyberattack.


Typically, on-premises malware detection systems are installed in accordance with configurations that are either factory set or user-configurable, e.g., per specifications of installation guides provided by the manufacturers. Typically, these malware detection systems are initially configured to operate efficiently in accordance with network traffic patterns generally prevailing at the time of installation. Sometimes, the malware detection systems may not be properly configured by customers (users). Moreover, as network traffic patterns are dynamic and the threat landscape confronting customers may differ and may even change over time, in some situations, the malware detection systems' configurations should be tuned upon installation and re-tuned from time to time after installation for optimal its effectiveness.


In extreme situations, the malware detection systems may be significantly misconfigured, resulting in under-utilization of their detection capabilities or over-utilization of their detection capacity. This may reduce operational efficiencies or efficacies, or both, below that achievable by properly configured malware detection systems, and, in worse case scenarios, may result in inadequately analyzed network traffic and unnecessarily increased risk of a successful cyberattack on an enterprise. However, this activity of reconfiguring or tuning of the malware detection system is rarely performed due to both increased costs for the customer and a reduced availability of skilled technicians to perform such services.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the disclosure are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 is an exemplary embodiment of an architecture of a cybersecurity protection service.



FIG. 2 is an exemplary embodiment of a malware detection system of FIG. 1.



FIG. 3 is an exemplary embodiment of the system configuration optimization engine of FIG. 1.



FIG. 4 is a first exemplary embodiment of an operation flow between a cybersecurity system and the system configuration optimization engine of FIG. 1.



FIG. 5 is an exemplary embodiment of an interactive dashboard of a first interface display screen produced by the monitoring and reporting service of FIG. 4.



FIG. 6 is an exemplary embodiment of an interactive dashboard of a second interface display screen to provide greater granularity in the analytic results illustrated by the first display screen of FIG. 5.



FIG. 7 is an exemplary embodiment of the logical operations performed during communications between a malware detection system deployed within a cybersecurity system and the system configuration optimization engine.



FIG. 8 is an exemplary embodiment of a flowchart illustrating operations by the system configuration optimization engine of FIG. 1.



FIG. 9 is a second exemplary embodiment of an operation flow between a cybersecurity system and the system configuration optimization engine of FIG. 1.





DETAILED DESCRIPTION
I. Overview

In general, embodiments of the disclosure describe a system configuration optimization engine that is configured to (i) receive meta-information including different metrics associated with one or more malware detection systems (situated on-premises or as part of a cloud service), (ii) determine whether each of the malware detection system(s) is operating at an optimal performance level, and (iii) generate results provided as feedback to update one or more configuration parameter values for a particular malware detection system that is operating at a non-optimal performance level. Each configuration parameter includes information that partially controls the operating state of a resource (e.g., hardware, software or firmware) deployed within a network device (e.g., malware detection system). Examples of configuration parameters may be directed to hardware characteristics (e.g., number of active processor cores, memory capacity, utilization levels, etc.), operational settings such as virtual machine (VM) characteristics (e.g., number of active VMs, VM utilization, etc.), kernel related optimization (e.g. enable/disable kernel filters according to operating system performance, etc.) software characteristics (e.g., number of active processes, applications utilized by the active processes, etc.) or the like.


As described below, the system configuration optimization engine is remotely located from and communicatively coupled to one or more cybersecurity systems, which may be associated with different customers. A cybersecurity system includes one or more malware detection systems, each configured to provide meta-information to the system configuration optimization engine. The meta-information may include (a) a first metric being statistics associated with certain configuration parameters of the malware detection system, and/or (b) a second metric being events monitored during operations of the malware detection system (e.g., data retrieved in response to a certain event such as a system crash, etc.).


Based on the received meta-information, the system configuration optimization engine assigns a performance level for each malware detection system of a cybersecurity system. For illustrative purposes, the assigned performance level may be one of a plurality of performance levels, either (i) an optimal performance level or (ii) a non-optimal performance level. The non-optimal performance level may include multiple levels of granularity, such as an over-utilized performance level and an under-utilized performance level. The “optimal performance level” refers to a preferred operating state for a network device performing cybersecurity analyses, such as performed by a malware detection system for example, which may be measured by certain metrics, such as hardware utilization statistics, virtual machine utilization statistics, and/or software utilization statistics. A “non-optimal performance level” (e.g., over-utilized or under-utilized) identifies the malware detection system is operating outside of its desired operating configuration.


More specifically, as described herein, the system configuration optimization engine analyzes the meta-information provided by each malware detection system and, based on such analysis, assigns a performance level to that malware detection system. The meta-information may be provided to the system configuration optimization engine on a periodic basis and/or an aperiodic basis in response to a certain event (e.g., system crash, system operability exceeds or falls below a prescribed threshold, request initiated by a network administrator or cybersecurity system manufacturer, etc.).


As described above, the “optimal” performance level refers to a preferred operating state for a network device preforming cybersecurity analyses. This preferred operating state may be represented through a collection of system metrics, and thus, the performance level for a malware detection system may be determined through the collective analysis of configuration parameter values directed to these system metrics, which are provided as part of the meta-information. As part of this collective analysis, the system configuration optimization engine determines the degree of correlation between the received metrics in the meta-information and desired metrics of the malware detection system (referred to as a “baseline configuration”) that are gathered using experiential knowledge of operational meta-information of known misconfigured malware detection systems and/or known optimal malware detection systems.


According to one embodiment of the disclosure, a baseline configuration may include certain hardware utilization threshold (or range), VM utilization threshold (or range), and/or software utilization threshold (or range) representing a desired operating configuration for a malware detection system with a certain hardware profile. Hence, the system configuration optimization engine may include a plurality of baseline configurations each associated with a different hardware profile (e.g., number of processors, memory size, etc.). When analyzing the performance level for a malware detection system, a baseline configuration may be selected based on the hardware profile for that malware detection system, along with other factors (e.g., threat landscape confronting the customer (for instance, as indicated by the industry protected by the malware detection system), subscription or customer type, etc.). For instance, given the same hardware profile, a malware detection system deployed for a customer in a high-risk industry (e.g., governmental defense agency, utility, etc.) may be assigned a more stringent baseline configuration (e.g., less range tolerance, different weighting scheme targeted to ensure higher average performance levels, etc.) than a baseline configuration reserved for malware detection systems deployed in lower-risk industries (e.g., textiles, etc.). As a result, subscription levels for malware detection systems deployed for high-risk industry customers may be more costly given a likely increased frequency of re-configuration of the malware detection system for placement into an optimal performance level.


According to one embodiment of the disclosure, the performance level for the malware detection system is determined by conducting one or more arithmetic or logical operations on performance level determinations conducted for a plurality of metrics provided as part of the meta-information. More specifically, each statistic of a configuration parameter may be assigned a weighting depending on its importance in representing the health of the malware detection system. For instance, a first statistic directed to processor utilization may be assigned a larger weighting (i.e., assigned a higher importance) than a second metric directed to the number of virtual machines (VMs) currently active. Hence, the performance level for the malware detection system is based on a collection of weighted, performance level determinations (e.g., over-utilized, optimal, under-utilized) based on the statistics provided as part of the meta-information.


Responsive to the meta-information, the system configuration optimization engine may be configured to return information (referred to as a “system health report”) to the malware detection system supplying the meta-information. For one embodiment, the system health report includes (i) an identifier of the malware detection system supplying the analyzed meta-information; (ii) a determined performance level for the malware detection system; (iii) the performance level determinations for some or all of the plurality of configuration parameters; and/or (iv) one or more modified configuration parameter values that are used by the malware detection system to adjust its configuration to better remain in or return to an optimal performance level.


The system configuration optimization engine may be configured, prior to selection and passing of one or more modified configuration parameter values to the malware detection system, to select the modified configuration parameter values by at least comparing the received meta-information to predefined operational bounds (e.g., a blacklist including one or more statistics associated with configuration parameters for devices with the same hardware profile operating at non-optimal performance levels (e.g., misconfigured systems, etc.), and/or whitelist including statistics of configuration parameters for devices with the same hardware profile operating at optimal performance levels). Additionally, the system configuration optimization engine may provide the modified configuration parameter values to a monitoring and reporting service, which may issue an alert to a customer of the malware detection systems upon determining that one or more of the malware detection systems is operating at a non-optimal performance level.


According to one embodiment of the disclosure, the configuration of a malware detection system operating at a non-optimal performance level may be updated automatically without customer approval. Alternatively, before configuration parameter value(s) for the particular malware detection system are updated, approval from the network administrator is needed. Herein, administrator approval may be secured by a network device implemented with the system configuration optimization engine prior to providing the results (i.e., one or more modified configuration parameter value(s) as feedback to the particular malware detection system.


As an illustrative example, the network device may send an alert message to the administrator (e.g., text, email, notice to access a dashboard, etc.), where the alert message requires an action by the administrator before the modified configuration parameter values are sent. The action may include, but is not limited or restricted to any reply mechanism such as selection of a radio button, selection of a display element (or entry of information) on the dashboard, or the like. As another illustrative example, the network device may send the modified configuration parameter values to the particular malware detection system and provide the results as input to a monitor/reporting service, which generates a display that is accessible by at least an authorized administrator and illustrates performance levels of the malware detection systems utilized by a customer. Any of these types of reply mechanisms allows the network administrator to interact with the dashboard to authorize the configuration parameter update.


II. Terminology

In the following description, certain terminology is used to describe various features of the invention. For example, each of the terms “logic,” “engine,” and “component” may be representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, the term logic (or engine or component) may include circuitry having data processing and/or storage functionality. Examples of such circuitry may include, but are not limited or restricted to a hardware processor (e.g., microprocessor, one or more processor cores, a digital signal processor, a programmable gate array, a microcontroller, an application specific integrated circuit “ASIC”, etc.), a semiconductor memory, or combinatorial elements.


Additionally, or in the alternative, the logic (or engine or component) may include software such as one or more processes, one or more instances, Application Programming Interface(s) (API), subroutine(s), function(s), applet(s), servlet(s), routine(s), source code, object code, shared library/dynamic link library (dll), or even one or more instructions. This software may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of a non-transitory storage medium may include, but are not limited or restricted to a programmable circuit; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); or persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the logic (or component) may be stored in persistent storage.


Herein, a “message” generally refers to related data that is received, transmitted, or exchanged over a communication session. The message may include one or more packets, where a “packet” broadly refers to a series of bits or bytes having a prescribed format. Alternatively, the data may include a collection of data that may take the form of an individual or a number of packets carrying related payloads, e.g., a single webpage received over a network.


The term “object” generally relates to content (or information for accessing such content) having a logical structure or organization that enables the object to be classified for purposes of malware analysis. The content may include an executable (e.g., an application, program, code segment, a script, dynamic link library “dll” or any file in a format that can be directly executed by a computer such as a file with an “.exe” extension, etc.), a non-executable (e.g., a file; any document such as a Portable Document Format “PDF” document; a word processing document such as Word® document; an electronic mail “email” message, web page, etc.), or simply a collection of related data (e.g., packets).


The term “computerized” generally represents that any corresponding operations are conducted by hardware in combination with software and/or firmware. The term “data store” generally refers to a data storage device such as the non-transitory storage medium described above, which provides non-persistent or persistent storage for information (e.g., data, meta-information, etc.).


According to one embodiment of the disclosure, the term “malware” may be broadly construed as any code, communication or activity that initiates or furthers a cyberattack. Malware may prompt or cause unauthorized, anomalous, unintended and/or unwanted behaviors or operations constituting a security compromise of information infrastructure. For instance, malware may correspond to a type of malicious computer code that, as an illustrative example, executes an exploit to take advantage of a vulnerability in a network, network device or software, to gain unauthorized access, harm or co-opt operations of the network, the network device of the software or to misappropriate, modify or delete data. Alternatively, as another illustrative example, malware may correspond to information (e.g., executable code, script(s), data, command(s), etc.) that is designed to cause a network device to experience anomalous (unexpected or undesirable) behaviors. The anomalous behaviors may include a communication-based anomaly or an execution-based anomaly, which, for example, could (1) alter the functionality of a network device executing application software in an atypical manner; (2) alter the functionality of the network device executing that application software without any malicious intent; and/or (3) provide unwanted functionality which may be generally acceptable in another context.


The term “network device” may be construed as any electronic computing system with the capability of processing data and connecting to a network. The network may be a public network such as the Internet and/or a local (private) network such as an enterprise network, a wireless local area network (WLAN), a local area network (LAN), a wide area network (WAN), or the like. Examples of a network device may include, but are not limited or restricted to an endpoint (e.g., a laptop, a mobile phone, a tablet, a computer, a video console, a copier, etc.), a network appliance, a server, a router or other intermediary communication device, a firewall, etc.


The term “transmission medium” may be construed as a physical or logical communication path between two or more network devices or between components within a network device. For instance, as a physical communication path, wired and/or wireless interconnects in the form of electrical wiring, optical fiber, cable, bus trace, or a wireless channel using radio frequency (RF) or infrared (IR), may be used. A logical communication path may simply represent a communication path between two or more network devices or between components within a network device such as one or more Application Programming Interfaces (APIs).


Finally, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.


As this invention is susceptible to embodiments of many different forms, it is intended that the present disclosure is to be considered as an example of the principles of the invention and not intended to limit the invention to the specific embodiments shown and described.


III. General Architecture

Referring to FIG. 1, an exemplary block diagram of an embodiment of an architecture of a cybersecurity protection service 100 is shown. Herein, the cybersecurity protection service 100 features one or more cybersecurity systems 1101-110N (N≥1) communicatively coupled to a system configuration optimization engine 150. Deployed for detecting and protecting a local network of a customer against cyberattacks, each of the cybersecurity systems 1101-110N (e.g., cybersecurity system 1101) includes one or more malware detection systems 1201-120M (M≥1). As shown, the malware detection systems 1201-120M may be deployed on-premises (in the local network) to detect and analyze incoming objects propagating into or through the local network for malware. Alternatively, the malware detection systems 1201-120M may be deployed as a cloud-based solution in which the incoming objects (or a representation thereof) are captured at the local network and provided to at least one of the cloud-based malware detection systems 1201-120M.


Herein, each of the malware detection system 1201-120M may be configured to perform a two-phase approach for detecting malware contained in network traffic. This two-phase approach includes a static phase and a dynamic phase. During the static phase, an initial analysis of the characteristics of an object is conducted, without execution or processing of the object, to determine whether the object is “malicious” (malware) or “benign” (not malware). Where the object is “suspicious” (e.g., inconclusive if malicious or benign), a further (dynamic) analysis of the object may be conducted. During the dynamic phase, the object is executed within one or more virtual machines. A virtual machine (VM) executes an incoming object and the behaviors of these objects during execution (or the VM) are monitored. Each behavior may also referred to as an “event.” In some embodiments, one or more malware detection system 1201-120M (e.g., malware detection system 1201) may be deployed as a network device, which is communicatively coupled to receive and analyze objects within network traffic. As a network device, the malware detection system 1201 includes logic being physical components that analyze objects for malware. Alternatively, the malware detection system 1201 may be deployed as a virtual device, namely a software (daemon) agent to detect cyberattacks that is operating (in the foreground or background) within a network device (e.g., an endpoint). An example of the two-phase malware detection system is described in U.S. Pat. No. 9,311,479, entitled, “Correlation and Consolidation of Analytic Data For Holistic View of A Malware Attack,” and U.S. Pat. No. 9,483,644 entitled “Methods for Detecting File Altering Malware in VM based Analysis,” the entire contents of both of which are hereby incorporated by reference.


As shown in FIG. 1, each malware detection system 1201, . . . , or 120M is configured to transmit meta-information 130 to the system configuration optimization engine 150. Such transmission may be initiated periodically upon detection of a timeout condition (e.g., prescribed amount of time has elapsed, prescribed count value achieved, certain period of time detected, etc.) or aperiodically upon detection of a predetermined event (e.g., a system crash, completion of analyses of a prescribed number of objects, etc.). For instance, from the malware detection system 1201, the meta-information 130 may include statistics 135 associated with one or more configuration parameters that identify a current operating state of the malware detection system 1201. The statistics 135 may include, but are not limited or restricted to, measured values associated with certain configuration parameters, such as hardware utilization statistics, virtual machine utilization statistics, and/or software utilization statistics, as described above. Additionally, or in the alternative, the meta-information may include events as described herein.


According to one embodiment of the disclosure, for any deployment of a malware detection system, certain component(s) within the malware detection system 1201 periodically or aperiodically determine the current operating state of the malware detection system. As an illustrative example, as shown in FIGS. 1-2, system configuration analysis logic 265 may be configured, in response to a timeout or predetermined event detected by monitoring logic 260, to ascertain statistics associated with certain configuration parameters. The statistics may be collected via the OS. The configuration parameter values may be temporarily stored prior to transmission to the system configuration optimization engine 150.


Referring back to FIG. 1, the system configuration optimization engine 150 may be implemented within a dedicated network device, which is located remotely from the cybersecurity systems 1101-110N. As an illustrative embodiment, the system configuration optimization engine 150 may be deployed as logic being part of public cloud computing or private cloud computing (e.g., private cloud, a virtual private cloud or a hybrid cloud). When operating as part of public cloud computing, the system configuration optimization engine 150 is accessible by each of the cybersecurity systems 1101-110N via a public network, as public cloud computing supports a multi-tenant environment. In contrast, when operating as part of private cloud computing, the system configuration optimization engine 150 is accessible by a single cybersecurity system (e.g., cybersecurity system 1101) where each of the cybersecurity systems 1101-110N is associated with a different customer as private cloud computing supports a single-tenant environment.


Upon receipt of the meta-information 130 from the cybersecurity system 1101, the system configuration optimization engine 150 analyzes the meta-information 130 to determine whether the malware detection system 1201 is operating at an optimal performance level. Such analysis may involve an evaluation of the meta-information 130 against a predictive model based on heuristic information including configuration parameter values of a known body of past configurations (e.g., blacklist and/or whitelist and/or hardware profile) that are associated with non-optimal and/or optimal performance levels. By identifying the misconfigurations, the system configuration optimization engine 150 may recommend modifications to the malware detection system configuration. These modifications are alerted to the customer and/or directly provided to the malware detection system(s).


According to one embodiment of the disclosure, the determination whether the malware detection system 1201 is operating at an optimal performance level, namely a preferred operating state for malware detection analyses as measured by metrics such as system parameter values and/or detection parameter values. Collectively, these configuration parameter values identify a health of the malware detection system 1201 or the cybersecurity system 1101 including at least the malware detection system 1201. Herein, the system parameters may be directed to features that influence operability of the malware detection system 1201, such as hardware utilization statistics (e.g., processor utilization, amount or degree of available memory for storage, etc.), virtual machine utilization statistics (e.g., virtual machine “VM” utilization or the number of VMs activated, etc.) and/or software utilization statistics (e.g., what processes are running, statistics associated with the processes, queue length, etc.). In contrast, the detection parameters may be directed to features associated with a malware detection analysis being conducted (e.g., type of analyses, duration of analysis per object, classification threshold being used to determine performance level, etc.


As an example, as described above, when adjusting the detection parameter values, the type of analysis may be varied (dynamic, emulation, types of static analysis, etc.) or the thresholds that determine suspiciousness (requiring further analysis e.g., dynamic analysis) may be varied in order to control the number of objects that are subjected to further analysis. For under-utilization, by changing the threshold to a lower level, we may subject more objects to deeper analysis (e.g., dynamic) which increases consumption of available system resources and increases the rate of object analysis. The additional objects subject to further analysis caused by a reduced threshold may have a lower probability (based on preliminary analysis only) of being malicious. However, such analysis may reduce the number of false negatives. For over-utilization, by raising the threshold, fewer objects may be subject to further (dynamic) analysis. As the threshold is related to likelihood of maliciousness, adjustment may be slow to ensure that there is no appreciable increase the risk of false negatives. The availability of such adjustments may be related to the prevailing threat landscape for the particular customer protected by the system, or its industry,


In response to determining that the malware detection system 1201 is operating at a non-optimal performance level representing that the malware detection system 1201 is operating outside of its desired operating configuration (e.g., under-utilized or over-utilized), the system configuration optimization engine 150 may be configured to recommend configuration modifications for the malware detection system 1201 and return a configuration modification message 170. The configuration modification message 170 may include one or more modified configuration parameter values 175, which may be a different values than the received configuration parameter value 135. Upon modification of the configuration of the malware detection system 1201 in accordance with the modified configuration parameter values 175, either automatically or upon approval by an administrator as described above before the automated solution is given effect (e.g., agreement as to the modifications, accept increased charges if applicable, etc.), the functionality of the malware detection system 1201 is modified in efforts to return to an optimal performance level.


As an illustrative example, the configuration modification message 170 may include a modified configuration parameter value 175 signifying a change in the number of active virtual machines currently being utilized by the malware detection system 1201, as represented by the statistics 135 (e.g., inclusive of statistics associated with the number of active virtual machines) provided within the meta-information 130. Herein, according to one embodiment, the malware detection system 1201, upon receipt of the modified configuration parameter value 175 (with administrator approval if needed), may decrease the number of active virtual machines deployed, provided the malware detection system 1201 is determined by the system configuration optimization engine 150 to be operating at an “over-utilized” VM utilization level. Herein, the “over-utilized” VM utilization level may be determined by any configuration parameter value or combination of configuration parameter values indicating that available resources at the malware detection system 1201 are incapable of supporting the current performance level (e.g., the number of VMs running concurrently, number of objects queued and awaiting VM analysis, etc.), and the modified configuration parameter value 175 temporarily reducing the performance level of the malware detection system 1201. Alternatively, according to another embodiment, upon receipt of the modified configuration parameter value 175 (with administrator approval if needed), the malware detection system 1201 may increase in number of active virtual machines from the number of active virtual machines represented by the statistics 135, provided the malware detection system 1201 is operating at an “under-utilized” VM utilization level where resources at the malware detection system 1201 are available to support a higher performance level (e.g., more VMs, etc.).


Referring to FIG. 2, an illustrative embodiment of a malware detection system (e.g., malware detection system 1201) is shown. Herein, the malware detection system 1201 features a plurality of components, including a processor 200, a network interface 210, a memory 220, and an optional administrative interface 230, which are communicatively coupled together via a transmission medium 240. As shown, when deployed as a network appliance, the components are at least partially encased in a housing 250 made entirely or partially of a rigid material (e.g., hardened plastic, metal, glass, composite, or any combination thereof). The housing 250 protects these components from environmental conditions. As a virtual device, however, the malware detection system 1201 includes some or all of the functionality provided by the logic within the memory 220.


The processor 200 is a multi-purpose, programmable component that accepts digital data as input, processes the input data according to stored instructions, and provides results as output. One example of a processor may include an Intel® central processing unit (CPU) with an x86 instruction set architecture. Alternatively, the processor 200 may include another type of CPU, a digital signal processor (DSP), an Application Specific Integrated Circuit (ASIC), a field-programmable gate array (FPGA), or the like.


As shown in FIG. 2, the processor 200 is communicatively coupled to the memory 220 via the transmission medium 240. According to one embodiment of the disclosure, the memory 220 is adapt to store (i) event/timeout monitoring logic 260, (ii) system configuration analysis logic 265, (iii) optional timestamp generation logic 270, (iv) meta-information storage logic 275, (v) configuration readjustment logic 280 and corresponding data store 285, and (vi) malware detection logic 290.


The configuration analysis logic 265, in response to a particular event or timeout detected by the monitoring logic 260, obtains meta-information (e.g., one or more configuration parameter values 135) associated with the current operating state of the malware detection system 1201. As shown, the configuration analysis logic 265 includes an operating system (OS) statistics module 266 to collect hardware utilization statistics from the OS (e.g., processor utilization, amount or degree of available memory for storage, etc.); VM statistics module 267 to collect VM utilization statistics (e.g., VM utilization or the number of VMs activated, etc.); and/or application statistics module 268 to collect software utilization statistics (e.g., what processes are running, statistics associated with the processes, etc.).


Thereafter, the configuration analysis logic 265 may temporarily store statistics associated with certain configuration parameter(s) 135 (hereinafter, “statistics”) within the meta-information storage logic 275. As an optional operation, the timestamp generation logic 270 may generate a timestamp (not shown) that is applied to each value of the statistic 135 prior to storage with the meta-information storage logic 275. The statistics 135 may include the processor utilization level, the amount of hard disk space available, number of active virtual machines, the number of processes currently running, or the like. The current configuration parameter values 135, stored in the meta-information storage logic 275, are subsequently accessed from the meta-information storage logic 275 for transmission to the system configuration optimization engine 150 of FIG. 1.


The configuration readjustment logic 280 is adapted to receive the configuration information 170 (i.e., modified configuration parameter values 175) from the system configuration optimization engine 150 of FIG. 1 for storage with the data store 285. The configuration readjustment logic 280 (e.g., a script running on the malware detection system 1201) is configured to change the operating state of the malware detection system 1201 by altering certain configuration parameter values with the values included in the modified configuration parameter values 175. The adjustment of the current configuration parameter values, represented by the statistics 135, may occur upon receipt of the modified configuration parameter values 175 or after the configuration readjustment logic 280 initiates a message to an administrator to approve alteration of the operating state of the malware detection system 1201 with the modified configuration parameter values 175. Alternatively, the current configuration parameter values may be adjusted in response to a change in operating state by the malware detection system 1201 such as initialization of a new process or a time in which the processor utilization falls below a first predetermined value or exceeds a second predetermined value different than the first predetermined value.


The administrative interface 230 is a portal that allows an administrator, after credential exchange and authentication, to access and update logic stored within the memory 220 of the malware detection system 1201. For instance, the administrative interface 230 may include authentication logic (not shown) to authenticate an administrator requesting access to stored logic within the malware detection system 1201. Upon authentication, the administrator is able to modify (i) the triggering events or timeout parameters within the event/timeout monitoring logic 260, or (ii) code of the system configuration analysis logic 265, configuration readjustment logic 280, and/or malware detection logic 290 (e.g., code associated with static analysis of an object or the behavioral analysis of the object in efforts to detect a presence of malware within the object or its association with a cyberattack), or (iii) operability of the malware detection system 1201 (e.g., hardware changes, operational setting changes or software changes as described below).


Referring now to FIG. 3, an exemplary embodiment of the system configuration optimization engine 150 of FIG. 1 is shown. The system configuration optimization engine 150 features one or more processors 300, a network interface 310, a memory 320, and an optional system administrative interface 330 which allows an administrator to directly access to data within the system configuration optimization engine 150 (e.g., adjust the code associated with any of the components set forth in the memory 320 when the administrator is authorized to perform such actions).


As shown, the memory 320 comprises a parser 350, training data storage 360, machine learning (ML) modeling logic 365, ML training model 370, ML predictive model 375, and system health reporting logic 380. Herein, the parser 350 is configured to parse both structured and non-structured data, which is provided as meta-information 130 from a malware detection system (e.g., malware detection system 1201). More specifically, the parser 350 features a plurality of sub-parsers 355, including a first sub-parser 356 and a second sub-parser 357. The first sub-parser 356 is configured to parse structured data to recover meta-information including the values associated with one or more configuration parameters positioned at specific locations within the structured data (hereinafter, “recovered configuration parameter values”). The recovered configuration parameter values may be analyzed by the ML modeling logic 365 in accordance with the ML predictive model 375. The second sub-parser 357 is configured to parse unstructured data (e.g., line in a text file) for relevant information, including information associated with an event. For instance, the second sub-parser 357 may conduct a search for one or more keywords (e.g., “kernel crash” keyword, etc.) and extract information subsequent to the keywords (e.g., information identifying a nature and/or reason for the crash).


The training data storage 360 is a data store that is adapted to temporarily store sets of labeled training set 362 and/or unlabeled training set 363 (referred to as “training data” 364) for use by the machine learning modeling logic 365 in “training” the ML training model 370 to produce the ML predictive model 375. The training data storage 360 may include data from the cybersecurity systems 1101-110N as well as third party sources. Herein, the training data 364 include normalized, heuristic data pertaining to a plurality of configuration parameters directed to operability of a malware detection system, where some of the heuristic data may be directed to the same configuration parameter associated with a different hardware profile. The heuristic data may include a normalized value for a specific configuration parameter over a prescribed time period, as measured for a network device with a specific hardware profile over a prescribed time period. Alternatively, the heuristic data may include a prescribed number of sampled values. A “hardware profile” is a specific representation of a network device having certain functionality, such as a number of processing elements (e.g., processors, processor cores, etc.), certain memory storage capacity, certain VM capacity, manufacturer/model name of the network device, device identification number, or the like.


For different hardware profiles, each training data set 364 may correspond to a different configuration parameter, a different combination of configuration parameters, and/or different configuration parameter values or weighting used by different classifications. Stated differently, the ML modeling logic 365 uses the training data sets 364 to establish baselines in classifying incoming meta-information 130 (using the received configuration parameter values), and these baselines may vary between hardware profiles. Furthermore, the ML predictive model 375 may be trained to apply different weighting factors for different configuration parameters to determine a verdict for each incoming configuration parameter and/or an aggregate of weighted configuration parameters for classifying of the malware detection system 1201 providing the meta-information 130.


According to one embodiment of the disclosure, each set of training data 364 includes normalized, heuristic data associated with one or more configuration parameters, where the training data 364 is labeled to correspond to a particular classification of a plurality of classifications. Based on the foregoing, each classification of the training data 364 may correspond to a different aggregation of configuration parameter values as different hardware profiles may be associated with different normalized, heuristic data operating as a baseline and/or different weighting factors assigned to configuration parameter values for determining a verdict for each incoming configuration parameter value and an aggregate of weighted configuration parameter values.


The machine learning modeling logic 365 processes the ML training model 370 using the labeled training data 362 as well as unlabeled training data 363 to produce the updated predictive model 375. For instance, using the labeled training data 362, the ML training model 370 continues to update and improve the detection accuracy of the ML training model 370 until a prescribed accuracy (e.g., 90% accuracy) is achieved. Thereafter, the ML training model 370 is released for initial testing as the ML predictive model 375, and based on continued reliable testing of the ML predictive model 375, the ML predictive model 375 is utilized by the system configuration optimization engine 150 for determining whether certain malware detection systems are operating at an optimal performance level or a non-optimal performance level. Thereafter, the ML training model 370 (corresponding to the current ML predictive model) continues further training to improve operability of the ML predictive model 375.


The system health reporting logic 380 is adapted to receive the incoming meta-information and utilize the ML predicted model 375 in (1) determining whether the malware detection system is operating at an optimal performance level or not, and (2) determining what configuration parameters are modifiable in order for the malware detection system 1201 to be adjusted to operate at the optimal level. The system health reporting logic 380 may perform a number of operations iteratively by modifying different configuration parameters and analyzing the results of the modification to determine whether certain configuration parameter values provided by the meta-information gravitate toward the optimal performance level.


For instance, the system health reporting logic 380 may detect that an “over-utilized” processor utilization level (e.g., exceeding a first threshold such as a percentage of processing capacity exceeding 80% utilization where optimal utilization resides within 60%-80% range), and thus, mimic activation of additional processor cores in order to determine whether the activation of a single processor core would be sufficient to reduce the processor utilization level back to an optimal performance level (e.g., operating utilization normalized to reside within 60%-80% range). Hence, the system health reporting logic 380 performs behavioral analysis in accordance with the hardware profile in order to determine that the alteration of certain configuration parameter values is sufficient to return the malware detection system back to an optimal operating range. Besides percentage of processing capacity, the utilization level may be directed to the time spent on idle tasks (e.g., optimal performance level corresponds to a prescribed percentage range of processing time being spent on idle tasks where over-utilization exceeds the prescribed percentage range) or the type and/or amount of computing tasks being performed for a determined measure of time such as per second, hour, day or the like (e.g. the optimal performance level may correspond to a prescribed range of computing tasks performed according to the determined measure of time, where over-utilization exceeds the prescribed computing task range).


Similarly, the system health reporting logic 380 may detect that an “under-utilized” processor utilization level (e.g., utilizing falling below a second threshold such as 30% utilization where utilization normalized to reside within 60%-80% range), and thus, mimic deactivation of a processor core if multiple processor cores are active in order to determine whether the deactivation of a single processor core would be sufficient to increase the processor utilization level back to the optimal performance level. Furthermore, processor under-utilization may be detected where the processing time being spent on idle tasks falls below the prescribed processing range or the number of computing tasks performed over the measured unit of time falls below the prescribed computing task range.


Referring to FIG. 4, a first exemplary embodiment of an operation flow between a cybersecurity system 1101 and the system configuration optimization engine 150 is shown. Herein, both the malware detection systems 1201 and 1202 communicate with a cloud service 400, which operates as a datacenter that aggregates meta-information 130 received from malware detection systems 1201 via a first message 410 and meta-information 1302 received from malware detection systems 1202 via a second message 415. As shown, one embodiment of the first message 410 may include an identifier of the malware detection system 1201 (e.g., System ID 420), an optional timestamp 425, and statistics 135 associated with one or more configuration parameters representing the system metrics, such as hardware utilization statistics 430 (e.g., processor or memory-based statistics), virtual machine utilization statistics 435 (e.g., VM-based statistics) and/or software utilization statistics 440 (e.g., process-based statistics) as described above.


During aggregation, according to one embodiment of the disclosure, the unstructured data within the meta-information 130 may be formatted and placed into a prescribed data structure. Otherwise, the meta-information 130, including structured and/or unstructured data, may be provided to the system configuration optimization engine 150. According to one embodiment of the disclosure, the cloud service 400 controls delivery of the meta-information 130 (e.g., “push” delivery) while, according to another embodiment, the system configuration optimization engine 150 controls delivery of the meta-information 130 (e.g., “pull” delivery).


As shown, the system configuration optimization engine 150 includes the parser 350, which features a plurality of sub-parsers 355 including the first sub-parser 356 and the second sub-parser 357. As described above, the first sub-parser 356 is configured to parse structured data contained in the meta-information 130 in order to recover one or more configuration parameter values. The recovered configuration parameter values are used by the ML predictive model 375, being processed by the ML modeling logic (not shown), in determining a performance level at which the malware detection system 1201 is currently operating. Additionally, the ML predictive model 375, being processed by the ML modeling logic 365 of FIG. 3, determines the performance level (e.g. values) for each of the recovered configuration parameter values.


Additionally, the second sub-parser 357 is configured to parse unstructured data for relevant information (e.g., analysis of text strings such as lines of a text file). The “relevant” information includes information associated with a monitored event, where the information may be obtained from keyword searches, as described above. The relevant information may be used by the ML modeling logic in determining, independent or in combination with the recovered configuration parameter values, the performance level at which the malware detection system 1201 is currently operating.


More specifically, the ML modeling logic 365 is applied to the configuration parameter value(s) and/or relevant information are provided to the ML modeling logic 365. The ML predictive model 375 is generated as a result of the ML modeling logic performing “training” operations on the ML training model using the training data as described in FIG. 3. Herein, the ML predictive model 375 determines the hardware profile of a source of the configuration parameter value(s) and/or relevant information, where the hardware profile may influence what normalized, heuristic data is referenced in the classification of the source. For instance, using the ML predictive model 375, the ML modeling logic analyzes portions of the meta-information 130 (e.g., recovered configuration parameter statistics and/or the relevant information) to classify the malware detection system 1201 (e.g., over-utilized performance level, optimal performance level, or under-utilized performance level).


As described above, the classification operations are dependent on detected hardware profile for the malware detection system 1201 and the content of the configuration parameters supplied by the meta-information 130. For example, the ML predictive model 375 may apply prescribed weightings to the configuration parameter values, where the aggregate of the weighted values is used to determine whether the malware detection system is operating at an optimal performance level, or is operating at a non-optimal performance level (e.g., over-utilized where processor utilization exceeds a first prescribed percentage and/or memory available falls below a first prescribed byte size or under-utilized where processor utilization falls below a second prescribed percentage and/or memory available exceeds a second prescribed byte size).


The system health reporting logic 380 is adapted to receive the incoming meta-information and utilize the ML predicted model 375 to determine (1) whether the malware detection system is operating at an optimal performance level or not, and (2) determine what configuration parameters are modifiable for adjusting operability of the malware detection system 1201 to operate at the optimal performance level. The system health reporting logic 380 may iteratively modify certain configuration parameters based on what configuration parameters are negatively influencing performance, and using the resulting affects as feedback to adjust the next iteration so as to tune in steps to a value or values that produces an optimal performance level the configuration of the malware detection system 1201. This analysis may be performed through behavioral analysis of a virtual machine configured to accordance with a determined hardware profile or through heuristics based on prior configuration parameter adjustments. For instance, the system health reporting logic 380 may detect that an “over-utilized” processor utilization level, and thus, mimic activation of additional processor cores in order to determine whether the activation of a single processor core would be sufficient to reduce the processor utilization level back to an optimal performance level (e.g., operating utilization normalized to reside within 60%-80% range). Hence, the system health reporting logic 380 performs behavioral analysis in accordance with the hardware profile in order to determine that the alteration of certain configuration parameters is sufficient to return the malware detection system back to an optimal performance level.


As described above, the system configuration optimization engine 150 determines, using the ML predictive model 375, whether the meta-information 130 identifies the malware detection system 1201 as running in an over-utilized performance level, an optimal performance level, or an under-utilized performance level. In response to determining that the malware detection system 1201 operates at an over-utilized performance level, the system health reporting logic 380 determines which configuration parameters may be altered in order to return the malware detection system 120 back to its optimal performance level. This may involve an increase (or reduction) in active processor cores, an increase (or reduction) in active virtual machine instances, an increase (or reduction) in memory usage, or the like.


Upon completion of the analysis of the meta-information 130 supported by the malware detection system 1201, the system health reporting logic 380 generates a system health message 450, which is provided to the malware detection system 1201 that supplied the meta-information 130. Herein, the system health message 450 may include (i) an identifier of the malware detection system supplying the analyzed meta-information; (ii) the performance level for the malware detection system; (iii) the performance level determinations for some or all of the plurality of configuration parameters; and/or (iv) one or more modified configuration parameter values that are used by the malware detection system to adjust its configuration to remain in or return to its optimal performance level. The system health message 450 is consistent with the configuration modification message 170 of FIG. 1, including one or more modified configuration parameter values 175.


As shown, the system health reporting logic 380 may provide the system health message 450 to a customer support service 460. The customer support service 460 may automatically analyze the contents of the system health message 450 and generate subsequent communications 465 (e.g., via electronic mail, text, automated audio, signaling to monitoring and reporting logic 470, etc.) to advise the customer as to proposed modifications to the cybersecurity system. Furthermore, the customer support service 460 may provide a portion of the system health message 450, such as modified configuration parameter values, to a targeted malware detection system. The modified configuration parameter values may be selected to perform system modifications directed to (i) hardware characteristic changes (e.g., number of active processor cores, network connector types or functionality such as activation of wireless transceivers supporting different wireless frequencies, memory capacity thresholds, etc.), (ii) operational setting changes (e.g., OS setting changes, number of active VMs, VM utilization, additional systems or services available for purchase to improve operability of the cybersecurity system, etc.), and/or (iii) software characteristic changes (e.g., number of active processes, applications utilized by the active processes, etc.) or the like. Additionally, or in the alternative, the system health reporting logic 380 may provide the system health message 450 to the monitoring and reporting service 470. The monitoring and reporting service 470 generates automatically, without user interaction, a report (e.g., information for generation of one or more display screens, a printed report, etc.). The report may be provided to a management console or an administrative interface of a targeted malware detection system. Similarly, the portion of the system health message 450, including modified configuration parameter values, may be provided to the targeted malware detection system (e.g., malware detection system 1201) via the administrative interface 230 of FIG. 2 or via a management console to which each cybersecurity system and/or malware detection system as access.


As shown in FIG. 5, based on the contents of the system health message 450, the monitoring and reporting service 470 generates a report that highlights information associated with performance levels for each of the malware detection systems for each particular customer. Particular types of performance levels (e.g., over-utilized performance levels and/or under-utilized performance levels) may be highlighted to visibly denote a deviation from the optimal performance level. Examples as to how the performance level may be highlighted includes (1) altering location or ordering of at least certain portions of the performance level information to prominently display such information within the report; (2) modifying the font (e.g., color, size, type style, and/or effects) used in conveying some of the malware detection systems operating at non-optimal performance levels; (3) placement of one or more images proximate to a listing of certain types of performance levels (e.g., optimal, non-optimal, etc.); and (4) placement in a special window or windows associated with listings of certain types of performance levels.


Referring still to FIG. 5, an exemplary embodiment of an interface display screen 500 produced by the monitoring and reporting service 470 of FIG. 4 that provides an interactive dashboard is shown. Herein, rendered by the monitoring and reporting service 470, the display screen 500 features a first display area 510 that illustrates information directed to the performance level determined for malware detection systems deployed on-premises at a customer site. Multiple highlighting techniques are shown in display screen 500, although it is contemplated that any one or more highlighting technique may be conducted for a particular display.


More specifically, according to one embodiment of the disclosure, the display area 510 displays a plurality of entries 5201-520R (R>1, R=3 for this embodiment) that provide information directed to performance levels of the malware detection systems 1201-120M for each customer. As shown, each row of entries (e.g., 5201) rendered by the display logic comprises a plurality of fields, including one or more of the following: (1) a first field 530 including an identifier of the malware detection system; (2) a second field 532 including a timestamp that identifies when an analysis of the performance level for the malware detection system was conducted by the system configuration optimization engine 150; and/or (3) a third field 534 including the predicted performance level determined for the malware detection system by the system configuration optimization engine 150. The display area 510 may include additional fields to provide more details directed to the malware detection systems associated with a particular customer, including a fourth field 536 that lists a host address for the corresponding malware detection system, and/or a fifth field 538 that lists a hardware profile for the corresponding malware detection system.


Herein, the fields 530, 532, and 534 associated with malware detection systems operating at non-optimal performance levels may warrant heightened scrutiny level, namely information is displayed more prominently than those fields associated with malware detection systems operating at optimal performance levels for example. This allows a network administrator to more quickly and easily determine malware detection systems that may need re-configuration to improve system operability.


As an example, as a highlighting technique illustrated for the first field 530, the font associated with the malware detection systems operating at non-optimal performance levels (SYSTEM 1; SYSTEM 3) may be displayed differently than the font associated with the host names for malware detection system operating at an optimal performance level (SYSTEM 2). Alternatively, or in addition to the font changes in display, the highlighting technique may be accomplished by ordering malware detection systems operating at non-optimal performance levels (SYSTEM 1; SYSTEM 3) at the top of a listing while any malware detection systems operating at optimal performance levels (SYSTEM 2) are ordered toward the bottom of the listing. As another alternative embodiment, although not shown, a single display screen may produce two areas, where a first area includes the malware detection systems operating at non-optimal performance levels (SYSTEM 1; SYSTEM 3) while a second area includes one or more malware detection systems operating at optimal performance levels (SYSTEM 2).


As further granularity of the operability of the malware detection system under analysis may be needed, according to one embodiment, selection of a field associated with a targeted malware detection system (e.g., performance level field 534 determined for the malware detection system by the system configuration optimization engine 150) allows the user to visualize the performance level of each individual configuration parameter, as shown in FIG. 6.


Referring to FIG. 6, it is contemplated that selection of a particular entry (e.g., third entry 534 including the performance level represented by an underlined portion) may enable the network administrator to obtain more detailed information of the configuration parameter values that resulted in determining whether a certain malware detection system is operating at an optimal performance level or a non-optimal performance level. For instance, as shown in FIG. 6, by selecting a particular entry (e.g., third entry 534), the administrator may be able to uncover the predictive results that lead to the predicted performance level.


According to one embodiment of the disclosure, the predictive results may include each statistic associated with a configuration parameter 610 supplied by the malware detection system 1201 as part of the meta-information 130, the value 620 associated with each current configuration parameter received from the malware detection system 1201, and the performance level 630 determined for that particular configuration parameter (i.e., “over-utilized,” “optimal,” and “under-utilized”). Also, as an optional feature, the weighting 640 allocated for each configuration parameter may be displayed with the performance level determination along with the normalized optimal range 650 based on heuristic data for the particular hardware profile.


Referring to FIG. 7, an exemplary embodiment of the logical operations performed by the malware detection systems 1201 (deployed within the cybersecurity system 1101) and the system configuration optimization engine 150 is shown. Herein, on a periodic basis and/or an aperiodic basis, the configuration analysis logic 265 collects configuration data associated with the malware detection system 1201. According to this embodiment, the OS statistics module 266 is configured to collect hardware utilization statistics 700, which may include current operating state information associated with the processor(s) (processor statistics 705) and memory (memory statistics 710). Examples of the processor statistics 705 may include, but are not limited to processor utilization and/or the number of processor cores active. Examples of the memory statistics 710 may include, but are not limited to available memory (size) and/or average (disk) access speed.


As shown in FIG. 7, the configuration analysis logic 265 further includes the VM statistics module 267 and the application statistics module 268. The VM statistics module 267 may be configured to collect VM utilization statistics such as the current virtual operating state associated with a virtual processor (VM processor statistics 715) and/or a virtual memory (virtual memory statistics 720). Examples of the VM processor statistics 715 may include, but are not limited to number of maximum VM instances available or running, or number of concurrent VMs running, or queue length of objects waiting for dynamic analysis, or guest CPU utilization of each VM. Examples of the virtual memory statistics 720 may include, but are not limited to virtual memory available (size). The application statistics module 268 may be configured to collect software statistics such as what processes are running 725 and event log outlining operations of the processes (not shown).


Thereafter, the configuration analysis logic 265 may temporarily store these statistics as the meta-information associated with the malware detection system 1201 and the meta-information 130 is made available to the system configuration optimization engine 150. For this embodiment, the meta-information 135 is transmitted to the system configuration optimization engine 150 and evaluated, using a predictive model developed based on heuristic data gathered from experiential knowledge of operational meta-information of known misconfigured and optimally configured systems. For instance, when processed by the ML modeling logic, the configuration parameters (and/or groups of configuration parameters) are compared to heuristic data associated with optimal and non-optimal performance levels associated with the respective configuration parameters (and/or groups of configuration parameters) to determine whether the configuration parameter value (and/or groups of configuration parameter values) falls within a prescribed range as determined by the heuristic data.


For those configuration parameter values falling outside of the optimal performance level, the system configuration optimization engine 150 determines (what configuration parameter values are modifiable for adjusting operability of the malware detection system 1201 to operate at the optimal performance level. The system configuration optimization engine 150 iteratively modifies certain configuration parameter values based on what configuration parameter values are negatively influencing performance by the malware detection system 1201 and analyzes the potential effects of a similar adjustment to the configuration of the malware detection system 1201.


As shown in FIG. 7, the system configuration optimization engine 150 determines which configuration parameter values may be altered in order to return the malware detection system 1201 back to its optimal performance level. More specifically, the system configuration optimization engine 150 may perform hardware optimization modifications 730 such as adjusting (i.e., increasing or reducing) the number of active processor cores or adjusting the number of log events maintained by the malware detection system 1201 (to increase or reduce processor utilization). Additionally, or in the alternative, the system configuration optimization engine 150 may perform VM optimization modifications 735 such as adjusting (i.e., increasing or reducing) the number of active virtual machine instances. Likewise, additionally or in the alternative, the system configuration optimization engine 150 may perform software optimization modifications 740 such as enabling or disabling certain software features or adjusting (i.e., increasing or reducing) resources available to a particular software process.


Referring now to FIG. 8, an exemplary embodiment of a flowchart illustrating operations of the system configuration optimization engine of FIG. 1, FIG. 3 and FIG. 4 is shown. Initially, meta-information including configuration parameters directed to the operability of a malware detection system is received (block 800). From the meta-information, the hardware profile of the malware detection system is determined to identify a desired operating configuration when analyzing the configuration parameter values associated with the malware detection system (block 810). The incoming meta-information is analyzed against the heuristic configuration data in accordance with the hardware profile of the malware detection system in order to determine whether or not the incoming meta-information indicates that the malware detection system is operating at an optimal performance level (blocks 820, 830 and 840).


After the determination of the performance level of the malware detection system, the system configuration optimization engine determines which configuration parameter values associated with the incoming meta-information are modifiable to adjust the configuration (and operating state) of the malware detection system (block 850). Thereafter, the system configuration optimization engine may be configured to perform, within a virtualized environment representative of the hardware profile of the malware detection system, iterative adjustments of different configuration parameter values to determine whether such adjustments allow the current performance level to remain in or return to its optimal performance level (block 860). The performance levels of the malware detection system and configuration parameter values may be provided for display (block 870). The modified configuration parameter values may be automatically returned to the malware detection system for reconfiguring the malware detection system (blocks 880 and 890).


Referring to FIG. 9, a second exemplary embodiment of an operation flow between the cybersecurity system 1101 and the system configuration optimization engine 150 is shown. Herein, as shown, the malware detection systems 1201 is in communication with the system configuration optimization engine 150 via a cloud service 900. The cloud service 900 is configured to aggregate incoming meta-information 130 from different malware detection systems 1201-120N of the cybersecurity system 1101 and control the return of modified configuration parameter values and the updating of the predictive model 375.


More specifically, the malware detection systems 1201, in response to a detected event or timeout, obtains meta-information associated with the current operating state of the malware detection system 1201. Herein, the meta-information 130 includes hardware utilization statistics, VM utilization statistics, and/or software utilization statistics as described above.


As shown, the system configuration optimization engine 150 includes the parser 350, to recover one or more the configuration parameter values contained in the meta-information 130. The statistics are used by the ML predictive model 375, being processed by the ML modeling logic 365, in determining a performance level at which the malware detection system 1201 is currently operating. Additionally, the ML predictive model 375, being processed by the ML modeling logic 365, determines the performance level for each of the configuration parameter values pertaining to the statistics. The ML predictive model 375 is repeatedly updated and configured as a result of the ML modeling logic 365 performing “training” operations on the training data 364.


Herein, the system configuration optimization engine 150 determines the hardware profile of a source of the configuration parameter value(s), where the hardware profile may influence what normalized, heuristic data is referenced in the classification of the source. For instance, using the ML predictive model 375, the ML modeling logic 365 analyzes portions of the meta-information 130 (e.g., statistics associated with configuration parameters) to classify the malware detection system 1201 (e.g., over-utilized performance level, optimal performance level, or under-utilized performance level). Furthermore, the system configuration optimization engine 150 determines what configuration parameter values are modifiable for adjusting operability of the malware detection system 1201 to operate at the optimal performance level. Thereafter, the system configuration optimization engine 150 iteratively modifies a subset of these configuration parameter values to determine what configuration modifications of the malware detection system 1201 will maintain the system at an optimal performance level.


More specifically, the system configuration optimization engine 150 determines, using the ML predictive model 375, whether the meta-information 130 identifies the malware detection system 1201 as running in a non-optimal performance level and the configuration modification necessary to return the malware detection system 1201. In response to determining that the malware detection system 1201 operates at an over-utilized performance level, the system configuration optimization engine 150 determines one or more configuration parameter values that, if altered, improve the performance level of the malware detection system 1201 and returns update information 900 that would cause the configuration modification at the malware detection system 1201 to occur in real-time. Updates 910 to the ML predictive model 375 are provided to the malware detection system 1201 to analyze the performance level of the malware detection system 1201 locally and in real-time.


In the foregoing description, the invention is described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims.

Claims
  • 1. A computerized method for reconfiguring one or more malware detection systems each performing cybersecurity analyses on incoming data, the method comprising: receiving meta-information including metrics associated with the one or more malware detection systems;determining whether each of the one or more malware detection systems is operating at an optimal performance level by at least determining a correlation between the metrics and a baseline configuration being metrics gathered through experiential knowledge of meta-information associated with known malware detection systems, wherein the baseline configuration comprises utilization statistics including one or more of (i) hardware utilization statistics, (ii) virtual machine utilization statistics, or (iii) software utilization statistics; andgenerating results, based on the correlation between the metrics and the baseline configuration, provided as feedback to the one or more malware detection systems to update one or more configuration parameters of the one or more malware detection systems.
  • 2. The computerized method of claim 1, wherein the metrics include statistics associated with the one or more configuration parameters.
  • 3. The computerized method of claim 1, wherein the metrics include events monitored during operations of a malware detection system of the one or more malware detection systems.
  • 4. The computerized method of claim 1, wherein each of the one or more malware detection systems being deployed as a cloud-based service and the optimal performance level associated with a malware detection system of the one or more malware detection systems corresponds to a preferred operating state for the malware detection system as measured by any one or more of the hardware utilization statistics, the virtual machine utilization statistics or the software utilization statistics associated with the malware detection system.
  • 5. The computerized method of claim 1 further comprising: sending an alert message that requires an action by an administrator before one or more modified configuration parameters are provided to a malware detection system of the one or more malware detection systems to update the one or more configuration parameters of the malware detection system.
  • 6. The computerized method of claim 1, wherein each of the one or more configuration parameters includes information that partially controls an operating state of a resource deployed within a malware detection system of the one or more malware detection systems.
  • 7. The computerized method of claim 6, wherein a configuration parameter of the one or more configuration parameters is directed to virtual machine characteristics including a virtual machine utilization level.
  • 8. The computerized method of claim 6, wherein the determining of the correlation between the metrics and the baseline configuration is conducted by a predictive model in operation within a system configuration optimization engine.
  • 9. The computerized method of claim 8, wherein the correlation conducted by the predictive model comprises an evaluation of configuration parameter values being part of the metrics with configuration parameter values pertaining to past configurations that are associated with either (i) non-optimal performance levels, (ii) optimal performance levels, or (iii) a combination of non-optimal performance levels and optimal performance levels.
  • 10. The computerized method of claim 8, wherein the determining whether each of the one or more malware detection systems is operating at the optimal performance level comprises determining, by the predictive model, whether a malware detection system of the one or more malware detection system is operating at the optimal performance level that represents a preferred operating state for the malware detection system performing cybersecurity analyses.
  • 11. The computerized method of claim 8, wherein the predictive model is produced by at least training a machine learning model that includes applying different weighting factors to at least one of the one or more configuration parameters.
  • 12. The computerized method of claim 1, wherein the utilization statistics are represented by at least one of (i) a threshold, (ii) a range, (iii) a number, or (iv) an operational bound.
  • 13. The computerized method of claim 1, wherein the one or more configuration parameters include information to at least partially control an operating state of software operating within a malware detection system of the one or more malware detection systems.
  • 14. The computerized method of claim 13, wherein the one or more configuration parameters includes a first configuration parameter directed to an operational setting of the malware detection system.
  • 15. The computerized method of claim 14, wherein the operational setting includes one or more of (i) virtual machine characteristics including a number of active virtual machines used by the malware detection system or (ii) software characteristics including a number of active processes or a number of applications utilized by the active processes.
  • 16. The computerized method of claim 13, wherein the baseline configuration associated with the malware detection system is selected based, at least in part, on a hardware profile for the malware detection system, along with other factors (e.g., threat landscape confronting the customer (for instance, as indicated by the industry protected by the malware detection system), subscription or customer type, etc.).
  • 17. The computerized method of claim 13, wherein the baseline configuration associated with the malware detection system is selected based, at least in part, on a threat landscape confronting a customer associated with the one or more malware detection systems, the threat landscape reflects one or more factors including an industry protected by the malware detection system, a subscription type utilized by the customer, or a type of customer.
  • 18. The computerized method of claim 1, wherein the one or more malware detection systems are deployed within a public cloud network.
  • 19. The computerized method of claim 18, wherein a system configuration optimization engine is configured to conduct the receiving, determining and generating operations is deployed within the public cloud network.
  • 20. The computerized method of claim 1, wherein the optimal performance level corresponds to or is associated with a preferred operating state of a malware detection system of the one or more malware detection system as measured by the hardware utilization statistics, the virtual machine utilization statistics, and/or the software utilization statistics.
  • 21. The computerized method of claim 1, wherein the determining whether each of the one or more malware detection systems is operating at the optimal performance level comprises a determination as to a degree of correlation between the metrics associated with the received meta-information and metrics associated with the baseline configuration including metrics associated with one or more of (i) known misconfigured malware detection systems, (ii) known optimal malware detection systems, or (iii) any combination of known misconfigured malware detection systems and known optimal malware detection systems.
  • 22. The computerized method of claim 1, wherein the determining whether each of the one or more malware detection systems is operating at the optimal performance level comprises determining a performance level for at least a malware detection system of the one or more malware detection systems by conducting operations on metrics provided by the malware detection system as part of the meta-information, the operations include assigning weighting to different metrics of the metrics provided by the malware detection system, the weighting to adjust the performance level toward one or more configuration parameters associated with the metrics that are more applicable to representing the performance level of the malware detection system when correlating with the baseline configuration.
  • 23. The computerized method of claim 1, wherein the generating of the results to be provided as feedback to the one or more malware detection systems is conducted to reduce a number of false negatives.
  • 24. The computerized method of claim 1, wherein the hardware utilization statistics includes at least one of processor utilization or an amount or degree of available memory for storage.
  • 25. The computerized method of claim 1, wherein the virtual machine utilization statistics includes information directed to a degree of utilization of one or more virtual machines operating within a selected malware detection system or information directed to identifying a number of virtual machines activated and operational within a selected malware detection system.
  • 26. The computerized method of claim 1, wherein the software utilization statistics include (i) information to determine what processes are running within a selected malware detection system or (ii) statistics associated with the processes or queue length.
  • 27. The computerized method of claim 1, wherein the one or more malware detection systems are part of a cloud service.
  • 28. The computerized method of claim 1 is performed in a cloud service.
  • 29. The computerized method of claim 1, wherein the correlation between the metrics and the baseline configuration is conducted to determine whether a malware detection system of the one or more malware detection systems is operating at a non-optimal performance level, the non-optimal performance level indicating that the malware detection system is operating outside of a desired operating configuration.
  • 30. The computerized method of claim 1, wherein the feedback includes one or more modified configuration parameter values that differ from values of the one or more configuration parameters received as part of the meta-information.
  • 31. The computerized method of claim 30 further comprising: updating the one or more configuration parameters of the one or more malware detection systems with the one or more modified configuration parameter values.
  • 32. The computerized method of claim 31, wherein the updating of the one or more configuration parameters with the one or more modified configuration parameter values occurs in response to receipt of an approval from an administrator.
  • 33. The computerized method of claim 1, wherein the one or more malware detection system includes a first malware detection system and the baseline configuration is based on a profile of the first malware detection system.
  • 34. A system for detecting a cyber-attack comprising: one or more processors;a memory communicatively coupled to the one or more processors, the memory comprises(i) a parser that, upon execution-by the one or more hardware processors, receives incoming data from a network device and recovers meta-information associated with one or more configuration parameters,(ii) a machine learning-modeling logic that, upon execution by the one or more processors, conducts training on a machine learning model to produce a predictive model being applied to training data including data associated with the recovered meta-information to determine whether the network device is operating at a non-optimal performance level, and(iii) a system health reporting logic that, upon execution by the one or more processors and based on the recovered meta-information, (a) determines whether the network device is operating at the non-optimal performance level, (b) determines one or more configuration parameter values associated with the one or more configuration parameters that, if modifiable, readjusts operability of the network device from the non-optimal performance level to an optimal performance level, and (c) generates a message including information to alter the one or more configuration parameter values at the network device.
  • 35. The system of claim 34, wherein the parser of the memory comprises a first sub-parser configured to parse structured data of the incoming data to recover the meta-information associated with the one or more configuration parameter values.
  • 36. The system of claim 35, wherein the parser of the memory further comprises a second sub-parser configured to parse unstructured data of the incoming data to recover information subsequent to specific keywords being monitored.
  • 37. The system of claim 35, wherein the recovered meta-information including statistics associated with the one or more configuration parameter values that represent a performance level of the network device.
  • 38. The system of claim 37, wherein the parser that, upon execution by the one or more processors, receives incoming data from the network device being a malware detection system including circuitry to detect and analyze incoming objects propagating into or through a local network for malware.
  • 39. The system of claim 35, wherein the system health reporting logic that, upon execution by the one or more processors, determines whether the network device is operating at the non-optimal performance level based on at least a prescribed degree of correlation between the statistics associated with a first configuration parameter of the one or more configuration parameters and the optimal performance level being a preferred operating state for the network device as measured by hardware utilization statistics.
  • 40. The system of claim 35, wherein the system health reporting logic that, upon execution by the one or more processors, determines whether the network device is operating at the non-optimal performance level based on at least a prescribed degree of correlation between the statistics associated with a first configuration parameter of the one or more configuration parameters and the optimal performance level being a preferred operating state for the network device as measured by virtual machine utilization statistics.
  • 41. The system of claim 35, wherein the system health reporting logic that, upon execution by the one or more processors, determines whether the network device is operating at the non-optimal performance level based on at least a prescribed degree of correlation between the statistics associated with a first configuration parameter of the one or more configuration parameters and the optimal performance level being a preferred operating state for the network device as measured by software utilization statistics.
  • 42. The system of claim 34, wherein the system health reporting logic, upon execution by the one or more processors, performs iterative operations including modifying at least one configuration parameter value and analyzing predicted changes in performance level of the source in returning the network device to the optimal performance level.
  • 43. The system of claim 8 further comprising an administrative interface operating as is a portal that allows an administrator, after credential exchange and authentication, to access and update logic stored within the memory, the portal allows for accessing and updating logic by at least modifying (i) one or more triggering events or timeout parameters associated with an event/timeout monitoring logic executed by the one or more processors, or (ii) code configured to detect a presence of malware within an object or its association with a cyberattack, or (iii) operational settings associated with the malware detection system, or (iv) software associated with the malware detection system.
  • 44. The system of claim 34, wherein the machine learning modeling logic, upon execution by the one or more processors, conducts training on the machine learning model to produce the predictive model using labeled training data and unlabeled training data.
  • 45. The system of claim 44, wherein the machine learning modeling logic, upon execution by the one or more processors, conducts the supervised training on the machine learning model using the labeled training data to initially train the machine learning model to improve detection accuracy of the machine learning model in determining whether the network device is operating at a non-optimal performance level until a prescribed accuracy is achieved.
  • 46. A system for detecting a cyber-attack comprising: one or more processors;a memory communicatively coupled to the one or more processors, the memory comprises(i) a parser that, upon execution by the one or more processors, receives incoming data from a network device and recovers meta-information associated with one or more configuration parameters and determines a hardware profile of the network device,(ii) a machine learning modeling logic that, upon execution by the one or more processors, processes a machine learning predictive model that determines whether the network device is operating at a non-optimal performance level or an optimal performance level based on the recovered meta-information, and(iii) a system health reporting logic that, upon execution by the one or more processors and utilizes the machine learning predictive model, (a) determines whether the network device is operating at the non-optimal performance level, (b) determines one or more configuration parameter values associated with the one or more configuration parameters that, if modifiable, readjusts operability of the network device from the non-optimal performance level to an optimal performance level, and (c) generates a message including information to alter the one or more configuration parameter values at the network device.
  • 47. The system of claim 46, wherein the system health reporting logic that, upon execution by the one or more processors, further generates an update of the machine learning predictive model to enable the network device to analyze performance level locally and in real time.
  • 48. The system of claim 46, wherein the machine learning modeling logic being configured to conduct training on a machine learning model using training data to produce the machine learning predictive model, the training data includes information associated the meta-information.
  • 49. The system of claim 48, wherein the information associated with the meta-information includes normalized, heuristic data pertaining to the configuration parameter values.
  • 50. The system of claim 48, wherein the machine learning modeling logic is configured to evaluate operability of the machine learning model by applying different weighting factors for different configuration parameter values of the configuration parameter values received from the parser.
  • 51. The system of claim 46, wherein the machine learning modeling logic is configured to evaluate operability of the machine learning model based on the training data including at least configuration parameter values associated a machine learning model operating on the system health reporting logic.
US Referenced Citations (734)
Number Name Date Kind
4292580 Ott et al. Sep 1981 A
5175732 Hendel et al. Dec 1992 A
5319776 Hile et al. Jun 1994 A
5440723 Arnold et al. Aug 1995 A
5490249 Miller Feb 1996 A
5657473 Killean et al. Aug 1997 A
5802277 Cowlard Sep 1998 A
5842002 Schnurer et al. Nov 1998 A
5960170 Chen et al. Sep 1999 A
5968176 Nessett et al. Oct 1999 A
5978917 Chi Nov 1999 A
5983348 Ji Nov 1999 A
6088803 Tso et al. Jul 2000 A
6092194 Touboul Jul 2000 A
6094677 Capek et al. Jul 2000 A
6108799 Boulay et al. Aug 2000 A
6154844 Touboul et al. Nov 2000 A
6269330 Cidon et al. Jul 2001 B1
6272641 Ji Aug 2001 B1
6279113 Vaidya Aug 2001 B1
6298445 Shostack et al. Oct 2001 B1
6357008 Nachenberg Mar 2002 B1
6424627 Sorhaug et al. Jul 2002 B1
6442696 Wray et al. Aug 2002 B1
6460141 Olden Oct 2002 B1
6484315 Ziese Nov 2002 B1
6487666 Shanklin et al. Nov 2002 B1
6493756 O'Brien et al. Dec 2002 B1
6550012 Villa et al. Apr 2003 B1
6775657 Baker Aug 2004 B1
6831893 Ben Nun et al. Dec 2004 B1
6832367 Choi et al. Dec 2004 B1
6895550 Kanchirayappa et al. May 2005 B2
6898632 Gordy et al. May 2005 B2
6907396 Muttik et al. Jun 2005 B1
6941348 Petry et al. Sep 2005 B2
6971097 Wallman Nov 2005 B1
6981279 Arnold et al. Dec 2005 B1
7007107 Ivchenko et al. Feb 2006 B1
7028179 Anderson et al. Apr 2006 B2
7043757 Hoefelmeyer et al. May 2006 B2
7058822 Edery et al. Jun 2006 B2
7069316 Gryaznov Jun 2006 B1
7080407 Zhao et al. Jul 2006 B1
7080408 Pak et al. Jul 2006 B1
7093002 Wolff et al. Aug 2006 B2
7093239 van der Made Aug 2006 B1
7096498 Judge Aug 2006 B2
7100201 Izatt Aug 2006 B2
7107617 Hursey et al. Sep 2006 B2
7159149 Spiegel et al. Jan 2007 B2
7213260 Judge May 2007 B2
7231667 Jordan Jun 2007 B2
7240364 Branscomb et al. Jul 2007 B1
7240368 Roesch et al. Jul 2007 B1
7243371 Kasper et al. Jul 2007 B1
7249175 Donaldson Jul 2007 B1
7287278 Liang Oct 2007 B2
7308716 Danford et al. Dec 2007 B2
7328453 Merkle, Jr. et al. Feb 2008 B2
7346486 Ivancic et al. Mar 2008 B2
7356736 Natvig Apr 2008 B2
7386888 Liang et al. Jun 2008 B2
7392542 Bucher Jun 2008 B2
7418729 Szor Aug 2008 B2
7428300 Drew et al. Sep 2008 B1
7441272 Durham et al. Oct 2008 B2
7448084 Apap et al. Nov 2008 B1
7458098 Judge et al. Nov 2008 B2
7464404 Carpenter et al. Dec 2008 B2
7464407 Nakae et al. Dec 2008 B2
7467408 O'Toole, Jr. Dec 2008 B1
7478428 Thomlinson Jan 2009 B1
7480773 Reed Jan 2009 B1
7487543 Arnold et al. Feb 2009 B2
7496960 Chen et al. Feb 2009 B1
7496961 Zimmer et al. Feb 2009 B2
7519990 Xie Apr 2009 B1
7523493 Liang et al. Apr 2009 B2
7530104 Thrower et al. May 2009 B1
7540025 Tzadikario May 2009 B2
7546638 Anderson et al. Jun 2009 B2
7565550 Liang et al. Jul 2009 B2
7568233 Szor et al. Jul 2009 B1
7584455 Ball Sep 2009 B2
7603715 Costa et al. Oct 2009 B2
7607171 Marsden et al. Oct 2009 B1
7639714 Stolfo et al. Dec 2009 B2
7644441 Schmid et al. Jan 2010 B2
7657419 van der Made Feb 2010 B2
7676841 Sobchuk et al. Mar 2010 B2
7698548 Shelest et al. Apr 2010 B2
7707633 Danford et al. Apr 2010 B2
7712136 Sprosts et al. May 2010 B2
7730011 Deninger et al. Jun 2010 B1
7739740 Nachenberg et al. Jun 2010 B1
7779463 Stolfo et al. Aug 2010 B2
7784097 Stolfo et al. Aug 2010 B1
7832008 Kraemer Nov 2010 B1
7836502 Zhao et al. Nov 2010 B1
7849506 Dansey et al. Dec 2010 B1
7854007 Sprosts et al. Dec 2010 B2
7869073 Oshima Jan 2011 B2
7877803 Enstone et al. Jan 2011 B2
7904959 Sidiroglou et al. Mar 2011 B2
7908660 Bahl Mar 2011 B2
7930738 Petersen Apr 2011 B1
7937387 Frazier et al. May 2011 B2
7937761 Bennett May 2011 B1
7949849 Lowe et al. May 2011 B2
7996556 Raghavan et al. Aug 2011 B2
7996836 McCorkendale et al. Aug 2011 B1
7996904 Chiueh et al. Aug 2011 B1
7996905 Arnold et al. Aug 2011 B2
8006305 Aziz Aug 2011 B2
8010667 Zhang et al. Aug 2011 B2
8020206 Hubbard et al. Sep 2011 B2
8028338 Schneider et al. Sep 2011 B1
8042184 Batenin Oct 2011 B1
8045094 Teragawa Oct 2011 B2
8045458 Alperovitch et al. Oct 2011 B2
8069484 McMillan et al. Nov 2011 B2
8074256 Valente et al. Dec 2011 B2
8087086 Lai et al. Dec 2011 B1
8171553 Aziz et al. May 2012 B2
8176049 Deninger et al. May 2012 B2
8176480 Spertus May 2012 B1
8181251 Kennedy May 2012 B2
8201246 Wu et al. Jun 2012 B1
8204984 Aziz et al. Jun 2012 B1
8214905 Doukhvalov et al. Jul 2012 B1
8220055 Kennedy Jul 2012 B1
8225288 Miller et al. Jul 2012 B2
8225373 Kraemer Jul 2012 B2
8233882 Rogel Jul 2012 B2
8234640 Fitzgerald et al. Jul 2012 B1
8234709 Viljoen et al. Jul 2012 B2
8239944 Nachenberg et al. Aug 2012 B1
8260914 Ranjan Sep 2012 B1
8266091 Gubin et al. Sep 2012 B1
8286251 Eker et al. Oct 2012 B2
8291499 Aziz et al. Oct 2012 B2
8307435 Mann et al. Nov 2012 B1
8307443 Wang et al. Nov 2012 B2
8312545 Tuvell et al. Nov 2012 B2
8321936 Green et al. Nov 2012 B1
8321941 Tuvell et al. Nov 2012 B2
8332571 Edwards, Sr. Dec 2012 B1
8365286 Poston Jan 2013 B2
8365297 Parshin et al. Jan 2013 B1
8370938 Daswani et al. Feb 2013 B1
8370939 Zaitsev et al. Feb 2013 B2
8375444 Aziz et al. Feb 2013 B2
8381299 Stolfo et al. Feb 2013 B2
8402529 Green et al. Mar 2013 B1
8464340 Ahn et al. Jun 2013 B2
8468602 McDougal Jun 2013 B2
8479174 Chiriac Jul 2013 B2
8479276 Vaystikh et al. Jul 2013 B1
8479291 Bodke Jul 2013 B1
8510827 Leake et al. Aug 2013 B1
8510828 Guo et al. Aug 2013 B1
8510842 Amit et al. Aug 2013 B2
8516478 Edwards et al. Aug 2013 B1
8516575 Burnside et al. Aug 2013 B2
8516583 Thomas Aug 2013 B2
8516590 Ranadive et al. Aug 2013 B1
8516593 Aziz Aug 2013 B2
8522348 Chen et al. Aug 2013 B2
8528086 Aziz Sep 2013 B1
8533824 Hutton et al. Sep 2013 B2
8539582 Aziz et al. Sep 2013 B1
8549638 Aziz Oct 2013 B2
8555391 Demir et al. Oct 2013 B1
8561177 Aziz et al. Oct 2013 B1
8566476 Shiffer et al. Oct 2013 B2
8566946 Aziz et al. Oct 2013 B1
8584094 Dadhia et al. Nov 2013 B2
8584234 Sobel et al. Nov 2013 B1
8584239 Aziz et al. Nov 2013 B2
8595834 Xie et al. Nov 2013 B2
8627404 McDougal Jan 2014 B2
8627476 Satish et al. Jan 2014 B1
8635079 McDougal Jan 2014 B2
8635696 Aziz Jan 2014 B1
8682054 Xue et al. Mar 2014 B2
8682812 Ranjan Mar 2014 B1
8689333 Aziz Apr 2014 B2
8695096 Zhang Apr 2014 B1
8713631 Pavlyushchik Apr 2014 B1
8713681 Silberman et al. Apr 2014 B2
8726392 McCorkendale et al. May 2014 B1
8739280 Chess et al. May 2014 B2
8776229 Aziz Jul 2014 B1
8782792 Bodke Jul 2014 B1
8789172 Stolfo et al. Jul 2014 B2
8789178 Kejriwal et al. Jul 2014 B2
8793278 Frazier et al. Jul 2014 B2
8793787 Ismael et al. Jul 2014 B2
8805947 Kuzkin et al. Aug 2014 B1
8806629 Cherepov Aug 2014 B1
8806647 Daswani et al. Aug 2014 B1
8832829 Manni et al. Sep 2014 B2
8850570 Ramzan Sep 2014 B1
8850571 Staniford et al. Sep 2014 B2
8881234 Narasimhan et al. Nov 2014 B2
8881271 Butler, II Nov 2014 B2
8881282 Aziz et al. Nov 2014 B1
8898788 Aziz et al. Nov 2014 B1
8935779 Manni et al. Jan 2015 B2
8949257 Shiffer et al. Feb 2015 B2
8984638 Aziz et al. Mar 2015 B1
8990939 Staniford et al. Mar 2015 B2
8990944 Singh et al. Mar 2015 B1
8997219 Staniford et al. Mar 2015 B2
9009822 Ismael et al. Apr 2015 B1
9009823 Ismael et al. Apr 2015 B1
9027135 Aziz May 2015 B1
9071638 Aziz et al. Jun 2015 B1
9104867 Thioux et al. Aug 2015 B1
9106630 Frazier et al. Aug 2015 B2
9106694 Aziz et al. Aug 2015 B2
9118715 Staniford et al. Aug 2015 B2
9159035 Ismael et al. Oct 2015 B1
9171160 Vincent et al. Oct 2015 B2
9176843 Ismael et al. Nov 2015 B1
9189627 Islam Nov 2015 B1
9195829 Goradia et al. Nov 2015 B1
9197664 Aziz et al. Nov 2015 B1
9223972 Vincent et al. Dec 2015 B1
9225740 Ismael et al. Dec 2015 B1
9241010 Bennett et al. Jan 2016 B1
9251343 Vincent et al. Feb 2016 B1
9262635 Paithane et al. Feb 2016 B2
9268936 Butler Feb 2016 B2
9275229 LeMasters Mar 2016 B2
9282109 Aziz et al. Mar 2016 B1
9292686 Ismael et al. Mar 2016 B2
9294486 Chiang Mar 2016 B1
9294501 Mesdaq et al. Mar 2016 B2
9300686 Pidathala et al. Mar 2016 B2
9306960 Aziz Apr 2016 B1
9306974 Aziz et al. Apr 2016 B1
9311479 Manni et al. Apr 2016 B1
9350747 McLarnon May 2016 B2
9355247 Thioux et al. May 2016 B1
9356944 Aziz May 2016 B1
9363280 Rivlin et al. Jun 2016 B1
9367681 Ismael et al. Jun 2016 B1
9398028 Karandikar et al. Jul 2016 B1
9413781 Cunningham et al. Aug 2016 B2
9426071 Caldejon et al. Aug 2016 B1
9430646 Mushtaq et al. Aug 2016 B1
9432389 Khalid et al. Aug 2016 B1
9438613 Paithane et al. Sep 2016 B1
9438622 Staniford et al. Sep 2016 B1
9438623 Thioux et al. Sep 2016 B1
9459901 Jung et al. Oct 2016 B2
9467460 Otvagin et al. Oct 2016 B1
9483644 Paithane et al. Nov 2016 B1
9495180 Ismael Nov 2016 B2
9497213 Thompson et al. Nov 2016 B2
9507935 Ismael et al. Nov 2016 B2
9516057 Aziz Dec 2016 B2
9519782 Aziz et al. Dec 2016 B2
9536091 Paithane et al. Jan 2017 B2
9537972 Edwards et al. Jan 2017 B1
9560059 Islam Jan 2017 B1
9565202 Kindlund et al. Feb 2017 B1
9591015 Amin et al. Mar 2017 B1
9591020 Aziz Mar 2017 B1
9594904 Jain et al. Mar 2017 B1
9594905 Ismael et al. Mar 2017 B1
9594912 Thioux et al. Mar 2017 B1
9609007 Rivlin et al. Mar 2017 B1
9626509 Khalid et al. Apr 2017 B1
9628498 Aziz et al. Apr 2017 B1
9628507 Haq et al. Apr 2017 B2
9633134 Ross Apr 2017 B2
9635039 Islam et al. Apr 2017 B1
9641546 Manni et al. May 2017 B1
9654485 Neumann May 2017 B1
9661009 Karandikar et al. May 2017 B1
9661018 Aziz May 2017 B1
9674298 Edwards et al. Jun 2017 B1
9680862 Ismael et al. Jun 2017 B2
9690606 Ha et al. Jun 2017 B1
9690933 Singh et al. Jun 2017 B1
9690935 Shiffer et al. Jun 2017 B2
9690936 Malik et al. Jun 2017 B1
9716617 Ahuja et al. Jul 2017 B1
9736179 Ismael Aug 2017 B2
9740857 Ismael et al. Aug 2017 B2
9747446 Pidathala et al. Aug 2017 B1
9756074 Aziz et al. Sep 2017 B2
9773112 Rathor et al. Sep 2017 B1
9781144 Otvagin et al. Oct 2017 B1
9787700 Amin et al. Oct 2017 B1
9787706 Otvagin et al. Oct 2017 B1
9792196 Ismael et al. Oct 2017 B1
9824209 Ismael et al. Nov 2017 B1
9824211 Wilson Nov 2017 B2
9824216 Khalid et al. Nov 2017 B1
9825976 Gomez et al. Nov 2017 B1
9825989 Mehra et al. Nov 2017 B1
9838408 Karandikar et al. Dec 2017 B1
9838411 Aziz Dec 2017 B1
9838416 Aziz Dec 2017 B1
9838417 Khalid et al. Dec 2017 B1
9846776 Paithane et al. Dec 2017 B1
9876701 Caldejon et al. Jan 2018 B1
9888016 Amin et al. Feb 2018 B1
9888019 Pidathala et al. Feb 2018 B1
9910988 Vincent et al. Mar 2018 B1
9912644 Cunningham Mar 2018 B2
9912681 Ismael et al. Mar 2018 B1
9912684 Aziz et al. Mar 2018 B1
9912691 Mesdaq et al. Mar 2018 B2
9912698 Thioux et al. Mar 2018 B1
9916440 Paithane et al. Mar 2018 B1
9921978 Chan et al. Mar 2018 B1
9934376 Ismael Apr 2018 B1
9934381 Kindlund et al. Apr 2018 B1
9946568 Ismael et al. Apr 2018 B1
9954890 Staniford et al. Apr 2018 B1
9973531 Thioux May 2018 B1
10002252 Ismael et al. Jun 2018 B2
10019338 Goradia et al. Jul 2018 B1
10019573 Silberman et al. Jul 2018 B2
10025691 Ismael et al. Jul 2018 B1
10025927 Khalid et al. Jul 2018 B1
10027689 Rathor et al. Jul 2018 B1
10027690 Aziz et al. Jul 2018 B2
10027696 Rivlin et al. Jul 2018 B1
10033747 Paithane et al. Jul 2018 B1
10033748 Cunningham et al. Jul 2018 B1
10033753 Islam et al. Jul 2018 B1
10033759 Kabra et al. Jul 2018 B1
10050998 Singh Aug 2018 B1
10057356 Sherman Aug 2018 B2
10068091 Aziz et al. Sep 2018 B1
10075455 Zafar et al. Sep 2018 B2
10083302 Paithane et al. Sep 2018 B1
10084813 Eyada Sep 2018 B2
10089461 Ha et al. Oct 2018 B1
10097573 Aziz Oct 2018 B1
10104102 Neumann Oct 2018 B1
10108446 Steinberg et al. Oct 2018 B1
10121000 Rivlin et al. Nov 2018 B1
10122746 Manni et al. Nov 2018 B1
10133863 Bu et al. Nov 2018 B2
10133866 Kumar et al. Nov 2018 B1
10146810 Shiffer et al. Dec 2018 B2
10148693 Singh et al. Dec 2018 B2
10165000 Aziz et al. Dec 2018 B1
10169585 Pilipenko et al. Jan 2019 B1
10176321 Abbasi et al. Jan 2019 B2
10181029 Ismael et al. Jan 2019 B1
10191861 Steinberg et al. Jan 2019 B1
10192052 Singh et al. Jan 2019 B1
10198574 Thioux et al. Feb 2019 B1
10200384 Mushtaq et al. Feb 2019 B1
10210329 Malik et al. Feb 2019 B1
10216927 Steinberg Feb 2019 B1
10218740 Mesdaq et al. Feb 2019 B1
10242185 Goradia Mar 2019 B1
10432669 Badhwar et al. Oct 2019 B1
10439897 Komarla et al. Oct 2019 B1
10721275 Kung et al. Jul 2020 B2
20010005889 Albrecht Jun 2001 A1
20010047326 Broadbent et al. Nov 2001 A1
20020018903 Kokubo et al. Feb 2002 A1
20020038430 Edwards et al. Mar 2002 A1
20020091819 Melchione et al. Jul 2002 A1
20020095607 Lin-Hendel Jul 2002 A1
20020116627 Tarbotton et al. Aug 2002 A1
20020144156 Copeland Oct 2002 A1
20020162015 Tang Oct 2002 A1
20020166063 Lachman et al. Nov 2002 A1
20020169952 DiSanto et al. Nov 2002 A1
20020184528 Shevenell et al. Dec 2002 A1
20020188887 Largman et al. Dec 2002 A1
20020194490 Halperin et al. Dec 2002 A1
20030021728 Sharpe et al. Jan 2003 A1
20030074578 Ford et al. Apr 2003 A1
20030084318 Schertz May 2003 A1
20030101381 Mateev et al. May 2003 A1
20030115483 Liang Jun 2003 A1
20030126464 McDaniel et al. Jul 2003 A1
20030161265 Cao et al. Aug 2003 A1
20030188190 Aaron et al. Oct 2003 A1
20030191957 Hypponen et al. Oct 2003 A1
20030200460 Morota et al. Oct 2003 A1
20030212902 van der Made Nov 2003 A1
20030229801 Kouznetsov et al. Dec 2003 A1
20030237000 Denton et al. Dec 2003 A1
20040003323 Bennett et al. Jan 2004 A1
20040006473 Mills et al. Jan 2004 A1
20040015712 Szor Jan 2004 A1
20040019832 Arnold et al. Jan 2004 A1
20040047356 Bauer Mar 2004 A1
20040083408 Spiegel et al. Apr 2004 A1
20040088581 Brawn et al. May 2004 A1
20040093513 Cantrell et al. May 2004 A1
20040111531 Staniford et al. Jun 2004 A1
20040117478 Triulzi et al. Jun 2004 A1
20040117624 Brandt et al. Jun 2004 A1
20040128355 Chao et al. Jul 2004 A1
20040165588 Pandya Aug 2004 A1
20040236963 Danford et al. Nov 2004 A1
20040243349 Greifeneder et al. Dec 2004 A1
20040249911 Alkhatib et al. Dec 2004 A1
20040255161 Cavanaugh Dec 2004 A1
20040268147 Wiederin et al. Dec 2004 A1
20050005159 Oliphant Jan 2005 A1
20050021740 Bar et al. Jan 2005 A1
20050033960 Vialen et al. Feb 2005 A1
20050033989 Poletto et al. Feb 2005 A1
20050050148 Mohammadioun et al. Mar 2005 A1
20050086523 Zimmer et al. Apr 2005 A1
20050091513 Mitomo et al. Apr 2005 A1
20050091533 Omote et al. Apr 2005 A1
20050091652 Ross et al. Apr 2005 A1
20050108562 Khazan et al. May 2005 A1
20050114663 Cornell et al. May 2005 A1
20050125195 Brendel Jun 2005 A1
20050149726 Joshi et al. Jul 2005 A1
20050157662 Bingham et al. Jul 2005 A1
20050183143 Anderholm et al. Aug 2005 A1
20050201297 Peikari Sep 2005 A1
20050210533 Copeland et al. Sep 2005 A1
20050238005 Chen et al. Oct 2005 A1
20050240781 Gassoway Oct 2005 A1
20050262562 Gassoway Nov 2005 A1
20050265331 Stolfo Dec 2005 A1
20050283839 Cowburn Dec 2005 A1
20060010495 Cohen et al. Jan 2006 A1
20060015416 Hoffman et al. Jan 2006 A1
20060015715 Anderson Jan 2006 A1
20060015747 Van de Ven Jan 2006 A1
20060021029 Brickell et al. Jan 2006 A1
20060021054 Costa et al. Jan 2006 A1
20060031476 Mathes et al. Feb 2006 A1
20060047665 Neil Mar 2006 A1
20060070130 Costea et al. Mar 2006 A1
20060075496 Carpenter et al. Apr 2006 A1
20060095968 Portolani et al. May 2006 A1
20060101516 Sudaharan et al. May 2006 A1
20060101517 Banzhof et al. May 2006 A1
20060117385 Mester et al. Jun 2006 A1
20060123477 Raghavan et al. Jun 2006 A1
20060143709 Brooks et al. Jun 2006 A1
20060150249 Gassen et al. Jul 2006 A1
20060161983 Cothrell et al. Jul 2006 A1
20060161987 Levy-Yurista Jul 2006 A1
20060161989 Reshef et al. Jul 2006 A1
20060164199 Glide et al. Jul 2006 A1
20060173992 Weber et al. Aug 2006 A1
20060179147 Tran et al. Aug 2006 A1
20060184632 Marino et al. Aug 2006 A1
20060191010 Benjamin Aug 2006 A1
20060221956 Narayan et al. Oct 2006 A1
20060236393 Kramer et al. Oct 2006 A1
20060242709 Seinfeld et al. Oct 2006 A1
20060248519 Jaeger et al. Nov 2006 A1
20060248582 Panjwani et al. Nov 2006 A1
20060251104 Koga Nov 2006 A1
20060288417 Bookbinder et al. Dec 2006 A1
20070006288 Mayfield et al. Jan 2007 A1
20070006313 Porras et al. Jan 2007 A1
20070011174 Takaragi et al. Jan 2007 A1
20070016951 Piccard et al. Jan 2007 A1
20070019286 Kikuchi Jan 2007 A1
20070033645 Jones Feb 2007 A1
20070038943 FitzGerald et al. Feb 2007 A1
20070064689 Shin et al. Mar 2007 A1
20070074169 Chess et al. Mar 2007 A1
20070094730 Bhikkaji et al. Apr 2007 A1
20070101435 Konanka et al. May 2007 A1
20070128855 Cho et al. Jun 2007 A1
20070142030 Sinha et al. Jun 2007 A1
20070143827 Nicodemus et al. Jun 2007 A1
20070156895 Vuong Jul 2007 A1
20070157180 Tillmann et al. Jul 2007 A1
20070157306 Elrod et al. Jul 2007 A1
20070168988 Eisner et al. Jul 2007 A1
20070171824 Ruello et al. Jul 2007 A1
20070174915 Gribble et al. Jul 2007 A1
20070192500 Lum Aug 2007 A1
20070192858 Lum Aug 2007 A1
20070198275 Malden et al. Aug 2007 A1
20070208822 Wang et al. Sep 2007 A1
20070220607 Sprosts et al. Sep 2007 A1
20070240218 Tuvell et al. Oct 2007 A1
20070240219 Tuvell et al. Oct 2007 A1
20070240220 Tuvell et al. Oct 2007 A1
20070240222 Tuvell et al. Oct 2007 A1
20070250930 Aziz et al. Oct 2007 A1
20070256132 Oliphant Nov 2007 A2
20070271446 Nakamura Nov 2007 A1
20080005782 Aziz Jan 2008 A1
20080018122 Zierler et al. Jan 2008 A1
20080028463 Dagon et al. Jan 2008 A1
20080040710 Chiriac Feb 2008 A1
20080046781 Childs et al. Feb 2008 A1
20080066179 Liu Mar 2008 A1
20080072326 Danford et al. Mar 2008 A1
20080077793 Tan et al. Mar 2008 A1
20080080518 Hoeflin et al. Apr 2008 A1
20080086720 Lekel Apr 2008 A1
20080098476 Syversen Apr 2008 A1
20080120722 Sima et al. May 2008 A1
20080134178 Fitzgerald et al. Jun 2008 A1
20080134334 Kim et al. Jun 2008 A1
20080141376 Clausen et al. Jun 2008 A1
20080184367 McMillan et al. Jul 2008 A1
20080184373 Traut et al. Jul 2008 A1
20080189787 Arnold et al. Aug 2008 A1
20080201778 Guo et al. Aug 2008 A1
20080209557 Herley et al. Aug 2008 A1
20080215742 Goldszmidt et al. Sep 2008 A1
20080222729 Chen et al. Sep 2008 A1
20080263665 Ma et al. Oct 2008 A1
20080295172 Bohacek Nov 2008 A1
20080301810 Lehane et al. Dec 2008 A1
20080307524 Singh et al. Dec 2008 A1
20080313738 Enderby Dec 2008 A1
20080320594 Jiang Dec 2008 A1
20090003317 Kasralikar et al. Jan 2009 A1
20090007100 Field et al. Jan 2009 A1
20090013408 Schipka Jan 2009 A1
20090031423 Liu et al. Jan 2009 A1
20090036111 Danford et al. Feb 2009 A1
20090037835 Goldman Feb 2009 A1
20090044024 Oberheide et al. Feb 2009 A1
20090044274 Budko et al. Feb 2009 A1
20090064332 Porras et al. Mar 2009 A1
20090077666 Chen et al. Mar 2009 A1
20090083369 Marmor Mar 2009 A1
20090083855 Apap et al. Mar 2009 A1
20090089879 Wang et al. Apr 2009 A1
20090094697 Provos et al. Apr 2009 A1
20090113425 Ports et al. Apr 2009 A1
20090125976 Wassermann et al. May 2009 A1
20090126015 Monastyrsky et al. May 2009 A1
20090126016 Sobko et al. May 2009 A1
20090133125 Choi et al. May 2009 A1
20090144823 Lamastra et al. Jun 2009 A1
20090158430 Borders Jun 2009 A1
20090172815 Gu et al. Jul 2009 A1
20090187992 Poston Jul 2009 A1
20090193293 Stolfo et al. Jul 2009 A1
20090198651 Shiffer et al. Aug 2009 A1
20090198670 Shiffer et al. Aug 2009 A1
20090198689 Frazier et al. Aug 2009 A1
20090199274 Frazier et al. Aug 2009 A1
20090199296 Xie et al. Aug 2009 A1
20090228233 Anderson et al. Sep 2009 A1
20090241187 Troyansky Sep 2009 A1
20090241190 Todd et al. Sep 2009 A1
20090265692 Godefroid et al. Oct 2009 A1
20090271867 Zhang Oct 2009 A1
20090300415 Zhang et al. Dec 2009 A1
20090300761 Park et al. Dec 2009 A1
20090328185 Berg et al. Dec 2009 A1
20090328221 Blumfield et al. Dec 2009 A1
20100005146 Drako et al. Jan 2010 A1
20100011205 McKenna Jan 2010 A1
20100017546 Poo et al. Jan 2010 A1
20100030996 Butler, II Feb 2010 A1
20100031353 Thomas et al. Feb 2010 A1
20100037314 Perdisci et al. Feb 2010 A1
20100043073 Kuwamura Feb 2010 A1
20100054278 Stolfo et al. Mar 2010 A1
20100058474 Hicks Mar 2010 A1
20100064044 Nonoyama Mar 2010 A1
20100077481 Polyakov et al. Mar 2010 A1
20100083376 Pereira et al. Apr 2010 A1
20100115621 Staniford et al. May 2010 A1
20100132038 Zaitsev May 2010 A1
20100154056 Smith et al. Jun 2010 A1
20100180344 Malyshev et al. Jul 2010 A1
20100192223 Ismael et al. Jul 2010 A1
20100220863 Dupaquis et al. Sep 2010 A1
20100235831 Dittmer Sep 2010 A1
20100251104 Massand Sep 2010 A1
20100281102 Chinta et al. Nov 2010 A1
20100281541 Stolfo et al. Nov 2010 A1
20100281542 Stolfo et al. Nov 2010 A1
20100287260 Peterson et al. Nov 2010 A1
20100299754 Amit et al. Nov 2010 A1
20100306173 Frank Dec 2010 A1
20110004737 Greenebaum Jan 2011 A1
20110025504 Lyon et al. Feb 2011 A1
20110041179 St Hlberg Feb 2011 A1
20110047594 Mahaffey et al. Feb 2011 A1
20110047620 Mahaffey et al. Feb 2011 A1
20110055907 Narasimhan et al. Mar 2011 A1
20110078794 Manni et al. Mar 2011 A1
20110093951 Aziz Apr 2011 A1
20110099620 Stavrou et al. Apr 2011 A1
20110099633 Aziz Apr 2011 A1
20110099635 Silberman et al. Apr 2011 A1
20110113231 Kaminsky May 2011 A1
20110145918 Jung et al. Jun 2011 A1
20110145920 Mahaffey et al. Jun 2011 A1
20110145934 Abramovici et al. Jun 2011 A1
20110167493 Song et al. Jul 2011 A1
20110167494 Bowen et al. Jul 2011 A1
20110173213 Frazier et al. Jul 2011 A1
20110173460 Ito et al. Jul 2011 A1
20110219449 St. Neitzel et al. Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225624 Sawhney et al. Sep 2011 A1
20110225655 Niemela et al. Sep 2011 A1
20110247072 Staniford et al. Oct 2011 A1
20110265182 Peinado et al. Oct 2011 A1
20110289582 Kejriwal et al. Nov 2011 A1
20110302587 Nishikawa et al. Dec 2011 A1
20110307954 Melnik et al. Dec 2011 A1
20110307955 Kaplan et al. Dec 2011 A1
20110307956 Yermakov et al. Dec 2011 A1
20110314546 Aziz et al. Dec 2011 A1
20120011560 Natarajan et al. Jan 2012 A1
20120023593 Puder et al. Jan 2012 A1
20120054869 Yen et al. Mar 2012 A1
20120066698 Yanoo Mar 2012 A1
20120079596 Thomas et al. Mar 2012 A1
20120084859 Radinsky et al. Apr 2012 A1
20120096553 Srivastava et al. Apr 2012 A1
20120110667 Zubrilin et al. May 2012 A1
20120117652 Manni et al. May 2012 A1
20120121154 Xue et al. May 2012 A1
20120124426 Maybee et al. May 2012 A1
20120174186 Aziz et al. Jul 2012 A1
20120174196 Bhogavilli et al. Jul 2012 A1
20120174218 McCoy et al. Jul 2012 A1
20120198279 Schroeder Aug 2012 A1
20120210423 Friedrichs et al. Aug 2012 A1
20120222121 Staniford et al. Aug 2012 A1
20120255015 Sahita et al. Oct 2012 A1
20120255017 Sallam Oct 2012 A1
20120260342 Dube et al. Oct 2012 A1
20120266244 Green et al. Oct 2012 A1
20120278886 Luna Nov 2012 A1
20120297489 Dequevy Nov 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120331553 Aziz et al. Dec 2012 A1
20130014259 Gribble et al. Jan 2013 A1
20130036472 Aziz Feb 2013 A1
20130047257 Aziz Feb 2013 A1
20130074185 McDougal et al. Mar 2013 A1
20130086684 Mohler Apr 2013 A1
20130097699 Balupari et al. Apr 2013 A1
20130097706 Titonis et al. Apr 2013 A1
20130111587 Goel et al. May 2013 A1
20130117852 Stute May 2013 A1
20130117855 Kim et al. May 2013 A1
20130139264 Brinkley et al. May 2013 A1
20130160125 Likhachev et al. Jun 2013 A1
20130160127 Jeong et al. Jun 2013 A1
20130160130 Mendelev et al. Jun 2013 A1
20130160131 Madou et al. Jun 2013 A1
20130167236 Sick Jun 2013 A1
20130174214 Duncan Jul 2013 A1
20130185789 Hagiwara et al. Jul 2013 A1
20130185795 Winn et al. Jul 2013 A1
20130185798 Saunders et al. Jul 2013 A1
20130191915 Antonakakis et al. Jul 2013 A1
20130196649 Paddon et al. Aug 2013 A1
20130227691 Aziz et al. Aug 2013 A1
20130246370 Bartram et al. Sep 2013 A1
20130247186 LeMasters Sep 2013 A1
20130263260 Mahaffey et al. Oct 2013 A1
20130291109 Staniford et al. Oct 2013 A1
20130298243 Kumar et al. Nov 2013 A1
20130318038 Shiffer et al. Nov 2013 A1
20130318073 Shiffer et al. Nov 2013 A1
20130325791 Shiffer et al. Dec 2013 A1
20130325792 Shiffer et al. Dec 2013 A1
20130325871 Shiffer et al. Dec 2013 A1
20130325872 Shiffer et al. Dec 2013 A1
20140032875 Butler Jan 2014 A1
20140053260 Gupta et al. Feb 2014 A1
20140053261 Gupta et al. Feb 2014 A1
20140130158 Wang et al. May 2014 A1
20140137180 Lukacs et al. May 2014 A1
20140169762 Ryu Jun 2014 A1
20140179360 Jackson et al. Jun 2014 A1
20140181131 Ross Jun 2014 A1
20140189687 Jung et al. Jul 2014 A1
20140189866 Shiffer et al. Jul 2014 A1
20140189882 Jung et al. Jul 2014 A1
20140237600 Silberman et al. Aug 2014 A1
20140280245 Wilson Sep 2014 A1
20140283037 Sikorski et al. Sep 2014 A1
20140283063 Thompson et al. Sep 2014 A1
20140328204 Klotsche et al. Nov 2014 A1
20140337836 Ismael Nov 2014 A1
20140344926 Cunningham et al. Nov 2014 A1
20140351935 Shao et al. Nov 2014 A1
20140380473 Bu et al. Dec 2014 A1
20140380474 Paithane et al. Dec 2014 A1
20150007312 Pidathala et al. Jan 2015 A1
20150096022 Vincent et al. Apr 2015 A1
20150096023 Mesdaq et al. Apr 2015 A1
20150096024 Haq et al. Apr 2015 A1
20150096025 Ismael Apr 2015 A1
20150180886 Staniford et al. Jun 2015 A1
20150186645 Aziz et al. Jul 2015 A1
20150199513 Ismael et al. Jul 2015 A1
20150199531 Ismael et al. Jul 2015 A1
20150199532 Ismael et al. Jul 2015 A1
20150220735 Paithane et al. Aug 2015 A1
20150372980 Eyada Dec 2015 A1
20160004869 Ismael et al. Jan 2016 A1
20160006756 Ismael et al. Jan 2016 A1
20160044000 Cunningham Feb 2016 A1
20160127393 Aziz et al. May 2016 A1
20160191547 Zafar et al. Jun 2016 A1
20160191550 Ismael et al. Jun 2016 A1
20160224787 Guy Aug 2016 A1
20160261612 Mesdaq et al. Sep 2016 A1
20160285914 Singh et al. Sep 2016 A1
20160301703 Aziz Oct 2016 A1
20160314491 Shani et al. Oct 2016 A1
20160335110 Paithane et al. Nov 2016 A1
20170083703 Abbasi et al. Mar 2017 A1
20180013770 Ismael Jan 2018 A1
20180048660 Paithane et al. Feb 2018 A1
20180121316 Ismael et al. May 2018 A1
20180234459 Kung et al. Aug 2018 A1
20180288077 Siddiqui et al. Oct 2018 A1
20190327271 Saxena et al. Oct 2019 A1
Foreign Referenced Citations (11)
Number Date Country
2439806 Jan 2008 GB
2490431 Oct 2012 GB
0206928 Jan 2002 WO
0223805 Mar 2002 WO
2007117636 Oct 2007 WO
2008041950 Apr 2008 WO
2011084431 Jul 2011 WO
2011112348 Sep 2011 WO
2012075336 Jun 2012 WO
2012145066 Oct 2012 WO
2013067505 May 2013 WO
Non-Patent Literature Citations (60)
Entry
“Mining Specification of Malicious Behavior”—Jha et al, UCSB, Sep. 2007 https://www.cs.ucsb.edu/.about.chris/research/doc/esec07.sub.--mining.pdf-.
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003).
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.isp?reload=true&arnumbe-r=990073, (Dec. 7, 2013).
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108.
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003).
Apostolopoulos, George; hassapis, Constantinos; “V-eM: A cluster of Virtual Machines for Robust, Detailed, and High-Performance Network Emulation”, 14th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Sep. 11-14, 2006, pp. 117-126.
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006.
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184.
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77.
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003).
Chaudet, C. , et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82.
Chen, P. M. and Noble, B. D., “When Virtual is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”) (2001).
Cisco “Intrusion Prevention for the Cisco ASA 5500-x Series” Data Sheet (2012).
Cohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120.
Costa, M. , et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005).
Didier Stevens, “Malicious PDF Documents Explained”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 9, No. 1, Jan. 1, 2011, pp. 80-82, XP011329453, ISSN: 1540-7993, DOI: 10.1109/MSP.2011.14.
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007).
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002).
FireEye Malware Analysis & Exchange Network, Malware Protection System, FireEye Inc., 2010.
FireEye Malware Analysis, Modern Malware Forensics, FireEye Inc., 2010.
FireEye v.6.0 Security Target, pp. 1-35, Version 1.1, FireEye Inc., May 2011.
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28.
Gregg Keizer: “Microsoft's HoneyMonkeys Show Patching Windows Works”, Aug. 8, 2005, XP055143386, Retrieved from the Internet: URL:http://www.informationweek.com/microsofts-honeymonkeys-show-patching-windows-works/d/d-id/1035069? [retrieved on Jun. 1, 2016].
Heng Yin et al, Panorama: Capturing System-Wide Information Flow for Malware Detection and Analysis, Research Showcase @ CMU, Carnegie Mellon University, 2007.
Hiroshi Shinotsuka, Malware Authors Using New Techniques to Evade Automated Threat Analysis Systems, Oct. 26, 2012, http://www.symantec.com/connect/blogs/, pp. 1-4.
Idika et al., A-Survey-of-Malware-Detection-Techniques, Feb. 2, 2007, Department of Computer Science, Purdue University.
Isohara, Takamasa, Keisuke Takemori, and Ayumu Kubota. “Kernel-based behavior analysis for android malware detection.” Computational intelligence and Security (CIS), 2011 Seventh International Conference on. IEEE, 2011.
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003).
Kevin A Roundy et al: “Hybrid Analysis and Control of Malware”, Sep. 15, 2010, Recent Advances in Intrusion Detection, Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 317-338, XP019150454 ISBN:978-3-642-15511-6.
Khaled Salah et al: “Using Cloud Computing to Implement a Security Overlay Network”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 11, No. 1, Jan. 1, 2013 (Jan. 1, 2013).
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286.
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”), (2003).
Kreibich, C. , et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003).
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages.
Lastline Labs, The Threat of Evasive Malware, Feb. 25, 2013, Lastline Labs, pp. 1-8.
Li et al., A VMM-Based System Call Interposition Framework for Program Monitoring, Dec. 2010, IEEE 16th International Conference on Parallel and Distributed Systems, pp. 706-711.
Lindorfer, Martina, Clemens Kolbitsch, and Paolo Milani Comparetti. “Detecting environment-sensitive malware.” Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011.
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001).
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910.
Morales, Jose A., et al., ““Analyzing and exploiting network behaviors of malware.””, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34.
Mori, Detecting Unknown Computer Viruses, 2004, Springer-Verlag Berlin Heidelberg.
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002).
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987.
Newsome, J. , et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), (Feb. 2005).
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302.
Oberheide et al., CloudAV.sub.--N-Version Antivirus in the Network Cloud, 17th USENIX Security Symposium USENIX Security '08 Jul. 28-Aug. 1, 2008 San Jose, CA.
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doom, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”).
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25.
Singh, S. , et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004).
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998).
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003).
Vladimir Getov: “Security as a Service in Smart Clouds—Opportunities and Concerns”, Computer Software and Applications Conference (COMPSAC), 2012 IEEE 36th Annual, IEEE, Jul. 16, 2012 (Jul. 16, 2012).
Wahid et al., Characterising the Evolution in Scanning Activity of Suspicious Hosts, Oct. 2009, Third International Conference on Network and System Security, pp. 344-350.
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages.
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9.
Yuhei Kawakoya et al: “Memory behavior-based automatic malware unpacking in stealth debugging environment”, Malicious and Unwanted Software (Malware), 2010 5th International Conference on, IEEE, Piscataway, NJ, USA, Oct. 19, 2010, pp. 39-46, XP031833827, ISBN:978-1-4244-8-9353-1.
Zhang et al., The Effects of Threading, Infection Time, and Multiple-Attacker Collaboration on Malware Propagation, Sep. 2009, IEEE 28th International Symposium on Reliable Distributed Systems, pp. 73-82.
MacDonald, Neil et al. “How to make Cloud IaaS Workloads More Secure Than Your Own Data Center” Gartner Research ID: G00300337, https://www.gartner.com/en/documents/3352444/how-to-make-cloud-iaas-workloads-more-secure-than-your-o, last accessed May 13, 2020.
U.S. Appl. No. 15/878,386 dated Jan. 23, 2018 Non-Final Office Action dated Nov. 12, 2019.
U.S. Appl. No. 15/878,386 dated Jan. 23, 2018 Notice of Allowance dated Mar. 3, 2020.