Embodiments of the disclosure relate to the field of cybersecurity. More specifically, certain embodiments of the disclosure relate to a system, apparatus and method for updating and concurrently managing (sometimes referred to as seamlessly updating) multiple threat analyzers for cybersecurity systems.
Over the last decade, malicious software (malware) has become a pervasive problem for Internet users. Often, malware is a program or file embedded within downloadable content and designed to adversely influence (e.g., attack) normal operations of a computer. Examples of different types of malware may include bots, computer viruses, worms, Trojan horses, spyware, adware, or any other programming that operates within the computer without permission.
For example, malware may be embedded within objects hosted on a web site, which can be subsequently downloaded on a user's computing device in response to requesting a web page from the web site containing those objects. Once downloaded and processed, this malware (i.e. malicious content) can unknowingly download and install additional malicious objects that can further a cyberattack on the computing device. Similarly, malware may also be installed on a computing device upon receipt, or opening, of an electronic mail (email) message. For example, an email message may contain an attachment, such as a Portable Document Format (PDF) document, containing embedded executable malware. Further, malware may exist in computer files infected through any of a variety of attack vectors, which can then be uploaded from the infected computer onto other network devices thereby furthering the spread of the malware.
Recently, various types of cybersecurity methods have been deployed that have attempted to find malware within files or other network content. Often, these methods evaluate objects suspected of being a threat (i.e. malicious) and make a determination if the object is either a cybersecurity threat or not a threat (i.e. benign). Malware detection systems can utilize, for example software, to inspect processes, stored files and network content (e.g., objects of email, webpages, network traffic) in real-time or near real-time on a variety of devices, including endpoint devices of a network. In many cases, indicators such as signatures may be utilized by the systems to detect suspicious objects and other artifacts of cyberattacks.
As malware adapts and evolves to existing cybersecurity measures and new cyberattacks are launched, the techniques utilized in such detection, and the indicators used, must also be updated. Such updating can be achieved by a remote updating process over a network connection, or through a direct update delivered via a physical interconnect such as Universal Serial Bus (“USB”). In many systems, updating can require a short, yet discreet period of downtime to complete requiring the cybersecurity system to cease functioning. This downtime can lead to potential lapses in detection of cyberthreats by the cybersecurity system during the update process. Additionally, conventional update processes can create a queue of objects to be evaluated and/or allow for certain objects to not be evaluated for threats, leading to an increased cybersecurity risk.
Conventional systems of updating endpoints typically perform these updates by bringing the currently running analyses to an end, and then performing any updates to the necessary components within the system. During the update process, the endpoint device cannot process newly received suspicious objects. In this condition, the endpoint devices must either delay cybersecurity analyses on these suspicious objects until the update process is complete or ignore the suspicious objects entirely. In the former situation, the endpoint devices may halt processing temporarily until the update process is completed, which can create bottlenecks and inefficiencies. In the latter situation, the endpoint devices are left vulnerable to attacks as newly received suspicious objects are ignored and not processed. Other cybersecurity systems may wait to apply the update until a reboot or other shutdown process is occurring to not interfere with normal operation of the endpoint device. In these instances, the endpoint devices may be left vulnerable to cyberattacks that would have been detected had the update been applied immediately.
Embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
Based on the problems identified above, there is a need for a cybersecurity system that may be updated without creating a break in the analysis of suspicious objects. Various embodiments of the disclosure relate to a system or method configured to concurrently manage cybersecurity analyzers during updates such as that resident on a protected endpoint device. The methods and systems disclosed herein utilize at least two cybersecurity analyzers wherein a first analyzer is operable to protect an endpoint device, but, during an update process at least a second analyzer is launched (i.e., generated) to protect the endpoint device. The second generated analyzer may be a software instance of the first analyzer, but the second analyzer includes or is modified with at least some of the updated functions, features, or data received from the update process. In some embodiments, the two analyzers may constitute different instances (i.e., processes) of a single analysis program, sharing some, but not necessarily all, of the components of the analysis program. Once the second analyzer is generated successfully a queue or queues (i.e., ordered storage for object or object identifiers (IDs)) of the cybersecurity system that manages the inflow of suspicious objects for processing can direct (i.e., make available) all subsequently received suspicious objects to the updated second analyzer instead of the first analyzer.
The components (e.g., algorithms, functions, resources, memory, etc.) shared by the first and second analyzers may include the core logic or engines of the analysis program that perform analysis. The components that differ between the first and second analyzers (that is, the just-mentioned functions, features, or data) may include the cybersecurity analytics content (sometimes referred to as security content), which, in various embodiments of the disclosure, include, for example, one or more sets of rules. These rule sets may include detection rules, which, when applied, may be used in detecting specific indicators (e.g., artifacts or characteristics of the object such as compliance with applicable communication protocols, or its behavior during processing) that may be associated with suspiciousness, maliciousness or at least unwanted content/activities. The rule set may also include classification rules that may be used in determining whether the detected indicators correspond (e.g., correlate) to known indicators of a cyberattack sufficiently to allow the analyzer to declare (and report) the object as malicious, that is, a threat associated with a cyberattack.
In some embodiments, queued items (e.g., objects or object IDs) already in the queue for the first analyzer are allowed to finish under the original, non-updated cybersecurity analytics content, e.g., a non-updated rule set applied by the first analyzer. In response to the queue from the first analyzer being depleted (i.e., drained), the first analyzer is terminated or otherwise marked for reclamation via typical memory management methods. At that point, analysis of any additionally received objects may be carried out by the second analyzer, using the updated set of rules or other security content. In some embodiments, once the second analyzer has been successfully created, some or all of the remaining queue (e.g., objects or object IDs) of the first analyzer may be transferred to the second analyzer if doing so would not create a significant gap in processing or other degradation in service. In yet other embodiments, the remaining suspicious objects in the first queue may be duplicated within a second queue, allowing for comparisons to be made between the results of analysis by the first analyzer applying a first set of rules or other cybersecurity analytics content and the results of analysis by the second analyzer applying a second set of rules or other cybersecurity analytics content. As a result, the cybersecurity system or subsystem can generate comparison or correlation data that may aid in the detection of cybersecurity threats or sent to a cybersecurity analyst for troubleshooting and analysis of the first and second cybersecurity analytics content. Thus, the system can be configured to manage the queue in such a way that the threat detection process is not paused, stopped or otherwise interrupted by the update process.
As an illustrated example, a server may communicate with a cybersecurity agent, namely logic operating on an endpoint device supporting rule-based cybersecurity analytics that an update is available. The cybersecurity agent receives the cybersecurity analytics content update from the server and identifies the appropriate analyzer associated with the updated rule content. The cybersecurity agent initializes a second analyzer and configures its operation with the updated rule content while allowing, in the meantime, the first analyzer to continue operating with current, pre-updated rule content. Once the cybersecurity agent initializes the second analyzer, objects to be analyzed may be “routed” (i.e., sent or otherwise made available) to the second analyzer while no further objects are provided to the first analyzer. Once the first analyzer completes processing objects being analyzed, the cybersecurity agent terminates (i.e. exits) the first analyzer while the second analyzer remains in operation. In this way, the cybersecurity protection provided to the endpoint device does not experience downtime in its processing of objects during the update process.
The cybersecurity analytics content may include a set of core data that can include, but is not limited to, new indicators, new rules, new methods of operation for pre-configured analyzers within the cybersecurity system, new analyzers that can be generated via an analyzer generation logic, an updated analyzer generation logic that can parse and select analyzers based on the type of object and associated context data, and/or updated thresholds and reporting rules for threat evaluation and remedial actions.
These analytics updates may be provided periodically or aperiodically in response to detecting new threats or other security concerns. In many embodiments, the analytics updates may be provided by a threat detection vendor. Alternatively, in certain embodiments, cybersecurity analytics updates may be created by an enterprise or other organization and distributed on a private network within and for use by the organization.
One practical application of the invention is to reduce or eliminate downtime in threat analysis during the update period. In this way, the threat analysis logic, such as inspection logic, is maintained and these devices are less vulnerable to cyberattacks. Unlike conventional fail-safe systems that simply utilize redundant components to avoid single points of failure, embodiments disclosed herein allow for the concurrent analyzer management and cybersecurity analytics update process to occur on a single endpoint device such as through a plurality of differentiated (updated and non-updated) analyzer processes, for example, without the need for additional systems. By having such a method and system in place, solutions can be provided that allow users to reduce costs and spend less time planning and scheduling updates on a system or within a network.
In the following description, certain terminology is used to describe features of the invention. For example, in certain situations, the term “logic” is representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, logic may include circuitry such as processing circuitry (e.g., a microprocessor, one or more processor cores, a programmable gate array, a microcontroller, an application specific integrated circuit, etc.), wireless receiver, transmitter and/or transceiver circuitry, semiconductor memory, combinatorial logic, or other types of electronic components.
As software, logic may be in the form of one or more software modules, such as executable code in the form of an executable application, an application programming interface (API), a subroutine, a function, a procedure, an applet, a servlet, a routine, source code, object code, a shared library/dynamic load library, or one or more instructions. These software modules may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of non-transitory storage medium may include, but is not limited or restricted to a programmable circuit; a semiconductor memory; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the executable code is stored in persistent storage.
The term “network content” generally refers to information transmitted over a network as one or more messages, which generally refer to signaling (wired or wireless) as either information placed in a prescribed format and transmitted in accordance with a suitable delivery protocol or information made accessible through a logical data structure such as an API. Examples of the delivery protocol include but are not limited or restricted to HTTP (Hypertext Transfer Protocol); HTTPS (HTTP Secure); Simple Mail Transfer Protocol (SMTP); File Transfer Protocol (FTP); iMessage; Instant Message Access Protocol (IMAP); or the like. Hence, each message may be in the form of one or more packets, frame, or any other series of bits having the prescribed, structured format.
Yet another example of network content includes one or more files that are transferred using a data transfer protocol such as File Transfer Protocol (FTP) for subsequent storage on a file share. Where the network content is email, Instant Message or a file, the header may include the sender/recipient address, the sender/recipient phone number, or a targeted network location of the file, respectively.
The term “malware” is directed to software that produces an undesirable behavior upon execution, where the behavior is deemed to be “undesirable” based on customer-specific rules, manufacturer-based rules, or any other type of rules formulated by public opinion or a particular governmental or commercial entity. This undesired behavior may include a communication-based anomaly or an execution-based anomaly that (1) alters the functionality of an electronic device executing that application software in a malicious manner; (2) alters the functionality of an electronic device executing that application software without any malicious intent; and/or (3) provides an unwanted functionality which is generally acceptable in other context.
The term “object” generally refers to content in the form of an item of information having a logical structure or organization that enables it to be classified for purposes of analysis for malware. One example of the object may include an email message or a portion of the email message. Another example of the object may include a storage file or a document such as a Portable Document Format (PDF) document, a word processing document such as Word® document, or other information that may be subjected to cybersecurity analysis. The object may also include an executable such as an application, program, code segment, a script, dynamic link library “dll,” URL link, or any other element having a format that can be directly executed or interpreted by logic within the electronic device.
Logic may be software that includes code being one or more instructions, commands or other data structures that, when processed (e.g., executed) to perform a particular operation or a series of operations. Examples of software include an application, a process, an instance, Application Programming Interface (API), subroutine, plug-in, function, applet, servlet, routine, source code, object code, shared library/dynamic link library (dll), or a collection of HTML elements. This software may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of non-transitory storage medium may include, but are not limited or restricted to a programmable circuit; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); or persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the logic may be stored in persistent storage.
The term “transmission medium” may constitute a physical or virtual communication link between two or more systems. For instance, the transmission medium may correspond to a physical (wired or wireless) communication link between two or more network devices (defined below). The physical communication link may include electrical wiring, optical fiber, cable, bus trace, or a wireless channel using infrared, radio frequency (RF), or any other wired/wireless signaling mechanism. Alternatively, the transmission medium may correspond to a virtual communication link that provides logical connectivity between different logic (e.g. software modules).
The term “network device” should be generally construed as electronics with data processing capability and/or a capability of connecting to any type of network, such as a public network (e.g., Internet), a private network (e.g., a wireless data telecommunication network, a local area network “LAN”, etc.), or a combination of networks. Examples of a network device may include, but are not limited or restricted to, the following: a server, a mainframe, a cybersecurity device, a firewall, a router; an info-entertainment device, industrial controllers, vehicles, or an endpoint device (e.g., a laptop, a smartphone, a tablet, a desktop computer, a netbook, gaming console, a medical device, or any general-purpose or special-purpose, user-controlled electronic device).
Lastly, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
As this invention is susceptible to embodiments of many different forms, it is intended that the present disclosure is to be considered as an example of the principles of the invention and not intended to limit the invention to the specific embodiments shown and described.
Referring to
The content update system 130 is depicted in
In many embodiments, the cybersecurity agent 116 and cybersecurity appliance 110 may be configured to perform cybersecurity threat analytics, such as inspecting ingress data traffic, identifying whether any artifacts of the data traffic may include malware, and if so, analyzing at least those artifacts. In certain embodiments, these analytics may be conducted in a virtual machine to detect anomalous behaviors that would be present if the data traffic was actually processed by network devices 115, 125, 135. As a result of the inspections done by the cybersecurity agent 116 and/or the cybersecurity appliance 110, artifacts and objects can be determined to either be malicious or benign.
In a variety of embodiments, when new threats or methods of detection have been developed, data to update the method of operation or particular parameters used in the analytics (“cybersecurity analytics content data”) may be developed and stored in the content repository data store 131 for transmission over the network 120 to network-connected endpoint devices (having cybersecurity agents 116) or cybersecurity appliances 110. The delivery of the updated cybersecurity analytics content data may be automatic (e.g. periodic or responsive to a particular threat such as a prescribed number of updates available, prescribed time, hours elapsed, etc.) or manually triggered by an administrator. In certain embodiments, the updated cybersecurity analytics content data can be provided in an offline process such as, but not limited to, presenting the updated cybersecurity analytics content data in the form of a drive coupled to the network device 115, 125, 135, or cybersecurity appliance 110. The cybersecurity analytics content data may be comprised of various data packages as needed including functions, applications, data files, and/or databases. As those skilled in the art will recognize, a variety of data formats may be utilized to transmit cybersecurity analytics content update data.
In many embodiments, the creation or reception of updated cybersecurity analytics content data will begin the analyzer update process within the cybersecurity agent 116 or cybersecurity appliance 110. The structure of cybersecurity agents 116, and cybersecurity appliances 110 are discussed in more detail below within the discussion of
As those skilled in the art will recognize, the specific naming, order, and number of devices and logics may vary within the cybersecurity analyzer update and concurrent management system 100 based on the desired application. It is contemplated that the content update system 130 may be deployed as a device but may also be implemented in a cloud computing service for delivery and distribution of the updated cybersecurity analytics content data as described. Furthermore, it is also contemplated that the functionality of one or more content update systems 130 may be incorporated into another management system when malware updating is to be conducted at a centralized resource. By way of example and not limitation, a private organization may incorporate a content update system 130 to distribute updated cybersecurity analytics content data developed on its own cybersecurity systems.
Referring now to
Within the persistent memory system 220 of the embodiment depicted in
In a number of embodiments, the update logic 224 can be utilized for the network device 115 to receive and manage updated cybersecurity analytics content data that is then stored within the security content data store 240. This process may be effected through the use of a user interface (e.g., a graphical user interface or a network interface to a remote administrator console) represented by input/output 230 that prompts the user/system administrator for authorization for commencing an update. In certain embodiments, the cybersecurity analyzer update process may automatically be authorized or executed without any user input. Additionally, the choice between manual and automatic updating may be evaluated by update logic 224 through one of a number of predetermined or administrator-customizable rules (which may be included in the security content data and themselves subject to update). initiation of the update process can be based on a number of factors, including, but not limited to, the type of threat, the number of evaluations done over a given period of time, time of day generation occurs, and/or the security level of the given user.
In further embodiments, the inspection logic 221 can be utilized to inspect objects for threats. In certain embodiments, the inspection logic 221 may facilitate the updating process in lieu of the update logic 224 which may not be present in said embodiments. In additional embodiments, the inspection logic 221 may inspect network content for threats or select which objects may need to be further inspected for threats. In some embodiments, the inspection logic 221 may receive an object for inspection and communicate with the analyzer logic 223, which can generate an analyzer based on the type of content (e.g., network traffic, email, documents, etc.) that needs to be inspected. In some embodiments, the inspection logic 221 may be configured such that no changes are needed to the system for the inspection logic 221 to facilitate the cybersecurity analyzer update and concurrent management process. In this way, the cybersecurity analyzer update and concurrent management system 100 may be deployed more easily on legacy systems. In still further embodiments, the inspection logic 221 may be configured to manage the queueing process for the objects that are to be inspected for threats.
In a variety of embodiments, the inspection logic 221 may also facilitate selection of suspicious network content and objects for further inspection. The inspection logic 221 can be configured to store and otherwise utilize indicators such as signatures or other identifying characteristics of objects in order to better facilitate selection and detection of items that require further analysis for potential threats. In some embodiments, the inspection logic 221 may evaluate indicators of compromise that may or may not be determinative of the classification of the object along with characteristics of the object including associated communication protocol anomalies. In additional embodiments, the inspection logic 221 may be updated through the analyzer update process to include new methods, triggers, or signatures for improved inspection.
In many embodiments, reporting logic 222 can be utilized to generate reports, display screens (e.g., on administrator consoles), emails, or other communications via the input/output interface 230 to the user/system administrator informing them of the results of the threat inspections. In further embodiments, the reporting logic 222 may trigger an action (e.g., to remediate a threat) based on a predetermined or customizable rule set subject to update through the analyzer update process. In still further embodiments, the reporting logic 222 may be updated by the updated cybersecurity analytics content data such that responses to certain threats may change and thus yield a different response, such as when a particular threat is found to be more harmful or pervasive requiring a more aggressive and immediate response, or vice-versa.
In numerous embodiments, the analyzer logic 223 can be utilized to generate a plurality of analyzers to evaluate threat levels of certain objects. In some embodiments, the analyzer logic 223 may include an analyzer generation logic shown as 305 of
It can be understood by those skilled in the art, that context data can be utilized to add relevant information to evaluation of the threat level of an object. In various embodiments, the context data may aid in the determination of what/which analyzers to utilize. Context data may also include a variety of types of information regarding the computing environment associated with the network device 115 the suspicious object was retrieved from, including, but not limited to, operating system data, operating system settings, network settings, network security settings, local evaluation data, software data, software settings, and/or software version. It would be obvious to those skilled in the art that context data may include other forms of data not specifically outlined above which may aid in the analysis of the threat level undertaken on a suspicious object.
In still further embodiments, the analyzer generation logic 305 may also analyze context data associated with the object and determine, for example, what version of email client that might be used by a user or system. In response to this determination, the analyzer generation logic may select a second set of analyzers from within the analyzer logic 223 that correspond to different versions of the email client. The selected analyzers may analyze the object within multiple environments wherein the client versions are different. The analyzer logic 223 may employ a single analyzer or multiple analyzers to evaluate a single object. Additionally, these multiple analyzers may be synchronous or asynchronous in nature, depending on the settings or computing environment involved. Analyzer logic is discussed in more detail in conjunction with
In a variety of embodiments, the security content data store 240 is configured to store content utilized by logics within the cybersecurity agent 116. The content update data may include, but is not limited or restricted to, analyzer data, scanning or inspection rules, heuristics, other digital signatures indicative of threats, threat correlation/classification rules and/or remediation rules. For example, during the analyzer update process, the update logic 224 may direct the received data to be stored within the security content data store 240 as updated security content data prior to deploying within the analyzer logic 223. In certain embodiments, the security content data store 240 can be located externally from the network device 200. In some embodiments, the external updated cybersecurity analytics content data can be provided via a cloud-based service. In additional embodiments, the security content data store 240 may be contained within another logic 221, 222, 223, 224, 225 within the cybersecurity agent.
In a number of embodiments, an external interface logic 225 may be utilized to provide a method of incorporating external or third-party tools within the cybersecurity agent 116. By way of example and not limitation, a third-party virus scanner may be utilized to supplement the malware detection methods. In certain embodiments, the external interface logic 225 may be utilized to receive updated cybersecurity analytics content data from sources not associated with the vendor of the cybersecurity agent 116. In these cases, the settings or rules regarding such third-party data and related interfaces may be administered by the system user/administrator.
Cybersecurity appliances 110 may also be deployed and function equivalently to the network devices 115 comprising cybersecurity agents 116; however, the cybersecurity appliance often constitutes a specific network device designated, either solely or partially, to conduct cybersecurity analytics on one or more objects to determine whether any of the objects is a cybersecurity threat. A cybersecurity appliance 110 can be communicatively coupled with a network 120 via an input/output interface, where the network 120 may operate as a public network such as the Internet or a private network (e.g., a local area network “LAN”, wireless LAN, etc.). A cybersecurity appliance 110 may be configured to monitor or scan nearly all network traffic going into and out of a monitored system or devices. In some embodiments, the cybersecurity appliance 110 may work in cooperation with monitoring software installed on a network device without a cybersecurity agent installed to scan all newly stored, deleted or modified files within a file system of the monitored network device. In other embodiments, the cybersecurity appliance 110 may work cooperatively with a cybersecurity agent 116 commutatively coupled with the cybersecurity appliance 110. In these embodiments, the cybersecurity agent 116 may offload suspicious objects for analysis or may request a secondary or more thorough inspection of the suspicious object. It is contemplated that the cybersecurity appliance 110 may comprise logics and data stores equivalent to those found within a cybersecurity agent 116. As those skilled in the art would understand, due to the increased volume of suspicious objects to evaluate, a cybersecurity appliance 110 may be configured in some embodiments to operate such that real-time or near real-time analysis is not required, unlike cybersecurity agents 116 which typically operate at real-time or near real-time on suspicious objects encountered during normal operation of the network device 115. In some instances, the cybersecurity agent 116 may accomplish real-time or near real-time operation by prioritizing suspicious objects for inspection and/or offloading objects to external cybersecurity agents 116 operated on other network devices 125, 135 or to connected cybersecurity appliances 110.
Referring to
In some embodiments, the analyzer logic 2231 may utilize pre-generated analyzers available in other systems. In certain embodiments, the analyzers may be behavioral analyzers comprising instrumented virtual machines to process the object. The analyzers can be configured according to the context data on which to base the settings of the analyzer. In additional embodiments, the analyzers may emulate a specific computing environment and monitor various settings such as, but not limited to, buffer, memory reads/writes, and/or function calls and returns. It is contemplated that the analyzers utilized by the analyzer logic may include a variety of analyzers and may also include other analyzer types based upon the application needed.
In further embodiments, the object processing logic 311 can be utilized to process the received object to determine an overall threat level. In still further embodiments, this threat determination may result in a numerical score that is passed back to the inspection logic 221 (
In embodiments wherein the analyzer generation logic 305 receives objects for analysis, the object can be parsed to determine the object type (or can determine the object type from received context data) to determine at least one analyzer configured to analyze the object. The selected analyzers may perform a variety of analytics on the suspicious object, including static and dynamic analyses. In certain embodiments, the analyzers may utilize dynamic methods of analysis including virtual machines that attempt to open, execute or otherwise process the objects in a monitored runtime environment. For these purposes, the virtual machine may be provisioned with an operating system and applications (e.g., web browser, email application, document reader, etc.) required to process the object (e.g., webpage, email, document, etc.). In additional embodiments, the analyzers may analyze the suspicious object in an emulated environment. Under such dynamic analyses, the analyzer may examine the characteristics of the file during and after execution. These characteristics may include the types of data stored as a result of the object's execution, calls made to other systems, contents of memory stacks, etc. The analyzers can then log the results of the monitoring and classify the object. In some embodiments, classification is done by correlating the results against known malware. Correlation can be configured to aggregate the analyzer data in accordance with any desired grouping. One grouping may be based on the source of the suspicious object under analysis, while another grouping may be based upon the file type of the suspicious object. The correlation may be compared to a plurality of thresholds which can be indicative of an object's overall threat level. These thresholds may be based on known malware and can be updated based on newly updated cybersecurity analytics content data. In further embodiments, the analyzers may utilize static means of threat analysis including, but not limited to, examining object characteristics without execution, and/or generating a hash of the object for comparison with a pre-existing hash database.
In certain instances, the object being analyzed by the analyzer, may generate or locate an embedded object (sub-object) that itself may need to be evaluated. By way of example and not limitation, a text file may contain a hyperlink to an external website, which itself should be evaluated as part of the overall threat analysis. As such, the analyzer generation logic 305 may determine if new objects have been found within the analyzed object. When embedded objects are found, the analyzer generation logic 305 can again select an analyzer based upon the embedded object to conduct analysis on the embedded object described above. When all objects have been found and no further embedded objects are determined, the analyzer logic 223 may then generate and return a value related to the threat level of the object and any sub-object, which can then be aggregated with the results of either the inspection logic or other similar logic to determine the overall results and generate a final threat score. In additional embodiments, the analyzer logic 2231 correlates the threat values against other analyzers or static analyses, generates a final score, and then provides those results back to the calling logic, such as the inspection logic for further processing.
The classification of threat level (e.g., as malware or benign) can be realized through a variety of techniques including comparison of the results and/or final score to a set of predetermined thresholds or rules. In many embodiments, a responsive action may be taken and/or a report generated based upon the classification of the score data generated by the analyzer logic 2231. Those skilled in the art will recognize that various types of rules, heuristics, or other comparisons can be made in order to generate an overall threat determination. When updated cybersecurity analytics content data is received by the cybersecurity agent or appliance, the analyzer logic 2231 may be updated which could include receiving new types of analyzers that can be generated by the analyzer generation logic 305, or updating the types of rules, heuristics, or comparisons within the object processing logic 311. The processing of updates to these components in a seamless updating system is described in more detail in
Referring to
In a variety of embodiments, the analyzer logic 223 would have a first analyzer 310 receiving and queueing objects within the first object queue logic 312 for threat analysis. In response to received updated cybersecurity analytics content data, the analyzer generation logic 305 can generate a second analyzer 320 which may then be configured to receive any subsequent objects for threat analysis that would have otherwise been directed to the first analyzer 310. In certain embodiments, the updated second analyzer 320 may be similar to the first analyzer 310 with the exception of an updated object processing logic 321. In other embodiments, the second analyzer 320 may be an entirely new analyzer type or variation entirely different from the first analyzer 310.
During the analyzer update process, once the updated second analyzer 320 is generated and accepting new objects for threat analysis, the first analyzer 310, and in particular the object queue logic 312, may then cease to receive objects for threat evaluation. In some embodiments, the object queue of the first analyzer is linked to the corresponding queue within the second analyzer and, when the second, updated analyzer is successfully launched, will terminate queueing actions within the first analyzer object queue. In response to the queue of the first analyzer 310 being depleted (i.e. having no more objects to analyze), the first analyzer 310 may then be subject to termination or reclamation via normal memory management systems. In this way, the analyzer logic 2232 may continue to accept and process objects for threat detection without a pause in normal operation. Additionally, because the analyzer generation logic 305 can generate entirely new analyzers as they are updated, the analyzers 310, 320 do not require additional programming in order to facilitate the analyzer update process, reducing programming complexity and allowing for implementation in previously installed systems. The analyzer update and concurrent management process is described in more detail below.
Referring now to
In response to updated cybersecurity analytics content data being generated, it is transmitted to corresponding cybersecurity agents and/or appliances (block 420). In some embodiments, the updated cybersecurity analytics content data may be transferred to an endpoint device from a cybersecurity appliance or administrator console or other content management device. When the updated cybersecurity analytics content data is transmitted successfully, the receiving cybersecurity agent or cybersecurity appliance can then parse the updated cybersecurity analytics content data (block 430). In some embodiments, the updated cybersecurity analytics content data may comprise any mixture of updated rules, heuristics, digital signatures, analyzers, or threshold settings. The parsing of the data allows the updated cybersecurity analytics content data to be identified, extracted and transmitted to the proper logic or data store within the network device.
Once received, the process can begin to update the security content data store 240 as well as any cybersecurity agent logic as required (block 440). In response to a successful update, an inspection logic may be tasked with determining a threat analysis to be performed on a suspicious object and request the use of an analyzer. In a variety of embodiments, the analyzer generation logic may then determine what type of analyzer should be instantiated to assess the received object and conduct the threat analysis. With the newly updated analyzer generation logic or other cybersecurity analytics content data, the analyzer generation logic can generate an additional analyzer comprising the updated rules and other logic (block 450).
Once an updated analyzer has been generated, any subsequent objects for threat analysis that would otherwise be fed into the queue of the first analyzer, are now directed into the queue of the second analyzer (block 460). In certain embodiments, the queue can signal that a successful launching of the updated second analyzer has occurred and may start transferring (that is, making available) subsequent suspicious objects to the second analyzer. In some embodiments, some or all of the contents (or pointers/references to the contents) of entries within the queue for the first analyzer are copied to the queue of the second analyzer. In a number of embodiments, this queueing operation to the updated analyzer is handled by the inspection logic that requested the threat analysis. As a result of the successful generation of an updated analyzer which is operable to accept suspicious objects for analysis, the seamless update process 400 then stops queueing new suspicious objects to the original, first analyzer (block 470).
Any objects that remain in the queue of the first analyzer remain active and are processed within the first analyzer (block 480). If after processing a suspicious object in the first analyzer queue, there are more objects to analyze, the original analyzer continues to analyze the objects for threats until there are no more objects remaining in the original object queue (block 485). Once the queue for the original analyzer is depleted, the original analyzer may then be terminated or marked for reclamation by the system (block 490). Once the original analyzer is terminated, the process 400 can end with the updated analyzer having taken control of the incoming object queue without any disruption in the flow of objects to evaluate for threats. The timing of the analyzer update and concurrent management update process is described in more detail below.
Referring to
In response to the successful creation/launching of an analyzer 310, a first return signal 551 may be generated that is received by the inspection logic 221 which can signal that the first analyzer 310 may now receive objects to evaluate. In response to a subsequent inspect request 512, the inspection logic 221 can enqueue the objects related to the inspect request 512 into the first analyzer 310. It should be noted that the embodiment depicted in
In a number of embodiments, the analyzers 310, 320 receive objects for a threat analysis and return a value to the inspection logic 221 related to that analysis. The analyzers 310, 320 depicted in
In the analyzer update and concurrent management system 500 depicted in
In response to the updated cybersecurity analytics content data transmission 561, the rules/signatures regarding inspection of objects, and/or the analyzers themselves may be updated within the cybersecurity agent or security content data store 240. If a new inspect request 513 comes in before the updated cybersecurity analytics content data transmission 561 is finished or the inspection logic 221, or cybersecurity agent 116 has finished updating the core content, the object may still be enqueued within the first analyzer 310. After a period of time, the download of the updated cybersecurity analytics content data is completed and ready to be utilized. When this occurs, the inspection logic 221 may send a second create analyzer call 526 to the analyzer generation logic 305 to generate a second analyzer 320 which comprises the updated rules from the content update data provided by the content repository data store 131.
Both the first analyzer 310 and second analyzer 320 may be active at the same time. However, subsequent objects to be evaluated can now be directed to the second analyzer 320 which utilizes the updated data. The first analyzer 310, in many embodiments, can stay active and continue analyzing the objects still within its queue.
Thus, the inspect request 514 can be evaluated by the second analyzer 320 utilizing the updated rules. For example, the network device 115 may have a first analyzer 310 generated to analyze email content, and subsequently receives a new suspicious email after the second analyzer is successfully generated. In this case however, the inspection logic 221 enqueues the new suspicious email to the second analyzer 320 comprising the updated cybersecurity analytics data. In certain embodiments, suspicious objects may be enqueued within an internal queue associated with either the first analyzer 310 or the second analyzer 320. In additional embodiments, the inspection logic 221 manages the enqueuing process between the multiple analyzers 310, 320. In further embodiments, the queueing for the two analyzers 310, 320 may be handled by a separate queue management logic that is not included within either the inspection logic 221 or analyzers 310, 320.
It should be noted that the seamless update process as depicted in
It should be understood by those skilled in the art that asynchronous implementations may come in a variety of forms. By way of example and not limitation, the first and second analyzers 310, 320 may utilize a “promises and futures” system. In these embodiments, the inspection logic 221 may generate a future for the value of the object that is sent to the analyzers 310, 320 for threat evaluation. In response, the analyzers 310, 320 may generate a promise for each of the received futures. In this way, the inspection logic 221 may comprise a reference to each analyzer as long as it comprises at least one future, and the analyzers 310, 320 may remain active as long as they comprise at least one future. When the second analyzer 320 has been instantiated and begins to receive new objects for analysis, the inspection logic 221 can release the reference to the first analyzer 310. In some embodiments, the termination or marking for release is not done until all futures have been returned by the first analyzer 310. In additional embodiments, the first analyzer 310 would remain active until it runs out of promises. When this happens, the first analyzer 310 is then subject to termination or reclamation by the system in accordance to whatever resource optimization scheme is being utilized.
As shown in
In the foregoing description, the invention is described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims.
This application claims the benefit of U.S. Provisional Application No. 62/827,039, filed Mar. 30, 2019, the entire contents of both are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6898632 | Gordy et al. | May 2005 | B2 |
6941348 | Petry | Sep 2005 | B2 |
7080407 | Zhao et al. | Jul 2006 | B1 |
7080408 | Pak et al. | Jul 2006 | B1 |
7243371 | Kasper et al. | Jul 2007 | B1 |
7308716 | Danford et al. | Dec 2007 | B2 |
7448084 | Apap et al. | Nov 2008 | B1 |
7458098 | Judge et al. | Nov 2008 | B2 |
7467408 | O'Toole, Jr. | Dec 2008 | B1 |
7496961 | Zimmer et al. | Feb 2009 | B2 |
7519990 | Xie | Apr 2009 | B1 |
7540025 | Tzadikario | May 2009 | B2 |
7639714 | Stolfo et al. | Dec 2009 | B2 |
7698548 | Shelest et al. | Apr 2010 | B2 |
7779463 | Stolfo et al. | Aug 2010 | B2 |
7854007 | Sprosts et al. | Dec 2010 | B2 |
7937387 | Frazier et al. | May 2011 | B2 |
7949849 | Lowe et al. | May 2011 | B2 |
8006305 | Aziz | Aug 2011 | B2 |
8020206 | Hubbard et al. | Sep 2011 | B2 |
8045458 | Alperovitch et al. | Oct 2011 | B2 |
8069484 | McMillan et al. | Nov 2011 | B2 |
8171553 | Aziz et al. | May 2012 | B2 |
8201246 | Wu et al. | Jun 2012 | B1 |
8204984 | Aziz et al. | Jun 2012 | B1 |
8214905 | Doukhvalov et al. | Jul 2012 | B1 |
8291499 | Aziz et al. | Oct 2012 | B2 |
8370938 | Daswani et al. | Feb 2013 | B1 |
8370939 | Zaitsev et al. | Feb 2013 | B2 |
8375444 | Aziz et al. | Feb 2013 | B2 |
8438644 | Watters et al. | May 2013 | B2 |
8464340 | Ahn et al. | Jun 2013 | B2 |
8494974 | Watters et al. | Jul 2013 | B2 |
8516593 | Aziz | Aug 2013 | B2 |
8528086 | Aziz | Sep 2013 | B1 |
8539582 | Aziz et al. | Sep 2013 | B1 |
8549638 | Aziz | Oct 2013 | B2 |
8561177 | Aziz et al. | Oct 2013 | B1 |
8566476 | Shiffer et al. | Oct 2013 | B2 |
8566946 | Aziz et al. | Oct 2013 | B1 |
8584239 | Aziz et al. | Nov 2013 | B2 |
8635696 | Aziz | Jan 2014 | B1 |
8689333 | Aziz | Apr 2014 | B2 |
8713681 | Silberman et al. | Apr 2014 | B2 |
8776229 | Aziz | Jul 2014 | B1 |
8793278 | Frazier et al. | Jul 2014 | B2 |
8793787 | Ismael et al. | Jul 2014 | B2 |
8813050 | Watters et al. | Aug 2014 | B2 |
8832829 | Manni et al. | Sep 2014 | B2 |
8850571 | Staniford et al. | Sep 2014 | B2 |
8881271 | Butler, II | Nov 2014 | B2 |
8881282 | Aziz et al. | Nov 2014 | B1 |
8898788 | Aziz et al. | Nov 2014 | B1 |
8935779 | Manni et al. | Jan 2015 | B2 |
8949257 | Shiffer et al. | Feb 2015 | B2 |
8984638 | Aziz et al. | Mar 2015 | B1 |
8990939 | Staniford et al. | Mar 2015 | B2 |
8990944 | Singh et al. | Mar 2015 | B1 |
8997219 | Staniford et al. | Mar 2015 | B2 |
9009822 | Ismael et al. | Apr 2015 | B1 |
9009823 | Ismael et al. | Apr 2015 | B1 |
9015846 | Watters et al. | Apr 2015 | B2 |
9027135 | Aziz | May 2015 | B1 |
9071638 | Aziz et al. | Jun 2015 | B1 |
9104867 | Thioux et al. | Aug 2015 | B1 |
9106630 | Frazier et al. | Aug 2015 | B2 |
9106694 | Aziz et al. | Aug 2015 | B2 |
9118715 | Staniford et al. | Aug 2015 | B2 |
9122503 | Hoff | Sep 2015 | B1 |
9159035 | Ismael et al. | Oct 2015 | B1 |
9171160 | Vincent et al. | Oct 2015 | B2 |
9176843 | Ismael et al. | Nov 2015 | B1 |
9189627 | Islam | Nov 2015 | B1 |
9195829 | Goradia et al. | Nov 2015 | B1 |
9197664 | Aziz et al. | Nov 2015 | B1 |
9223972 | Vincent et al. | Dec 2015 | B1 |
9225740 | Ismael et al. | Dec 2015 | B1 |
9241010 | Bennett et al. | Jan 2016 | B1 |
9251343 | Vincent et al. | Feb 2016 | B1 |
9262635 | Paithane et al. | Feb 2016 | B2 |
9268936 | Butler | Feb 2016 | B2 |
9275229 | LeMasters | Mar 2016 | B2 |
9282109 | Aziz et al. | Mar 2016 | B1 |
9292686 | Ismael et al. | Mar 2016 | B2 |
9294501 | Mesdaq et al. | Mar 2016 | B2 |
9300686 | Pidathala et al. | Mar 2016 | B2 |
9306960 | Aziz | Apr 2016 | B1 |
9306974 | Aziz et al. | Apr 2016 | B1 |
9311479 | Manni et al. | Apr 2016 | B1 |
9355247 | Thioux et al. | May 2016 | B1 |
9356944 | Aziz | May 2016 | B1 |
9363280 | Rivlin et al. | Jun 2016 | B1 |
9367681 | Ismael et al. | Jun 2016 | B1 |
9398028 | Karandikar et al. | Jul 2016 | B1 |
9413781 | Cunningham et al. | Aug 2016 | B2 |
9426071 | Caldejon et al. | Aug 2016 | B1 |
9430646 | Mushtaq et al. | Aug 2016 | B1 |
9432389 | Khalid et al. | Aug 2016 | B1 |
9438613 | Paithane et al. | Sep 2016 | B1 |
9438622 | Staniford et al. | Sep 2016 | B1 |
9438623 | Thioux et al. | Sep 2016 | B1 |
9459901 | Jung et al. | Oct 2016 | B2 |
9467460 | Otvagin et al. | Oct 2016 | B1 |
9483644 | Paithane et al. | Nov 2016 | B1 |
9495180 | Ismael | Nov 2016 | B2 |
9497213 | Thompson et al. | Nov 2016 | B2 |
9507935 | Ismael et al. | Nov 2016 | B2 |
9516057 | Aziz | Dec 2016 | B2 |
9519782 | Aziz et al. | Dec 2016 | B2 |
9536091 | Paithane et al. | Jan 2017 | B2 |
9537972 | Edwards et al. | Jan 2017 | B1 |
9560059 | Islam | Jan 2017 | B1 |
9565202 | Kindlund et al. | Feb 2017 | B1 |
9591015 | Amin et al. | Mar 2017 | B1 |
9591020 | Aziz | Mar 2017 | B1 |
9594904 | Jain et al. | Mar 2017 | B1 |
9594905 | Ismael et al. | Mar 2017 | B1 |
9594912 | Thioux et al. | Mar 2017 | B1 |
9609007 | Rivlin et al. | Mar 2017 | B1 |
9626509 | Khalid et al. | Apr 2017 | B1 |
9628498 | Aziz et al. | Apr 2017 | B1 |
9628507 | Haq et al. | Apr 2017 | B2 |
9633134 | Ross | Apr 2017 | B2 |
9635039 | Islam et al. | Apr 2017 | B1 |
9641546 | Manni et al. | May 2017 | B1 |
9654485 | Neumann | May 2017 | B1 |
9661009 | Karandikar et al. | May 2017 | B1 |
9661018 | Aziz | May 2017 | B1 |
9674298 | Edwards et al. | Jun 2017 | B1 |
9680862 | Ismael et al. | Jun 2017 | B2 |
9690606 | Ha et al. | Jun 2017 | B1 |
9690933 | Singh | Jun 2017 | B1 |
9690935 | Shiffer et al. | Jun 2017 | B2 |
9690936 | Malik et al. | Jun 2017 | B1 |
9736179 | Ismael | Aug 2017 | B2 |
9740857 | Ismael et al. | Aug 2017 | B2 |
9747446 | Pidathala et al. | Aug 2017 | B1 |
9749343 | Watters et al. | Aug 2017 | B2 |
9749344 | Watters et al. | Aug 2017 | B2 |
9756074 | Aziz et al. | Sep 2017 | B2 |
9773112 | Rathor et al. | Sep 2017 | B1 |
9781144 | Otvagin et al. | Oct 2017 | B1 |
9787700 | Amin et al. | Oct 2017 | B1 |
9787706 | Otvagin et al. | Oct 2017 | B1 |
9792196 | Ismael et al. | Oct 2017 | B1 |
9824209 | Ismael et al. | Nov 2017 | B1 |
9824211 | Wilson | Nov 2017 | B2 |
9824216 | Khalid et al. | Nov 2017 | B1 |
9825976 | Gomez et al. | Nov 2017 | B1 |
9825989 | Mehra et al. | Nov 2017 | B1 |
9838408 | Karandikar et al. | Dec 2017 | B1 |
9838411 | Aziz | Dec 2017 | B1 |
9838416 | Aziz | Dec 2017 | B1 |
9838417 | Khalid et al. | Dec 2017 | B1 |
9846776 | Paithane et al. | Dec 2017 | B1 |
9876701 | Caldejon et al. | Jan 2018 | B1 |
9888016 | Amin et al. | Feb 2018 | B1 |
9888019 | Pidathala et al. | Feb 2018 | B1 |
9892261 | Joram et al. | Feb 2018 | B2 |
9904955 | Watters et al. | Feb 2018 | B2 |
9910988 | Vincent et al. | Mar 2018 | B1 |
9912644 | Cunningham | Mar 2018 | B2 |
9912681 | Ismael et al. | Mar 2018 | B1 |
9912684 | Aziz et al. | Mar 2018 | B1 |
9912691 | Mesdaq et al. | Mar 2018 | B2 |
9912698 | Thioux et al. | Mar 2018 | B1 |
9916440 | Paithane et al. | Mar 2018 | B1 |
9921978 | Chan et al. | Mar 2018 | B1 |
9934376 | Ismael | Apr 2018 | B1 |
9934381 | Kindlund et al. | Apr 2018 | B1 |
9946568 | Ismael et al. | Apr 2018 | B1 |
9954890 | Staniford et al. | Apr 2018 | B1 |
9973531 | Thioux | May 2018 | B1 |
10002252 | Ismael et al. | Jun 2018 | B2 |
10019338 | Goradia et al. | Jul 2018 | B1 |
10019573 | Silberman et al. | Jul 2018 | B2 |
10025691 | Ismael et al. | Jul 2018 | B1 |
10025927 | Khalid et al. | Jul 2018 | B1 |
10027689 | Rathor et al. | Jul 2018 | B1 |
10027690 | Aziz et al. | Jul 2018 | B2 |
10027696 | Rivlin et al. | Jul 2018 | B1 |
10033747 | Paithane et al. | Jul 2018 | B1 |
10033748 | Cunningham et al. | Jul 2018 | B1 |
10033753 | Islam et al. | Jul 2018 | B1 |
10033759 | Kabra et al. | Jul 2018 | B1 |
10050998 | Singh | Aug 2018 | B1 |
10063583 | Watters et al. | Aug 2018 | B2 |
10068091 | Aziz et al. | Sep 2018 | B1 |
10075455 | Zafar et al. | Sep 2018 | B2 |
10083302 | Paithane et al. | Sep 2018 | B1 |
10084813 | Eyada | Sep 2018 | B2 |
10089461 | Ha et al. | Oct 2018 | B1 |
10097573 | Aziz | Oct 2018 | B1 |
10104102 | Neumann | Oct 2018 | B1 |
10108446 | Steinberg et al. | Oct 2018 | B1 |
10121000 | Rivlin et al. | Nov 2018 | B1 |
10122746 | Manni et al. | Nov 2018 | B1 |
10133863 | Bu et al. | Nov 2018 | B2 |
10133866 | Kumar et al. | Nov 2018 | B1 |
10146810 | Shiffer et al. | Dec 2018 | B2 |
10148693 | Singh et al. | Dec 2018 | B2 |
10165000 | Aziz et al. | Dec 2018 | B1 |
10169585 | Pilipenko et al. | Jan 2019 | B1 |
10176321 | Abbasi et al. | Jan 2019 | B2 |
10181029 | Ismael et al. | Jan 2019 | B1 |
10191861 | Steinberg et al. | Jan 2019 | B1 |
10192052 | Singh et al. | Jan 2019 | B1 |
10198574 | Thioux et al. | Feb 2019 | B1 |
10200384 | Mushtaq et al. | Feb 2019 | B1 |
10210329 | Malik et al. | Feb 2019 | B1 |
10216927 | Steinberg | Feb 2019 | B1 |
10218740 | Mesdaq et al. | Feb 2019 | B1 |
10242185 | Goradia | Mar 2019 | B1 |
10282548 | Aziz et al. | May 2019 | B1 |
10284574 | Aziz et al. | May 2019 | B1 |
10284575 | Paithane et al. | May 2019 | B2 |
10296437 | Ismael et al. | May 2019 | B2 |
10335738 | Paithane et al. | Jul 2019 | B1 |
10341363 | Vincent et al. | Jul 2019 | B1 |
10341365 | Ha | Jul 2019 | B1 |
10366231 | Singh et al. | Jul 2019 | B1 |
10380343 | Jung et al. | Aug 2019 | B1 |
10395029 | Steinberg | Aug 2019 | B1 |
10404725 | Rivlin et al. | Sep 2019 | B1 |
10417031 | Paithane et al. | Sep 2019 | B2 |
10430586 | Paithane et al. | Oct 2019 | B1 |
10432649 | Bennett et al. | Oct 2019 | B1 |
10445502 | Desphande et al. | Oct 2019 | B1 |
10447728 | Steinberg | Oct 2019 | B1 |
10454950 | Aziz | Oct 2019 | B1 |
10454953 | Amin et al. | Oct 2019 | B1 |
10462173 | Aziz et al. | Oct 2019 | B1 |
10467411 | Pidathala et al. | Nov 2019 | B1 |
10467414 | Kindlund et al. | Nov 2019 | B1 |
10469512 | Ismael | Nov 2019 | B1 |
10474813 | Ismael | Nov 2019 | B1 |
10476906 | Siddiqui | Nov 2019 | B1 |
10476909 | Aziz et al. | Nov 2019 | B1 |
10491627 | Su | Nov 2019 | B1 |
10503904 | Singh et al. | Dec 2019 | B1 |
10505956 | Pidathala et al. | Dec 2019 | B1 |
10511614 | Aziz | Dec 2019 | B1 |
10515214 | Vincent et al. | Dec 2019 | B1 |
10523609 | Subramanian | Dec 2019 | B1 |
10528726 | Ismael | Jan 2020 | B1 |
10534906 | Paithane et al. | Jan 2020 | B1 |
10552610 | Vashisht et al. | Feb 2020 | B1 |
10554507 | Siddiqui et al. | Feb 2020 | B1 |
10565378 | Vincent et al. | Feb 2020 | B1 |
10567405 | Aziz | Feb 2020 | B1 |
10572665 | Jung et al. | Feb 2020 | B2 |
10581874 | Khalid et al. | Mar 2020 | B1 |
10581879 | Paithane et al. | Mar 2020 | B1 |
10581898 | Singh | Mar 2020 | B1 |
10587636 | Aziz et al. | Mar 2020 | B1 |
10587647 | Khalid et al. | Mar 2020 | B1 |
10592678 | Ismael et al. | Mar 2020 | B1 |
10601848 | Jeyaraman et al. | Mar 2020 | B1 |
10601863 | Siddiqui | Mar 2020 | B1 |
10601865 | Mesdaq et al. | Mar 2020 | B1 |
10616266 | Otvagin | Apr 2020 | B1 |
10621338 | Pfoh et al. | Apr 2020 | B1 |
10623434 | Aziz et al. | Apr 2020 | B1 |
10637880 | Islam et al. | Apr 2020 | B1 |
10642753 | Steinberg | May 2020 | B1 |
10657251 | Malik et al. | May 2020 | B1 |
10666686 | Singh et al. | May 2020 | B1 |
10671721 | Otvagin et al. | Jun 2020 | B1 |
10671726 | Paithane et al. | Jun 2020 | B1 |
10701091 | Cunningham et al. | Jun 2020 | B1 |
10706149 | Vincent | Jul 2020 | B1 |
10713358 | Sikorski et al. | Jul 2020 | B2 |
10713362 | Vincent et al. | Jul 2020 | B1 |
10715542 | Wei et al. | Jul 2020 | B1 |
10726127 | Steinberg | Jul 2020 | B1 |
10728263 | Neumann | Jul 2020 | B1 |
10735458 | Haq et al. | Aug 2020 | B1 |
10740456 | Ismael et al. | Aug 2020 | B1 |
10747872 | Ha et al. | Aug 2020 | B1 |
10757120 | Aziz et al. | Aug 2020 | B1 |
10757134 | Eyada | Aug 2020 | B1 |
10785255 | Otvagin et al. | Sep 2020 | B1 |
10791138 | Siddiqui et al. | Sep 2020 | B1 |
10795991 | Ross et al. | Oct 2020 | B1 |
10798112 | Siddiqui et al. | Oct 2020 | B2 |
10798121 | Khalid et al. | Oct 2020 | B1 |
10805340 | Goradia | Oct 2020 | B1 |
10805346 | Kumar et al. | Oct 2020 | B2 |
10812513 | Manni et al. | Oct 2020 | B1 |
10817606 | Vincent | Oct 2020 | B1 |
10826931 | Quan et al. | Nov 2020 | B1 |
10826933 | Ismael et al. | Nov 2020 | B1 |
10834107 | Paithane et al. | Nov 2020 | B1 |
10846117 | Steinberg | Nov 2020 | B1 |
10848397 | Siddiqui et al. | Nov 2020 | B1 |
10848521 | Thioux et al. | Nov 2020 | B1 |
10855700 | Jeyaraman et al. | Dec 2020 | B1 |
10868818 | Rathor et al. | Dec 2020 | B1 |
10872151 | Kumar et al. | Dec 2020 | B1 |
10873597 | Mehra et al. | Dec 2020 | B1 |
10887328 | Paithane et al. | Jan 2021 | B1 |
10893059 | Aziz et al. | Jan 2021 | B1 |
10893068 | Khalid et al. | Jan 2021 | B1 |
10902117 | Singh et al. | Jan 2021 | B1 |
10902119 | Vashisht et al. | Jan 2021 | B1 |
10904286 | Liu | Jan 2021 | B1 |
10929266 | Goradia et al. | Feb 2021 | B1 |
20020038430 | Edwards et al. | Mar 2002 | A1 |
20020091819 | Melchione et al. | Jul 2002 | A1 |
20020095607 | Lin-Hendel | Jul 2002 | A1 |
20020169952 | DiSanto et al. | Nov 2002 | A1 |
20020184528 | Shevenell et al. | Dec 2002 | A1 |
20020188887 | Largman et al. | Dec 2002 | A1 |
20030084318 | Schertz | May 2003 | A1 |
20030188190 | Aaron et al. | Oct 2003 | A1 |
20030191957 | Hypponen et al. | Oct 2003 | A1 |
20040015712 | Szor | Jan 2004 | A1 |
20040019832 | Arnold et al. | Jan 2004 | A1 |
20040117478 | Triulzi et al. | Jun 2004 | A1 |
20040117624 | Brandt et al. | Jun 2004 | A1 |
20040236963 | Danford et al. | Nov 2004 | A1 |
20040255161 | Cavanaugh | Dec 2004 | A1 |
20040268147 | Wiederin et al. | Dec 2004 | A1 |
20050021740 | Bar et al. | Jan 2005 | A1 |
20050086523 | Zimmer et al. | Apr 2005 | A1 |
20050091513 | Mitomo et al. | Apr 2005 | A1 |
20050108562 | Khazan et al. | May 2005 | A1 |
20050125195 | Brendel | Jun 2005 | A1 |
20050149726 | Joshi et al. | Jul 2005 | A1 |
20050157662 | Bingham et al. | Jul 2005 | A1 |
20050238005 | Chen et al. | Oct 2005 | A1 |
20050262562 | Gassoway | Nov 2005 | A1 |
20050283839 | Cowburn | Dec 2005 | A1 |
20060010495 | Cohen et al. | Jan 2006 | A1 |
20060015715 | Anderson | Jan 2006 | A1 |
20060015747 | Van de Ven | Jan 2006 | A1 |
20060021029 | Brickell et al. | Jan 2006 | A1 |
20060031476 | Mathes et al. | Feb 2006 | A1 |
20060070130 | Costea et al. | Mar 2006 | A1 |
20060117385 | Mester et al. | Jun 2006 | A1 |
20060123477 | Raghavan et al. | Jun 2006 | A1 |
20060150249 | Gassen et al. | Jul 2006 | A1 |
20060161987 | Levy-Yurista | Jul 2006 | A1 |
20060173992 | Weber et al. | Aug 2006 | A1 |
20060191010 | Benjamin | Aug 2006 | A1 |
20060242709 | Seinfeld et al. | Oct 2006 | A1 |
20060251104 | Koga | Nov 2006 | A1 |
20060288417 | Bookbinder et al. | Dec 2006 | A1 |
20070006288 | Mayfield et al. | Jan 2007 | A1 |
20070006313 | Porras et al. | Jan 2007 | A1 |
20070011174 | Takaragi et al. | Jan 2007 | A1 |
20070016951 | Piccard et al. | Jan 2007 | A1 |
20070064689 | Shin et al. | Mar 2007 | A1 |
20070143827 | Nicodemus et al. | Jun 2007 | A1 |
20070157306 | Elrod et al. | Jul 2007 | A1 |
20070192858 | Lum | Aug 2007 | A1 |
20070208822 | Wang et al. | Sep 2007 | A1 |
20070240218 | Tuvell et al. | Oct 2007 | A1 |
20070240220 | Tuvell et al. | Oct 2007 | A1 |
20070240222 | Tuvell et al. | Oct 2007 | A1 |
20070250930 | Aziz et al. | Oct 2007 | A1 |
20080005782 | Aziz | Jan 2008 | A1 |
20080040710 | Chiriac | Feb 2008 | A1 |
20080072326 | Danford et al. | Mar 2008 | A1 |
20080077793 | Tan et al. | Mar 2008 | A1 |
20080134334 | Kim et al. | Jun 2008 | A1 |
20080141376 | Clausen et al. | Jun 2008 | A1 |
20080184367 | McMillan et al. | Jul 2008 | A1 |
20080189787 | Arnold et al. | Aug 2008 | A1 |
20080307524 | Singh et al. | Dec 2008 | A1 |
20080320594 | Jiang | Dec 2008 | A1 |
20090003317 | Kasralikar et al. | Jan 2009 | A1 |
20090064332 | Porras et al. | Mar 2009 | A1 |
20090083855 | Apap et al. | Mar 2009 | A1 |
20090125976 | Wassermann et al. | May 2009 | A1 |
20090126015 | Monastyrsky et al. | May 2009 | A1 |
20090144823 | Lamastra et al. | Jun 2009 | A1 |
20090158430 | Borders | Jun 2009 | A1 |
20090172815 | Gu et al. | Jul 2009 | A1 |
20090198651 | Shiffer et al. | Aug 2009 | A1 |
20090198670 | Shiffer et al. | Aug 2009 | A1 |
20090198689 | Frazier et al. | Aug 2009 | A1 |
20090199274 | Frazier et al. | Aug 2009 | A1 |
20090241190 | Todd et al. | Sep 2009 | A1 |
20090300589 | Watters et al. | Dec 2009 | A1 |
20100017546 | Poo et al. | Jan 2010 | A1 |
20100030996 | Butler, II | Feb 2010 | A1 |
20100058474 | Hicks | Mar 2010 | A1 |
20100077481 | Polyakov et al. | Mar 2010 | A1 |
20100115621 | Staniford et al. | May 2010 | A1 |
20100132038 | Zaitsev | May 2010 | A1 |
20100154056 | Smith et al. | Jun 2010 | A1 |
20100192223 | Ismael et al. | Jul 2010 | A1 |
20100281542 | Stolfo et al. | Nov 2010 | A1 |
20110078794 | Manni et al. | Mar 2011 | A1 |
20110093951 | Aziz | Apr 2011 | A1 |
20110099633 | Aziz | Apr 2011 | A1 |
20110099635 | Silberman et al. | Apr 2011 | A1 |
20110167493 | Song et al. | Jul 2011 | A1 |
20110173213 | Frazier et al. | Jul 2011 | A1 |
20110178942 | Watters et al. | Jul 2011 | A1 |
20110219450 | McDougal et al. | Sep 2011 | A1 |
20110225624 | Sawhney et al. | Sep 2011 | A1 |
20110247072 | Staniford et al. | Oct 2011 | A1 |
20110307954 | Melnik et al. | Dec 2011 | A1 |
20110307955 | Kaplan et al. | Dec 2011 | A1 |
20110307956 | Yermakov et al. | Dec 2011 | A1 |
20110314546 | Aziz et al. | Dec 2011 | A1 |
20120117652 | Manni et al. | May 2012 | A1 |
20120174186 | Aziz et al. | Jul 2012 | A1 |
20120174218 | McCoy et al. | Jul 2012 | A1 |
20120210423 | Friedrichs et al. | Aug 2012 | A1 |
20120222121 | Staniford et al. | Aug 2012 | A1 |
20120233698 | Watters et al. | Sep 2012 | A1 |
20120278886 | Luna | Nov 2012 | A1 |
20120331553 | Aziz et al. | Dec 2012 | A1 |
20130036472 | Aziz | Feb 2013 | A1 |
20130047257 | Aziz | Feb 2013 | A1 |
20130097706 | Titonis et al. | Apr 2013 | A1 |
20130185795 | Winn et al. | Jul 2013 | A1 |
20130227691 | Aziz et al. | Aug 2013 | A1 |
20130232577 | Watters et al. | Sep 2013 | A1 |
20130247186 | LeMasters | Sep 2013 | A1 |
20130282426 | Watters et al. | Oct 2013 | A1 |
20130291109 | Staniford et al. | Oct 2013 | A1 |
20130318038 | Shiffer et al. | Nov 2013 | A1 |
20130318073 | Shiffer et al. | Nov 2013 | A1 |
20130325791 | Shiffer et al. | Dec 2013 | A1 |
20130325792 | Shiffer et al. | Dec 2013 | A1 |
20130325871 | Shiffer et al. | Dec 2013 | A1 |
20130325872 | Shiffer et al. | Dec 2013 | A1 |
20140032875 | Butler | Jan 2014 | A1 |
20140181131 | Ross | Jun 2014 | A1 |
20140189687 | Jung et al. | Jul 2014 | A1 |
20140189866 | Shiffer et al. | Jul 2014 | A1 |
20140189882 | Jung et al. | Jul 2014 | A1 |
20140237600 | Silberman et al. | Aug 2014 | A1 |
20140280245 | Wilson | Sep 2014 | A1 |
20140283037 | Sikorski et al. | Sep 2014 | A1 |
20140283063 | Thompson et al. | Sep 2014 | A1 |
20140297494 | Watters et al. | Oct 2014 | A1 |
20140337836 | Ismael | Nov 2014 | A1 |
20140344926 | Cunningham et al. | Nov 2014 | A1 |
20140380473 | Bu et al. | Dec 2014 | A1 |
20140380474 | Paithane et al. | Dec 2014 | A1 |
20150007312 | Pidathala et al. | Jan 2015 | A1 |
20150096018 | Mircescu | Apr 2015 | A1 |
20150096022 | Vincent et al. | Apr 2015 | A1 |
20150096023 | Mesdaq et al. | Apr 2015 | A1 |
20150096024 | Haq et al. | Apr 2015 | A1 |
20150096025 | Ismael | Apr 2015 | A1 |
20150113653 | Song | Apr 2015 | A1 |
20150180886 | Staniford et al. | Jun 2015 | A1 |
20150186645 | Aziz et al. | Jul 2015 | A1 |
20150199513 | Ismael et al. | Jul 2015 | A1 |
20150199531 | Ismael et al. | Jul 2015 | A1 |
20150199532 | Ismael et al. | Jul 2015 | A1 |
20150220735 | Paithane et al. | Aug 2015 | A1 |
20150372980 | Eyada | Dec 2015 | A1 |
20160004869 | Ismael et al. | Jan 2016 | A1 |
20160006756 | Ismael et al. | Jan 2016 | A1 |
20160044000 | Cunningham | Feb 2016 | A1 |
20160127393 | Aziz et al. | May 2016 | A1 |
20160191547 | Zafar et al. | Jun 2016 | A1 |
20160191550 | Ismael et al. | Jun 2016 | A1 |
20160241580 | Watters et al. | Aug 2016 | A1 |
20160241581 | Watters et al. | Aug 2016 | A1 |
20160261612 | Mesdaq et al. | Sep 2016 | A1 |
20160285914 | Singh et al. | Sep 2016 | A1 |
20160301703 | Aziz | Oct 2016 | A1 |
20160323295 | Joram et al. | Nov 2016 | A1 |
20160335110 | Paithane et al. | Nov 2016 | A1 |
20170083703 | Abbasi et al. | Mar 2017 | A1 |
20170359217 | Ahuja | Dec 2017 | A1 |
20180013770 | Ismael | Jan 2018 | A1 |
20180048660 | Paithane et al. | Feb 2018 | A1 |
20180069891 | Watters et al. | Mar 2018 | A1 |
20180121316 | Ismael et al. | May 2018 | A1 |
20180288077 | Siddiqui et al. | Oct 2018 | A1 |
20190104154 | Kumar et al. | Apr 2019 | A1 |
20190132334 | Johns et al. | May 2019 | A1 |
20190207966 | Vashisht et al. | Jul 2019 | A1 |
20190207967 | Vashisht et al. | Jul 2019 | A1 |
20200252428 | Gardezi et al. | Aug 2020 | A1 |
20220103593 | Singh | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
2439806 | Jan 2008 | GB |
2490431 | Oct 2012 | GB |
0206928 | Jan 2002 | WO |
0223805 | Mar 2002 | WO |
2007117636 | Oct 2007 | WO |
2008041950 | Apr 2008 | WO |
2011084431 | Jul 2011 | WO |
2011112348 | Sep 2011 | WO |
2012075336 | Jun 2012 | WO |
2012145066 | Oct 2012 | WO |
2013067505 | May 2013 | WO |
Entry |
---|
Venezia, Paul, “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003). |
Vladimir Getov: “Security as a Service in Smart Clouds—Opportunities and Concerns”, Computer Software and Applications Conference (COMPSAC), 2012 IEEE 36th Annual, IEEE, Jul. 16, 2012 (Jul. 16, 2012). |
Wahid et al., Characterising the Evolution in Scanning Activity of Suspicious Hosts, Oct. 2009, Third International Conference on Network and System Security, pp. 344-350. |
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages. |
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9. |
Yuhei Kawakoya et al: “Memory behavior-based automatic malware unpacking in stealth debugging environment”, Malicious and Unwanted Software (Malware), 2010 5th International Conference on, IEEE, Piscataway, NJ, USA, Oct. 19, 2010, pp. 39-46, XP031833827, ISBN:978-1-4244-8-9353-1. |
Zhang et al., The Effects of Threading, Infection Time, and Multiple-Attacker Collaboration on Malware Propagation, Sep. 2009, IEEE 28th International Symposium on Reliable Distributed Systems, pp. 73-82. |
“Mining Specification of Malicious Behavior”—Jha et al, UCSB, Sep. 2007 https://www.cs.ucsb.edu/.about.chris/research/doc/esec07.sub.-mining.pdf-. |
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003). |
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.sp?reload=true&arnumbe- r=990073, (Dec. 7, 2013). |
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108. |
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003). |
Apostolopoulos, George; hassapis, Constantinos; “V-eM: A cluster of Virtual Machines for Robust, Detailed, and High-Performance Network Emulation”, 14th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Sep. 11-14, 2006, pp. 117-126. |
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006. |
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184. |
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), p. 67-77. |
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003). |
Chaudet, C., et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82. |
Chen, P. M. and Noble, B. D., “When Virtual is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”) (2001). |
Disco “Intrusion Prevention for the Cisco ASA 5500-x Series” Data Sheet (2012). |
Dohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120. |
Costa, M., et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005). |
Didier Stevens, “Malicious PDF Documents Explained”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 9, No. 1, Jan. 1, 2011, pp. 80-82, XP011329453, ISSN: 1540-7993, DOI: 10 1109/MSP.2011.14. |
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007). |
Dunlap, George W., et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002). |
FireEye Malware Analysis & Exchange Network, Malware Protection System, FireEye Inc., 2010. |
FireEye Malware Analysis, Modern Malware Forensics, FireEye Inc., 2010. |
FireEye v.6.0 Security Target, pp. 1-35, Version 1.1, FireEye Inc., May 2011. |
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28. |
Gregg Keizer: “Microsoft's HoneyMonkeys Show Patching Windows Works”, Aug. 8, 2005, XP055143386, Retrieved from the Internet: URL:http://www.informationweek.com/microsofts-honeymonkeys-show-patching-windows-works/d/d-id/1035069? [retrieved on Jun. 1, 2016]. |
Heng Yin et al, Panorama: Capturing System-Wide Information Flow for Malware Detection and Analysis, Research Showcase @ CMU, Carnegie Mellon University, 2007. |
Hiroshi Shinotsuka, Malware Authors Using New Techniques to Evade Automated Threat Analysis Systems, Oct. 26, 2012, http://www.symantec.com/connect/blogs/, pp. 1-4. |
Idika et al., A-Survey-of-Malware-Detection-Techniques, Feb. 2, 2007, Department of Computer Science, Purdue University. |
Sohara, Takamasa, Keisuke Takemori, and Ayumu Kubota. “Kernel-based behavior analysis for android malware detection.” Computational intelligence and Security (CIS), 2011 Seventh International Conference on. IEEE, 2011. |
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003). |
Kevin A Roundy et al: “Hybrid Analysis and Control of Malware”, Sep. 15, 2010, Recent Advances in Intrusion Detection, Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 317-338, XP019150454 ISBN:978-3-642-15511-6. |
Khaled Salah et al: “Using Cloud Computing to Implement a Security Overlay Network”, SECURITY & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 11, No. 1, Jan. 1, 2013 (Jan. 1, 2013). |
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286. |
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”), (2003). |
Kreibich, C., et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003). |
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages. |
Lastline Labs, The Threat of Evasive Malware, Feb. 25, 2013, Lastline Labs, pp. 1-8. |
Li et al., A VMM-Based System Call Interposition Framework for Program Monitoring, Dec. 2010, IEEE 16th International Conference on Parallel and Distributed Systems, pp. 706-711. |
Lindorfer, Martina, Clemens Kolbitsch, and Paolo Milani Comparetti. “Detecting environment-sensitive malware.” Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011. |
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001). |
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910. |
Morales, Jose A., et al., ““Analyzing and exploiting network behaviors of malware.””, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34. |
Mori, Detecting Unknown Computer Viruses, 2004, Springer-Verlag Berlin Heidelberg. |
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002). |
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987. |
Newsome, J., et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), (Feb. 2005). |
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302. |
Oberheide et al., CloudAV.sub.—N-Version Antivirus in the Network Cloud, 17th USENIX Security Symposium USENIX Security '08 Jul. 28-Aug. 1, 2008 San Jose, CA. |
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doorn, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”). |
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25. |
Singh, S., et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004). |
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998). |
Number | Date | Country | |
---|---|---|---|
62827039 | Mar 2019 | US |