Embodiments of the invention relate to the field of application software testing. More specifically, one embodiment of the disclosure relates to a system, apparatus and method for providing a user interface to visually display, in real-time, event/time indexed video that illustrates simulated operations of an application undergoing anomalous behavior detection analysis and a textual log synchronized with the video in accordance with execution flow.
Normally, malware features one or more programs or files that disrupt the operations of an infected electronic device, normally by attacking and adversely influencing its operations. In some instances, malware may, unbeknownst to the user, gather and transmit passwords and other sensitive information from the electronic device. In other instances, malware may alter the functionality of the electronic device without the user's permission. Examples of different types of malware may include bots, computer viruses, worms, Trojan horses, spyware, adware, or any other programming that operates within the electronic device without permission.
Over the last decade, various types of malware detection applications have been introduced in order to uncover the presence of malware within an electronic device, especially within software downloaded from a remote source and installed within the electronic device. However, these applications neither provide an ability to customize the behavioral analysis nor obtain the benefits of a real-time, interactive visual display of such analysis.
Embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
Various embodiments of the invention relate to a system, apparatus and method for providing a user interface to control a real-time visual display of an anomalous behavior detection analysis being conducted on simulated operations of application software running within a virtual machine (VM) emulated run-time test and observation environments (hereinafter virtual “run-time environment”). For example, according to one embodiment, the visual display may be video depicting the simulated operations during this detection analysis, where the video is synchronized with a textual log displayed with the video.
The video features a multiple order indexing scheme, where a first order of indexing permits a user, by user interaction, to access a particular segment of video in accordance with either a particular playback time in the video or a particular analyzed event. An example of an “analyzed event” is a test behavior, namely a particular behavior being monitored during the anomalous behavior detection analysis. The second order of indexing provides a user, during display of the video, information related to where the analyzed event occurs within the execution flow of the application software.
Hence, the video not only enables an administrator to visually witness anomalous behaviors that suggest the application software under test has malware, suspicious code or pernicious code, but also provides an administrator with evidence for use in policy enforcement and information to further refine (or harden) the anomalous behavior detection analysis and/or application software.
In the following description, certain terminology is used to describe features of the invention. For example, in certain situations, the terms “logic”, “engine” and “unit” are representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, logic may include circuitry such as processing circuitry (e.g., a microprocessor, one or more processor cores, a programmable gate array, a microcontroller, an application specific integrated circuit, etc.), wireless receiver, transmitter and/or transceiver circuitry, semiconductor memory, combinatorial logic, or other types of electronic components.
As software, logic may be in the form of one or more software modules, such as executable code in the form of an executable application, an application programming interface (API), a subroutine, a function, a procedure, an applet, a servlet, a routine, source code, object code, a shared library/dynamic load library, or one or more instructions. These software modules may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of non-transitory storage medium may include, but is not limited or restricted to a programmable circuit; a semiconductor memory; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the executable code is stored in persistent storage.
It is contemplated that an electronic device may include hardware logic such as one or more of the following: (i) processing circuitry; (ii) a communication interface which may include one or more radio units (for supporting wireless data transmission/reception) and/or a physical connector to support wired connectivity; (iii) a non-transitory storage medium; and/or (iv) a display. Types of electronic devices may include mobile electronic devices (e.g., cellular smartphones, tablets, laptop computers, netbooks, etc.), stationary electronic devices (e.g., desktop computers, servers, controllers, access points, base stations, routers, etc.) that are adapted for network connectivity.
The term “transmission medium” is a communication path between two or more electronic devices. The communication path may include wired and/or wireless segments. Examples of wired and/or wireless segments include electrical wiring, optical fiber, cable, bus trace, or a wireless channel using infrared, radio frequency (RF), or any other wired/wireless signaling mechanism.
The term “video” is generally defined as a series of successive display images, including VM-emulated graphical representations (screenshots) of operations that would have been displayed if an electronic device executed the application software under test (“test application”) natively (e.g. if the application under test was executed on a mobile device OS). Hence, video may have a number of different formats, for example, a series of graphic images sequenced to represent a series of video frames; a series of compressed video frames in compliance with H.264, MPEG-2 or another video format; and a series of static images (such as slide show) that together define a time-based sequence. The video may even be vector-based graphic representations that collectively produce an animated sequence of images.
The term “anomalous behavior” is directed to an undesirable behavior occurring during execution of application software, where a behavior may be deemed to be “undesirable” based on customer-specific rules, manufacturer-based rules, or any other type of rules formulated by public opinion or a particular governmental or commercial entity. This undesired behavior may include (1) altering the functionality of the device executing that application software in a malicious manner (malware-based behavior); altering the functionality of the device executing that application software without any malicious intent (suspicious code-based behavior); and/or (3) providing an unwanted functionality which is generally acceptable in other context (pernicious code-based behavior). Examples of unwanted functionality by pernicious code may include tracking and/or disseminating user activity on the device (e.g., websites visited, email recipients, etc.), tracking and/or disseminating user location (e.g., global satellite positioning “GPS” location), privacy intrusion (e.g. accessing certain files such as contact lists), or the like.
For instance, as illustrative examples, an “anomalous behavior” may include a communication-based anomaly, such as an unexpected attempt to establish a network communication, unexpected attempt to transfer data (e.g., GPS data or other location data resulting in a privacy violation, contact lists, etc.), unexpected attempt to activate a video capture device (e.g., web camera), or unexpected activation of an audio capture device (e.g. microphone). Anomalous behavior also may include an execution anomaly, for example, an unexpected execution of computer program code, an unexpected Application Programming Interface (API) function call, an unexpected alteration of a registry key, or the like.
Lastly, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
As this invention is susceptible to embodiments of many different forms, it is intended that the present disclosure is to be considered as an example of the principles of the invention and not intended to limit the invention to the specific embodiments shown and described.
I. General Architecture
Referring to
Furthermore, anomalous behavior detection device 110 is communicatively coupled via a transmission medium 140 to one or more electronic devices 1501-150N (N≥1). Through a graphics user interface (GUI) provided by anomalous behavior detection device 110, an administrator is able to (i) control the anomalous behavior detection analysis and (ii) watch, in real-time, VM-emulated operations of the test application in concert with analysis of the presence or absence of certain behaviors chosen to be monitored during such operations (hereinafter referred to as “test behaviors”).
As shown in
It is contemplated that communication system 100 may represent a dedicated anomalous behavior detection process for a particular network or subnetwork, being a part of a larger network. In such a deployment, anomalous behavior detection device 110 may be communicatively coupled to a central management system (CMS) 180, which communicatively couples communication system 100 along with other communication systems. This allows multiple communication systems to operate in tandem and exchange information as needed.
It is further contemplated that anomalous behavior detection device 110 may be deployed to provide cloud computing anomalous behavior detection services. Alternatively, anomalous behavior detection device 110 may be deployed as an appliance (electronic device) integrated as part of a local or enterprise network, or any combination thereof.
Referring now to
Processor 200 is further coupled to persistent storage 230 via transmission medium 240. According to one embodiment of the disclosure, persistent storage 230 may include video storage unit 250, application analyzer logic 260, graphics user interface (GUI) logic 270, and identity verification logic 280. Of course, when implemented as hardware logic, any of these logic/units 250, 260, 270 and/or 280 would be implemented separately from persistent memory 230.
Application analyzer 260 is adapted to conduct testing of the safety/security of application software, including mobile device application software. Such testing involves at least analysis of one or more test behaviors in response to a sequence of simulated (e.g. VM-emulated) operations performed by the test application. From the analysis of these test behaviors, anomalous behaviors may be detected.
During this testing, application analyzer 260 also generates video by capturing display images (or frames), on a continuous or periodic sampling basis, produced during simulated operations of the test application during the anomalous behavior detection analysis. As the anomalous behavior detection analysis at least partially involves analysis of a sequence of test behaviors, a time stamp is associated with at least a first display image (or frame) of a video segment for each test behavior being analyzed. This enables the video to be indexed by time and by test behavior. The time stamp along with information directed to the corresponding test behavior is stored within a time-stamp storage unit 262 accessible to application analyzer 260 while the video may be stored in video storage unit 250 for later review. Of course, time-stamps may be applied to every display image (or frame) to provide greater precision on the location within the video where analysis for particular test behaviors is conducted.
Additionally, application analyzer 260 features an index tracking logic 264 that is adapted to track and record which display images (or frames) of the video corresponds to a particular test behavior being analyzed. For example, it is contemplated that index tracking logic 264 may include a table where each entry maintains an identifier (or index) associated with a particular display image (or frame) along with a corresponding identifier (or index) associated with a particular aspect of the test behavior being analyzed. As a result, the display of the video is synchronized with the display (and illustrated progress) of the analyzed test behaviors.
Furthermore, after completion of this testing, application analyzer 260 assigns a threat score to the test application. The threat score, ranging between minima and maxima values (e.g., 0-10), represents the severity of the test behavior(s) detected during the anomalous behavior detection analysis. In other words, the threat score may be considered to represent the amount of potential harm that detected anomalous behavior(s) may cause an electronic device executing that test application.
As further shown in
Identity verification logic 280 is used to control authentication of users seeking to access application analyzer 260. Furthermore, identity verification logic 280 may set access privileges for each authenticated user, as certain users may have restricted access to only certain functionality offered by application analyzer 260. As an example, one user may have access to replay video stored in video storage unit 250, but is unable to initiate anomalous behavior analysis testing on application software. Another administrator may have complete access to all functionality offered by application analyzer 260.
Referring now to
Static instrumentation engine 300 receives a test application (APPN) 305 and generates a representation of the test application 300 that is analyzed with one or more various software analysis techniques (e.g., control information analysis, or data analysis). Static instrumentation engine 300 then modifies the application code itself to include within itself special monitoring functions and/or special stimuli functions operable during execution of the test application in dynamic run-time test and observation engine 320. The monitoring functions report their results to the control logic 325 and the stimuli functions are told what stimuli to generate by control logic 325. During such analysis by static instrumentation engine 300, video 310 is captured and/or other graphics related to the analysis is generated and provided to GUI logic 270 to produce one or more user interface display screens. Furthermore, video 310 is stored in video storage unit 250 for subsequent playback.
It is contemplated that static instrumentation engine 300 may be adapted to receive information from dynamic RTO engine 320 in order to instrument the code to better analyze specific behaviors targeted in the heuristics and/or probability analysis.
After processing is completed by static instrumentation engine 300, test application 305 is then provided to control logic 325 within dynamic RTO engine 320. Control logic 325 operates as a scheduler to dynamically control the anomalous behavior detection analysis among different applications and/or the same application software among different virtual run-time environments. Furthermore, control logic 325 maintains time-stamp storage unit 262 and index tracking logic 264 as previously described.
In general, dynamic RTO engine 320 acts as an intelligent testing function. According to one approach, dynamic RTO engine 320 recursively collects information describing the current state of test application 305 and selects a subset of rules, corresponding at least in part to the test behaviors set by the user, to be monitored during virtual execution of test application 305. The strategic selection and application of various rules over a number of recursions in view of each new observed operational state permits control logic 325 to resolve a specific conclusion about test application 305, namely a threat score denoting whether the application is “safe” or “unsafe”.
As shown in
One or more virtual run-time environments 350 simulate operations of test application 305 to detect anomalous behavior produced by this application. For instance, run-time environment 3551 can be used to identify the presence of anomalous behavior during analysis of simulated operations of test application 305 performed on a virtual machine 3401. Of course, there can be multiple run-time environments 3551-355M (M≥2) to simulate multiple types of processing environments for test application 305.
A virtual machine may be considered a representation of a specific electronic device that is provided to a selected run-time environment by control unit 325. In one example, control unit 325 retrieves virtual machine 3401 from virtual machine repository 330 and configures virtual machine 3401 to mimic an Android® based smart phone. The configured virtual machine 3401 is then provided to one of the run-time environments 3551-355M (e.g., run-time environment 3551).
As run-time environment 3551 simulates the operations of test application 305, virtual machine 3401 can be closely monitored for any test behaviors set by the user (or set by default) in behavior setting logic 370. By simulating the operations of test application 305 and analyzing the response of virtual machine 3401, run-time environment 3551 can identify known and previously unidentified anomalous behavior and report the same through the indexed video and a dynamic textual log.
Besides VM 3401, run-time environment 3551 is provided test application 305 along with an instance 360 of test application (App) an instance 365 of the type of operating system on which target application 305 will run if deemed sufficiently safe during the dynamic anomalous behavior detection process. Here, the use of virtual machines (VMs) permits the instantiation of multiple additional run-time environments 3551-355M each having its own test application and OS instance, where the various run-time environments 3551-355M are isolated from one another.
As previously described, the simultaneous existence of multiple run-time environments 3551-355M permits different types of observations/tests to be run on a particular test application. That is, different instances of the same test application may be provided in different run-time environments so that different types of tests/observances can be concurrently performed on the same application. Alternatively, different test applications can be concurrently tested/observed.
For instance, a first application may be tested/observed in a first run-time environment (e.g., environment 3551) while a second, different application is tested/observed in another run-time environment (e.g., environment 355M). Notably, instances of different operating system types and even different versions of the same type of operating system may be located in different run-time environments. For example, an Android® operating system instance 365 may be located in first run-time test environment 3551 while an iOS® operating system instance (not shown) may be located in a second run-time test environment 355M. Concurrent testing of one or more test applications (whether different instances of the same application or respective instances of different applications or some combination thereof) enhances the overall performance of the communication system.
II. Anomalous Behavior Analysis and Video Generation/Playback
Referring to
Next, upon user interaction with the second user interface display screen (e.g. selection by the user of a particular object), the GUI logic provides a third user interface display screen (See
Once the test behaviors for the anomalous behavior detection analysis are set, the application analyzer virtually processes the test application to detect anomalous behavior (block 430). The simulated operations conducted during the virtual processing of the test application produce video, which is sent to the GUI logic for rendering a fourth user interface display screen in real time (block 435). Additionally, a textual log providing information as to what events (e.g., test behaviors) are being analyzed and when the analyzed events occur within the execution flow of the application software. This information may be provided through the placement and ordering of display objects corresponding to test behaviors alongside the video corresponding to the order of display images (or frames) rendered during the simulated operations of the test application.
As a result, progress changes in the anomalous behavior analysis displayed by the video are synchronized with progress changes shown by the textual log. Concurrently with or subsequent to the supply of the video to the GUI logic, the video is provided to video storage unit for storage and subsequent retrieval for playback (block 440).
Referring to
Thereafter, playback of the video continues unless disrupted by video playback alternation events (e.g., Pause, Stop, Fast-Forward, Reverse, etc.) in which playback of the video is haltered to service these events (blocks 480, 482, 484 and 486). Once playback of the video has completed, this playback session ends (block 490). The user may be provided the opportunity to commence a new playback session or select another video.
III. User Interface Display Screens to Control the Application Analyzer
Referring now to
As shown, an initial request for access to the application analyzer is redirected to login display screen 500 that features at least two entry fields; namely a User Name 510 and a Password 520. The User Name entry field 510 requires the user to enter a registered user name in order to identify the user seeking access to the application analyzer. Password entry field 520 allows the user to enter his or her password.
Once a login object 530 is selected by the user, the user name and password are provided to identity verification logic 280 of
As shown in
For instance, first area 610 displays a plurality of objects that provide information directed to application software that has been analyzed or are currently being analyzed with a first selected time period (24 hours). Provided by the application analyzer to the GUI logic for rendering, the information associated with these objects identifies: (1) number of applications submitted (object 620); (2) number of applications analyzed (object 622); (3) number of applications currently being analyzed (object 624); (4) number of applications analyzed according to customized rule settings (object 626); and (5) the number of “unsafe” applications detected (object 628). Some or all of these numeric values are stored for a period of time that may be set by the manufacturer or the user.
It is contemplated that the first selected time period may be adjusted through a drop-down list 615 that features multiple time periods using the current time as a reference (e.g., 24 hours ago, 1 week ago, 1 month ago, 3 months ago, 1 year ago, etc.). However, although not shown, drop-down list 615 may also feature user interaction to select the start and end time periods for the first selected time period.
Second area 640 provides graphical depictions of application software analyzed over a second selected time period 645, which may differ from the selected time period for first display area 610. As shown, a first graphical depiction 650 represents a line graph that identifies different categories of analyzed applications (vertical axis) analyzed at different times within the selected time period (horizontal axis). The different categories include (1) “safe” applications 652 (applications with a threat score not greater than a predetermined threshold); (2) unsafe applications 654 (applications with a threat score greater than the predetermined threshold); and (3) applications submitted for analysis 656.
A second graphical depiction 660 represents a bar graph directed to applications that have completed their anomalous behavior analysis testing. For this bar graph, the horizontal axis represents the measured threat score (0-10) while the vertical axis represents the number of analyzed applications associated with the measured threat score.
A third graphical depiction 665 represents a pie chart also directed to applications that have completed their anomalous behavior analysis testing. A first color 667 denotes those applications having a threat score indicating the application is considered “safe” for use while a second color 668 denotes those applications having a threat score that identifies them as being “unsafe” for use.
Third area 670 provides a graphical and/or textual depiction entry 675 for each application that has been analyzed or is in the process of being analyzed. Each entry 675 includes a plurality of parameters, including at least three or more of the following: (1) date the application was submitted 680; (2) application name 681; (3) status (safe, unsafe, complete with error, in progress) 682; (4) threat score 683; and (5) custom rule matching status 684. The order of these entries can be adjusted according to submission date, alphabetically by application name, status and threat score.
With respect to the status parameter 682, currently, there are four status levels. As previously mentioned, “safe” is a status level assigned to applications having a threat score no greater than a predetermined threshold while “unsafe” is a status level assigned to applications having a threat score greater than the predetermined threshold, normally indicating the presence of malware or some sort of suspicious or pernicious code causes behaviors unsuitable for the targeted device. Another status level is “in progress” which indicates that the corresponding application is currently undergoing the anomalous behavior analysis. Lastly, “complete-error” is a status level which identifies that an anomalous behavior has been detected, but the risk level may widely vary depending on the particular customer.
For instance, as an illustrative example, for application software that establishes a network connection to a server for upgrades, without any malicious intent, the assigned level of risk would be minimal for most clients. However, where the electronic device is for use by a high-ranking governmental official, any unknown network connectivity may be assigned a high risk. Hence, the test application is assigned to “complete-error” status with medium threat score upon detecting a test behavior that is considered by the anomalous behavior detection analysis as being customer dependent. This status level encourages user interaction (e.g., select “Go To Details” link located next to the threat score) to obtain a more detailed explanation of the findings associated with the threat score, although more detailed explanations are provided for all status levels.
Referring now to
Upload display area 710 enables the user to enter addressing information (e.g. Uniform Resource Locator “URL”, File Transfer Protocol “FTP” address, etc.) with an input field 715. Thereafter, once the “Submit” object 720 is selected, an HTTP Request message is sent in order to fetch the test application from the website or database specified by the addressing information.
Search display area 730 features an input field 735 into which the user can enter the name (or at least a portion of the name) of the test application. For instance, as shown in
Referring to
Referring back to
Referring now to
For ease of illustration, only some of the test behaviors are set forth in
As further shown in
As further shown in
Referring to
More specifically, sequence based analysis builder 860 provides a listing 865 of test behaviors chosen by the user as illustrated in
Similarly, group & sequence based analysis builder 875 enables use of test behaviors 815 to formulate groupings using logical operators (AND, OR) 880. Test behaviors 815 may be dragged into position along with logical operators 880.
Referring now to
Video display area 920 is allocated to display video 925, which captures simulated operations of the test application (XYZ Messenger) during the anomalous behavior detection analysis in concert with analysis of the presence or absence of the selected events. During anomalous behavior analysis of the XYZ Messenger by one or more VMs in a run-time environment, as illustrated in
Synchronous with playback of video 925, a textual log 945 is provided to identify the execution flow and which test behaviors have been completed, awaiting analysis or currently being analyzed. The display of interface display screen 900, especially the textual log 945, may be conducted in two different types of modes: Regular Display mode and Analyst Display mode. In Regular mode, the showing/listing of only detected anomalous behaviors, such as suspicious or important events/results for example, is conducted. In Analyst Display mode, the showing/listing of all events occurring in the application, including those related to only the execution of the application and those events that would have been forced by the mobile electronic device.
For completed test behaviors, during Analyst mode for example, a first image (check mark) 950 is rendered to identify whether the test behavior was not present, a second image (“X”) 952 is rendered to identify that the test behavior was detected; a third image (“Δ”) 954 is rendered where, at this point in the analysis, the test behavior has not been analyzed yet; and a fourth image (progress bar) 956 is rendered where the test behavior is currently being analyzed. The updating of entries within textual log 945 is synchronized with video 925 being displayed.
Referring to
User interface display screen 1000 provides a first object (REPLAY) 1030 that, based upon user interaction, signals the GUI logic to replay video 925 as shown in
In general terms, the video replay provides context for each event to explain away or confirm certain anomalous behaviors in light of what image displays (screenshots) may have been displayed or user interactions that have occurred. Some applications exhibit anomalies which may be viewed/verified as unwanted behaviors depending on when/where in the application the event occurred (e.g., audio recording started when expected or at unexpected time, or whether a permission is noted in a manifest). In order to provide such context, the displayed images of video 925 may capture the display output of the application software for at least a period of time (window) before and after an event included in the displayed textual log 945 has occurred.
Referring back
Additionally, display screen 1100 features a search field 1120 that enables the user to search for a particular event or test behavior at a particular point in the video replay. Also, an activity graph 1130 identifies the activities (e.g., number and frequency of API function calls, Java™ events, etc.) during the testing period for the anomalous behavior detection analysis. The particular activities may be obtained by selecting activity graph 1130 to denote a request for deeper analysis of the findings from the anomalous behavior detection analysis.
Referring back to
In the foregoing description, the invention is described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the present invention as set forth in the appended claims. The specification and drawings are accordingly to be regarded in an illustrative rather than in a restrictive sense.
This application is a continuation of U.S. patent application Ser. No. 13/775,174, filed Feb. 23, 2013, now U.S. Pat. No. 9,195,829 issued on Nov. 24, 2015, the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4292580 | Ott et al. | Sep 1981 | A |
5175732 | Hendel et al. | Dec 1992 | A |
5440723 | Arnold et al. | Aug 1995 | A |
5657473 | Killean et al. | Aug 1997 | A |
5842002 | Schnurer et al. | Nov 1998 | A |
5978917 | Chi | Nov 1999 | A |
6088803 | Tso et al. | Jul 2000 | A |
6094677 | Capek et al. | Jul 2000 | A |
6269330 | Cidon et al. | Jul 2001 | B1 |
6279113 | Vaidya | Aug 2001 | B1 |
6298445 | Shostack et al. | Oct 2001 | B1 |
6357008 | Nachenberg | Mar 2002 | B1 |
6424627 | slashed.rhaug et al. | Jul 2002 | B1 |
6484315 | Ziese | Nov 2002 | B1 |
6487666 | Shanklin et al. | Nov 2002 | B1 |
6493756 | O'Brien et al. | Dec 2002 | B1 |
6550012 | Villa et al. | Apr 2003 | B1 |
6775657 | Baker | Aug 2004 | B1 |
6832367 | Choi et al. | Dec 2004 | B1 |
6895550 | Kanchirayappa et al. | May 2005 | B2 |
6898632 | Gordy et al. | May 2005 | B2 |
6907396 | Muttik et al. | Jun 2005 | B1 |
6971097 | Wallman | Nov 2005 | B1 |
6981279 | Arnold et al. | Dec 2005 | B1 |
7007107 | Ivchenko et al. | Feb 2006 | B1 |
7028179 | Anderson et al. | Apr 2006 | B2 |
7043757 | Hoefelmeyer et al. | May 2006 | B2 |
7069316 | Gryaznov | Jun 2006 | B1 |
7080408 | Pak et al. | Jul 2006 | B1 |
7093002 | Wolff et al. | Aug 2006 | B2 |
7093239 | van der Made | Aug 2006 | B1 |
7100201 | Izatt | Aug 2006 | B2 |
7159149 | Spiegel et al. | Jan 2007 | B2 |
7231667 | Jordan | Jun 2007 | B2 |
7240364 | Branscomb et al. | Jul 2007 | B1 |
7240368 | Roesch et al. | Jul 2007 | B1 |
7287278 | Liang | Oct 2007 | B2 |
7308716 | Danford et al. | Dec 2007 | B2 |
7356736 | Natvig | Apr 2008 | B2 |
7386888 | Liang et al. | Jun 2008 | B2 |
7392542 | Bucher | Jun 2008 | B2 |
7418729 | Szor | Aug 2008 | B2 |
7428300 | Drew et al. | Sep 2008 | B1 |
7441272 | Durham et al. | Oct 2008 | B2 |
7448084 | Apap et al. | Nov 2008 | B1 |
7458098 | Judge et al. | Nov 2008 | B2 |
7464404 | Carpenter et al. | Dec 2008 | B2 |
7464407 | Nakae et al. | Dec 2008 | B2 |
7467408 | O'Toole, Jr. | Dec 2008 | B1 |
7480773 | Reed | Jan 2009 | B1 |
7487543 | Arnold et al. | Feb 2009 | B2 |
7496960 | Chen et al. | Feb 2009 | B1 |
7496961 | Zimmer et al. | Feb 2009 | B2 |
7519990 | Xie | Apr 2009 | B1 |
7523493 | Liang et al. | Apr 2009 | B2 |
7530104 | Thrower et al. | May 2009 | B1 |
7540025 | Tzadikario | May 2009 | B2 |
7565550 | Liang et al. | Jul 2009 | B2 |
7603715 | Costa et al. | Oct 2009 | B2 |
7607171 | Marsden et al. | Oct 2009 | B1 |
7639714 | Stolfo et al. | Dec 2009 | B2 |
7644441 | Schmid et al. | Jan 2010 | B2 |
7676841 | Sobchuk et al. | Mar 2010 | B2 |
7698548 | Shelest et al. | Apr 2010 | B2 |
7707633 | Danford et al. | Apr 2010 | B2 |
7779463 | Stolfo et al. | Aug 2010 | B2 |
7784097 | Stolfo et al. | Aug 2010 | B1 |
7832008 | Kraemer | Nov 2010 | B1 |
7849506 | Dansey et al. | Dec 2010 | B1 |
7869073 | Oshima | Jan 2011 | B2 |
7877803 | Enstone et al. | Jan 2011 | B2 |
7904959 | Sidiroglou et al. | Mar 2011 | B2 |
7908660 | Bahl | Mar 2011 | B2 |
7930738 | Petersen | Apr 2011 | B1 |
7937761 | Bennett | May 2011 | B1 |
7996556 | Raghavan et al. | Aug 2011 | B2 |
7996836 | McCorkendale et al. | Aug 2011 | B1 |
7996905 | Arnold et al. | Aug 2011 | B2 |
8006305 | Aziz | Aug 2011 | B2 |
8010667 | Zhang et al. | Aug 2011 | B2 |
8020206 | Hubbard et al. | Sep 2011 | B2 |
8028338 | Schneider et al. | Sep 2011 | B1 |
8045094 | Teragawa | Oct 2011 | B2 |
8045458 | Alperovitch et al. | Oct 2011 | B2 |
8069484 | McMillan et al. | Nov 2011 | B2 |
8087086 | Lai et al. | Dec 2011 | B1 |
8171553 | Aziz et al. | May 2012 | B2 |
8201246 | Wu et al. | Jun 2012 | B1 |
8204984 | Aziz et al. | Jun 2012 | B1 |
8220055 | Kennedy | Jul 2012 | B1 |
8225288 | Miller et al. | Jul 2012 | B2 |
8225373 | Kraemer | Jul 2012 | B2 |
8233882 | Rogel | Jul 2012 | B2 |
8234709 | Viljoen et al. | Jul 2012 | B2 |
8239944 | Nachenberg et al. | Aug 2012 | B1 |
8286251 | Eker et al. | Oct 2012 | B2 |
8291499 | Aziz et al. | Oct 2012 | B2 |
8307435 | Mann et al. | Nov 2012 | B1 |
8307443 | Wang et al. | Nov 2012 | B2 |
8312545 | Tuvell et al. | Nov 2012 | B2 |
8321936 | Green et al. | Nov 2012 | B1 |
8321941 | Tuvell et al. | Nov 2012 | B2 |
8365286 | Poston | Jan 2013 | B2 |
8370938 | Daswani et al. | Feb 2013 | B1 |
8370939 | Zaitsev et al. | Feb 2013 | B2 |
8375444 | Aziz et al. | Feb 2013 | B2 |
8381299 | Stolfo et al. | Feb 2013 | B2 |
8402529 | Green et al. | Mar 2013 | B1 |
8510827 | Leake et al. | Aug 2013 | B1 |
8510842 | Amit et al. | Aug 2013 | B2 |
8516593 | Aziz | Aug 2013 | B2 |
8528086 | Aziz | Sep 2013 | B1 |
8539582 | Aziz et al. | Sep 2013 | B1 |
8549638 | Aziz | Oct 2013 | B2 |
8561177 | Aziz et al. | Oct 2013 | B1 |
8566946 | Aziz et al. | Oct 2013 | B1 |
8584094 | Dadhia et al. | Nov 2013 | B2 |
8584234 | Sobel et al. | Nov 2013 | B1 |
8584239 | Aziz et al. | Nov 2013 | B2 |
8595834 | Xie et al. | Nov 2013 | B2 |
8627476 | Satish et al. | Jan 2014 | B1 |
8635696 | Aziz | Jan 2014 | B1 |
8713681 | Silberman et al. | Apr 2014 | B2 |
8826240 | Lachwani | Sep 2014 | B1 |
9409090 | McKenzie | Aug 2016 | B1 |
20010005889 | Albrecht | Jun 2001 | A1 |
20010047326 | Broadbent et al. | Nov 2001 | A1 |
20020018903 | Kokubo et al. | Feb 2002 | A1 |
20020038430 | Edwards et al. | Mar 2002 | A1 |
20020091819 | Melchione et al. | Jul 2002 | A1 |
20020144156 | Copeland | Oct 2002 | A1 |
20020162015 | Tang | Oct 2002 | A1 |
20020166063 | Lachman et al. | Nov 2002 | A1 |
20020184528 | Shevenell et al. | Dec 2002 | A1 |
20020188887 | Largman et al. | Dec 2002 | A1 |
20020194490 | Halperin et al. | Dec 2002 | A1 |
20030074578 | Ford et al. | Apr 2003 | A1 |
20030084318 | Schertz | May 2003 | A1 |
20030115483 | Liang | Jun 2003 | A1 |
20030188190 | Aaron et al. | Oct 2003 | A1 |
20030200460 | Morota et al. | Oct 2003 | A1 |
20030212902 | van der Made | Nov 2003 | A1 |
20030237000 | Denton et al. | Dec 2003 | A1 |
20040003323 | Bennett et al. | Jan 2004 | A1 |
20040015712 | Szor | Jan 2004 | A1 |
20040019832 | Arnold et al. | Jan 2004 | A1 |
20040047356 | Bauer | Mar 2004 | A1 |
20040083408 | Spiegel et al. | Apr 2004 | A1 |
20040093513 | Cantrell et al. | May 2004 | A1 |
20040111531 | Staniford et al. | Jun 2004 | A1 |
20040165588 | Pandya | Aug 2004 | A1 |
20040236963 | Danford et al. | Nov 2004 | A1 |
20040243349 | Greifeneder et al. | Dec 2004 | A1 |
20040249911 | Alkhatib et al. | Dec 2004 | A1 |
20040255161 | Cavanaugh | Dec 2004 | A1 |
20040268147 | Wiederin et al. | Dec 2004 | A1 |
20050021740 | Bar et al. | Jan 2005 | A1 |
20050033960 | Vialen et al. | Feb 2005 | A1 |
20050033989 | Poletto et al. | Feb 2005 | A1 |
20050050148 | Mohammadioun et al. | Mar 2005 | A1 |
20050086523 | Zimmer et al. | Apr 2005 | A1 |
20050091513 | Mitomo et al. | Apr 2005 | A1 |
20050091533 | Omote et al. | Apr 2005 | A1 |
20050114663 | Cornell et al. | May 2005 | A1 |
20050125195 | Brendel | Jun 2005 | A1 |
20050157662 | Bingham et al. | Jul 2005 | A1 |
20050183143 | Anderholm et al. | Aug 2005 | A1 |
20050201297 | Peikari | Sep 2005 | A1 |
20050210533 | Copeland et al. | Sep 2005 | A1 |
20050238005 | Chen et al. | Oct 2005 | A1 |
20050265331 | Stolfo | Dec 2005 | A1 |
20060010495 | Cohen et al. | Jan 2006 | A1 |
20060015715 | Anderson | Jan 2006 | A1 |
20060021054 | Costa et al. | Jan 2006 | A1 |
20060031476 | Mathes et al. | Feb 2006 | A1 |
20060047665 | Neil | Mar 2006 | A1 |
20060070130 | Costea et al. | Mar 2006 | A1 |
20060075496 | Carpenter et al. | Apr 2006 | A1 |
20060095968 | Portolani et al. | May 2006 | A1 |
20060101516 | Sudaharan et al. | May 2006 | A1 |
20060101517 | Banzhof et al. | May 2006 | A1 |
20060117385 | Mester et al. | Jun 2006 | A1 |
20060123477 | Raghavan et al. | Jun 2006 | A1 |
20060143709 | Brooks et al. | Jun 2006 | A1 |
20060150249 | Gassen et al. | Jul 2006 | A1 |
20060161983 | Cothrell et al. | Jul 2006 | A1 |
20060161987 | Levy-Yurista | Jul 2006 | A1 |
20060161989 | Reshef et al. | Jul 2006 | A1 |
20060164199 | Gilde et al. | Jul 2006 | A1 |
20060173992 | Weber et al. | Aug 2006 | A1 |
20060179147 | Tran et al. | Aug 2006 | A1 |
20060184632 | Marino et al. | Aug 2006 | A1 |
20060191010 | Benjamin | Aug 2006 | A1 |
20060221956 | Narayan et al. | Oct 2006 | A1 |
20060236393 | Kramer et al. | Oct 2006 | A1 |
20060242709 | Seinfeld et al. | Oct 2006 | A1 |
20060251104 | Koga | Nov 2006 | A1 |
20060288417 | Bookbinder et al. | Dec 2006 | A1 |
20070006288 | Mayfield et al. | Jan 2007 | A1 |
20070006313 | Porras et al. | Jan 2007 | A1 |
20070011174 | Takaragi et al. | Jan 2007 | A1 |
20070016951 | Piccard et al. | Jan 2007 | A1 |
20070033645 | Jones | Feb 2007 | A1 |
20070038943 | Fitzgerald et al. | Feb 2007 | A1 |
20070064689 | Shin et al. | Mar 2007 | A1 |
20070094730 | Bhikkaji et al. | Apr 2007 | A1 |
20070143827 | Nicodemus et al. | Jun 2007 | A1 |
20070156895 | Vuong | Jul 2007 | A1 |
20070157180 | Tillmann et al. | Jul 2007 | A1 |
20070157306 | Elrod et al. | Jul 2007 | A1 |
20070171824 | Ruello et al. | Jul 2007 | A1 |
20070174915 | Gribble et al. | Jul 2007 | A1 |
20070192500 | Lum | Aug 2007 | A1 |
20070192858 | Lum | Aug 2007 | A1 |
20070192863 | Kapoor et al. | Aug 2007 | A1 |
20070198275 | Malden et al. | Aug 2007 | A1 |
20070240218 | Tuvell et al. | Oct 2007 | A1 |
20070240219 | Tuvell et al. | Oct 2007 | A1 |
20070240220 | Tuvell et al. | Oct 2007 | A1 |
20070240222 | Tuvell et al. | Oct 2007 | A1 |
20070250930 | Aziz et al. | Oct 2007 | A1 |
20070271446 | Nakamura | Nov 2007 | A1 |
20070285578 | Hirayama et al. | Dec 2007 | A1 |
20080005782 | Aziz | Jan 2008 | A1 |
20080072326 | Danford et al. | Mar 2008 | A1 |
20080077793 | Tan et al. | Mar 2008 | A1 |
20080080518 | Hoeflin et al. | Apr 2008 | A1 |
20080098476 | Syversen | Apr 2008 | A1 |
20080120722 | Sima et al. | May 2008 | A1 |
20080134178 | Fitzgerald et al. | Jun 2008 | A1 |
20080134334 | Kim et al. | Jun 2008 | A1 |
20080141376 | Clausen et al. | Jun 2008 | A1 |
20080184373 | Traut et al. | Jul 2008 | A1 |
20080189787 | Arnold et al. | Aug 2008 | A1 |
20080215742 | Goldszmidt et al. | Sep 2008 | A1 |
20080222729 | Chen et al. | Sep 2008 | A1 |
20080263665 | Ma et al. | Oct 2008 | A1 |
20080295172 | Bohacek | Nov 2008 | A1 |
20080301810 | Lehane et al. | Dec 2008 | A1 |
20080307524 | Singh et al. | Dec 2008 | A1 |
20080320594 | Jiang | Dec 2008 | A1 |
20090007100 | Field et al. | Jan 2009 | A1 |
20090013408 | Schipka | Jan 2009 | A1 |
20090031423 | Liu et al. | Jan 2009 | A1 |
20090036111 | Danford et al. | Feb 2009 | A1 |
20090044024 | Oberheide et al. | Feb 2009 | A1 |
20090044274 | Budko et al. | Feb 2009 | A1 |
20090083369 | Marmor | Mar 2009 | A1 |
20090083855 | Apap et al. | Mar 2009 | A1 |
20090089879 | Wang et al. | Apr 2009 | A1 |
20090094697 | Provos et al. | Apr 2009 | A1 |
20090125976 | Wassermann et al. | May 2009 | A1 |
20090126015 | Monastyrsky et al. | May 2009 | A1 |
20090126016 | Sobko et al. | May 2009 | A1 |
20090133125 | Choi et al. | May 2009 | A1 |
20090158430 | Borders | Jun 2009 | A1 |
20090187992 | Poston | Jul 2009 | A1 |
20090193293 | Stolfo et al. | Jul 2009 | A1 |
20090199296 | Xie et al. | Aug 2009 | A1 |
20090228233 | Anderson et al. | Sep 2009 | A1 |
20090241187 | Troyansky | Sep 2009 | A1 |
20090241190 | Todd et al. | Sep 2009 | A1 |
20090265692 | Godefroid et al. | Oct 2009 | A1 |
20090271867 | Zhang | Oct 2009 | A1 |
20090300761 | Park et al. | Dec 2009 | A1 |
20090328185 | Berg et al. | Dec 2009 | A1 |
20090328221 | Blumfield et al. | Dec 2009 | A1 |
20100017546 | Poo et al. | Jan 2010 | A1 |
20100043073 | Kuwamura | Feb 2010 | A1 |
20100054278 | Stolfo et al. | Mar 2010 | A1 |
20100058474 | Hicks | Mar 2010 | A1 |
20100064044 | Nonoyama | Mar 2010 | A1 |
20100077481 | Polyakov et al. | Mar 2010 | A1 |
20100083376 | Pereira et al. | Apr 2010 | A1 |
20100115621 | Staniford et al. | May 2010 | A1 |
20100132038 | Zaitsev | May 2010 | A1 |
20100154056 | Smith et al. | Jun 2010 | A1 |
20100192223 | Ismael et al. | Jul 2010 | A1 |
20100251104 | Massand | Sep 2010 | A1 |
20100281102 | Chinta et al. | Nov 2010 | A1 |
20100281541 | Stolfo et al. | Nov 2010 | A1 |
20100281542 | Stolfo et al. | Nov 2010 | A1 |
20100287260 | Peterson et al. | Nov 2010 | A1 |
20110025504 | Lyon et al. | Feb 2011 | A1 |
20110041179 | Stahlberg | Feb 2011 | A1 |
20110047594 | Mahaffey et al. | Feb 2011 | A1 |
20110047620 | Mahaffey et al. | Feb 2011 | A1 |
20110078794 | Manni et al. | Mar 2011 | A1 |
20110093951 | Aziz | Apr 2011 | A1 |
20110099633 | Aziz | Apr 2011 | A1 |
20110113231 | Kaminsky | May 2011 | A1 |
20110145920 | Mahaffey et al. | Jun 2011 | A1 |
20110167494 | Bowen et al. | Jul 2011 | A1 |
20110219450 | McDougal et al. | Sep 2011 | A1 |
20110247072 | Staniford et al. | Oct 2011 | A1 |
20110265182 | Peinado et al. | Oct 2011 | A1 |
20110307954 | Melnik et al. | Dec 2011 | A1 |
20110307955 | Kaplan et al. | Dec 2011 | A1 |
20110307956 | Yermakov et al. | Dec 2011 | A1 |
20110314546 | Aziz et al. | Dec 2011 | A1 |
20120079596 | Thomas et al. | Mar 2012 | A1 |
20120084859 | Radinsky et al. | Apr 2012 | A1 |
20120117652 | Manni et al. | May 2012 | A1 |
20120174186 | Aziz et al. | Jul 2012 | A1 |
20120174218 | McCoy et al. | Jul 2012 | A1 |
20120198279 | Schroeder | Aug 2012 | A1 |
20120210423 | Friedrichs et al. | Aug 2012 | A1 |
20120222121 | Staniford et al. | Aug 2012 | A1 |
20120278886 | Luna | Nov 2012 | A1 |
20120297489 | Dequevy | Nov 2012 | A1 |
20120330801 | McDougal et al. | Dec 2012 | A1 |
20130036472 | Aziz | Feb 2013 | A1 |
20130047257 | Aziz | Feb 2013 | A1 |
20130080634 | Grelewicz | Mar 2013 | A1 |
20130097706 | Titonis et al. | Apr 2013 | A1 |
20130111587 | Goel et al. | May 2013 | A1 |
20130160130 | Mendelev et al. | Jun 2013 | A1 |
20130160131 | Madou et al. | Jun 2013 | A1 |
20130227691 | Aziz et al. | Aug 2013 | A1 |
20130246370 | Bartram et al. | Sep 2013 | A1 |
20130263260 | Mahaffey et al. | Oct 2013 | A1 |
20130291109 | Staniford et al. | Oct 2013 | A1 |
20130298243 | Kumar et al. | Nov 2013 | A1 |
20140053260 | Gupta et al. | Feb 2014 | A1 |
20140053261 | Gupta et al. | Feb 2014 | A1 |
20150012647 | Grelewicz | Jan 2015 | A1 |
20160249106 | Lachwani | Aug 2016 | A1 |
Number | Date | Country |
---|---|---|
2439806 | Jan 2008 | GB |
02006928 | Jan 2002 | WO |
0223805 | Mar 2002 | WO |
2007117636 | Oct 2007 | WO |
2008041950 | Apr 2008 | WO |
2011084431 | Jul 2011 | WO |
2012145066 | Oct 2012 | WO |
Entry |
---|
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003). |
“Packet”, Microsoft Computer Dictionary, Microsoft Press, (Mar. 2002), 1 page. |
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.isp?reload=true&arnumbe- r=990073, (Dec. 7, 2013). |
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108. |
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003). |
AltaVista Advanced Search Results. “attack vector identifier”. Http://www.altavista.com/web/results?ltag=ody&pg=aq&aqmode=aqa=Event+Orch- estrator . . . , (Accessed on Sep. 15, 2009). |
AltaVista Advanced Search Results. “Event Orchestrator”. Http://www.altavista.com/web/results?ltag=ody&pg=aq&aqmode=aqa=Event+Orch- esrator . . . , (Accessed on Sep. 3, 2009). |
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006. |
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184. |
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77. |
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/q2/319 (“Boubalos”), (Jun. 5, 2003). |
Chaudet, C. , et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82. |
Chen, P. M. and Noble, B. D., “When Virtual is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”) (2001). |
Cisco, Configuring the Catalyst Switched Port Analyzer (SPAN) (“Cisco”), (1992). |
Cohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120. |
Costa, M. , et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005). |
Crandall, J.R. , et al., “Minos:Control Data Attack Prevention Orthogonal to Memory Model”, 37th International Symposium on Microarchitecture, Portland, Oregon, (Dec. 2004). |
Deutsch, P. , “Zlib compressed data format specification version 3.3” RFC 1950, (1996). |
Distler, “Malware Analysis: An Introduction”, SANS Institute InfoSec Reading Room, SANS Institute, (2007). |
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, (“Dunlap”), (Dec. 9, 2002). |
Excerpt regarding First Printing Date for Merike Kaeo, Designing Network Security (“Kaeo”), (2005). |
Filiol, Eric , et al., “Combinatorial Optimisation of Worm Propagation on an Unknown Network”, International Journal of Computer Science 2.2 (2007). |
Gibler, Clint, et al. AndroidLeaks: automatically detecting potential privacy leaks in android applications on a large scale. Springer Berlin Heidelberg, 2012. |
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28. |
IEEE Xplore Digital Library Sear Results for “detection of unknown computer worms”. Http//ieeexplore.ieee.org/searchresult.jsp?SortField=Score&SortOrder=desc- &ResultC . . . , (Accessed on Aug. 28, 2009). |
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003). |
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286. |
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”) (2003). |
Krasnyansky, Max , et al., Universal TUN/TAP driver, available at https://www.kernl.org/doc/Documentation/networking/tuntap.txt (2002) (“Krasnyansky”). |
Kreibich, C. , et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003). |
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages. |
Liljenstam, Michael , et al., “Simulating Realistic Network Traffic for Worm Warning System Design and Testing”, Institute for Security Technology studies, Dartmouth College (“Liljenstam”), (Oct. 27, 2003). |
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001). |
Margolis, P.E. , “Random House Webster's ‘Computer & Internet Dictionary 3rd Edition’”, ISBN 0375703519, (Dec. 1998). |
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910. |
Morales, Jose A., et al., ““Analyzing and exploiting network behaviors of malware.””, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34. |
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002). |
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987. |
Newsome, J. , et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), (Feb. 2005). |
Newsome, J. , et al., “Polygraph: Automatically Generating Signatures for Polymorphic Worms”, In Proceedings of the IEEE Symposium on Security and Privacy, (May 2005). |
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302. |
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doorn, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”). |
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25. |
Singh, S. , et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004). |
Spitzner, Lance , “Honeypots: Tracking Hackers”, (“Spizner”), (Sep. 17, 2002). |
The Sniffers's Guide to Raw Traffic available at: yuba.stanford.edu/.about.casado/pcap/section1.html, (Jan. 6, 2014). |
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998). |
U.S. Appl. No. 13/775,174, filed Feb. 23, 2013 Non-Final Office Action dated Feb. 20, 2015. |
U.S. Appl. No. 13/775,174, filed Feb. 23, 2013 Notice of Allowance dated Aug. 11, 2015. |
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003). |
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages. |
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9. |
Number | Date | Country | |
---|---|---|---|
Parent | 13775174 | Feb 2013 | US |
Child | 14949770 | US |