Real-time visual playback with synchronous textual analysis log display and event/time indexing

Information

  • Patent Grant
  • 10929266
  • Patent Number
    10,929,266
  • Date Filed
    Monday, July 9, 2018
    5 years ago
  • Date Issued
    Tuesday, February 23, 2021
    3 years ago
Abstract
In one embodiment, a method for detecting one or more behaviors by software under test that indicate a presence of malware is described. First, an analysis of operations conducted by the software being processed by a virtual machine is performed. The analysis includes monitoring one or more behaviors conducted by the software during processing within the virtual machine. Next, a video corresponding to at least the one or more monitored behaviors, which are conducted by the software during processing of the software within the virtual machine, is generated. Also, text information associated with each of the one or more monitored behaviors is generated, where the text information being displayed on an electronic device contemporaneously with the video corresponding to the one or more monitored behaviors.
Description
FIELD

Embodiments of the invention relate to the field of application software testing. More specifically, one embodiment of the disclosure relates to a system, apparatus and method for providing a user interface to visually display, in real-time, event/time indexed video that illustrates simulated operations of an application undergoing anomalous behavior detection analysis and a textual log synchronized with the video in accordance with execution flow.


GENERAL BACKGROUND

Normally, malware features one or more programs or files that disrupt the operations of an infected electronic device, normally by attacking and adversely influencing its operations. In some instances, malware may, unbeknownst to the user, gather and transmit passwords and other sensitive information from the electronic device. In other instances, malware may alter the functionality of the electronic device without the user's permission. Examples of different types of malware may include bots, computer viruses, worms, Trojan horses, spyware, adware, or any other programming that operates within the electronic device without permission.


Over the last decade, various types of malware detection applications have been introduced in order to uncover the presence of malware within an electronic device, especially within software downloaded from a remote source and installed within the electronic device. However, these applications neither provide an ability to customize the behavioral analysis nor obtain the benefits of a real-time, interactive visual display of such analysis.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 is an exemplary block diagram of an embodiment of a communication system.



FIG. 2 is an exemplary block diagram of logic implemented within an anomalous behavior detection device being part of the communication system of FIG. 1.



FIG. 3 is an exemplary block diagram of logic within the application analyzer of FIG. 2.



FIG. 4A is an exemplary diagram of a flowchart partially illustrating the anomalous behavior detection analysis conducted by the application analyzer to generate, in real time, a video of simulated operations of a targeted software application in order to detect particular anomalous behaviors.



FIG. 4B is an exemplary diagram of a flowchart partially illustrating the anomalous behavior detection analysis conducted by the application analyzer to display the video of the simulated operations which is searchable/indexed according to particular anomalous behaviors.



FIG. 5 is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 to obtain access privileges to anomalous behavior processes.



FIG. 6 is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 to operate as a dashboard for the anomalous behavior detection analysis.



FIG. 7A is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 to upload an application or search for an application in an on-line store to be analyzed for anomalous behavior.



FIG. 7B is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 in which a search within one or more on-line stores is conducted for one or more versions of the “XYZ Messenger” application.



FIG. 7C is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that identifies the “XYZ Messenger” application being located within the on-line store(s) for use as the test application for the anomalous behavior detection analysis.



FIG. 8A is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that lists user-interaction test behaviors for analysis.



FIG. 8B is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that provides a user-interactive mechanism for sequencing and/or grouping anomalous behaviors for analysis.



FIG. 9 is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that illustrates real-time activity during the anomalous behavior detection analysis.



FIG. 10 is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that illustrates completion of the anomalous behavior detection analysis along with the final image or frame of the real-time analysis of the analyzed application and a dynamic textual log of the analyzed test behaviors.



FIG. 11 is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that illustrates replay of video associated with the anomalous behavior detection analysis with event or behavioral searching and time-based indexing based on user interaction.



FIG. 12 is an exemplary embodiment of a user interface produced by the application analyzer of FIG. 3 that illustrates details of the results of the anomalous behavior detection analysis based on user interaction.





DETAILED DESCRIPTION

Various embodiments of the invention relate to a system, apparatus and method for providing a user interface to control a real-time visual display of an anomalous behavior detection analysis being conducted on simulated operations of application software running within a virtual machine (VM) emulated run-time test and observation environments (hereinafter virtual “run-time environment”). For example, according to one embodiment, the visual display may be video depicting the simulated operations during this detection analysis, where the video is synchronized with a textual log displayed with the video.


The video features a multiple order indexing scheme, namely an index scheme that allows a user (viewer) with an ability to access additional data during rendering (display) of the video. For instance, the multiple order indexing scheme may include a first order and a second order of indexing, where the first order of indexing permits a user, by user interaction, to access a particular segment of video in accordance with either a particular playback time in the video or a particular analyzed event. An example of an “analyzed event” is a test behavior, namely a particular behavior being monitored during the anomalous behavior detection analysis. The second order of indexing provides a user, during display of the video, information related to where the analyzed event occurs within the execution flow of the application software.


Hence, the video not only enables an administrator to visually witness anomalous behaviors that suggest the application software under test has malware, suspicious code or pernicious code, but also provides an administrator with evidence for use in policy enforcement and information to further refine (or harden) the anomalous behavior detection analysis and/or application software.


In the following description, certain terminology is used to describe features of the invention. For example, in certain situations, the terms “logic”, “engine” and “unit” are representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, logic may include circuitry such as processing circuitry (e.g., a microprocessor, one or more processor cores, a programmable gate array, a microcontroller, an application specific integrated circuit, etc.), wireless receiver, transmitter and/or transceiver circuitry, semiconductor memory, combinatorial logic, or other types of electronic components.


As software, logic may be in the form of one or more software modules, such as executable code in the form of an executable application, an application programming interface (API), a subroutine, a function, a procedure, an applet, a servlet, a routine, source code, object code, a shared library/dynamic load library, or one or more instructions. These software modules may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of non-transitory storage medium may include, but is not limited or restricted to a programmable circuit; a semiconductor memory; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the executable code is stored in persistent storage.


It is contemplated that an electronic device may include hardware logic such as one or more of the following: (i) processing circuitry; (ii) a communication interface which may include one or more radio units (for supporting wireless data transmission/reception) and/or a physical connector to support wired connectivity; (iii) a non-transitory storage medium; and/or (iv) a display. Types of electronic devices may include mobile electronic devices (e.g., cellular smartphones, tablets, laptop computers, netbooks, etc.), stationary electronic devices (e.g., desktop computers, servers, controllers, access points, base stations, routers, etc.) that are adapted for network connectivity.


The term “transmission medium” is a communication path between two or more electronic devices. The communication path may include wired and/or wireless segments. Examples of wired and/or wireless segments include electrical wiring, optical fiber, cable, bus trace, or a wireless channel using infrared, radio frequency (RF), or any other wired/wireless signaling mechanism.


The term “video” is generally defined as a series of successive display images, including VM-emulated graphical representations (screenshots) of operations that would have been displayed if an electronic device executed the application software under test (“test application”) natively (e.g. if the application under test was executed on a mobile device OS). Hence, video may have a number of different formats, for example, a series of graphic images sequenced to represent a series of video frames; a series of compressed video frames in compliance with H.264, MPEG-2 or another video format; and a series of static images (such as slide show) that together define a time-based sequence. The video may even be vector-based graphic representations that collectively produce an animated sequence of images.


The term “anomalous behavior” is directed to an undesirable behavior occurring during execution of application software, where a behavior may be deemed to be “undesirable” based on customer-specific rules, manufacturer-based rules, or any other type of rules formulated by public opinion or a particular governmental or commercial entity. This undesired behavior may include (1) altering the functionality of the device executing that application software in a malicious manner (malware-based behavior); altering the functionality of the device executing that application software without any malicious intent (suspicious code-based behavior); and/or (3) providing an unwanted functionality which is generally acceptable in other context (pernicious code-based behavior). Examples of unwanted functionality by pernicious code may include tracking and/or disseminating user activity on the device (e.g., websites visited, email recipients, etc.), tracking and/or disseminating user location (e.g., global satellite positioning “GPS” location), privacy intrusion (e.g. accessing certain files such as contact lists), or the like.


For instance, as illustrative examples, an “anomalous behavior” may include a communication-based anomaly, such as an unexpected attempt to establish a network communication, unexpected attempt to transfer data (e.g., GPS data or other location data resulting in a privacy violation, contact lists, etc.), unexpected attempt to activate a video capture device (e.g., web camera), or unexpected activation of an audio capture device (e.g. microphone). Anomalous behavior also may include an execution anomaly, for example, an unexpected execution of computer program code, an unexpected Application Programming Interface (API) function call, an unexpected alteration of a registry key, or the like.


Lastly, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.


As this invention is susceptible to embodiments of many different forms, it is intended that the present disclosure is to be considered as an example of the principles of the invention and not intended to limit the invention to the specific embodiments shown and described.


I. General Architecture


Referring to FIG. 1, an exemplary block diagram of an embodiment of a communication system 100 is shown. Communication system 100 comprises an anomalous behavior detection device 110 for testing the safety/security of application software such as mobile device application software for example. As shown, anomalous behavior detection device 110 is communicatively coupled via a transmission medium 120 to an on-line store 130, namely one or more servers operating as a source from which application software may be downloaded to anomalous behavior detection device 110.


Furthermore, anomalous behavior detection device 110 is communicatively coupled via a transmission medium 140 to one or more electronic devices 1501-150N (N≥1). Through a graphics user interface (GUI) provided by anomalous behavior detection device 110, an administrator is able to (i) control the anomalous behavior detection analysis and (ii) watch, in real-time, VM-emulated operations of the test application in concert with analysis of the presence or absence of certain behaviors chosen to be monitored during such operations (hereinafter referred to as “test behaviors”).


As shown in FIG. 1, electronic device(s) 1501-150N may include an electronic device 1501 communicatively coupled to transmission medium 140 via a wireless transmission medium 160. Alternatively, electronic device(s) 1501-150N may include electronic device 1502 (N=2) that is communicatively coupled to transmission medium 140 via a wired transmission medium 170. As shown, electronic device 1501 is a dual-mode cellular telephone while electronic device 150N is a computer.


It is contemplated that communication system 100 may represent a dedicated anomalous behavior detection process for a particular network or subnetwork, being a part of a larger network. In such a deployment, anomalous behavior detection device 110 may be communicatively coupled to a central management system (CMS) 180, which communicatively couples communication system 100 along with other communication systems. This allows multiple communication systems to operate in tandem and exchange information as needed.


It is further contemplated that anomalous behavior detection device 110 may be deployed to provide cloud computing anomalous behavior detection services. Alternatively, anomalous behavior detection device 110 may be deployed as an appliance (electronic device) integrated as part of a local or enterprise network, or any combination thereof.


Referring now to FIG. 2, an exemplary block diagram of logic that is implemented within anomalous behavior detection device 110 is shown. Anomalous behavior detection device 110 comprises one or more processors 200 that are coupled to a communication interface logic 210 via a first transmission medium 220. Communication interface 210 enables communications with other electronic devices over private and/or public networks, such as a display device 190 used to view the results of the anomalous behavior detection analysis. According to one embodiment of the disclosure, communication interface 210 may be implemented as a physical interface including one or more ports for wired connectors. Additionally, or in the alternative, interface 210 may be implemented with one or more radio units for supporting wireless communications with other electronic devices.


Processor 200 is further coupled to persistent storage 230 via transmission medium 240. According to one embodiment of the disclosure, persistent storage 230 may include video storage unit 250, application analyzer logic 260, graphics user interface (GUI) logic 270, and identity verification logic 280. Of course, when implemented as hardware logic, any of these logic/units 250, 260, 270 and/or 280 would be implemented separately from persistent memory 230.


Application analyzer 260 is adapted to conduct testing of the safety/security of application software, including mobile device application software. Such testing involves at least analysis of one or more test behaviors in response to a sequence of simulated (e.g. VM-emulated) operations performed by the test application. From the analysis of these test behaviors, anomalous behaviors may be detected.


During this testing, application analyzer 260 also generates video by capturing display images (or frames), on a continuous or periodic sampling basis, produced during simulated operations of the test application during the anomalous behavior detection analysis. As the anomalous behavior detection analysis at least partially involves analysis of a sequence of test behaviors, a time stamp is associated with at least a first display image (or frame) of a video segment for each test behavior being analyzed. This enables the video to be indexed by time and by test behavior. The time stamp along with information directed to the corresponding test behavior is stored within a time-stamp storage unit 262 accessible to application analyzer 260 while the video may be stored in video storage unit 250 for later review. Of course, time-stamps may be applied to every display image (or frame) to provide greater precision on the location within the video where analysis for particular test behaviors is conducted.


Additionally, application analyzer 260 features an index tracking logic 264 that is adapted to track and record which display images (or frames) of the video corresponds to a particular test behavior being analyzed. For example, it is contemplated that index tracking logic 264 may include a table where each entry maintains an identifier (or index) associated with a particular display image (or frame) along with a corresponding identifier (or index) associated with a particular aspect of the test behavior being analyzed. As a result, the display of the video is synchronized with the display (and illustrated progress) of the analyzed test behaviors.


Furthermore, after completion of this testing, application analyzer 260 assigns a threat score to the test application. The threat score, ranging between minima and maxima values (e.g., 0-10), represents the severity of the test behavior(s) detected during the anomalous behavior detection analysis. In other words, the threat score may be considered to represent the amount of potential harm that detected anomalous behavior(s) may cause an electronic device executing that test application.


As further shown in FIG. 2, GUI logic 270 provides user interface screen displays for controlling the operational state of application analyzer 260 as describe above. As examples, GUI logic 270 enables user-control of the anomalous behavior detection analysis by producing a behavior selection display screen (see FIG. 8A) and a behavior ordering display screen (see FIG. 8B). The behavior selection display screen enables user interaction as to the particular test behaviors for application analyzer 260 to monitor during simulation operations of the test application. The behavior ordering display screen allows the user to place these behaviors into a particular sequence or grouping. Also, GUI logic 270 produces user interface display screens to convey the anomalous behavior analysis results, including real-time display of (i) video representing simulated operations of the test application in concert with analysis of the presence or absence of anomalous behaviors and/or (ii) textual log synchronized with the display of the video to show the progress and completion of the analyzed events and execution flow. In some embodiments, GUI logic 270 generates, for display contemporaneously (i.e., in a temporally overlapping manner) with the video, a textual log that provides information as to when each event occurs within an execution flow of the operations of the test application; and provides, during playback of the video on screen, reciprocal graphic interaction between the displayed video and the displayed textual log responsive to a user input.


Identity verification logic 280 is used to control authentication of users seeking to access application analyzer 260. Furthermore, identity verification logic 280 may set access privileges for each authenticated user, as certain users may have restricted access to only certain functionality offered by application analyzer 260. As an example, one user may have access to replay video stored in video storage unit 250, but is unable to initiate anomalous behavior analysis testing on application software. Another administrator may have complete access to all functionality offered by application analyzer 260.


Referring now to FIG. 3, an exemplary block diagram of logic within application analyzer 260 of FIG. 2 is shown. Herein, application analyzer 260 comprises (1) static instrumentation engine 300; (2) Dynamic run-time test and observation (RTO) engine 320, and (3) behavior setting logic 370. As shown, static instrumentation engine 300 and dynamic RTO engine 320 are deployed within the same device. However, it is contemplated that static instrumentation engine 300 and dynamic RTO engine 320 may be employed within different devices and/or executed by different processors when implemented as software.


Static instrumentation engine 300 receives a test application (APPN) 305 and generates a representation of the test application 300 that is analyzed with one or more various software analysis techniques (e.g., control information analysis, or data analysis). Static instrumentation engine 300 then modifies the application code itself to include within itself special monitoring functions and/or special stimuli functions operable during execution of the test application in dynamic run-time test and observation engine 320. The monitoring functions report their results to the control logic 325 and the stimuli functions are told what stimuli to generate by control logic 325. During such analysis by static instrumentation engine 300, video 310 is captured and/or other graphics related to the analysis is generated and provided to GUI logic 270 to produce one or more user interface display screens. Furthermore, video 310 is stored in video storage unit 250 for sub sequent playback.


It is contemplated that static instrumentation engine 300 may be adapted to receive information from dynamic RTO engine 320 in order to instrument the code to better analyze specific behaviors targeted in the heuristics and/or probability analysis.


After processing is completed by static instrumentation engine 300, test application 305 is then provided to control logic 325 within dynamic RTO engine 320. Control logic 325 operates as a scheduler to dynamically control the anomalous behavior detection analysis among different applications and/or the same application software among different virtual run-time environments. Furthermore, control logic 325 maintains time-stamp storage unit 262 and index tracking logic 264 as previously described.


In general, dynamic RTO engine 320 acts as an intelligent testing function. According to one approach, dynamic RTO engine 320 recursively collects information describing the current state of test application 305 and selects a subset of rules, corresponding at least in part to the test behaviors set by the user, to be monitored during virtual execution of test application 305. The strategic selection and application of various rules over a number of recursions in view of each new observed operational state permits control logic 325 to resolve a specific conclusion about test application 305, namely a threat score denoting whether the application is “safe” or “unsafe”.


As shown in FIG. 3, dynamic RTO engine 320 comprises a virtual machine repository 330 that is configured to store one or more virtual machines 3401-340P (where P≥1). More specifically, virtual machine repository 330 may be adapted to store a single virtual machine (VM) that can be configured by scheduling functionality within control unit 325 to simulate the performance of multiple types of electronic devices. Virtual machine repository 330 also can store any number of distinct VMs each configured to simulate performance of a different electronic device and/or different operating systems (or versions) for such electronic devices.


One or more virtual run-time environments 350 simulate operations of test application 305 to detect anomalous behavior produced by this application. For instance, run-time environment 3551 can be used to identify the presence of anomalous behavior during analysis of simulated operations of test application 305 performed on a virtual machine 3401. Of course, there can be multiple run-time environments 3551-355M (M≥2) to simulate multiple types of processing environments for test application 305.


A virtual machine may be considered a representation of a specific electronic device that is provided to a selected run-time environment by control unit 325. In one example, control unit 325 retrieves virtual machine 3401 from virtual machine repository 330 and configures virtual machine 3401 to mimic an Android® based smart phone. The configured virtual machine 3401 is then provided to one of the run-time environments 3551-355M (e.g., run-time environment 3551).


As run-time environment 3551 simulates the operations of test application 305, virtual machine 3401 can be closely monitored for any test behaviors set by the user (or set by default) in behavior setting logic 370. By simulating the operations of test application 305 and analyzing the response of virtual machine 3401, run-time environment 3551 can identify known and previously unidentified anomalous behavior and report the same through the indexed video and a dynamic textual log.


Besides VM 3401, run-time environment 3551 is provided test application 305 along with an instance 360 of test application (App) an instance 365 of the type of operating system on which target application 305 will run if deemed sufficiently safe during the dynamic anomalous behavior detection process. Here, the use of virtual machines (VMs) permits the instantiation of multiple additional run-time environments 3551-355M each having its own test application and OS instance, where the various run-time environments 3551-355M are isolated from one another.


As previously described, the simultaneous existence of multiple run-time environments 3551-355M permits different types of observations/tests to be run on a particular test application. That is, different instances of the same test application may be provided in different run-time environments so that different types of tests/observances can be concurrently performed on the same application. Alternatively, different test applications can be concurrently tested/observed.


For instance, a first application may be tested/observed in a first run-time environment (e.g., environment 3551) while a second, different application is tested/observed in another run-time environment (e.g., environment 355M). Notably, instances of different operating system types and even different versions of the same type of operating system may be located in different run-time environments. For example, an Android® operating system instance 365 may be located in first run-time test environment 3551 while an iOS® operating system instance (not shown) may be located in a second run-time test environment 355M. Concurrent testing of one or more test applications (whether different instances of the same application or respective instances of different applications or some combination thereof) enhances the overall performance of the communication system.


II. Anomalous Behavior Analysis and Video Generation/Playback


Referring to FIG. 4A, an exemplary diagram of a flowchart partially illustrating anomalous behavior detection analysis conducted by the application analyzer, which generate, in real time, video that may capture anomalous behavior detected in response to simulated operations of the test application and may provide a visual correlation of the anomalous behavior with the video segment at which it occurred. However, prior to conducting this anomalous behavior analysis, the anomalous behavior detection device receives a message from an electronic device requesting access to the application analyzer. In response, a first user interface (login) display screen (see FIG. 5) is provided by GUI logic within the anomalous behavior detection device. After authentication of the user operating the electronic device and/or the electronic device initiating the request message, the GUI logic fetches heuristic data related to operations previously and currently being performed by the application analyzer. Such heuristic data is provided to the GUI logic to generate textual and/or visual representations displayed in a second user interface (dashboard) display screen (see FIG. 6).


Next, upon user interaction with the second user interface display screen (e.g. selection by the user of a particular object), the GUI logic provides a third user interface display screen (See FIGS. 7A-7C) that enables a user to select the test application, which may be uploaded from a web server (or an application database accessible to the application analyzer), or retrieved by searching an on-line store for that application (block 400). Once the test application is received by the application analyzer, a determination is made as to whether default test behaviors are to be used for the anomalous behavior detection analysis (blocks 410 and 420). If not, the GUI logic provides user interface display screens that enable modification of the test behaviors through user interaction (e.g. by selecting and deselecting listed behaviors that are available for analysis as well as altering the sequence order or groupings in the analysis of the behaviors) as set forth in block 425.


Once the test behaviors for the anomalous behavior detection analysis are set, the application analyzer virtually processes the test application to detect anomalous behavior (block 430). The simulated operations conducted during the virtual processing of the test application produce video, which is sent to the GUI logic for rendering a fourth user interface display screen in real time (block 435). Additionally, a textual log providing information as to what events (e.g., test behaviors) are being analyzed and when the analyzed events occur within the execution flow of the application software. This information may be provided through the placement and ordering of display objects corresponding to test behaviors alongside the video corresponding to the order of display images (or frames) rendered during the simulated operations of the test application.


As a result, progress changes in the anomalous behavior analysis displayed by the video are synchronized with progress changes shown by the textual log. Concurrently with or subsequent to the supply of the video to the GUI logic, the video is provided to video storage unit for storage and subsequent retrieval for playback (block 440).


Referring to FIG. 4B, an exemplary diagram of a flowchart partially illustrating the replay of video produced by the application analyzer performing anomalous behavior detection analysis is shown, where the video is indexed according to the particular test behaviors. As illustrated, upon conducting playback of video associated with the anomalous behavior analysis conducted on the test application, a determination is made whether the playback is directed to viewing a particular test behavior (blocks 450 and 460). If not, the video commences playback at the beginning or at an elapsed start time selected by the user (blocks 470, 472 and 474). However, if the playback is directed to viewing video associated with a particular test behavior, the application analyzer accesses a time stamp associated with a first frame for a video segment corresponding to the test behavior and uses the time-stamp to index a starting point for the video playback (block 475).


Thereafter, playback of the video continues unless disrupted by video playback alternation events (e.g., Pause, Stop, Fast-Forward, Reverse, etc.) in which playback of the video is haltered to service these events (blocks 480, 482, 484 and 486). Once playback of the video has completed, this playback session ends (block 490). The user may be provided the opportunity to commence a new playback session or select another video.


III. User Interface Display Screens to Control the Application Analyzer


Referring now to FIG. 5, an exemplary embodiment of a first user interface (Login) display screen 500 produced by application analyzer 260 of FIG. 3 is shown. Herein, in order to gain access to the application analyzer to perform anomalous behavior detection analysis, the user initially establishes a network connection with the anomalous behavior detection device. This network connection may be established in accordance Hypertext Transfer Protocol (HTTP) Request or HTTP Secure (HTTPS) communication protocols.


As shown, an initial request for access to the application analyzer is redirected to login display screen 500 that features at least two entry fields; namely a User Name 510 and a Password 520. The User Name entry field 510 requires the user to enter a registered user name in order to identify the user seeking access to the application analyzer. Password entry field 520 allows the user to enter his or her password.


Once a login object 530 is selected by the user, the user name and password are provided to identity verification logic 280 of FIG. 2 within anomalous behavior detection device 110. Once the user is verified by identity verification logic 280, access privileges for that user are set and the user is provided with a second user interface display screen 600 as shown in FIG. 6.


As shown in FIG. 6, an exemplary embodiment of second user interface display screen 600 produced by the application analyzer of FIG. 3 to operate as a dashboard is shown. Herein, dashboard display screen 600 comprises a plurality of areas 610, 640 and 670 that display results of anomalous behavior analysis testing over a selected time period.


For instance, first area 610 displays a plurality of objects that provide information directed to application software that has been analyzed or are currently being analyzed with a first selected time period (24 hours). Provided by the application analyzer to the GUI logic for rendering, the information associated with these objects identifies: (1) number of applications submitted (object 620); (2) number of applications analyzed (object 622); (3) number of applications currently being analyzed (object 624); (4) number of applications analyzed according to customized rule settings (object 626); and (5) the number of “unsafe” applications detected (object 628). Some or all of these numeric values are stored for a period of time that may be set by the manufacturer or the user.


It is contemplated that the first selected time period may be adjusted through a drop-down list 615 that features multiple time periods using the current time as a reference (e.g., 24 hours ago, 1 week ago, 1 month ago, 3 months ago, 1 year ago, etc.). However, although not shown, drop-down list 615 may also feature user interaction to select the start and end time periods for the first selected time period.


Second area 640 provides graphical depictions of application software analyzed over a second selected time period 645, which may differ from the selected time period for first display area 610. As shown, a first graphical depiction 650 represents a line graph that identifies different categories of analyzed applications (vertical axis) analyzed at different times within the selected time period (horizontal axis). The different categories include (1) “safe” applications 652 (applications with a threat score not greater than a predetermined threshold); (2) unsafe applications 654 (applications with a threat score greater than the predetermined threshold); and (3) applications submitted for analysis 656.


A second graphical depiction 660 represents a bar graph directed to applications that have completed their anomalous behavior analysis testing. For this bar graph, the horizontal axis represents the measured threat score (0-10) while the vertical axis represents the number of analyzed applications associated with the measured threat score.


A third graphical depiction 665 represents a pie chart also directed to applications that have completed their anomalous behavior analysis testing. A first color 667 denotes those applications having a threat score indicating the application is considered “safe” for use while a second color 668 denotes those applications having a threat score that identifies them as being “unsafe” for use.


Third area 670 provides a graphical and/or textual depiction entry 675 for each application that has been analyzed or is in the process of being analyzed. Each entry 675 includes a plurality of parameters, including at least three or more of the following: (1) date the application was submitted 680; (2) application name 681; (3) status (safe, unsafe, complete with error, in progress) 682; (4) threat score 683; and (5) custom rule matching status 684. The order of these entries can be adjusted according to submission date, alphabetically by application name, status and threat score.


With respect to the status parameter 682, currently, there are four status levels. As previously mentioned, “safe” is a status level assigned to applications having a threat score no greater than a predetermined threshold while “unsafe” is a status level assigned to applications having a threat score greater than the predetermined threshold, normally indicating the presence of malware or some sort of suspicious or pernicious code causes behaviors unsuitable for the targeted device. Another status level is “in progress” which indicates that the corresponding application is currently undergoing the anomalous behavior analysis. Lastly, “complete-error” is a status level which identifies that an anomalous behavior has been detected, but the risk level may widely vary depending on the particular customer.


For instance, as an illustrative example, for application software that establishes a network connection to a server for upgrades, without any malicious intent, the assigned level of risk would be minimal for most clients. However, where the electronic device is for use by a high-ranking governmental official, any unknown network connectivity may be assigned a high risk. Hence, the test application is assigned to “complete-error” status with medium threat score upon detecting a test behavior that is considered by the anomalous behavior detection analysis as being customer dependent. This status level encourages user interaction (e.g., select “Go To Details” link located next to the threat score) to obtain a more detailed explanation of the findings associated with the threat score, although more detailed explanations are provided for all status levels.


Referring now to FIGS. 7A-7C, exemplary embodiments of a third user interface display screen 700, which is produced by the application analyzer to provide upload and search capabilities for the test applications to be analyzed for anomalous behavior, is shown. Herein, FIG. 7A illustrates screen display 700 that is generated in response to user interaction (e.g. selection of a particular menu object 705). According to this embodiment of the disclosure, third user interface display screen 700 comprises an upload display area 710, a search display area 730 and a submission area 760.


Upload display area 710 enables the user to enter addressing information (e.g. Uniform Resource Locator “URL”, File Transfer Protocol “FTP” address, etc.) with an input field 715. Thereafter, once the “Submit” object 720 is selected, an HTTP Request message is sent in order to fetch the test application from the website or database specified by the addressing information.


Search display area 730 features an input field 735 into which the user can enter the name (or at least a portion of the name) of the test application. For instance, as shown in FIG. 7B, application software entitled “XYZ Messenger” is input into input field 735. A drop-down list 740 enables the user to select from a list of on-line stores from which to search and acquire the XYZ Messenger as the test application. These on-line stores may include Google® Play® store, Apple® App Store™, Amazon® Appstore, Windows® Phone store or BlackBerry® World™ app store, or combinations of such on-line stores. After entering at least a portion of the application name and selecting the on-line store, a “Search” object 745 is selected. This activates a web browser to search for the identified application software at websites associated with the selected on-line store(s).


Referring to FIG. 7C, if the on-line store has a copy of the test application, the test application is returned and displayed as object 750 as shown in FIG. 7C along with metadata 755 associated with the test application (e.g., publisher name, size, version type, OS type supported, or user rating). It is contemplated that, if the on-line store has multiple versions of the test application (XYZ Messenger), all versions are returned to the application analyzer and displayed. This allows the user interaction as to the particular version to undergo anomalous behavior analysis, and based on certain activity, such upon selecting the “Submit for Analysis” object 780, the anomalous behavior analysis of the test application begins. This enables the user to upgrade and downgrade applications to whatever version is desired by the user.


Referring back to FIG. 7A, submission area 760 displays objects 765 that identify applications that have been analyzed for anomalous behavior or are currently being analyzed for anomalous behavior. It is contemplated that, based on user interaction, each of these objects may extract either (1) a website (or server address) from which the application was obtained along with the application name (and perhaps its version number) for insertion into input field 715 or (2) the application name (and perhaps its version number) for insertion into input field 735. This enables the user to conduct searches for updates to the particular application software without having to re-enter information to locate that application.


Referring now to FIG. 8A, an exemplary embodiment of a behavior display screen 800 produced by application analyzer 260 of FIG. 3 that lists selectable test behaviors for anomalous behavior analysis is shown. In response to user interaction (e.g., after selecting the “Submit” object 720 within upload display area 710 or “Search” object 745) and upon retrieval of the test application, application analyzer uploads available test behaviors into the GUI logic to produce behavior display screen 800. As shown, behavior display screen 800 comprises information 805 to identify the test application, a drop-down list 810 identifying a selected operating system (e.g., Android® 4.2) to be emulated by the VM when conducting the anomalous behavior analysis testing, and a listing of test behaviors 815 supported by the application analyzer.


For ease of illustration, only some of the test behaviors are set forth in FIG. 8A. As a default setting, certain test behaviors are pre-selected for anomalous behavior analysis, although it is contemplated that each listed test behavior may be subsequently selected or deselected by the user. Examples of the test behaviors are shown below in Table A, by category and behavior description.












TABLE A







Category
Behavior Description









PHONE
Incoming/outgoing call notification




Make/receive calls



SMS
Send SMS to any number




Send SMS to premium number




Receive SMS (or be notified of




incoming/outgoing messages)




Modify/delete SMS




Leak SMS contents



NETWORK
Access suspicious domain



LOCATION
Access coarse location




Access fine location




Leak geo-location



USER ACCOUNTS
Access multimedia




(photos/videos/documents)




Leak contacts



CAMERA
Access camera



MICROPHONE
Leak recorded audio




Record audio



BLUETOOTH/NFC
Access Bluetooth/NFC device




Pair with external devices



FILESYSTEM
Add/delete files on storage




Execute arbitrary files




Modify system folder contents



FRAMEWORK
Bypass framework/access internal APIs




Access kernel drivers



SYSTEM
Install other apps




Wipe user data




Wipe cache




Access data of other apps




Gain root access



Custom
Custom sequence #1










As further shown in FIG. 8A, six (6) test behaviors are set for the anomalous behavior detection analysis, namely (1) send Short Message Service (SMS) message to any number 820; (2) access suspicious domain 821; (3) add/delete files in local storage 822; (4) install other applications 823; (5) record audio 824; and (6) access camera 825. After the test behaviors are set, based on user interaction (e.g., the user selects the “Start Analysis” object 830), the anomalous behavior detection analysis commences.


As further shown in FIG. 8A, based on user interaction (e.g. selection of the “create new behavior” link 840), the user-interactive display screen is provided for sequencing and/or grouping of test behaviors for analysis, as shown in FIG. 8B.


Referring to FIG. 8B, an exemplary embodiment of a behavior group/sequence display screen 850 produced by the application analyzer of FIG. 3 is shown. As shown, display screen 850 provides a user-interaction mechanism for sequencing and/or grouping test behaviors for analysis. The sequence and/or grouping are used by the application analyzer to customize when test behaviors are monitored during simulated operations of the test application in one or more run-time environments.


More specifically, sequence based analysis builder 860 provides a listing 865 of test behaviors chosen by the user as illustrated in FIG. 8A and allows the user to click and drag any test behavior 815 within listing 865 to alter its position within the listing. The sequence order of the test behaviors (from top-to-bottom) defines the order of processing as represented by a textual and graphical representation 870.


Similarly, group & sequence based analysis builder 875 enables use of test behaviors 815 to formulate groupings using logical operators (AND, OR) 880. Test behaviors 815 may be dragged into position along with logical operators 880.


Referring now to FIG. 9, an exemplary embodiment of fourth user interface display screen 900, which is produced by the application analyzer of FIG. 3 and illustrates real-time activity during the anomalous behavior detection analysis, is shown. As illustrated, fourth user interface display screen 900 is produced by user interaction, such as in response to selection of any “Start Analysis” objects 830, 890 and 895 of FIGS. 8A and 8B for example. Display screen 900 comprises a plurality of display areas that are dynamically updated during virtual processing of the test application. Such updates may be performed contemporaneous and in real-time (e.g. <1 sec. between successive updates), although other update mechanisms may be used in which updates are performed less often. These display areas include a video display area 920, a log display area 940 and a progress bar display area 960.


Video display area 920 is allocated to display video 925, which captures simulated operations of the test application (XYZ Messenger) during the anomalous behavior detection analysis in concert with analysis of the presence or absence of the selected events. During anomalous behavior analysis of the XYZ Messenger by one or more VMs in a run-time environment, as illustrated in FIGS. 2 and 3, the application analyzer uploads video 925 to the GUI logic, which renders, in real time, video 925 within video display area 920. The video 925 may illustrate static and/or dynamic testing of XYZ Messenger for anomalous behavior. The progress of the anomalous behavior detection analysis is represented by progress bar 965, where such progress is determined by the application analyzer.


Synchronous with playback of video 925, a textual log 945 is provided to identify the execution flow and which test behaviors have been completed, awaiting analysis or currently being analyzed. The display of interface display screen 900, especially the textual log 945, may be conducted in two different types of modes: Regular Display mode and Analyst Display mode. In Regular mode, the showing/listing of only detected anomalous behaviors, such as suspicious or important events/results for example, is conducted. In Analyst Display mode, the showing/listing of all events occurring in the application, including those related to only the execution of the application and those events that would have been forced by the mobile electronic device.


For completed test behaviors, during Analyst mode for example, a first image (check mark) 950 is rendered to identify whether the test behavior was not present, a second image (“X”) 952 is rendered to identify that the test behavior was detected; a third image (“Δ”) 954 is rendered where, at this point in the analysis, the test behavior has not been analyzed yet; and a fourth image (progress bar) 956 is rendered where the test behavior is currently being analyzed. The updating of entries within textual log 945 is synchronized with video 925 being displayed.


Referring to FIG. 10, an exemplary embodiment of a user interface display screen 1000 produced by the application analyzer of FIG. 3 is shown. User interface display screen 1000 illustrates completion of the anomalous behavior analysis testing by display of a completion message 1010 and a final image or frame 1020 of video 925, with no progress bars being present in textual log 945. Herein, completion message 1010 is rendered that identifies (i) whether the test application has been successfully analyzed and (2) whether the test application is “safe” or “unsafe”. The findings for each particular test behavior represented by indicia (e.g., symbols, color, etc.) along with elapsed time 1025 of that test behavior in the video are set forth in the completed textual log 945.


User interface display screen 1000 provides a first object (REPLAY) 1030 that, based upon user interaction, signals the GUI logic to replay video 925 as shown in FIG. 11. Based on user interaction with first object 1030, video 925 is replayed from the start, where a time bar 1040 positioned below video 925, may be used to replay certain segments of video 925 at selected times. Textual log 945 is synchronized with video 925 to illustrate status of different test behaviors in accordance with the default sequence or sequence selected by the user as illustrated in FIGS. 8A and 8B. The illustration of status may be through images, highlighting text description of the test behavior (e.g., bold, different colors, different font type, etc.).


In general terms, the video replay provides context for each event to explain away or confirm certain anomalous behaviors in light of what image displays (screenshots) may have been displayed or user interactions that have occurred. Some applications exhibit anomalies which may be viewed/verified as unwanted behaviors depending on when/where in the application the event occurred (e.g., audio recording started when expected or at unexpected time, or whether a permission is noted in a manifest). In order to provide such context, the displayed images of video 925 may capture the display output of the application software for at least a period of time (window) before and after an event included in the displayed textual log 945 has occurred.


Referring back FIG. 10, one or more displayed test behaviors in textual log 945 are user interactive. When selected by the user, the GUI logic replays video 925 starting at the time in the anomalous behavior analysis when monitoring of the selected test behavior commenced. This start time may be obtained by extracted a time stamp associated with the first captured image (or frame) of video 925 when the anomalous behavior analysis began to monitor for the selected test behavior. For example, upon user interaction with a third test behavior in the sequence (e.g. add/delete files on storage) as shown in FIG. 11, video data 925 commences with elapsed time of 1:25 minutes with this test behavior as represented by a blank progress bar 1110.


Additionally, display screen 1100 features a search field 1120 that enables the user to search for a particular event or test behavior at a particular point in the video replay. Also, an activity graph 1130 identifies the activities (e.g., number and frequency of API function calls, Java™ events, etc.) during the testing period for the anomalous behavior detection analysis. The particular activities may be obtained by selecting activity graph 1130 to denote a request for deeper analysis of the findings from the anomalous behavior detection analysis.


Referring back to FIG. 10, user interface display screen 1000 further signals the GUI logic, based on user interaction (e.g., selection of a second object (SHOW DETAILED ANALYSIS) 1040 by the user, to produce a screen display 1200 with a summary of test behavior failures as set forth in FIG. 12. Screen display 1200 comprises metadata 1210; alerts 1220 based on test behaviors where the security risk for the test behavior may vary for different customers; a listing of permissions 1230 requested during the anomalous behavior detection analysis; and a scrolling log 1240 outlining the success and failures of custom defined rules similar in form to textual log 945 for the test behaviors as shown in FIG. 10.


In the foregoing description, the invention is described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the present invention as set forth in the appended claims. The specification and drawings are accordingly to be regarded in an illustrative rather than in a restrictive sense.

Claims
  • 1. A method for detecting one or more behaviors by software under test that correspond to at least one anomalous behavior, the method comprising: performing an analysis of operations conducted by the software being processed, the analysis includes at least monitoring one or more behaviors conducted by the software during processing;generating a video illustrative of at least the one or more monitored behaviors conducted by the software during processing, the video being generated in accordance with an indexing scheme including both a first order of indexing that permits user access to a selected segment of a plurality of segments forming the video and a second order of indexing that provides information related to when at least one monitored behavior of the one or more monitored behaviors occurs during an execution flow of the software; andgenerating text information associated with each of the one or more monitored behaviors being analyzed and when each of the one or more monitored behaviors occurs during the execution flow, the text information being displayed on an electronic device contemporaneously with the video illustrative of the one or more monitored behaviors,wherein the contemporaneous display of the video and the text information enables visual monitoring of a segment of the video during an occurrence of the at least one anomalous behavior that is part of the one or more monitored behaviors and indicates the software potentially includes malware or suspicious code or pernicious code.
  • 2. The method of claim 1, wherein the generating of the video comprises capturing display output of the software during processing of the software within a virtual machine, the display output including the plurality of segments forming the video.
  • 3. The method of claim 1, wherein the performing of the analysis of the operations conducted by the software further comprises determining whether any behavior of the one or more behaviors is an anomalous behavior indicating that the software includes malware or suspicious code or pernicious code.
  • 4. The method of claim 1, wherein the first order of indexing permits user access in accordance with playback time in the video.
  • 5. The method of claim 1, wherein the video corresponds to a plurality of display images that would have been displayed if the software executed natively.
  • 6. The method of claim 1, wherein the first order of indexing permits user access in accordance with a particular monitored behavior of the one or more monitored behaviors.
  • 7. The method of claim 1, wherein the performing of the analysis of operations conducted by the software being processed by a virtual machine comprises: generating a display screen for selecting and setting the one or more monitored behaviors;performing operations of the software within the virtual machine; andmonitoring the one or more monitored behaviors to determine whether at least one of the one or more behaviors during analysis of the operations of the software performed within the virtual machine.
  • 8. The method of claim 7, wherein the generating of the display screen comprises presenting a user interface to enable a user to customize a grouping or sequencing of behaviors of the one or more behaviors to be monitored.
  • 9. The method of claim 1 being performed within a virtual machine running in a mobile electronic device and the software is designed for native execution on the mobile electronic device.
  • 10. The method of claim 1 further comprising indexing of the video so as to permit a user, by user interaction, to access a desired display image of the video in accordance with at least one of (i) a playback time and (ii) an analyzed event of the one or more events.
  • 11. The method of claim 1, wherein the conducting of the analysis for the occurrence of the one or more events comprises monitoring for one or more anomalous behaviors each being an undesirable behavior corresponding to (i) a malware-based behavior directed to altering functionality of the electronic device executing the software in a malicious manner, or (ii) a suspicious code-based behavior directed to altering the functionality of the electronic device executing the software without any malicious intent, or (iii) a pernicious code-based behavior directed to providing an unwanted functionality that is generally acceptable in other context.
  • 12. An apparatus for detecting one or more behaviors by software under test that correspond to at least one anomalous behavior, the apparatus comprising: a processor; anda non-transitory storage medium communicatively coupled to the processor, the non-transitory storage medium comprises logic that, when executed by the processor, (i) performs an analysis of operations conducted by the software being processed by a virtual machine, the analysis includes at least monitoring one or more behaviors conducted by the software during processing within the virtual machine, (ii) generates a video illustrative of at least the one or more monitored behaviors conducted by the software during processing of the software within the virtual machine, and (iii) generates text information associated with each of the one or more monitored behaviors,wherein the video is generated in accordance with an indexing scheme including both a first order of indexing that permits user access to a selected segment of a plurality of segments forming the video and a second order of indexing that provides the text information related to at least one monitored behavior of the one or more monitored behaviors and when the at least one monitored behavior occurs within the execution flow of the software, andwherein the text information is displayed on an electronic device contemporaneously with the video illustrative of the one or more monitored behaviors to enable visual monitoring of the one or more monitored behaviors for the at least one anomalous behavior that indicates the software potentially includes malware or suspicious code or pernicious code.
  • 13. The apparatus of claim 12, wherein the logic, when executed by the processor, performs the analysis by (a) responsive at least in part to user input, setting one or more behaviors to be monitored; (b) controlling the operations of the software conducted within the virtual machine; and (c) determining whether the one or more monitored behaviors are detected during analysis of the operations conducted by the software being processed by the virtual machine.
  • 14. The apparatus of claim 12, wherein the video includes a sequence of display images.
  • 15. The apparatus of claim 14 further comprising a display, and wherein the logic, when executed by the processor, to perform the analysis of operations conducted by the software to detect an occurrence of at least one anomalous behavior being part of the one or more monitored behaviors.
  • 16. The apparatus of claim 14, wherein the text information includes a textual log.
  • 17. The apparatus of claim 16, wherein the logic, when executed by the processor, to further provide, during playback of the video, reciprocal graphic interaction between the displayed video and the displayed textual log responsive to user input.
  • 18. The apparatus of claim 17, wherein the logic, executed by the processor, to conduct the first order of indexing of the video so as to permit access to a desired segment of the plurality of segments forming the video in accordance with either a particular playback time in the video or a particular analyzed behavior of the one or more monitored behaviors.
  • 19. A non-transitory storage medium implemented within an electronic device and including software that, upon execution by a processor deployed within the electronic device, performs operations for detecting potential malware or suspicious code or pernicious code within an application software being tested, comprising: performing an analysis of operations conducted by the application software being processed, the analysis includes at least monitoring one or more behaviors conducted by the application software during processing;generating a video illustrative of at least the one or more monitored behaviors conducted by the application software during processing of the application software, the video being generated in accordance with an indexing scheme including both a first order of indexing that permits user access to a selected segment of a plurality of segments forming the video and a second order of indexing that provides information related to when at least one monitored behavior of the one or more monitored behaviors occurs during an execution flow of the software; andgenerating text information associated with each of the one or more monitored behaviors being analyzed and when each of the one or more monitored behaviors occurs within the execution flow, the text information being displayed on the electronic device contemporaneously with the video illustrative of the one or more monitored behaviors,wherein the contemporaneous display of the video and the text information enables visual monitoring of a segment of the video during an occurrence of at least one anomalous behavior that is part of the one or more monitored behaviors and indicates the software potentially includes malware or suspicious code or pernicious code, the contemporaneous display of the segment of the video and the text information associated with the at least one anomalous behavior provides information capable of being used for use in hardening the software to reduce susceptibility of the software to at least malware.
  • 20. The non-transitory storage medium of claim 19, wherein the video corresponds to a plurality of display images that would have been displayed if the software executed as if the application software was executed on a cellular telephone under control by an operating system deployed within the cellular telephone.
  • 21. The non-transitory storage medium of claim 19, wherein the generating of the video comprises capturing display output of the software during processing of the software within a virtual machine, the display output including the plurality of segments forming the video.
  • 22. The non-transitory storage medium of claim 19, wherein the performing of the analysis of the operations conducted by the application software further comprises determining whether any behavior of the one or more behaviors is an anomalous behavior indicating that the application software includes malware or suspicious code or pernicious code.
  • 23. The non-transitory storage medium of claim 19, wherein the first order of indexing permits user access in accordance with playback time in the video.
  • 24. The non-transitory storage medium of claim 19, wherein the first order of indexing permits user access in accordance with a particular monitored behavior of the one or more monitored behaviors.
  • 25. The non-transitory storage medium of claim 19, wherein the performing of the analysis of operations conducted by the application software comprises: generating a display screen for selecting and setting the one or more monitored behaviors;performing operations of the application software within a virtual machine; andmonitoring the one or more monitored behaviors to determine whether at least one of the one or more behaviors during analysis of the operations of the application software performed within the virtual machine.
  • 26. The non-transitory storage medium of claim 25, wherein the generating of the display screen comprises presenting a user interface to enable a user to customize a grouping or sequencing of behaviors of the one or more behaviors to be monitored.
  • 27. The method of claim 1, wherein the contemporaneous display of the segment of the video and the text information associated with the at least one anomalous behavior provides information for use in hardening the software to reduce susceptibility of the software to at least malware.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 147/949,770 filed Nov. 23, 2015, now U.S. Pat. No. 10,019,338 issued Jul. 10, 2018 which is a continuation of U.S. patent application Ser. No. 13/775,168, filed Feb. 23, 2013, now U.S. Pat. No. 9,195,829 issued Nov. 24, 2015, the entire contents of which are incorporated herein by reference.

US Referenced Citations (723)
Number Name Date Kind
4292580 Ott et al. Sep 1981 A
5175732 Hendel et al. Dec 1992 A
5319776 Rile et al. Jun 1994 A
5440723 Arnold et al. Aug 1995 A
5490249 Miller Feb 1996 A
5657473 Killean et al. Aug 1997 A
5802277 Cowlard Sep 1998 A
5842002 Schnurer et al. Nov 1998 A
5960170 Chen et al. Sep 1999 A
5978917 Chi Nov 1999 A
5983348 Ji Nov 1999 A
6088803 Tso et al. Jul 2000 A
6092194 Touboul Jul 2000 A
6094677 Capek et al. Jul 2000 A
6108799 Boulay et al. Aug 2000 A
6154844 Touboul et al. Nov 2000 A
6269330 Cidon et al. Jul 2001 B1
6272641 Ji Aug 2001 B1
6279113 Vaidya Aug 2001 B1
6298445 Shostack et al. Oct 2001 B1
6357008 Nachenberg Mar 2002 B1
6424627 Sorhaug et al. Jul 2002 B1
6442696 Wray et al. Aug 2002 B1
6484315 Ziese Nov 2002 B1
6487666 Shanklin et al. Nov 2002 B1
6493756 O'Brien et al. Dec 2002 B1
6550012 Villa et al. Apr 2003 B1
6775657 Baker Aug 2004 B1
6831893 Ben Nun et al. Dec 2004 B1
6832367 Choi et al. Dec 2004 B1
6895550 Kanchirayappa et al. May 2005 B2
6898632 Gordy et al. May 2005 B2
6907396 Muttik et al. Jun 2005 B1
6941348 Petry et al. Sep 2005 B2
6971097 Wallman Nov 2005 B1
6981279 Arnold et al. Dec 2005 B1
7007107 Ivchenko et al. Feb 2006 B1
7028179 Anderson et al. Apr 2006 B2
7043757 Hoefelmeyer et al. May 2006 B2
7058822 Edery et al. Jun 2006 B2
7069316 Gryaznov Jun 2006 B1
7080407 Zhao et al. Jul 2006 B1
7080408 Pak et al. Jul 2006 B1
7093002 Wolff et al. Aug 2006 B2
7093239 van der Made Aug 2006 B1
7096498 Judge Aug 2006 B2
7100201 Izatt Aug 2006 B2
7107617 Hursey et al. Sep 2006 B2
7159149 Spiegel et al. Jan 2007 B2
7213260 Judge May 2007 B2
7231667 Jordan Jun 2007 B2
7240364 Branscomb et al. Jul 2007 B1
7240368 Roesch et al. Jul 2007 B1
7243371 Kasper et al. Jul 2007 B1
7249175 Donaldson Jul 2007 B1
7287278 Liang Oct 2007 B2
7308716 Danford et al. Dec 2007 B2
7328453 Merkle, Jr. et al. Feb 2008 B2
7346486 Ivancic et al. Mar 2008 B2
7356736 Natvig Apr 2008 B2
7386888 Liang et al. Jun 2008 B2
7392542 Bucher Jun 2008 B2
7418729 Szor Aug 2008 B2
7428300 Drew et al. Sep 2008 B1
7441272 Durham et al. Oct 2008 B2
7448084 Apap et al. Nov 2008 B1
7458098 Judge et al. Nov 2008 B2
7464404 Carpenter et al. Dec 2008 B2
7464407 Nakae et al. Dec 2008 B2
7467408 O'Toole, Jr. Dec 2008 B1
7478428 Thomlinson Jan 2009 B1
7480773 Reed Jan 2009 B1
7487543 Arnold et al. Feb 2009 B2
7496960 Chen et al. Feb 2009 B1
7496961 Zimmer et al. Feb 2009 B2
7519990 Xie Apr 2009 B1
7523493 Liang et al. Apr 2009 B2
7530104 Thrower et al. May 2009 B1
7540025 Tzadikario May 2009 B2
7546638 Anderson et al. Jun 2009 B2
7565550 Liang et al. Jul 2009 B2
7568233 Szor et al. Jul 2009 B1
7584455 Ball Sep 2009 B2
7603715 Costa et al. Oct 2009 B2
7607171 Marsden et al. Oct 2009 B1
7639714 Stolfo et al. Dec 2009 B2
7644441 Schmid et al. Jan 2010 B2
7657419 van der Made Feb 2010 B2
7676841 Sobchuk et al. Mar 2010 B2
7698548 Shelest et al. Apr 2010 B2
7707633 Danford et al. Apr 2010 B2
7712136 Sprosts et al. May 2010 B2
7730011 Deninger et al. Jun 2010 B1
7739740 Nachenberg et al. Jun 2010 B1
7779463 Stolfo et al. Aug 2010 B2
7784097 Stolfo et al. Aug 2010 B1
7832008 Kraemer Nov 2010 B1
7836502 Zhao et al. Nov 2010 B1
7849506 Dansey et al. Dec 2010 B1
7854007 Sprosts et al. Dec 2010 B2
7869073 Oshima Jan 2011 B2
7877803 Enstone et al. Jan 2011 B2
7904959 Sidiroglou et al. Mar 2011 B2
7908660 Bahl Mar 2011 B2
7930738 Petersen Apr 2011 B1
7937387 Frazier et al. May 2011 B2
7937761 Bennett May 2011 B1
7949849 Lowe et al. May 2011 B2
7996556 Raghavan et al. Aug 2011 B2
7996836 McCorkendale et al. Aug 2011 B1
7996904 Chiueh et al. Aug 2011 B1
7996905 Arnold et al. Aug 2011 B2
8006305 Aziz Aug 2011 B2
8010667 Zhang et al. Aug 2011 B2
8020206 Hubbard et al. Sep 2011 B2
8028338 Schneider et al. Sep 2011 B1
8042184 Batenin Oct 2011 B1
8045094 Teragawa Oct 2011 B2
8045458 Alperovitch et al. Oct 2011 B2
8069484 McMillan et al. Nov 2011 B2
8087086 Lai et al. Dec 2011 B1
8171553 Aziz et al. May 2012 B2
8176049 Deninger et al. May 2012 B2
8176480 Spertus May 2012 B1
8201246 Wu et al. Jun 2012 B1
8204984 Aziz et al. Jun 2012 B1
8214905 Doukhvalov et al. Jul 2012 B1
8220055 Kennedy Jul 2012 B1
8225288 Miller et al. Jul 2012 B2
8225373 Kraemer Jul 2012 B2
8233882 Rogel Jul 2012 B2
8234640 Fitzgerald et al. Jul 2012 B1
8234709 Viljoen et al. Jul 2012 B2
8239944 Nachenberg et al. Aug 2012 B1
8260914 Ranjan Sep 2012 B1
8266091 Gubin et al. Sep 2012 B1
8286251 Eker et al. Oct 2012 B2
8291499 Aziz et al. Oct 2012 B2
8307435 Mann et al. Nov 2012 B1
8307443 Wang et al. Nov 2012 B2
8312545 Tuvell et al. Nov 2012 B2
8321936 Green et al. Nov 2012 B1
8321941 Tuvell et al. Nov 2012 B2
8332571 Edwards, Sr. Dec 2012 B1
8365286 Poston Jan 2013 B2
8365297 Parshin et al. Jan 2013 B1
8370938 Daswani et al. Feb 2013 B1
8370939 Zaitsev et al. Feb 2013 B2
8375444 Aziz et al. Feb 2013 B2
8381299 Stolfo et al. Feb 2013 B2
8402529 Green et al. Mar 2013 B1
8464340 Ahn et al. Jun 2013 B2
8479174 Chiriac Jul 2013 B2
8479276 Vaystikh et al. Jul 2013 B1
8479291 Bodke Jul 2013 B1
8510827 Leake et al. Aug 2013 B1
8510828 Guo et al. Aug 2013 B1
8510842 Amit et al. Aug 2013 B2
8516478 Edwards et al. Aug 2013 B1
8516590 Ranadive et al. Aug 2013 B1
8516593 Aziz Aug 2013 B2
8522348 Chen et al. Aug 2013 B2
8528086 Aziz Sep 2013 B1
8533824 Hutton et al. Sep 2013 B2
8539582 Aziz et al. Sep 2013 B1
8549638 Aziz Oct 2013 B2
8555391 Demir et al. Oct 2013 B1
8561177 Aziz et al. Oct 2013 B1
8566476 Shifter et al. Oct 2013 B2
8566946 Aziz et al. Oct 2013 B1
8584094 Dadhia et al. Nov 2013 B2
8584234 Sobel et al. Nov 2013 B1
8584239 Aziz et al. Nov 2013 B2
8595834 Xie et al. Nov 2013 B2
8627476 Satish et al. Jan 2014 B1
8635696 Aziz Jan 2014 B1
8682054 Xue et al. Mar 2014 B2
8682812 Ranjan Mar 2014 B1
8689333 Aziz Apr 2014 B2
8695096 Zhang Apr 2014 B1
8713631 Pavlyushchik Apr 2014 B1
8713681 Silberman et al. Apr 2014 B2
8726392 McCorkendale et al. May 2014 B1
8739280 Chess et al. May 2014 B2
8776229 Aziz Jul 2014 B1
8782792 Bodke Jul 2014 B1
8789172 Stolfo et al. Jul 2014 B2
8789178 Kejriwal et al. Jul 2014 B2
8793278 Frazier et al. Jul 2014 B2
8793787 Ismael et al. Jul 2014 B2
8805947 Kuzkin et al. Aug 2014 B1
8806647 Daswani et al. Aug 2014 B1
8826240 Lachwani et al. Sep 2014 B1
8832829 Manni et al. Sep 2014 B2
8850570 Ramzan Sep 2014 B1
8850571 Staniford et al. Sep 2014 B2
8881234 Narasimhan et al. Nov 2014 B2
8881271 Butler, II Nov 2014 B2
8881282 Aziz et al. Nov 2014 B1
8898788 Aziz et al. Nov 2014 B1
8935779 Manni et al. Jan 2015 B2
8949257 Shiffer et al. Feb 2015 B2
8984585 Martini Mar 2015 B2
8984638 Aziz et al. Mar 2015 B1
8990939 Staniford et al. Mar 2015 B2
8990944 Singh et al. Mar 2015 B1
8997219 Staniford et al. Mar 2015 B2
9009822 Ismael et al. Apr 2015 B1
9009823 Ismael et al. Apr 2015 B1
9027135 Aziz May 2015 B1
9071638 Aziz et al. Jun 2015 B1
9104867 Thioux et al. Aug 2015 B1
9106630 Frazier et al. Aug 2015 B2
9106694 Aziz et al. Aug 2015 B2
9118715 Staniford et al. Aug 2015 B2
9152541 Kuo Oct 2015 B1
9159035 Ismael et al. Oct 2015 B1
9171160 Vincent et al. Oct 2015 B2
9176843 Ismael et al. Nov 2015 B1
9189627 Islam Nov 2015 B1
9195829 Goradia et al. Nov 2015 B1
9197664 Aziz et al. Nov 2015 B1
9223972 Vincent et al. Dec 2015 B1
9225740 Ismael et al. Dec 2015 B1
9241010 Bennett et al. Jan 2016 B1
9251343 Vincent et al. Feb 2016 B1
9262635 Paithane et al. Feb 2016 B2
9268936 Butler Feb 2016 B2
9275229 LeMasters Mar 2016 B2
9282109 Aziz et al. Mar 2016 B1
9292686 Ismael et al. Mar 2016 B2
9294501 Mesdaq et al. Mar 2016 B2
9300686 Pidathala et al. Mar 2016 B2
9306960 Aziz Apr 2016 B1
9306974 Aziz et al. Apr 2016 B1
9311479 Manni et al. Apr 2016 B1
9355247 Thioux et al. May 2016 B1
9356944 Aziz May 2016 B1
9363280 Rivlin et al. Jun 2016 B1
9367681 Ismael et al. Jun 2016 B1
9398028 Karandikar et al. Jul 2016 B1
9409090 McKenzie et al. Aug 2016 B1
9413781 Cunningham et al. Aug 2016 B2
9426071 Caldejon et al. Aug 2016 B1
9430646 Mushtaq et al. Aug 2016 B1
9432389 Khalid et al. Aug 2016 B1
9438613 Paithane et al. Sep 2016 B1
9438622 Staniford et al. Sep 2016 B1
9438623 Thioux et al. Sep 2016 B1
9459901 Jung et al. Oct 2016 B2
9467460 Otvagin et al. Oct 2016 B1
9483644 Paithane et al. Nov 2016 B1
9495180 Ismael Nov 2016 B2
9497213 Thompson et al. Nov 2016 B2
9507935 Ismael et al. Nov 2016 B2
9516057 Aziz Dec 2016 B2
9519782 Aziz et al. Dec 2016 B2
9536091 Paithane et al. Jan 2017 B2
9537972 Edwards et al. Jan 2017 B1
9560059 Islam Jan 2017 B1
9565202 Kindlund et al. Feb 2017 B1
9591015 Amin et al. Mar 2017 B1
9591020 Aziz Mar 2017 B1
9594904 Jain et al. Mar 2017 B1
9594905 Ismael et al. Mar 2017 B1
9594912 Thioux et al. Mar 2017 B1
9609007 Rivlin et al. Mar 2017 B1
9626509 Khalid et al. Apr 2017 B1
9628498 Aziz et al. Apr 2017 B1
9628507 Haq et al. Apr 2017 B2
9633134 Ross Apr 2017 B2
9635039 Islam et al. Apr 2017 B1
9641546 Manni et al. May 2017 B1
9654485 Neumann May 2017 B1
9661009 Karandikar et al. May 2017 B1
9661018 Aziz May 2017 B1
9674298 Edwards et al. Jun 2017 B1
9680862 Ismael et al. Jun 2017 B2
9690606 Ha et al. Jun 2017 B1
9690933 Singh et al. Jun 2017 B1
9690935 Shiffer et al. Jun 2017 B2
9690936 Malik et al. Jun 2017 B1
9736179 Ismael Aug 2017 B2
9740857 Ismael et al. Aug 2017 B2
9747446 Pidathala et al. Aug 2017 B1
9756074 Aziz et al. Sep 2017 B2
9773112 Rathor et al. Sep 2017 B1
9781144 Otvagin et al. Oct 2017 B1
9787700 Amin et al. Oct 2017 B1
9787706 Otvagin et al. Oct 2017 B1
9792196 Ismael et al. Oct 2017 B1
9824209 Ismael et al. Nov 2017 B1
9824211 Wilson Nov 2017 B2
9824216 Khalid et al. Nov 2017 B1
9825976 Gomez et al. Nov 2017 B1
9825989 Mehra et al. Nov 2017 B1
9838408 Karandikar et al. Dec 2017 B1
9838411 Aziz Dec 2017 B1
9838416 Aziz Dec 2017 B1
9838417 Khalid et al. Dec 2017 B1
9846776 Paithane et al. Dec 2017 B1
9876701 Caldejon et al. Jan 2018 B1
9888016 Amin et al. Feb 2018 B1
9888019 Pidathala et al. Feb 2018 B1
9910988 Vincent et al. Mar 2018 B1
9912644 Cunningham Mar 2018 B2
9912681 Ismael et al. Mar 2018 B1
9912684 Aziz et al. Mar 2018 B1
9912691 Mesdaq et al. Mar 2018 B2
9912698 Thioux et al. Mar 2018 B1
9916440 Paithane et al. Mar 2018 B1
9921978 Chan et al. Mar 2018 B1
9934376 Ismael Apr 2018 B1
9934381 Kindlund et al. Apr 2018 B1
9946568 Ismael et al. Apr 2018 B1
9954890 Staniford et al. Apr 2018 B1
9973531 Thioux May 2018 B1
10002252 Ismael et al. Jun 2018 B2
10019338 Goradia et al. Jul 2018 B1
10019573 Silberman et al. Jul 2018 B2
10025691 Ismael et al. Jul 2018 B1
10025927 Khalid et al. Jul 2018 B1
10027689 Rathor et al. Jul 2018 B1
10027690 Aziz et al. Jul 2018 B2
10027696 Rivlin et al. Jul 2018 B1
10033747 Paithane et al. Jul 2018 B1
10033748 Cunningham et al. Jul 2018 B1
10033753 Islam et al. Jul 2018 B1
10033759 Kabra et al. Jul 2018 B1
10050998 Singh Aug 2018 B1
10068091 Aziz et al. Sep 2018 B1
10075455 Zafar et al. Sep 2018 B2
10083302 Paithane et al. Sep 2018 B1
10084813 Eyada Sep 2018 B2
10089461 Ha et al. Oct 2018 B1
10097573 Aziz Oct 2018 B1
10104102 Neumann Oct 2018 B1
10108446 Steinberg et al. Oct 2018 B1
10121000 Rivlin et al. Nov 2018 B1
10122746 Manni et al. Nov 2018 B1
10133863 Bu et al. Nov 2018 B2
10133866 Kumar et al. Nov 2018 B1
10146810 Shiffer et al. Dec 2018 B2
10148693 Singh et al. Dec 2018 B2
10165000 Aziz et al. Dec 2018 B1
10169585 Pilipenko et al. Jan 2019 B1
10176321 Abbasi et al. Jan 2019 B2
10181029 Ismael et al. Jan 2019 B1
10191861 Steinberg et al. Jan 2019 B1
10192052 Singh et al. Jan 2019 B1
10198574 Thioux et al. Feb 2019 B1
10200384 Mushtaq et al. Feb 2019 B1
10210329 Malik et al. Feb 2019 B1
10216927 Steinberg Feb 2019 B1
10218740 Mesdaq et al. Feb 2019 B1
10242185 Goradia Mar 2019 B1
20010005889 Albrecht Jun 2001 A1
20010047326 Broadbent et al. Nov 2001 A1
20020018903 Kokubo et al. Feb 2002 A1
20020038430 Edwards et al. Mar 2002 A1
20020091819 Melchione et al. Jul 2002 A1
20020095607 Lin-Hendel Jul 2002 A1
20020116627 Tarbotton et al. Aug 2002 A1
20020144156 Copeland Oct 2002 A1
20020162015 Tang Oct 2002 A1
20020166063 Lachman et al. Nov 2002 A1
20020169952 DiSanto et al. Nov 2002 A1
20020184528 Shevenell et al. Dec 2002 A1
20020188887 Largman et al. Dec 2002 A1
20020194490 Halperin et al. Dec 2002 A1
20030021728 Sharpe et al. Jan 2003 A1
20030074578 Ford et al. Apr 2003 A1
20030084318 Schertz May 2003 A1
20030101381 Mateev et al. May 2003 A1
20030115483 Liang Jun 2003 A1
20030188190 Aaron et al. Oct 2003 A1
20030191957 Hypponen et al. Oct 2003 A1
20030200460 Morota et al. Oct 2003 A1
20030212902 van der Made Nov 2003 A1
20030229801 Kouznetsov et al. Dec 2003 A1
20030237000 Denton et al. Dec 2003 A1
20040003323 Bennett et al. Jan 2004 A1
20040006473 Mills et al. Jan 2004 A1
20040015712 Szor Jan 2004 A1
20040019832 Arnold et al. Jan 2004 A1
20040047356 Bauer Mar 2004 A1
20040083408 Spiegel et al. Apr 2004 A1
20040088581 Brawn et al. May 2004 A1
20040093513 Cantrell et al. May 2004 A1
20040111531 Staniford et al. Jun 2004 A1
20040117478 Triulzi et al. Jun 2004 A1
20040117624 Brandt et al. Jun 2004 A1
20040128355 Chao et al. Jul 2004 A1
20040165588 Pandya Aug 2004 A1
20040236963 Danford et al. Nov 2004 A1
20040243349 Greifeneder et al. Dec 2004 A1
20040249911 Alkhatib et al. Dec 2004 A1
20040255161 Cavanaugh Dec 2004 A1
20040268147 Wiederin et al. Dec 2004 A1
20050005159 Oliphant Jan 2005 A1
20050021740 Bar et al. Jan 2005 A1
20050033960 Vialen et al. Feb 2005 A1
20050033989 Poletto et al. Feb 2005 A1
20050050148 Mohammadioun et al. Mar 2005 A1
20050086523 Zimmer et al. Apr 2005 A1
20050091513 Mitomo et al. Apr 2005 A1
20050091533 Omote et al. Apr 2005 A1
20050091652 Ross et al. Apr 2005 A1
20050108562 Khazan et al. May 2005 A1
20050114663 Cornell et al. May 2005 A1
20050125195 Brendel Jun 2005 A1
20050149726 Joshi et al. Jul 2005 A1
20050157662 Bingham et al. Jul 2005 A1
20050183143 Anderholm et al. Aug 2005 A1
20050201297 Peikari Sep 2005 A1
20050210533 Copeland et al. Sep 2005 A1
20050238005 Chen et al. Oct 2005 A1
20050240781 Gassoway Oct 2005 A1
20050254775 Hamilton Nov 2005 A1
20050262562 Gassoway Nov 2005 A1
20050265331 Stolfo Dec 2005 A1
20050283839 Cowburn Dec 2005 A1
20060010495 Cohen et al. Jan 2006 A1
20060015416 Hoffman et al. Jan 2006 A1
20060015715 Anderson Jan 2006 A1
20060015747 Van de Ven Jan 2006 A1
20060021029 Brickell et al. Jan 2006 A1
20060021054 Costa et al. Jan 2006 A1
20060031476 Mathes et al. Feb 2006 A1
20060047665 Neil Mar 2006 A1
20060070130 Costea et al. Mar 2006 A1
20060075496 Carpenter et al. Apr 2006 A1
20060095968 Portolani et al. May 2006 A1
20060101516 Sudaharan et al. May 2006 A1
20060101517 Banzhof et al. May 2006 A1
20060117385 Mester et al. Jun 2006 A1
20060123477 Raghavan et al. Jun 2006 A1
20060143709 Brooks et al. Jun 2006 A1
20060150249 Gassen et al. Jul 2006 A1
20060161983 Cothrell et al. Jul 2006 A1
20060161987 Levy-Yurista Jul 2006 A1
20060161989 Reshef et al. Jul 2006 A1
20060164199 Gilde et al. Jul 2006 A1
20060173992 Weber et al. Aug 2006 A1
20060179147 Tran et al. Aug 2006 A1
20060184632 Marino et al. Aug 2006 A1
20060191010 Benjamin Aug 2006 A1
20060221956 Narayan et al. Oct 2006 A1
20060236393 Kramer et al. Oct 2006 A1
20060242709 Seinfeld et al. Oct 2006 A1
20060248519 Jaeger et al. Nov 2006 A1
20060248582 Panjwani et al. Nov 2006 A1
20060251104 Koga Nov 2006 A1
20060288417 Bookbinder et al. Dec 2006 A1
20070006288 Mayfield et al. Jan 2007 A1
20070006313 Porras et al. Jan 2007 A1
20070011174 Takaragi et al. Jan 2007 A1
20070016951 Piccard et al. Jan 2007 A1
20070019286 Kikuchi Jan 2007 A1
20070033645 Jones Feb 2007 A1
20070038943 FitzGerald et al. Feb 2007 A1
20070064689 Shin et al. Mar 2007 A1
20070074169 Chess et al. Mar 2007 A1
20070094730 Bhikkaji et al. Apr 2007 A1
20070101435 Konanka et al. May 2007 A1
20070128855 Cho et al. Jun 2007 A1
20070142030 Sinha et al. Jun 2007 A1
20070143827 Nicodemus et al. Jun 2007 A1
20070156895 Vuong Jul 2007 A1
20070157180 Tillmann et al. Jul 2007 A1
20070157306 Elrod et al. Jul 2007 A1
20070168988 Eisner et al. Jul 2007 A1
20070171824 Ruello et al. Jul 2007 A1
20070174915 Gribble et al. Jul 2007 A1
20070192500 Lum Aug 2007 A1
20070192858 Lum Aug 2007 A1
20070192863 Kapoor et al. Aug 2007 A1
20070198275 Malden et al. Aug 2007 A1
20070208822 Wang et al. Sep 2007 A1
20070220607 Sprosts et al. Sep 2007 A1
20070240218 Tuvell et al. Oct 2007 A1
20070240219 Tuvell et al. Oct 2007 A1
20070240220 Tuvell et al. Oct 2007 A1
20070240222 Tuvell et al. Oct 2007 A1
20070250930 Aziz et al. Oct 2007 A1
20070256132 Oliphant Nov 2007 A2
20070271446 Nakamura Nov 2007 A1
20070285578 Hirayama et al. Dec 2007 A1
20080005782 Aziz Jan 2008 A1
20080018122 Zierler et al. Jan 2008 A1
20080028463 Dagon et al. Jan 2008 A1
20080040710 Chiriac Feb 2008 A1
20080046781 Childs et al. Feb 2008 A1
20080066179 Liu Mar 2008 A1
20080072326 Danford et al. Mar 2008 A1
20080077793 Tan et al. Mar 2008 A1
20080080518 Hoeflin et al. Apr 2008 A1
20080086720 Lekel Apr 2008 A1
20080098476 Syversen Apr 2008 A1
20080120722 Sima et al. May 2008 A1
20080134178 Fitzgerald et al. Jun 2008 A1
20080134334 Kim et al. Jun 2008 A1
20080141376 Clausen et al. Jun 2008 A1
20080184367 McMillan et al. Jul 2008 A1
20080184373 Traut et al. Jul 2008 A1
20080189787 Arnold et al. Aug 2008 A1
20080201778 Guo et al. Aug 2008 A1
20080209557 Herley et al. Aug 2008 A1
20080215742 Goldszmidt et al. Sep 2008 A1
20080222729 Chen et al. Sep 2008 A1
20080263665 Ma et al. Oct 2008 A1
20080295172 Bohacek Nov 2008 A1
20080301810 Lehane et al. Dec 2008 A1
20080307524 Singh et al. Dec 2008 A1
20080313738 Enderby Dec 2008 A1
20080320594 Jiang Dec 2008 A1
20080320595 van der Made Dec 2008 A1
20090003317 Kasralikar et al. Jan 2009 A1
20090007100 Field et al. Jan 2009 A1
20090013408 Schipka Jan 2009 A1
20090031423 Liu et al. Jan 2009 A1
20090036111 Danford et al. Feb 2009 A1
20090037835 Goldman Feb 2009 A1
20090044024 Oberheide et al. Feb 2009 A1
20090044274 Budko et al. Feb 2009 A1
20090064000 Garbow Mar 2009 A1
20090064332 Porras et al. Mar 2009 A1
20090077666 Chen et al. Mar 2009 A1
20090083369 Marmor Mar 2009 A1
20090083855 Apap et al. Mar 2009 A1
20090089879 Wang et al. Apr 2009 A1
20090094697 Provos et al. Apr 2009 A1
20090113425 Ports et al. Apr 2009 A1
20090125976 Wassermann et al. May 2009 A1
20090126015 Monastyrsky et al. May 2009 A1
20090126016 Sobko et al. May 2009 A1
20090133125 Choi et al. May 2009 A1
20090144823 Lamastra et al. Jun 2009 A1
20090158430 Borders Jun 2009 A1
20090172815 Gu et al. Jul 2009 A1
20090187992 Poston Jul 2009 A1
20090193293 Stolfo et al. Jul 2009 A1
20090198651 Shiffer et al. Aug 2009 A1
20090198670 Shiffer et al. Aug 2009 A1
20090198689 Frazier et al. Aug 2009 A1
20090199274 Frazier et al. Aug 2009 A1
20090199296 Xie et al. Aug 2009 A1
20090228233 Anderson et al. Sep 2009 A1
20090241187 Troyansky Sep 2009 A1
20090241190 Todd et al. Sep 2009 A1
20090265692 Godefroid et al. Oct 2009 A1
20090271867 Zhang Oct 2009 A1
20090300415 Zhang et al. Dec 2009 A1
20090300761 Park et al. Dec 2009 A1
20090328185 Berg et al. Dec 2009 A1
20090328221 Blumfield et al. Dec 2009 A1
20100005146 Drako et al. Jan 2010 A1
20100011205 McKenna Jan 2010 A1
20100017546 Poo et al. Jan 2010 A1
20100030996 Butler, II Feb 2010 A1
20100031353 Thomas et al. Feb 2010 A1
20100037314 Perdisci et al. Feb 2010 A1
20100043073 Kuwamura Feb 2010 A1
20100054278 Stolfo et al. Mar 2010 A1
20100058474 Hicks Mar 2010 A1
20100064044 Nonoyama Mar 2010 A1
20100077481 Polyakov et al. Mar 2010 A1
20100083376 Pereira et al. Apr 2010 A1
20100115621 Staniford et al. May 2010 A1
20100132038 Zaitsev May 2010 A1
20100154056 Smith et al. Jun 2010 A1
20100180344 Malyshev et al. Jul 2010 A1
20100192223 Ismael et al. Jul 2010 A1
20100220863 Dupaquis et al. Sep 2010 A1
20100235831 Dittmer Sep 2010 A1
20100251104 Massand Sep 2010 A1
20100281102 Chinta et al. Nov 2010 A1
20100281541 Stolfo et al. Nov 2010 A1
20100281542 Stolfo et al. Nov 2010 A1
20100287260 Peterson et al. Nov 2010 A1
20100299754 Amit et al. Nov 2010 A1
20100306173 Frank Dec 2010 A1
20110004737 Greenebaum Jan 2011 A1
20110025504 Lyon et al. Feb 2011 A1
20110041179 St Hlberg Feb 2011 A1
20110047594 Mahaffey et al. Feb 2011 A1
20110047620 Mahaffey et al. Feb 2011 A1
20110055907 Narasimhan et al. Mar 2011 A1
20110078794 Manni et al. Mar 2011 A1
20110093951 Aziz Apr 2011 A1
20110099620 Stavrou et al. Apr 2011 A1
20110099633 Aziz Apr 2011 A1
20110099635 Silberman et al. Apr 2011 A1
20110113231 Kaminsky May 2011 A1
20110145918 Jung et al. Jun 2011 A1
20110145920 Mahaffey et al. Jun 2011 A1
20110145934 Abramovici et al. Jun 2011 A1
20110167493 Song et al. Jul 2011 A1
20110167494 Bowen et al. Jul 2011 A1
20110173213 Frazier et al. Jul 2011 A1
20110173460 Ito et al. Jul 2011 A1
20110219449 St. Neitzel et al. Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225624 Sawhney et al. Sep 2011 A1
20110225655 Niemela et al. Sep 2011 A1
20110247072 Staniford et al. Oct 2011 A1
20110265182 Peinado et al. Oct 2011 A1
20110289582 Kejriwal et al. Nov 2011 A1
20110302587 Nishikawa et al. Dec 2011 A1
20110307954 Melnik et al. Dec 2011 A1
20110307955 Kaplan et al. Dec 2011 A1
20110307956 Yermakov et al. Dec 2011 A1
20110314546 Aziz et al. Dec 2011 A1
20120023593 Puder et al. Jan 2012 A1
20120054869 Yen et al. Mar 2012 A1
20120066698 Yanoo Mar 2012 A1
20120079596 Thomas et al. Mar 2012 A1
20120084859 Radinsky et al. Apr 2012 A1
20120096553 Srivastava et al. Apr 2012 A1
20120110667 Zubrilin et al. May 2012 A1
20120117652 Manni et al. May 2012 A1
20120121154 Xue et al. May 2012 A1
20120124426 Maybee et al. May 2012 A1
20120174186 Aziz et al. Jul 2012 A1
20120174196 Bhogavilli et al. Jul 2012 A1
20120174218 McCoy et al. Jul 2012 A1
20120198279 Schroeder Aug 2012 A1
20120210423 Friedrichs et al. Aug 2012 A1
20120222121 Staniford et al. Aug 2012 A1
20120255015 Sahita et al. Oct 2012 A1
20120255017 Sallam Oct 2012 A1
20120260342 Dube et al. Oct 2012 A1
20120266244 Green et al. Oct 2012 A1
20120278886 Luna Nov 2012 A1
20120297489 Dequevy Nov 2012 A1
20120330801 McDougal et al. Dec 2012 A1
20120331553 Aziz et al. Dec 2012 A1
20130014259 Gribble et al. Jan 2013 A1
20130036472 Aziz Feb 2013 A1
20130047257 Aziz Feb 2013 A1
20130074185 McDougal et al. Mar 2013 A1
20130080634 Grelewicz et al. Mar 2013 A1
20130086684 Mohler Apr 2013 A1
20130097699 Balupari et al. Apr 2013 A1
20130097706 Titonis et al. Apr 2013 A1
20130111587 Goel et al. May 2013 A1
20130117852 Stute May 2013 A1
20130117855 Kim et al. May 2013 A1
20130139264 Brinkley et al. May 2013 A1
20130160125 Likhachev et al. Jun 2013 A1
20130160127 Jeong et al. Jun 2013 A1
20130160130 Mendelev et al. Jun 2013 A1
20130160131 Madou et al. Jun 2013 A1
20130167236 Sick Jun 2013 A1
20130174214 Duncan Jul 2013 A1
20130185789 Hagiwara et al. Jul 2013 A1
20130185795 Winn et al. Jul 2013 A1
20130185798 Saunders et al. Jul 2013 A1
20130191915 Antonakakis et al. Jul 2013 A1
20130196649 Paddon et al. Aug 2013 A1
20130227691 Aziz et al. Aug 2013 A1
20130246370 Bartram et al. Sep 2013 A1
20130247186 LeMasters Sep 2013 A1
20130263260 Mahaffey et al. Oct 2013 A1
20130291109 Staniford et al. Oct 2013 A1
20130298243 Kumar et al. Nov 2013 A1
20130318038 Shiffer et al. Nov 2013 A1
20130318073 Shiffer et al. Nov 2013 A1
20130325791 Shiffer et al. Dec 2013 A1
20130325792 Shiffer et al. Dec 2013 A1
20130325871 Shiffer et al. Dec 2013 A1
20130325872 Shiffer et al. Dec 2013 A1
20140032875 Butler Jan 2014 A1
20140053260 Gupta et al. Feb 2014 A1
20140053261 Gupta et al. Feb 2014 A1
20140130158 Wang et al. May 2014 A1
20140137180 Lukacs et al. May 2014 A1
20140169762 Ryu Jun 2014 A1
20140179360 Jackson et al. Jun 2014 A1
20140181131 Ross Jun 2014 A1
20140189687 Jung et al. Jul 2014 A1
20140189866 Shiffer et al. Jul 2014 A1
20140189882 Jung et al. Jul 2014 A1
20140237600 Silberman et al. Aug 2014 A1
20140280245 Wilson Sep 2014 A1
20140283037 Sikorski et al. Sep 2014 A1
20140283063 Thompson et al. Sep 2014 A1
20140328204 Klotsche et al. Nov 2014 A1
20140337836 Ismael Nov 2014 A1
20140344926 Cunningham et al. Nov 2014 A1
20140351935 Shao et al. Nov 2014 A1
20140380473 Bu et al. Dec 2014 A1
20140380474 Paithane et al. Dec 2014 A1
20150007312 Pidathala et al. Jan 2015 A1
20150012647 Grelewicz Jan 2015 A1
20150096022 Vincent et al. Apr 2015 A1
20150096023 Mesdaq et al. Apr 2015 A1
20150096024 Haq et al. Apr 2015 A1
20150096025 Ismael Apr 2015 A1
20150180886 Staniford et al. Jun 2015 A1
20150186645 Aziz et al. Jul 2015 A1
20150199513 Ismael et al. Jul 2015 A1
20150199531 Ismael et al. Jul 2015 A1
20150199532 Ismael et al. Jul 2015 A1
20150220735 Paithane et al. Aug 2015 A1
20150363300 Luan Dec 2015 A1
20150372980 Eyada Dec 2015 A1
20160004869 Ismael et al. Jan 2016 A1
20160006756 Ismael et al. Jan 2016 A1
20160044000 Cunningham Feb 2016 A1
20160127393 Aziz et al. May 2016 A1
20160191547 Zafar et al. Jun 2016 A1
20160191550 Ismael et al. Jun 2016 A1
20160249106 Lachwani et al. Aug 2016 A1
20160261612 Mesdaq et al. Sep 2016 A1
20160285914 Singh et al. Sep 2016 A1
20160301703 Aziz Oct 2016 A1
20160335110 Paithane et al. Nov 2016 A1
20170083703 Abbasi et al. Mar 2017 A1
20180013770 Ismael Jan 2018 A1
20180048660 Paithane et al. Feb 2018 A1
20180121316 Ismael et al. May 2018 A1
20180288077 Siddiqui et al. Oct 2018 A1
Foreign Referenced Citations (11)
Number Date Country
2439806 Jan 2008 GB
2490431 Oct 2012 GB
0206928 Jan 2002 WO
0223805 Mar 2002 WO
2007-117636 Oct 2007 WO
2008041950 Apr 2008 WO
2011084431 Jul 2011 WO
2011112348 Sep 2011 WO
2012075336 Jun 2012 WO
2012145066 Oct 2012 WO
2013067505 May 2013 WO
Non-Patent Literature Citations (79)
Entry
U.S. Appl. No. 14/949,770, filed Nov. 23, 2015 Final Office Action dated Jul. 19, 2017.
U.S. Appl. No. 14/949,770, filed Nov. 23, 2015 Non-Final Office Action dated Oct. 17, 2016.
U.S. Appl. No. 14/949,770, filed Nov. 23, 2015 Notice of Allowance dated Mar. 1, 2018.
Venezia, Paul , “NetDetector Captures Intrusions”, InfoWorld Issue 27, (“Venezia”), (Jul. 14, 2003).
Whyte, et al., “DNS-Based Detection of Scanning Works in an Enterprise Network”, Proceedings of the 12th Annual Network and Distributed System Security Symposium, (Feb. 2005), 15 pages.
Williamson, Matthew M., “Throttling Viruses: Restricting Propagation to Defeat Malicious Mobile Code”, ACSAC Conference, Las Vegas, NV, USA, (Dec. 2002), pp. 1-9.
“Network Security: NetDetector—Network Intrusion Forensic System (NIFS) Whitepaper”, (“NetDetector Whitepaper”), (2003).
“Packet”, Microsoft Computer Dictionary, Microsoft Press, (Mar. 2002), 1 page.
“When Virtual is Better Than Real”, IEEEXplore Digital Library, available at, http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&arnumbe- r=990073, (Dec. 7, 2013).
AltaVista Advanced Search Results. “attack vector identifier”. Http://www.altavista.com/web/results?ltag=ody&pg=ag&aqmode=aqa=Event+Orch- estrator . . . , (Accessed on Sep. 15, 2009).
AltaVista Advanced Search Results. “Event Orchestrator”. Http://www.altavista.com/web/results?ltag=ody&pg=ag&aqmode=aqa=Event+Orch- esrator . . . , (Accessed on Sep. 3, 2009).
Aura, Tuomas, “Scanning electronic documents for personally identifiable information”, Proceedings of the 5th ACM workshop on Privacy in electronic society. ACM, 2006.
Baecher, “The Nepenthes Platform: An Efficient Approach to collect Malware”, Springer-verlag Berlin Heidelberg, (2006), pp. 165-184.
Bayer, et al., “Dynamic Analysis of Malicious Code”, J Comput Virol, Springer-Verlag, France., (2006), pp. 67-77.
Boubalos, Chris , “extracting syslog data out of raw pcap dumps, seclists.org, Honeypots mailing list archives”, available at http://seclists.org/honeypots/2003/g2/319 (“Boubalos”), (Jun. 5, 2003).
Chaudet, C. , et al., “Optimal Positioning of Active and Passive Monitoring Devices”, International Conference on Emerging Networking Experiments and Technologies, Proceedings of the 2005 ACM Conference on Emerging Network Experiment and Technology, CoNEXT '05, Toulousse, France, (Oct. 2005), pp. 71-82.
Chen, P. M. and Noble, B. D., “When Virtual is Better Than Real, Department of Electrical Engineering and Computer Science”, University of Michigan (“Chen”) (2001).
Cisco, Configuring the Catalyst Switched Port Analyzer (SPAN) (“Cisco”), (1992).
Cohen, M.I. , “PyFlag—An advanced network forensic framework”, Digital investigation 5, Elsevier, (2008), pp. S112-S120.
Costa, M. , et al., “Vigilante: End-to-End Containment of Internet Worms”, SOSP '05, Association for Computing Machinery, Inc., Brighton U.K., (Oct. 23-26, 2005).
Crandall, J.R. , et al., “Minos:Control Data Attack Prevention Orthogonal to Memory Model”, 37th International Symposium on Microarchitecture, Portland, Oregon, (Dec. 2004).
Deutsch, P. , “Zlib compressed data format specification version 3.3” RFC 1950, (1996).
Distler, “Malware Analysis: An Introduction”, Sans Institute InfoSec Reading Room, SANS Institute, (2007).
Dunlap, George W. , et al., “ReVirt: Enabling Intrusion Analysis through Virtual-Machine Logging and Replay”, Proceeding of the 5th Symposium on Operating Systems Design and Implementation, USENIX Association, “Dunlap”), (Dec. 9, 2002).
Excerpt regarding First Printing Date for Merike Kaeo, Designing Network Security (“Kaeo”), (2005).
Filiol, Eric , et al., “Combinatorial Optimisation of Worm Propagation on an Unknown Network”, International Journal of Computer Science 2.2 (2007).
Gibler, Clint, et al. AndroidLeaks: automatically detecting potential privacy leaks in android applications on a large scale. Springer Berlin Heidelberg, 2012.
Goel, et al., Reconstructing System State for Intrusion Analysis, Apr. 2008 SIGOPS Operating Systems Review, vol. 42 Issue 3, pp. 21-28.
IEEE Xplore Digital Library Sear Results for “detection of unknown computer worms”. Http//ieeexplore.ieee.org/searchresult.jsp?SortField=Score&SortOrder=desc- &ResultC . . . , (Accessed on Aug. 28, 2009).
Kaeo, Merike , “Designing Network Security”, (“Kaeo”), (Nov. 2003).
Kim, H. , et al., “Autograph: Toward Automated, Distributed Worm Signature Detection”, Proceedings of the 13th Usenix Security Symposium (Security 2004), San Diego, (Aug. 2004), pp. 271-286.
King, Samuel T., et al., “Operating System Support for Virtual Machines”, (“King”) (2003).
Krasnyansky, Max , et al., Universal TUN/TAP driver, available at https://www.kernel.org/doc/Documentation/networking/tuntap.txt (2002) (“Krasnyansky”).
Kreibich, C. , et al., “Honeycomb-Creating Intrusion Detection Signatures Using Honeypots”, 2nd Workshop on Hot Topics in Networks (HotNets-11), Boston, USA, (2003).
Kristoff, J. , “Botnets, Detection and Mitigation: DNS-Based Techniques”, NU Security Day, (2005), 23 pages.
Liljenstam, Michael , et al., “Simulating Realistic Network Traffic for Worm Warning System Design and Testing”, Institute for Security Technology studies, Dartmouth College (“Liljenstam”), (Oct. 27, 2003).
Marchette, David J., “Computer Intrusion Detection and Network Monitoring: A Statistical Viewpoint”, (“Marchette”), (2001).
Margolis, P.E. , “Random House Webster's ‘Computer & Internet Dictionary 3rd Edition’”, ISBN 0375703519, (Dec. 1998).
Moore, D. , et al., “Internet Quarantine: Requirements for Containing Self-Propagating Code”, INFOCOM, vol. 3, (Mar. 30-Apr. 3, 2003), pp. 1901-1910.
Morales, Jose A., et al., ““Analyzing and exploiting network behaviors of malware.””, Security and Privacy in Communication Networks. Springer Berlin Heidelberg, 2010. 20-34.
Natvig, Kurt , “SANDBOXII: Internet”, Virus Bulletin Conference, (“Natvig”), (Sep. 2002).
NetBIOS Working Group. Protocol Standard for a NetBIOS Service on a TCP/UDP transport: Concepts and Methods. STD 19, RFC 1001, Mar. 1987.
Newsome, J. , et al., “Dynamic Taint Analysis for Automatic Detection, Analysis, and Signature Generation of Exploits on Commodity Software”, In Proceedings of the 12th Annual Network and Distributed System Security, Symposium (NDSS '05), (Feb. 2005).
Newsome, J. , et al., “Polygraph: Automatically Generating Signatures for Polymorphic Worms”, In Proceedings of the IEEE Symposium on Security and Privacy, (May 2005).
Nojiri, D. , et al., “Cooperation Response Strategies for Large Scale Attack Mitigation”, DARPA Information Survivability Conference and Exposition, vol. 1, (Apr. 22-24, 2003), pp. 293-302.
Reiner Sailer, Enriquillo Valdez, Trent Jaeger, Roonald Perez, Leendert van Doom, John Linwood Griffin, Stefan Berger., sHype: Secure Hypervisor Appraoch to Trusted Virtualized Systems (Feb. 2, 2005) (“Sailer”).
Silicon Defense, “Worm Containment in the Internal Network”, (Mar. 2003), pp. 1-25.
Singh, S. , et al., “Automated Worm Fingerprinting”, Proceedings of the ACM/USENIX Symposium on Operating System Design and Implementation, San Francisco, California, (Dec. 2004).
Spitzner, Lance , “Honeypots: Tracking Hackers”, (“Spizner”), (Sep. 17, 2002).
The Sniffers's Guide to Raw Traffic available at: yuba.stanford.edu/.about.casado/pcap/section1.html, (Jan. 6, 2014).
Thomas H. Ptacek, and Timothy N. Newsham , “Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection”, Secure Networks, (“Ptacek”), (Jan. 1998).
U.S. Appl. No. 13/775,174, filed Feb. 23, 2013 Non-Final Office Action dated Feb. 20, 2015.
U.S. Appl. No. 13/775,174, filed Feb. 23, 2013 Notice of Allowance dated Aug. 11, 2015.
U.S. Appl. No. 14/949,770, filed Nov. 23, 2015 Advisory Action dated Oct. 5, 2017.
“Mining Specification of Malicious Behavior”—Jha et al, UCSB, Sep. 2007 https://www.cs.ucsb/edu/.about.chris/research/doc/esec07.sub.--mining.pdf-.
Abdullah, et al., Visualizing Network Data for Intrusion Detection, 2005 IEEE Workshop on Information Assurance and Security, pp. 100-108.
Adetoye, Adedayo , et al., “Network Intrusion Detection & Response System”, (“Adetoye”), (Sep. 2003).
Apostolopoulos, George; hassapis, Constantinos; “V-eM: A cluster of Virtual Machines for Robust, Detailed, and High-Performance Network Emulation”, 14th IEEE International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Sep. 11-14, 2006, pp. 117-126.
Cisco “Intrusion Prevention for the Cisco ASA 5500-x Series” Data Sheet (2012).
Didier Stevens, “Malicious PDF Documents Explained”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 9, No. 1, Jan. 1, 2011, pp. 80-82, XP011329453, ISSN: 1540-7993, DOI: 10.1109/MSP.2011.14.
FireEye Malware Analysis & Exchange Network, Malware Protection System, FireEye Inc., 2010.
FireEye Malware Analysis, Modern Malware Forensics, FireEye Inc., 2010.
FireEye v.6.0 Security Target, pp. 1-35, Version 1.1, FireEye Inc., May 2011.
Gregg Keizer: “Microsoft's HoneyMonkeys Show Patching Windows Works”, Aug. 8, 2005, XP055143386, Retrieved from the Internet: URL:http://www.informationweek.com/microsofts-honeymonkeys-show-patching-windows-works/d/d-id/1035069? [retrieved on Jun. 1, 2016].
Heng Yin et al, Panorama: Capturing System-Wide Information Flow for Malware Detection and Analysis, Research Showcase © CMU, Carnegie Mellon University, 2007.
Hiroshi Shinotsuka, Malware Authors Using New Techniques to Evade Automated Threat Analysis Systems, Oct. 26, 2012, http://www.symantec.com/connect/blogs/, pp. 1-4.
Idika et al., A-Survey-of-Malware-Detection-Techniques, Feb. 2, 2007, Department of Computer Science, Purdue University.
Isohara, Takamasa, Keisuke Takemori, and Ayumu Kubota. “Kernel-based behavior analysis for android malware detection.” Computational intelligence and Security (CIS), 2011 Seventh International Conference on. IEEE, 2011.
Kevin A Roundy et al: “Hybrid Analysis and Control of Malware”, Sep. 15, 2010, Recent Advances in Intrusion Detection, Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 317-338, XP019150454 ISBN:978-3-642-15511-6.
Khaled Salah et al: “Using Cloud Computing to Implement a Security Overlay Network”, Security & Privacy, IEEE, IEEE Service Center, Los Alamitos, CA, US, vol. 11, No. 1, Jan. 1, 2013 (Jan. 1, 2013).
Lastline Labs, The Threat of Evasive Malware, Feb. 25, 2013, Lastline Labs, pp. 1-8.
Li et al., A VMM-Based System Call Interposition Framework for Program Monitoring, Dec. 2010, IEEE 16th International Conference on Parallel and Distributed Systems, pp. 706-711.
Lindorfer, Martina, Clemens Kolbitsch, and Paolo Milani Comparetti. “Detecting environment-sensitive malware.” Recent Advances in Intrusion Detection. Springer Berlin Heidelberg, 2011.
Mori, Detecting Unknown Computer Viruses, 2004, Springer-Verlag Berlin Heidelberg.
Oberheide et al., CloudAV.sub.—N-Version Antivirus in the Network Cloud, 17th USENIX Security Symposium USENIX Security '08 Jul. 28-Aug. 1, 2008 San Jose, CA.
Vladimir Getov: “Security as a Service in Smart Clouds—Opportunities and Concerns”, Computer Software and Applications Conference (COMPSAC), 2012 IEEE 36th Annual, IEEE, Jul. 16, 2012 (Jul. 16, 2012).
Wahid et al., Characterising the Evolution in Scanning Activity of Suspicious Hosts, Oct. 2009, Third International Conference on Network and System Security, pp. 344-350.
Yuhei Kawakoya et al: “Memory behavior-based automatic malware unpacking in stealth debugging environment”, Malicious and Unwanted Software (Malware), 2010 5th International Conference on, IEEE, Piscataway, NJ, USA, Oct. 19, 2010, pp. 39-46, XP031833827, ISBN:978-1-4244-8-9353-1.
Zhang et al., The Effects of Threading, Infection Time, and Multiple-Attacker Collaboration on Malware Propagation, Sep. 2009, IEEE 28th International Symposium on Reliable Distributed Systems, pp. 73-82.
Continuations (2)
Number Date Country
Parent 14949770 Nov 2015 US
Child 16030759 US
Parent 13775174 Feb 2013 US
Child 14949770 US