The present invention relates generally to computer security and networks, and particularly to associating user identifiers in event logs with a user entity and generating a user entity profile based on events in the logs.
In many computers and network systems, multiple layers of security apparatus and software are deployed in order to detect and repel the ever-growing range of security threats. At the most basic level, computers use anti-virus software to prevent malicious software from running on the computer. At the network level, intrusion detection and prevention systems analyze and control network traffic to detect and prevent malware from spreading through the network.
The description above is presented as a general overview of related art in this field and should not be construed as an admission that any of the information it contains constitutes prior art against the present patent application.
There is provided, in accordance with an embodiment of the present invention, a method for protecting a computer system, including identifying, by a processor, multiple user identifiers associated with a single user entity, detecting a first event carried out using a first one of the user identifiers, detecting a second event carried out using a second one of the user identifiers that is different from the first one of the user identifiers, and in response to a combination of the first and the second events, issuing an alert.
In some embodiments, identifying the multiple user identifiers associated with the single user entity includes collecting a set of events including the first and the second events, extracting respective user identifiers from the events in the set, mapping the extracted user identifiers to respective accounts, and associating the accounts with respective user entities, wherein the single user entity includes one of the multiple user entities.
In a first embodiment, mapping a given extracted user identifier to a given account includes normalizing the given user entity to a specific format, wherein the given account includes the normalized user entity.
In a second embodiment, the single user entity is associated with one or more accounts.
In a third embodiment, multiple user identifiers map to a given account for the single user entity.
In additional embodiments, detecting the first event includes detecting the first event on a first networked entity, and wherein detecting the second event includes detecting the second event on a second networked entity different from the first networked entity.
In further embodiments, detecting the first even includes detecting multiple first events during a first time period, and the method further includes generating a profile in response to the multiple first events, wherein detecting a second event includes detecting one or more second events in a second time period subsequent to the first time period, and wherein the combination of the first and the second events includes detecting that the one or more second events are not in accordance with the profile.
In supplemental embodiments, the first event includes a time-based status of the single user entity, and wherein the second event is not in accordance with the time-based status.
There is also provided, in accordance with an embodiment of the present invention, an apparatus for protecting a computer network, including a network interface card (NIC), and at least one processor configured to identify multiple user identifiers associated with a single user entity, to detect a first event carried out using a first one of the user identifiers, to detect a second event carried out using a second one of the user identifiers that is different from the first one of the user identifiers, and in response to a combination of the first and the second events, to issue an alert.
There is additionally provided, in accordance with an embodiment of the present invention, a computer software product for protecting a computing system, the product including a non-transitory computer-readable medium, in which program instructions are stored, which instructions, when read by a computer, cause the computer to identify multiple user identifiers associated with a single user entity, to detect a first event carried out using a first one of the user identifiers, to detect a second event carried out using a second one of the user identifiers that is different from the first one of the user identifiers, and in response to a combination of the first and the second events, to issue an alert.
The disclosure is herein described, by way of example only, with reference to the accompanying drawings, wherein:
Networked entities that communicate over computer network typically store logs that record events on the networked entities. While these logs can include identifiers for the events, user entities (e.g., employees of an organization) may use multiple accounts (e.g., email accounts) when accessing data on the network, and each account may use multiple identifiers when accessing data on the network. Therefore, it can be difficult to detect suspicious/malicious activity performed by a given user entity using different accounts and different user identifiers on the network.
Embodiments of the present invention provide methods and systems for protecting a computer system by identifying multiple user identifiers associated with a single user entity. Upon detecting a first event carried out using a first one of the user identifiers and detecting a second event carried out using a second one of the user identifiers that is different from the first one of the user identifiers, an alert can be issued in response to a combination of the first and the second events. In some embodiment the first event is collected from a first log on a first networked entity and the second event is collected from a second log on a second networked entity different than the first networked entity.
In one embodiment, multiple events can be collected from multiple event logs on networked entities coupled to a computer network. Identifiers can be extracted from the events, the identifiers can be normalized so as to map the events to accounts, and a subset of the accounts can be associated with the single user entity. A user entity profile can then be generated based on the events associated with the single user entity. Using this embodiment, systems implementing embodiments of the present invention can detect and flag suspicious activity if any subsequent events associated with the single user entity are determined not to be in accordance with the user entity profile.
Account database server 29 may comprises a domain database management system (DBMS) application 31 and a domain database 37. Account database 33 comprises a set of account database records 35 that are described in the description referencing
Computing facility 20 may also comprise an Internet gateway 34, which couples computing facility 20 to a public network 36 such as the Internet. To protect computing devices 28, computing facility 20 may also comprise a firewall 38 that is coupled to LAN 32 and controls, based on predetermined security rules, data traffic between computing devices 28 and a data cloud 40 comprising one or more cloud servers 42.
As described supra, security server 22 can be configured to generate user entity profiles 24 based on activity recorded by a plurality of networked entities in respective event logs 26. While the configuration in
In the configuration shown in
In embodiments described herein, security server 22 can be configured to extract user identifiers (IDs) from logs 26, normalize the user IDs and associate the normalized user IDs with user entities (i.e., individual people such as employees). In some embodiments, HR server 30 stores an HR database 46 that stores information for each user entity. In some embodiments, HR database 46 comprises a set of records 47 that have a one-to-one correspondence with user entities (i.e., employees) of an organization.
Security server 22 comprises a processor 48, a memory 50 and a network interface card (NIC) 51 that couples the security server to LAN 32. In some embodiments, processor 48 can combine logs 26 into an aggregated event log 52. Event logs 26 and 52 are described respectively in the descriptions referencing
While in embodiments described herein, processor 48 collects events from event logs 24A-24D, and stores to aggregated event log 52, aggregating events from other types of event logs 26 into the aggregated event log is considered to be within the spirit and scope of the present invention. Examples of information that can be stored by one or more additional event logs 26 include, but are not limited to:
A
In the configuration shown in
In some embodiments, tasks described herein such as extracting user-IDs from event logs 26, normalizing the user IDs, associate the normalized user IDs with user entities, aggregating logs 26 into aggregated event log 52, and generating user entity profiles 24 may be split among multiple computers systems 22, 28 and 30 within computing facility 20 or external to the computing facility (e.g., cloud servers 42). In additional embodiments, the functionality of some or all of computing devices 28, security server 22, account database server 29 and HR server 30 may be deployed in computing facility 20 and/or Internet 36 as physical computing devices, virtual machines or containers.
In some embodiments, client computers 28 have respective host names 56 that can be used to identify each of the client computers.
Processor 48 comprises a general-purpose central processing units (CPU) or special-purpose embedded processors, which are programmed in software or firmware to carry out the functions described herein. This software may be downloaded to security server 22 in electronic form, over a network, for example. Additionally or alternatively, the software may be stored on tangible, non-transitory computer-readable media, such as optical, magnetic, or electronic memory media. Further additionally or alternatively, at least some of the functions of processor 48 may be carried out by hard-wired or programmable digital logic circuits.
Examples of memory 50 include dynamic random-access memories, non-volatile random-access memories, hard disk drives and solid-state disk drives.
In the example shown in
Each event message 66 (i.e., referencing a given event) can have one or more user identifiers 68 (i.e., participants in the corresponding event). In one example, if a given event message corresponds to an event comprising a user entity sending an email, then the given event message 66 comprises a single identifier (ID) 68. In another example, if a given event message corresponds to an event comprising a first account associated with a first user entity granting one or more system permissions to a second account associated with a second user entity, then the given event message may comprise two identifiers 68.
In embodiments of the present invention, there are multiple user entities 67 (i.e., individual physical users) that operate computing devices using one or more respective accounts 69. As described hereinbelow, processor 48 can map each identifier 68 to a respective account 69, and then associate each account 69 with a respective user entity 67. Accounts 69 are described in the description referencing
In some embodiments, processor 48 can retrieve event log entries 60 from all the event logs (e.g., event logs 26A-26D), and store event information in the retrieved event log entries to aggregated event log 52. As described hereinbelow, processor 48 can use the information stored in aggregated event log 52 to map events to user entities.
Each aggregated event log entry 70 comprises an event ID 72, a source 74, a date 76, a time 78, an event message 80 and an identifier information record 82. Upon creating a new aggregated log entry 70 for a corresponding event log entry 60, processor 48 can:
In some embodiments, processor 48 can extract one or more user IDs 69 from event message 80s, normalize the user IDs and associate the normalized user IDs with user entities. In the configuration shown in
Upon creating the new aggregated log entry (i.e., as described supra), processor 48 can identify a number (i.e., one or more) identifiers 68 in event message 80, add the identified number of identifier information records 82 to the new aggregated log entry so that each identifier 68 has a corresponding identifier information record 82, and populate each given identifier information record as follows:
In examples described hereinbelow, a given user entity 67 named “John Doe” works for a company “Company”, has multiple mapped accounts 88, each referenced by one or more identifiers 84.
Examples of identifier types 86 include, but are not limited to:
In some embodiments, account database 33 may comprise Directory Sync Service™ (DSS™), produced by Palo Alto Networks, Inc., and endpoint agents 44 may comprise XDR™, the XDR™ endpoint agent may interact with DSS™ to retrieve mappings between identifiers 68 and accounts 67.
For example, relationships between identifiers 68 and accounts 67 can be maintained by a directory services application (not shown) such as is Active Directory™ (produced by Microsoft Corporation, Redmond, Washington, USA) that performs operations such as authenticating and authorizing all users and computers in a Windows™ domain type network, assigning and enforcing security policies for all computers, and installing or updating software. In this example, account DBMS 31 can query Active Directory™ to retrieve mappings between identifiers 68 and accounts 67 that comprise domain accounts.
User entity ID 100 comprise a unique identifier for a given user entity 67. In some embodiments, processor can create a set of user entity records 54 that have a one-to one correspondence with account database records 47, and store a unique identifier to each user entity id 100 in the set. Therefore, each given user entity (i.e., employee) 67 has a corresponding user entity record 54. User entity IDs 100 may also be referred to herein as user entities 100.
User entity profile 24 comprises a user profile indicating expected activity of the corresponding user entity. As described in the description referencing
Each status information records comprises a start date 110, and end date 112 and a status 114. Each given status 114 spans a time period starting with start date 110 and ending with end date 112. In some embodiments, start date 110 and end date 112 may also include time (e.g., 13:30 on 12/11/22).
Examples of statuses 114 include, but are not limited to:
Each user entity ID 100 typically uses one or more email accounts. In the configuration shown in
Each account information record 106 can store information such as a unique account ID 116, an account name 118 (i.e., an email address such as “john.doe@company.com” and john.doe@gmail.com) and account type 120. In embodiments herein, account ID 116 may also be referred to as account 116.
Examples of account types 120 include, but are not limited to:
In embodiments of the present invention, processor 48 extracts identifiers 84 from event log entries 60 and normalizes the extracted identifiers so as to identify respective mapped accounts 88. For a given user entity 100 in the corresponding user entity record 54, processor 48 can store, in identifier-account mapping records 108, current mappings between the extracted identifiers and the associated accounts (i.e., both for the given user entity). Each identifier-account mapping record 108 in a given user entity record 54 (i.e., for a corresponding user entity 100) can store information such as:
In step 130, processor 48 initializes user entity records 54. In some embodiments as described supra, each user entity record 54 corresponds to a given HR database record 47 an a corresponding user entity 100. When initializing user entity records 54, Additionally, when initializing user entity records 54, processor 48 can initialize user entity profiles 24 as well.
In step 132, processor 48 identifies event logs 26.
In step 134 the processor selects an unmapped event log entry 60 in a given event log 26. In embodiments herein, unmapped event log entries 60 comprise any of the event log entries no processed by steps 134-136 as described hereinbelow.
In step 136, processor 48 retrieves the selected event log entry. Upon retrieving the selected log entry, processor 48 can add a new aggregated log entry 70 to aggregated event log 52, and populate, in the new aggregated log entry, event ID 72, source 74, date 76, time 78 and event message 80 using embodiments described hereinabove.
In step 138, processor 48 identifies one or more identifiers 68 in event message 80 and stores the identified identifiers 68 to one or more extracted identifiers 84 (i.e., in one or more respective identifier information records 82).
In step 140 processor 48 normalizes the one or more extracted identifiers 84 to one or more specified formats so as to map each of the extracted identifiers to a respective account 116. In some embodiments, each account type 120 may have a corresponding specified format. Using the examples of account types described supra:
In some embodiments, the format for a given event is based on the source (e.g., the event log that processor 48 retrieved the event log entry corresponding to the given event, the event type, the field in the log entry corresponding to the given event) or content of the log entry corresponding to the given event. For example:
In some embodiments, there may be mappings from one or more extracted identifiers 84 (corresponding to respective identifiers 68) to a single account 116 (corresponding to a given account 69). For example:
In some embodiments, processor 46 can query database records 35 to the extracted identifiers to a respective account 116.
Upon performing each mapping of a given extracted identifier 84, processor 48 stores, the mapped account (ID) 116 to mapped account 88 in the identifier information record 82 storing the given extracted identifier. If any given mapping detected is step 140 is not already stored to user entity records 54, processor 48 can add a new identifier-account mapping record in the user entity record storing the mapped account, and populate identifier 122, identifier type 124 and associated account ID 126 accordingly.
In a first normalization embodiment, can normalize a given extracted identifier 84 by string manipulation (i.e., processor 48 stores the extracted identifiers as text strings). In this embodiment, processor 48 can normalizing extracted identifiers 84 to enable correlations and queries. For example, processor 48 can use string manipulation to normalize both
In a second normalization embodiment, processor 48 can normalize a given extracted identifier 84 by using domain knowledge. In this embodiment, special identifiers can indicate the type and scope of the account (e.g., at the host or main levels) mapped to the given identifier. In the following examples, processor 48 can use domain knowledge to:
Domain knowledge enables processor 48 to differentiate between accounts that are typically managed differently in Active Domain™ and Kerberos realms, as well as various data cloud environments.
In a third normalization embodiment, processor 48 can normalize a given extracted identifier 84 by using prior learned knowledge. In this embodiment, processor 48 can use learned roles and Directory Synchronization Service (DSS™) to determine the account for a given extracted identifier 84. In the following examples, processor 48 can use domain knowledge as follows:
Returning to the flow diagram, in step 142, for each given mapped account 88, processor 48 associates a given user entity 100 with a given mapped account 88. In some embodiments, each user entity 100 may be associated with one or more accounts 116. For example, as described supra, the mapped accounts may comprise “Company/jdoe”, “host123/jdoe”, “john.doe@company.com” and “john.doe@gmail.com”. All these mapped accounts 88 may be associated with a given user entity named “John Doe”.
In a first association embodiment, processor 46 can use information stored in HR database 46 and/or account database 33 so as to associate a given account 69 with a given user entity 67. For example, if processor 46 uses account database 33 to map a given identifier 68 to a given account 69 “john.doe@gmail.com”, and identifies a given user entity 67 named “John Doe” in HR database 46, then the processor can associate the given account with the given user entity as they have the same name.
In a second association embodiment, processor 48 can use heuristics to associate the given user entity with the given mapped account. For example, if “john.doe@gmail[.]com” matches DSS display name “John Doe” then they likely refer to the same user entity 100.
In a third association embodiment, processor 48 can use profiling and attribution to associate the given user entity with the given mapped account. In one profiling example, processor 48 can determine that the computing device having the host name “host123” is mostly used by a single user entity 100 “company\jdoe”. In a second profiling example, processor 48 can determine that the account “john.doe@gmail[.]com” always originates log entries 60 from the computing device having the host name “host_123”.
In a first attribution example, processor 48 can determine that the computing device having the host name “host_123” is a personal endpoint used by the user entity “jdoe”. In a second attribution example, processor 48 can determine that “john.doe@gmail[.]com” is the personal email of the user entity “jdoe”. In a third attribution example, processor 48 can determine that the user entity “jdoe” likely has access to the account “host_123\Administrator”.
Returning to the flow diagram, in step 144, processor 48 identifies one or more of the user entities that participated in the event corresponding to the selected log entry.
In step 146, processor 48 updates, with the event indicated by the event message in the selected log entry, the user entity profile for each of the user entities identified in step 144. In some embodiments, processor 48 can update user entity profiles 24 with the event indicated in the selected log entry only if the event was within a specified time period (e.g., the last 30 days).
In step 148, if there are any unmapped log entries 60, then the method continues with step 132. The method ends when there are no unmapped log entries 60.
Once processor 48 creates profiles 24, the processor can use the profiles to detect a single user entity 100 using multiple identifiers 122 to perform malicious activity in computing facility 20. For example, Processor 48 can:
While each of these individual events may seem legitimate, embodiments of the present invention enable correlating these three events to a single user entity 100 “John Doe”. Correlating multiple events having multiple identifiers 122 enables processor 48 to detect a suspicious sequence of events that are tied to a single user entity 100.
In step 150, at a time subsequent to generating profiles 24 as described in the description referencing
In step 152, processor 48 associates each of the events in the event messages in the additional event log entries with respective user entities 100, using embodiments described in the description referencing steps 140-142 in
In step 154, processor 48 updates status information records 104 with any updates to HR database 46 and updates user entity profiles 24 accordingly. For example, the user entity “John Doe” may be on vacation.
In step 154 processor 48 selects an unselected user entity 100.
In step 156, processor 48 compares the additional events for the selected user entity to user entity profile 24 of the selected user entity.
In step 158, processor 48 determines, based on the user entity profile, whether or not the additional events comprise suspicious activity. In some embodiments each user profile 24 can include information from status records 104 for the corresponding user entity 100. For example, if a given status for 114 for a given user entity 100 indicates that the given user entity is retired, and processor 48 detects events associated with the user entity subsequent to the retirement, then the processor can classify those events as suspicious since the events are not in accordance with the retirement status in the user entity profile.
If the additional events comprise suspicious activity, then in step 160, processor 48 issues an alert for the selected user entity. In one embodiment, the suspicious activity may combine a first event in a first given event log entry 60 that processor 48 used to generate the user entity profile, and a second event in a second given event log entry 60 that processor 48 collected in step 150. In this embodiment, the first and the second given event log entries mapped to different identifiers 122 associated with the same user entity 100.
To issue the alert processor 48 can perform operations such as transmitting a message to a system administrator (not shown) or restricting access to any of the accounts associated with the selected user entity.
In step 160, processor 48 updates the user entity profile of the selected user entity with the additional events associated with the selected user entity.
In step 164, if there are any unselected user entities 100 (i.e., in step 156), then the method continues with step 156. If there are no unselected user entities 100, then the method ends.
Returning to step 158, if processor 48 did not detect, based on the user entity profile, any suspicious activity in the additional events, then the method continues with step 162.
It will be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
Number | Name | Date | Kind |
---|---|---|---|
5991881 | Conklin et al. | Nov 1999 | A |
6347374 | Drake et al. | Feb 2002 | B1 |
6704874 | Porras et al. | Mar 2004 | B1 |
7003790 | Inoue et al. | Feb 2006 | B1 |
7007301 | Crosbie et al. | Feb 2006 | B2 |
7178164 | Bonnes | Feb 2007 | B1 |
7181769 | Keanini et al. | Feb 2007 | B1 |
7523016 | Surdulescu et al. | Apr 2009 | B1 |
7684568 | Yonge, III et al. | Mar 2010 | B2 |
7694150 | Kirby | Apr 2010 | B1 |
7703138 | Desai et al. | Apr 2010 | B2 |
7712134 | Nucci et al. | May 2010 | B1 |
7752665 | Robertson et al. | Jul 2010 | B1 |
7908655 | Bhattacharyya et al. | Mar 2011 | B1 |
8245298 | Pletka et al. | Aug 2012 | B2 |
8397284 | Kommareddy et al. | Mar 2013 | B2 |
8429180 | Sobel et al. | Apr 2013 | B1 |
8490190 | Hernacki et al. | Jul 2013 | B1 |
8516573 | Brown et al. | Aug 2013 | B1 |
8555388 | Wang et al. | Oct 2013 | B1 |
8578345 | Kennedy et al. | Nov 2013 | B1 |
8607353 | Rippert, Jr. et al. | Dec 2013 | B2 |
8620942 | Hoffman et al. | Dec 2013 | B1 |
8677487 | Balupari et al. | Mar 2014 | B2 |
8762288 | Dill | Jun 2014 | B2 |
8769681 | Michels et al. | Jul 2014 | B1 |
8925095 | Herz et al. | Dec 2014 | B2 |
8966625 | Zuk et al. | Feb 2015 | B1 |
9038178 | Lin | May 2015 | B1 |
9118582 | Martini | Aug 2015 | B1 |
9147071 | Sallam | Sep 2015 | B2 |
9231962 | Yen et al. | Jan 2016 | B1 |
9319421 | Ferragut et al. | Apr 2016 | B2 |
9342691 | Maestas | May 2016 | B2 |
9378361 | Yen et al. | Jun 2016 | B1 |
9386028 | Altman | Jul 2016 | B2 |
9531614 | Nataraj et al. | Dec 2016 | B1 |
9531736 | Torres et al. | Dec 2016 | B1 |
9690933 | Singh et al. | Jun 2017 | B1 |
9736251 | Samant et al. | Aug 2017 | B1 |
9773112 | Rathor et al. | Sep 2017 | B1 |
9979739 | Mumcuoglu et al. | May 2018 | B2 |
9979742 | Mumcuoglu et al. | May 2018 | B2 |
10027694 | Gupta et al. | Jul 2018 | B1 |
10075461 | Mumcuoglu et al. | Sep 2018 | B2 |
10140453 | Fridakis | Nov 2018 | B1 |
10181032 | Sadaghiani | Jan 2019 | B1 |
10237875 | Romanov | Mar 2019 | B1 |
10360367 | Mossoba | Jul 2019 | B1 |
10706144 | Moritz et al. | Jul 2020 | B1 |
10728281 | Kurakami | Jul 2020 | B2 |
10904277 | Sharifi Mehr | Jan 2021 | B1 |
11100199 | Subramaniam | Aug 2021 | B2 |
11501261 | Schemers | Nov 2022 | B1 |
20020059078 | Valdes et al. | May 2002 | A1 |
20020133586 | Shanklin et al. | Sep 2002 | A1 |
20030110396 | Lewis et al. | Jun 2003 | A1 |
20030133443 | Klinker et al. | Jul 2003 | A1 |
20040003286 | Kaler et al. | Jan 2004 | A1 |
20040015728 | Cole et al. | Jan 2004 | A1 |
20040117658 | Klaes | Jun 2004 | A1 |
20040199793 | Wilken et al. | Oct 2004 | A1 |
20040210769 | Radatti et al. | Oct 2004 | A1 |
20040250169 | Takemori et al. | Dec 2004 | A1 |
20040260733 | Adelstein et al. | Dec 2004 | A1 |
20050015624 | Ginter et al. | Jan 2005 | A1 |
20050060295 | Gould et al. | Mar 2005 | A1 |
20050069130 | Kobayashi | Mar 2005 | A1 |
20050071330 | Douceur et al. | Mar 2005 | A1 |
20050123138 | Abe et al. | Jun 2005 | A1 |
20050128989 | Bhagwat et al. | Jun 2005 | A1 |
20050183120 | Jain et al. | Aug 2005 | A1 |
20050216749 | Brent | Sep 2005 | A1 |
20050262556 | Waisman et al. | Nov 2005 | A1 |
20050262560 | Gassoway | Nov 2005 | A1 |
20050268112 | Wang et al. | Dec 2005 | A1 |
20050286423 | Poletto et al. | Dec 2005 | A1 |
20060018466 | Adelstein et al. | Jan 2006 | A1 |
20060075462 | Golan | Apr 2006 | A1 |
20060075492 | Golan et al. | Apr 2006 | A1 |
20060075500 | Bertman et al. | Apr 2006 | A1 |
20060107321 | Tzadikario | May 2006 | A1 |
20060126522 | Oh | Jun 2006 | A1 |
20060136720 | Armstrong et al. | Jun 2006 | A1 |
20060137009 | Chesla | Jun 2006 | A1 |
20060149848 | Shay | Jul 2006 | A1 |
20060156398 | Ross et al. | Jul 2006 | A1 |
20060161984 | Phillips et al. | Jul 2006 | A1 |
20060190803 | Kawasaki et al. | Aug 2006 | A1 |
20060191010 | Benjamin | Aug 2006 | A1 |
20060215627 | Waxman | Sep 2006 | A1 |
20060242694 | Gold et al. | Oct 2006 | A1 |
20060259967 | Thomas et al. | Nov 2006 | A1 |
20060282893 | Wu et al. | Dec 2006 | A1 |
20070011319 | McClure et al. | Jan 2007 | A1 |
20070072661 | Lototski | Mar 2007 | A1 |
20070073519 | Long | Mar 2007 | A1 |
20070116277 | Ro et al. | May 2007 | A1 |
20070124474 | Margulis | May 2007 | A1 |
20070198603 | Tsioutsiouliklis et al. | Aug 2007 | A1 |
20070201691 | Kumagaya | Aug 2007 | A1 |
20070201693 | Ohno | Aug 2007 | A1 |
20070218874 | Sinha et al. | Sep 2007 | A1 |
20070226796 | Gilbert et al. | Sep 2007 | A1 |
20070226802 | Gopalan et al. | Sep 2007 | A1 |
20070245420 | Yong et al. | Oct 2007 | A1 |
20070255724 | Jung et al. | Nov 2007 | A1 |
20070283166 | Yami et al. | Dec 2007 | A1 |
20080005782 | Aziz | Jan 2008 | A1 |
20080013725 | Kobayashi | Jan 2008 | A1 |
20080016339 | Shukla | Jan 2008 | A1 |
20080016570 | Capalik | Jan 2008 | A1 |
20080104046 | Singla et al. | May 2008 | A1 |
20080104703 | Rihn et al. | May 2008 | A1 |
20080134296 | Amitai et al. | Jun 2008 | A1 |
20080148381 | Aaron | Jun 2008 | A1 |
20080198005 | Schulak et al. | Aug 2008 | A1 |
20080244097 | Candelore et al. | Oct 2008 | A1 |
20080262991 | Kapoor et al. | Oct 2008 | A1 |
20080271143 | Stephens et al. | Oct 2008 | A1 |
20080285464 | Katzir | Nov 2008 | A1 |
20080301567 | Martin | Dec 2008 | A1 |
20090007100 | Field et al. | Jan 2009 | A1 |
20090007220 | Ormazabal et al. | Jan 2009 | A1 |
20090115570 | Cusack, Jr. | May 2009 | A1 |
20090157574 | Lee | Jun 2009 | A1 |
20090164522 | Fahey | Jun 2009 | A1 |
20090193103 | Small et al. | Jul 2009 | A1 |
20090265777 | Scott | Oct 2009 | A1 |
20090320136 | Ambert et al. | Dec 2009 | A1 |
20100014594 | Beheydt et al. | Jan 2010 | A1 |
20100054241 | Shah et al. | Mar 2010 | A1 |
20100071063 | Wang et al. | Mar 2010 | A1 |
20100107257 | Ollmann | Apr 2010 | A1 |
20100146292 | Shi et al. | Jun 2010 | A1 |
20100146293 | Shi et al. | Jun 2010 | A1 |
20100146501 | Wyatt et al. | Jun 2010 | A1 |
20100162400 | Feeney et al. | Jun 2010 | A1 |
20100197318 | Petersen et al. | Aug 2010 | A1 |
20100212013 | Kim et al. | Aug 2010 | A1 |
20100217861 | Wu | Aug 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100268818 | Richmond et al. | Oct 2010 | A1 |
20100272257 | Beals | Oct 2010 | A1 |
20100278054 | Dighe | Nov 2010 | A1 |
20100280978 | Shimada et al. | Nov 2010 | A1 |
20100284282 | Golic | Nov 2010 | A1 |
20100299430 | Powers et al. | Nov 2010 | A1 |
20110026521 | Gamage et al. | Feb 2011 | A1 |
20110035795 | Shi | Feb 2011 | A1 |
20110087779 | Martin et al. | Apr 2011 | A1 |
20110125770 | Battestini et al. | May 2011 | A1 |
20110135090 | Chan et al. | Jun 2011 | A1 |
20110138463 | Kim et al. | Jun 2011 | A1 |
20110153748 | Lee et al. | Jun 2011 | A1 |
20110185055 | Nappier et al. | Jul 2011 | A1 |
20110185421 | Wittenstein et al. | Jul 2011 | A1 |
20110214187 | Wittenstein et al. | Sep 2011 | A1 |
20110247071 | Hooks et al. | Oct 2011 | A1 |
20110265011 | Taylor et al. | Oct 2011 | A1 |
20110270957 | Phan et al. | Nov 2011 | A1 |
20110271343 | Kim et al. | Nov 2011 | A1 |
20110302653 | Frantz et al. | Dec 2011 | A1 |
20110317770 | Lehtiniemi et al. | Dec 2011 | A1 |
20120042060 | Jackowski et al. | Feb 2012 | A1 |
20120079596 | Thomas et al. | Mar 2012 | A1 |
20120102359 | Hooks | Apr 2012 | A1 |
20120136802 | Mcquade et al. | May 2012 | A1 |
20120137342 | Hartrell et al. | May 2012 | A1 |
20120143650 | Crowley et al. | Jun 2012 | A1 |
20120191660 | Hoog | Jul 2012 | A1 |
20120222120 | Rim et al. | Aug 2012 | A1 |
20120233311 | Parker et al. | Sep 2012 | A1 |
20120240185 | Kapoor et al. | Sep 2012 | A1 |
20120275505 | Tzannes et al. | Nov 2012 | A1 |
20120308008 | Kondareddy et al. | Dec 2012 | A1 |
20120331553 | Aziz et al. | Dec 2012 | A1 |
20130031600 | Luna et al. | Jan 2013 | A1 |
20130061045 | Kiefer et al. | Mar 2013 | A1 |
20130083700 | Sndhu et al. | Apr 2013 | A1 |
20130097706 | Titonis et al. | Apr 2013 | A1 |
20130111211 | Winslow et al. | May 2013 | A1 |
20130031037 | Brandt et al. | Jul 2013 | A1 |
20130196549 | Sorani | Aug 2013 | A1 |
20130298237 | Smith | Nov 2013 | A1 |
20130298243 | Kumar et al. | Nov 2013 | A1 |
20130333041 | Christodorescu et al. | Dec 2013 | A1 |
20140010367 | Wang | Jan 2014 | A1 |
20140013434 | Ranum et al. | Jan 2014 | A1 |
20140165207 | Engel et al. | Jun 2014 | A1 |
20140198669 | Brown et al. | Jul 2014 | A1 |
20140201776 | Minemura et al. | Jul 2014 | A1 |
20140230059 | Wang | Aug 2014 | A1 |
20140325643 | Bart et al. | Oct 2014 | A1 |
20150026810 | Friedrichs et al. | Jan 2015 | A1 |
20150040219 | Garraway et al. | Feb 2015 | A1 |
20150047032 | Hannis et al. | Feb 2015 | A1 |
20150071308 | Webb, III et al. | Mar 2015 | A1 |
20150121461 | Dulkin et al. | Apr 2015 | A1 |
20150156270 | Teraoka et al. | Jun 2015 | A1 |
20150180883 | Aktas et al. | Jun 2015 | A1 |
20150195300 | Adjaoute | Jul 2015 | A1 |
20150207694 | Inches et al. | Jul 2015 | A1 |
20150264069 | Beauchesne et al. | Sep 2015 | A1 |
20150295903 | Yi et al. | Oct 2015 | A1 |
20150304346 | Kim | Oct 2015 | A1 |
20150341380 | Heo et al. | Nov 2015 | A1 |
20150341389 | Kurakami | Nov 2015 | A1 |
20150356451 | Gupta et al. | Dec 2015 | A1 |
20160021141 | Liu et al. | Jan 2016 | A1 |
20160119292 | Kaseda et al. | Apr 2016 | A1 |
20160127390 | Lai et al. | May 2016 | A1 |
20160142746 | Schuberth | May 2016 | A1 |
20160191918 | Lai et al. | Jun 2016 | A1 |
20160234167 | Engel et al. | Aug 2016 | A1 |
20160247163 | Donsky et al. | Aug 2016 | A1 |
20160315954 | Peterson et al. | Oct 2016 | A1 |
20160323299 | Huston, III | Nov 2016 | A1 |
20160359895 | Chiu et al. | Dec 2016 | A1 |
20170007128 | Takano | Jan 2017 | A1 |
20170026387 | Vissamsetty et al. | Jan 2017 | A1 |
20170026395 | Mumcuoglu et al. | Jan 2017 | A1 |
20170054744 | Mumcuoglu et al. | Feb 2017 | A1 |
20170063921 | Fridman et al. | Mar 2017 | A1 |
20170078312 | Yamada et al. | Mar 2017 | A1 |
20170111376 | Friedlander et al. | Apr 2017 | A1 |
20170171229 | Arzi et al. | Jun 2017 | A1 |
20170262633 | Miserendino et al. | Sep 2017 | A1 |
20170289178 | Roundy et al. | Oct 2017 | A1 |
20170294112 | Kushnir | Oct 2017 | A1 |
20170374090 | McGrew et al. | Dec 2017 | A1 |
20180004948 | Martin et al. | Jan 2018 | A1 |
20180007013 | Wang | Jan 2018 | A1 |
20180048662 | Jang et al. | Feb 2018 | A1 |
20180077189 | Doppke et al. | Mar 2018 | A1 |
20180288081 | Yermakov | Oct 2018 | A1 |
20180332064 | Harris et al. | Nov 2018 | A1 |
20180365416 | Monastyrsky et al. | Dec 2018 | A1 |
20180373820 | Knezevic et al. | Dec 2018 | A1 |
20190036978 | Shulman-Peleg et al. | Jan 2019 | A1 |
20190044963 | Rajasekharan et al. | Feb 2019 | A1 |
20190044965 | Pilkington | Feb 2019 | A1 |
20190068620 | Avrahami et al. | Feb 2019 | A1 |
20190075344 | Brown | Mar 2019 | A1 |
20190207966 | Vashisht et al. | Jul 2019 | A1 |
20190297097 | Gong et al. | Sep 2019 | A1 |
20190319981 | Meshi et al. | Oct 2019 | A1 |
20190334931 | Arlitt et al. | Oct 2019 | A1 |
20200007566 | Wu | Jan 2020 | A1 |
20200082296 | Fly et al. | Mar 2020 | A1 |
20200136889 | Chen | Apr 2020 | A1 |
20200145435 | Chiu et al. | May 2020 | A1 |
20200162252 | Davis | May 2020 | A1 |
20200162494 | Rostami-Hesarsorkh | May 2020 | A1 |
20200195673 | Lee | Jun 2020 | A1 |
20200244658 | Meshi et al. | Jul 2020 | A1 |
20200244675 | Meshi et al. | Jul 2020 | A1 |
20200244676 | Amit et al. | Jul 2020 | A1 |
20200244683 | Meshi et al. | Jul 2020 | A1 |
20200244684 | Meshi et al. | Jul 2020 | A1 |
20200274894 | Argoeti et al. | Aug 2020 | A1 |
20200285737 | Kraus et al. | Sep 2020 | A1 |
20200293917 | Wang et al. | Sep 2020 | A1 |
20200327221 | Street | Oct 2020 | A1 |
20200327225 | Nguyen et al. | Oct 2020 | A1 |
20200342230 | Tsai | Oct 2020 | A1 |
20200374301 | Manevich et al. | Nov 2020 | A1 |
20210004458 | Edwards et al. | Jan 2021 | A1 |
20210176261 | Yavo et al. | Jun 2021 | A1 |
20210182387 | Zhu et al. | Jun 2021 | A1 |
20210209228 | Maor | Jul 2021 | A1 |
20210224676 | Arzani et al. | Jul 2021 | A1 |
20220129551 | Collier et al. | Apr 2022 | A1 |
20220138856 | Ahlstrom | May 2022 | A1 |
20220217156 | Wahbo | Jul 2022 | A1 |
20230171235 | Chhibber et al. | Jun 2023 | A1 |
Number | Date | Country |
---|---|---|
3041875 | Nov 2019 | CA |
103561048 | Feb 2014 | CN |
0952521 | Oct 1999 | EP |
2056559 | May 2009 | EP |
03083660 | Oct 2003 | WO |
Entry |
---|
U.S. Appl. No. 17/571,558 Office Action dated Jun. 26, 2023. |
Palo Alto Networks, Inc., “CORTEX XSOAR—Redefining Security Orchestration and Automation,” product Information, pp. 1-2, year 2020. |
Light Cyber Ltd, “LightCyber Magna”, pp. 1-3, year 2011. |
TIER-3 Pty Ltd, “Huntsman Protector 360”, Brochure, pp. 1-2, Apr. 1, 2010. |
TIER-3 Pty Ltd, “Huntsman 5.7 The Power of 2”, Brochure, pp. 1-2, Oct. 8, 2012. |
Bilge et at., “Disclosure: Detecting Botnet Command and Control Servers Through Large-Scale NetFlow Analysis”, ACSAC, pp. 1-10, Dec. 3-7, 2012. |
Blum., “Combining Labeled and Unlabeled Data with Co-Training”, Carnegie Mellon University, Research Showcase @ CMU, Computer Science Department, pp. 1-11, Jul. 1998. |
Felegyhazi et al., “On the Potential of Proactive Domain Blacklisting”, LEET'10 Proceedings of the 3rd USENIX Conference on Large-scale exploits and emergent threats, pp. 1-8, San Jose, USA, Apr. 27, 2010. |
Frosch., “Mining DNS-related Data for Suspicious Features”, Ruhr Universitat Bochum, Master'sThesis, pp. 1-88, Dec. 23, 2011. |
Bilge at al., “Exposure: Finding Malicious Domains Using Passive DNS Analysis ”, NDSS Symposium, pp. 1-17, Feb. 6-9, 2011. |
Gross et al., “Fire: Finding Rogue Networks”, Annual Conference on Computer Security Applications (ACSAC'09), pp. 1-10, Dec. 7-11, 2009. |
Markowitz, N., “Bullet Proof Hosting: A Theoretical Model”, Security Week, [pp. 1-5, Jun. 29, 2010, downloaded from http://www.infosecisland.com/blogview/4487-Bullet-Proof-Hosting-A-Theoretical-Model.html. |
Konte et al., “ASwatch: An AS Reputation System to Expose Bulletproof Hosting ASes”, SIGCOMM , pp. 625-638, Aug. 17-21, 2015. |
Markowitz, N., “Patterns of Use and Abuse with IP Addresses”, Security Week, pp. 1-4, Jul. 10, 2010, downloaded from http://infosecisland.com/blogview/5068-Patterns-of-Use-and-Abuse-with-IP-Addresses.html. |
Wei et al., “Identifying New Spam Domains by Hosting IPs: Improving Domain Blacklisting”, Department of Computer and Information Sciences, University of Alabama at Birmingham, USA, pp. 1-8, Dec. 8, 2010. |
Goncharov, M., “Criminal Hideouts for Lease: Bulletproof Hosting Services”, Forward-Looking Threat Research (FTR) Team, A TrendLabsSM Research Paper, pp. 1-28, Jul. 3, 2015. |
Xu, “Correlation Analysis of Intrusion Alerts,” Dissertation in Computer Science submitted to the Graduate Faculty, North Carolina State University, pp. 1-206, year 2006. |
U.S. Appl. No. 17/038,285 Office Action dated Mar. 21, 2022. |
International Application # PCT/IB2022/059544 Search Report dated Jan. 20, 2023. |
International Application # PCT/IB2022/060920 Search Report dated Feb. 7, 2023. |
EP Application # 19832439.4 Office Action dated Mar. 1, 2023. |
U.S. Appl. No. 17/175,720 Office Action dated Mar. 20, 2023. |
International Application # PCT/IB2022/061926 Search Report dated Mar. 27, 2023. |
U.S. Appl. No. 17/700,579 Office Action dated Mar. 23, 2023. |
U.S. Appl. No. 17/464,716 Office Action dated Apr. 14, 2023. |
U.S. Appl. No. 17/464,709 Office Action dated Apr. 14, 2023. |
U.S. Appl. No. 17/175,720 Office Action dated Nov. 7, 2022. |
U.S. Appl. No. 17/506,713 Office Action dated Nov. 8, 2022. |
Brownlee et al., “Traffic Flow Measurement: Architecture,” Request for Comments 2722, Network Working Group, pp. 1-48, Oct. 1999. |
“PA-3250 Next Generation Firewall,” PS-3200 Series, Datasheet, Palo Alto Networks, Inc., Santa Clara, CA, USA, pp. 1-4, year 2021. |
“What is PCI DSS?” Palo Alto Networks, Cyberpedia, pp. 1-5, year 2021, as downloaded from https://www.paloaltonetworks.com/cyberpedia/what-is-a-pci-dss. |
Wikipedia, “Active Directory,” pp. 1-14, last edited Oct. 2021. |
International Application # PCT/IB2021/058621 Search Report dated Dec. 14, 2021. |
Steimberg et al., U.S. Appl. No. 17/038,285, filed Sep. 30, 2020. |
Niksun, “Network Intrusion Forensic System (NIFS) for Intrusion Detection and Advanced Post Incident Forensics”, Whitepaper, pp. 1-12, Feb. 15, 2010. |
Shulman, A., “Top Ten Database Security Threats How to Mitigate the Most Significant Database Vulnerabilities”, White Paper, pp. 1-14, year 2006. |
Asrigo et al., “Using VMM-based sensors to monitor honeypots,” Proceedings of the 2nd International Conference on Virtual Execution Environments, pp. 13-23, Jun. 14, 2006. |
Bhuyan et al., “Surveying Port Scans and Their Detection Methodologies”, Computer Journal, vol. 54, No. 10. pp. 1565-1581, Apr. 20, 2011. |
Skormin, “Anomaly-Based Intrusion Detection Systems Utilizing System Call Data”, Watson School of Engineering at Binghamton University, pp. 1-82, Mar. 1, 2012. |
Palo Alto Networks, “Cortex XDR”, datasheet, pp. 1-7, year 2020. |
Palo Alto Networks, “WildFire”, datasheet, pp. 1-6, year 2020. |
Barford et al., “Characteristics of Network Traffic Flow Anomalies,” Proceedings of the 1st ACM SIGCOMM Workshop on Internet Measurement, pp. 69-73, year 2001. |
U.S. Appl. No. 17/700,579 Office Action dated Oct. 13, 2023. |
AU Application # 2021351215 Office Action dated Nov. 28, 2023. |
U.S. Appl. No. 17/676,275 Office Action dated Feb. 29, 2024. |
Number | Date | Country | |
---|---|---|---|
20230117268 A1 | Apr 2023 | US |