Network and computer security are of paramount concern in the industry today. It seems as if a week does not go by without some news about a network system being compromised. Moreover, this is not just private industry as governmental agencies experience security breaches with as much frequency as the private sector.
Companies are always adjusting network security systems and techniques to stay ahead of ever-changing external and internal breach attempts. In addition, governmental regulations concerning privacy and access to company information and assets create a tremendous amount of overhead, which any security adjustments have to account for.
Still further, an average company may have thousands of authorized user accounts providing varying levels of access to that company's assets. Companies have to manage each of these accounts to ensure against unauthorized access and ensure that governmental compliance is being maintained.
Many companies perform access reviews on their network accounts for enforcing a principle of least privilege, which means that users are only being granted access to resources that they need and that they have been granted legitimate access to. These reviews may need to be performed based on: any changes made to existing network security techniques, government compliance requirements, specific reported security violations, normal internal auditing, etc.
However, even with an average or small-sized company, the burden of performing access reviews can be a tremendous undertaking on staff and network computing resources because of the number of accounts and access permissions/restrictions embedded in the security systems.
Various embodiments of the invention provide methods and a system for identifying and performing security access reviews. In an embodiment, a method for processing a security access review is presented.
Specifically, in an embodiment, security permissions are obtained for principal accounts associated with principals. A security permissions profile is identified and a value for each principal account is generated based on the security permissions for that principal account and the security permission profile. Finally, select principal accounts are separated out for security access review based on the generated values for the select principal accounts.
A “resource” includes: a user, service, system, a hardware device, a virtual device, directory, data store, groups of users, files, combinations and/or collections of these things, etc. A “principal” is a specific type of resource, such as an automated service or user that at one time or another is an actor on another principal or another type of resource. A designation as to what is a resource and what is a principal can change depending upon the context of any given network transaction. Thus, if one resource attempts to access another resource, the actor of the transaction may be viewed as a principal. Resources can acquire and be associated with unique identities to identify unique resources during network transactions.
An “identity” is something that is formulated from one or more identifiers and secrets that provide a statement of roles and/or permissions that the identity has in relation to resources. An “identifier” is information, which may be private and permits an identity to be formed, and some portions of an identifier may be public information, such as a user identifier, name, etc. Some examples of identifiers include social security number (SSN), user identifier and password pair, account number, retina scan, fingerprint, face scan, Media Access Control (MAC) address, Internet Protocol (IP) address, device serial number, etc.
A “processing environment” defines a set of cooperating computing resources, such as machines (processor and memory-enabled devices), storage, software libraries, software systems, etc. that form a logical computing infrastructure. A “logical computing infrastructure” means that computing resources can be geographically distributed across a network, such as the Internet. So, one computing resource at network site X can be logically combined with another computing resource at network site Y to form a logical processing environment. Moreover, a processing environment can be layered on top of a hardware set of resources (hardware processors, storage, memory, etc.) as a Virtual Machine (VM) or a virtual processing environment.
The phrases “processing environment,” “cloud processing environment,” “hardware processing environment,” and the terms “cloud” and “VM” may be used interchangeably and synonymously herein.
Moreover, it is noted that a “cloud” refers to a logical and/or physical processing environment as discussed above.
A “service” as used herein is an application or software module that is implemented in a non-transitory computer-readable storage medium or in hardware memory as executable instructions that are executed by one or more hardware processors within one or more different processing environments. The executable instructions programmed in memory when executed by the hardware processors. A “service” can also be a collection of cooperating sub-services, such collection referred to as a “system.”
A single service can execute as multiple different instances of a same service over a network.
Various embodiments of this invention can be implemented as enhancements within existing network architectures and network-enabled devices.
Also, any software presented herein is implemented in (and reside within) hardware machines, such as hardware processor(s) or hardware processor-enabled devices (having hardware processors). These machines are configured and programmed to specifically perform the processing of the methods and system presented herein. Moreover, the methods and system are implemented and reside within a non-transitory computer-readable storage media or memory as executable instructions that are processed on the machines (processors) configured to perform the methods.
Of course, the embodiments of the invention can be implemented in a variety of architectural platforms, devices, operating and server systems, and/or applications. Any particular architectural layout or implementation presented herein is provided for purposes of illustration and comprehension of particular embodiments only and is not intended to limit other embodiments of the invention presented herein and below.
It is within this context that embodiments of the invention are now discussed within the context of the
Each depicted use in the diagram includes a shade of grey. The shades of grey to black are intended to visually illustrate similarities or differences between use account attributes.
“Account attributes” includes security information and/or enterprise information assigned to a particular user. For example, account attributes can include access permissions with respect to resources (as defined above), such as: read (view only), no access, write access (viewing and modifying (delete, change, create)). The account attributes can also include enterprise information, such as: employee id, employee name, department, management level, job title, groups assigned to within the enterprise, roles within the enterprise, etc. The account attributes can be assigned in any enterprise combination for each given user account.
The circle that encircles three users is intended to illustrate similarities with those user accounts with one another as opposed to the remaining user accounts.
It is to be noted, throughout this discussion, that the term “user” can also include an automated resource, such as a service or a program because accounts can be established within an enterprise for automated programs as well, and such accounts can or may include their own account attributes. In this manner, the discussion provided herein can include automated accounts that are depicted as user accounts and that represent a valid user within the enterprise (an automated resource).
Moreover, the account attributes can be assembled from multiple different locations as needed, such that an employee identity can be used from the account attributes to obtain and acquire the security settings (permissions) as additional account attributes.
The security permissions can describe any of the following: 1) actions that the user can take within an application (type of resource, such as running a report, etc.); 2) items that the use may possess or may need to possess (such as an identity badge, etc.); and 3) resources that the user can access (such as a building, a server, a specific service, a specific directory, etc.).
The security permissions are granted either directly as the result of a specific user account attribute (such as department, job code, job title, etc.). The combination of all of a given user's security permissions determine his/her access.
Access reviews are processed to certify that users have only the level of access that they need to do their jobs within the enterprise.
For purposes of the discussion presented herein, the following assumptions are made: 1) the security system includes identities for principals (users or an automated service) that are collected from a variety of identity sources; 2) the security system includes security permissions for the collected identities collected from application sources; 3) the security permissions of an identity varies based on direct assignments, job codes, job titles, department assignments, etc.; and 4) identities within the security system have some security permissions in common based on either direct assignments, job codes, job titles, departments, etc.
The
The similarity in security permissions is calculated by processing statistical algorithms, such as the Jaccard Index or Sorensen Index.
The first formula depicted in the
The second formula depicted in the
For example, given a set A={1, 2, 3} and set B={2, 3, 4}, the intersection set |A Π B|={2, 3} and the union |A ∪ B|]={1, 2, 3, 4}. The Jaccard and Sorensen's index is calculated as:
J(A, B)=2/4=0.5 (Jaccard Index); and
S=2(2)/6=0.667.
The similarity index can be calculated on either: 1) a security permission profile and/or 2) a group of principals.
The security permission profile is a selected list of security permissions or security permissions derived from a given selected principal (user or automated service).
A similarity index is generated for each principal in the sample set based on each principal's security permissions (account attributes) similarity to the security permission profile (selected or derived from a particular principal's account attributes). These similarity indexes show how similar each of the principals are with respect to the security permission profile, and the similarity indexes can be processed to bulk certify all principals within a selected similarity range (set as a predefined value by a reviewer). This reduces the number of principal accounts that require a manual detailed review, leaving only outliers for detailed review.
The
To further illustrate, consider a group of 25 principals, each principal assigned some subset of 45 different security permissions along with 2 permission profiles. The two permission profiles (in this example) are as follows.
The similarity indexes of the 25 example users (principals) using the Jaccard Index calculation (as identified in the
The
The
Here, the similarity index is calculated for each user in the sample set to every other user in the group based on a specific account attribute (such as job code, job title, etc.).
A similarity index value is calculated for each user to every other user in the sample set. For example, the sample set may be all users with the tile or job code as manager. The resulting similarity matrix would indicate how similar a user is to every other user in the sample set.
The table that follows is an example of what a similarity matrix may look like for a sample set of eleven users that have the same job description along with a graph (the
Many times within an organization there are certain security permissions that are considered more critical or higher risk than others. For example, a security permission granting someone access to view the company payroll or access to a room that holds critical computer systems is higher risk than granting someone access to the break room.
In some embodiments, to account for designated critical assets, a weighting factor can be processed within a statistical analysis. This allows permissions with a greater weight to have a greater impact on the calculated similarity index values.
Consider the original example where the Jaccard Index is for the set A={1, 2, 3} and the set B={2, 3, 4}.
J(A, B)=2/4=0.5 (calculated Jaccard Index value).
Now assume that {2} has a weight factor of 5 because it has a greater risk than the other set items. The new weighted sets become: A={1, 2, 2, 2, 2, 2, 3} and set B={2, 2, 2, 2, 2, 3, 4}. The new weighted intersection set |A Π B|={2, 2, 2, 2, 2, 3} and |A ∪ B|={1, 2, 2, 2, 2, 2, 3, 4}. For the weight factor consider each element {2} a derivative of the original so that the number of intersecting elements increases from 2 to 6 and the total number of elements increased from 4 to 8. The new weighted Jaccard Index value is calculated as follows:
J(A, B)=6/8=0.75.
So, weighting element {2} by a factor of 5 caused the Jaccard Index value for the Sets A and B to increase from 0.5 to 0.75 (50% increase).
It is now apparent how calculations of principal accounts (users or automated resources) can be made to generate similarity index values which when compared against a predefined range permits automatically certification of principal accounts during security access reviews. This can also be used for weighted calculation of similarity index values to account for sensitivity to certain organizational security permissions.
It is also to be noted that although the Jaccard Index value calculation was presented for purposes of illustration, other statistical algorithms can be processed as well without departing from the novel security access reviews presented herein and below, such as, but not limited to Sorensen Index, Dice's Coefficient, overlap coefficient, and the like.
The architectural processing environment 100 includes: an access review manager 110, security account management service(s) 120, and identity provider(s) 130.
It is noted that although the components are illustrated independently that this is done for illustration only. That is, the components can all reside and process on a same hardware device and same processing environments; or, the components can all reside and process on different hardware devices and different processing environments.
The access review manager 110 includes a backend interface for interacting with the identity provider(s) 130 and the security account management service(s) 120. The access review manager 110 also includes a user-facing interface for interacting and reporting results for automated actions or manual actions.
Initially, an access review is identified as being needed within an organization. This can be done manually or based on some security event detected within a security system of the organization that triggers an access review of principal (user or automated services) accounts (an Application Programming Interface (API) provides an automated detection of the security event and triggering by the access review manager of an access review).
The user-facing interface permits a security analyst to define a sample set of principal accounts. In an embodiment, the sample set may include all principal accounts. In an embodiment, the sample set is less than all principal accounts. In an embodiment, the sample set is a statistical sample of all user accounts. In an embodiment, the sample set is a statistical sample from a defined type of user account.
The access review manager 110 interacts with the security account management services 120 to access account data source(s) 121 and obtain account attributes for each principal account identified in the sample set. The account attributes can be any of the previously-noted attributes (in the FIGS, 1A-1G). The access review manager 110 also interacts with the identity providers 130 and obtains security permissions 131 for each principal account identified in the sample set. The security permissions can be any of the previously-noted security permissions (in the
In an embodiment, the user-facing interface of the access review manager 110 presents the account attributes and security permissions to the security analyst for the security analyst to select specific security permissions and account attributes that the user defines as a security profile for the sample set.
In an embodiment, the user-facing interface of the access review manager 110 presents pre-defined security profiles for selection by the security analyst as the security profile.
In an embodiment, the user-facing interface of the access review manager 110 allows the security analyst to select a designated principal account from which the security profile is derived based on that principal's security permissions.
Once the security profile is defined (based on the security analyst's actions with the user-facing interface of the access review manager 110), the access review manager 110 initiates the similarity index generator 111.
The similarity index generator 111 produces a similarity index value for each principal account in the sample set. In an embodiment, the similarity index generator 111 processes a Jaccard Index algorithm to produce the similarity index values (as shown in
In an embodiment, the user-facing interface of the access review manager 110 permits the security analyst to select the statistical algorithm that the similarity index generator 110 processes against the principal accounts for generating the similarity index values.
The user-facing interface of the access review manager 110 also permits the security analyst to define a threshold range or select a predefined threshold range from a list of ranges. The security analyst can also define whether similarity index values within the range are to be auto-certified accounts 112 for the access review or outliers 113 for more detailed, and perhaps, manual review. That is, an outlier may be defined through the user-facing interface by the security analyst as falling within the threshold range or falling outside the threshold range.
The access review manager 110 then presents the similarity indexes (generated by the similarity index generator 111) in a variety of manners, which the security analyst can customize. Some of these were presented above as tables, graphs, etc. Listings or reports can be automatically generated as well as auto-certified accounts 112 and outliers for access review 113.
Moreover, the user-facing interface permits the security analyst to request that the similarity index values be generated by the similarity index generator 111 for each principal within a defined group of principals based on one or more specific attributes for the group (discussed above).
Still further, the similarity index generator 111 can be configured through the user-facing interface by the security analyst to process weighted similarity index values (as discussed above).
In an embodiment, the access review manager 110 and the similarity index generator 111 process automatically. For example, when a security event or a scheduled time is reached, a sample set is statistically generated and a predefined security permissions profile obtained. The similarity index values for the sample set are then compared against a pre-defined range and the sample set is separated into the auto-certified accounts 112 and the outliers 113. The auto-certified accounts 112 can be automatically sent through an API to the organization's security system and principal accounts associated therewith flagged as having been certified. Concurrently, the outliers 113 are noted to the security system, which may be configured to temporarily freeze access to the outliers 113 and report the outliers to security personnel for manual review. That is, the entire access review can be done in an automated fashion or it can be done interactively with input from the security analyst (as discussed above).
Moreover, because access reviews can be processor-intensive, the approaches discussed herein improves processor throughput and reduces the time it takes for performing access reviews for compliance and internal security. This is achieved because outlier identification from the sample set reduces substantially the number of accounts that have to be reviewed. The approaches here are also configurable and customizable based on the needs of an organization's security system and compliance issues.
The embodiments discussed above and other embodiments are now discussed with reference to the
In an embodiment, access review manager performs the processing discussed above with reference to the
At 210, the access review manager obtains security permissions for principal accounts associated with principals. In an embodiment, the principals are end-users. In an embodiment, the principals are automated services. In an embodiment, the principals are a combination of end-users and automated services. The accounts are for access to an organization's resources (physical equipment, hardware and software, data storage, network devices, etc.).
According to an embodiment, at 211, the access review manager identifies the principal accounts as a statistical sample set from all existing principal accounts within the organization.
In an embodiment of 211 and at 212, the access review manager interacts with at least one identity provider to obtain the security permissions for the principal accounts once the sample set is identified. In an embodiment, the identity provider is the identity provider(s) 130.
At 220, the access review manager identifies a security permissions profile.
In an embodiment, at 221 the access review manager receives the security permissions profile as a security analyst-defined set of security permissions selected by the security analysts from the security permissions associated with the principal accounts.
In an embodiment, at 222, the access review manager derives the security permissions profile from assigned security permissions for a particular principal account.
In an embodiment of 222 and at 223, the access review manager receives a selection from a security analyst for the particular principal account that is selected from the principal accounts by the security analyst.
At 230, the access review manager generates a value for each principal account based on the security permissions of each principal account and the security permissions profile. That is, a similarity index value is calculated for and assigned to each principal account to identify the similarity of each principal account to the security permissions profile. This can be done in any of the manners discussed above with the
According to an embodiment, at 231, the access review manager processes a Jaccard Index value calculation against the security permissions for each principal account and the security permissions defined in the security permissions profile.
In embodiment of 231, the access review manager processes a weighted Jaccard Index value calculation that weights one or more specific security permissions. This was discussed above with the
At 240, the access review manager separates our select principal accounts for security access review based on the generated similarity index values for the principal accounts.
According to an embodiment, at 241, the access review manager compares each similarity index value against a predefined range, and the access review manager identifies the select principal accounts for the access review as having similarity index values that fall outside the predefined range.
In an embodiment of 241 and at 242, the access review manager certifies remaining principal accounts as having passed the security access review based on the remaining principal accounts as having similarity index values that fall within the predefined range.
In an embodiment of 242 and at 243, the access review manager receives the predefined range from a security analyst.
In an embodiment, at 244, the access review manager notifies a security system service of the security access review that is being performed on the selected principal accounts. The security system service may elect to temporarily suspend access to these accounts being reviewed; although this does not have to be the case.
The processing of the access review manager reflects the processing discussed above for determining similarity based on a security permissions profile. The
In an embodiment, the security reviewer performs any of the processing discussed above in the
In an embodiment, the security reviewer is the method 200 of the
At 310, the security reviewer obtains security permissions for a select group of principal accounts associated with principals. Again, the principals can be end-users, automated services, or a combination thereof. The accounts associated with access to an organization resources.
According to an embodiment, at 311, the security reviewer identifies the group based on at least one common attribute shared by the principal account, such as job title, job code, job description, etc.
In an embodiment of 311 and at 312, the security reviewer receives the account attribute from a security analyst.
At 320, the security reviewer generates a similarity matrix for the group. Each cell in the similarity matrix having a similarity index value representing a similarity relationship value between unique pairs of the principal accounts based on that pair's security permissions. That is, each principal account is assigned similarity values for each remaining account and the relationship of the entire group is depicted in the similarity matrix. This was also discussed and presented above with the discussion of the
In an embodiment, at 321, the security reviewer calculates each value based on the security permissions present in each pair of the principal accounts.
In an embodiment of 321 and at 322, the security reviewer receives a selection for a particular statistical algorithm that calculates each value. The selection received from a security analyst. The particular statistical algorithm can be any of the ones mentioned above with the
In an embodiment, each similarity index value is produced using a weighting factor, such as what was discussed above with the
At 330, the security reviewer determines whether select principal accounts from the group are to be designated for security access review based on the similarity index values that populate the similarity matrix.
In an embodiment, at 331, the security reviewer presents the similarity matrix as an interactive graph for interaction by a security analyst, such as the interactive graph presented in the
In an embodiment of 331 and at 332, the security reviewer receives an interaction with the interactive graph from the security analyst. The interaction defining the select principal accounts for the security access review and/or for defining other principal accounts for being certified as having passed the security access review.
In an embodiment, the security access review system 400 implements, inter alia, the processing depicted in the
The multi-factor authentication system 400 includes a processor 401 and an access review manager 402.
In an embodiment, the processor 401 is part of a server.
In an embodiment, the processor 401 is part of a cloud processing environment.
The access review manager 402 is configured and adapted to: 1) execute on the processor 401, 2) determine similarities between security permissions of principal accounts associated with principals, and 3) identifies select principal accounts for a security access review.
In an embodiment, the access review manager 402 is further configured, in 2), to determine the similarities based on: a) a security permissions profile (as discussed in the method 200 above), or b) unique similarities between pairs of the principal accounts (as discussed in the method 300 above).
In an embodiment, the similarities are produced as similarity index values using any of the above-mentioned statistical algorithms. In an embodiment, the similarities are produces based on weighted security permissions as discussed above with the
In an embodiment, the access review manager 402 is one or more of: the access review manager 110, the processing discussed in the
The above description is illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of embodiments should therefore be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.