In an ever-shrinking world of digital data, organizations have data centers and users spread across the globe. With data privacy regulations like Global Data Protection Regulation (GDPR), organizations may face risk of non-compliance by accidentally providing otherwise legitimate ways for personal data to reach geographic locations that are prohibited by the regulation. There are no easy ways to detect and prevent accidental non-compliance occurrences other than merely announcing do's and don'ts to stakeholders and hoping the stakeholders comply, and/or, finding out about non-compliance occurrences only after the fact. The problem is multiplied when multiple content sources are scattered across the globe in various locations, which leads to sensitive data both having simple and direct ways of ending up in prohibited locations, as well as possibly travelling from one location to another to traverse prohibited locations and/or be transferred to prohibited locations.
As will be described in greater detail below, the present disclosure describes various systems and methods for identifying possible leakage paths of sensitive information.
In one embodiment, a method for identifying possible leakage paths of sensitive information may include (i) discovering, at a computing device comprising at least one processor, an original set of users having permission to read the sensitive information at an originating storage device in an originating location via an original set of information transfer paths and (ii) performing a security action. The security action may include (A) determining an additional set of information transfer paths having information transfer paths other than the information transfer paths already discovered, via which the original set of users can write the sensitive information and (B) identifying an additional set of users having permission to read the sensitive information via the additional set of information transfer paths.
In an example, the security action may further include (i) ascertaining another additional set of information transfer paths having information transfer paths other than the information transfer paths already determined, via which at least one of (A) the original set of users can write the sensitive information and/or (B) other identified additional users can write the sensitive information, (ii) finding another additional set of users having permission to read the sensitive information via the another additional set of information transfer paths, and (iii) repeating the ascertaining and finding steps until no additional information transfer paths are identified.
In some examples, the security action may further include (i) comparing geographic locations of information transfer paths in at least one additional set of information transfer paths to a list of prohibited geographic locations and (ii) flagging a specific information transfer path in the at least one additional set of information transfer paths when the specific information transfer path connects to a prohibited geographic location. In an embodiment, the security action may further include preventing transfer of a specific file including the sensitive information via the flagged specific information transfer path. In some embodiments, the security action may further include preventing transfer of the sensitive information via the flagged specific information transfer path.
In an example, the security action may further include (i) comparing geographic locations of users in at least one additional set of additional set of users to a list of prohibited geographic locations and (ii) flagging a specific user in the at least one additional set of users when the specific user is in a prohibited location. In some examples, the security action may further include preventing access to a specific file including the sensitive information by the flagged specific user. In an embodiment, the security action may further include preventing access to the sensitive information by the flagged specific user. In some embodiments, the security action may further include changing an information access permission of the flagged specific user.
In one example, a system for identifying possible leakage paths of sensitive information may include several modules stored in a memory, including (i) a discovering module, stored in the memory, that discovers an original set of users having permission to read the sensitive information at an originating storage device in an originating location via an original set of information transfer paths and (ii) a performing module, stored in the memory, that performs a security action. In an example, the security action may include (i) determining an additional set of information transfer paths having information transfer paths other than the information transfer paths already discovered, via which the original set of users can write the sensitive information and (ii) identifying an additional set of users having permission to read the sensitive information via the additional set of information transfer paths. In an example, the system may also include at least one physical processor that executes the discovering module and the performing module.
In some examples, the above-described method may be encoded as computer-readable instructions on a non-transitory computer-readable medium. For example, a computer-readable medium may include one or more computer-executable instructions that, when executed by at least one processor of a computing device, may cause the computing device to (i) discover, at the computing device, an original set of users having permission to read the sensitive information at an originating storage device in an originating location via an original set of information transfer paths and (ii) perform a security action. In some examples, the security action may include (i) determining an additional set of information transfer paths having information transfer paths other than the information transfer paths already discovered, via which the original set of users can write the sensitive information and (ii) identifying an additional set of users having permission to read the sensitive information via the additional set of information transfer paths.
Features from any of the embodiments described herein may be used in combination with one another in accordance with the general principles described herein. These and other embodiments, features, and advantages will be more fully understood upon reading the following detailed description in conjunction with the accompanying drawings and claims.
The accompanying drawings illustrate a number of example embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the example embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the example embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
The present disclosure is generally directed to systems and methods for identifying possible leakage paths of sensitive information.
In some examples, provided systems and methods may determine, based at least in part on permissions of origin of sensitive data, when there is potential for sensitive data to be transferred to prohibited countries. In some examples, provided systems and methods may determine when there are information transfer paths via which sensitive data may reach prohibited locations due to inappropriately assigned user permissions across information transfer paths, such as read permissions, write permissions, and/or access permissions.
In some embodiments, provided systems and methods may analyze information describing who can access specific sensitive data, information describing what data is sensitive, information describing geographic locations of computing devices (e.g., servers), and information describing prohibited locations (e.g., from Global Data Protection Regulatory Authorities), to map possible paths by which the sensitive data may reach the prohibited locations (e.g., while using a customer's legitimate infrastructure). In some examples, provided systems and methods may automatically mitigate non-compliance risk. In some examples, provided systems and methods may automatically reduce leakage of sensitive data to unauthorized devices in prohibited geographic locations and/or by unauthorized devices in prohibited geographic locations. In some examples, provided systems and methods may alert users about non-compliance risk. In some examples, alerting the users may save the users from huge fines and aid the users in proving the users are taking every measure technologically available to remain compliant with data privacy regulations.
By doing so, the systems and methods described herein may improve the overall functionality of computing devices by automatically performing preemptive security actions to identify, prevent, and/or mitigate data leakage, thus enabling a higher level of protection for sensitive information. For example, the provided techniques may advantageously improve the functionality of computing devices by improving data protection services and/or software. Also, in some examples, the systems and methods described herein may advantageously improve the functionality of computing devices by automatically saving power, saving time, better managing information storage devices, and/or better managing network bandwidth utilization.
The following provides, with reference to
In certain embodiments, one or more of modules 102 in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
Example system 100 in
First computing device 202 generally represents any type or form of computing device capable of reading computer-executable instructions. In some examples, first computing device 202 may represent a computer running security software, such as data leakage prevention software. In some examples, security software may include a processor-readable medium storing computer-readable instructions that when executed cause a processor in a computing device to perform a security action. Additional examples of first computing device 202 include, without limitation, laptops, tablets, desktops, servers, cellular phones, Personal Digital Assistants (PDAs), multimedia players, embedded systems, wearable devices (e.g., smart watches, smart glasses, etc.), smart vehicles, so-called Internet-of-Things devices (e.g., smart appliances, etc.), gaming consoles, variations or combinations of one or more of the same, or any other suitable computing device. In some examples, first computing device 202 may be located in an approved geographic location per data privacy regulations.
Network 204 generally represents any medium or architecture capable of facilitating communication or data transfer. In one example, network 204 may facilitate communication between first computing device 202 and server 206. In this example, network 204 may facilitate communication or data transfer using wireless and/or wired connections. Examples of network 204 include, without limitation, an intranet, a Wide Area Network (WAN), a Local Area Network (LAN), a Personal Area Network (PAN), the Internet, Power Line Communications (PLC), a cellular network (e.g., a Global System for Mobile Communications (GSM) network), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable network. In some examples, network 204 may provide at least one information transfer path.
Server 206 generally represents any type or form of computing device capable of reading computer-executable instructions. In some examples, server 206 may represent a computer running security software, such as data leakage prevention software. Additional examples of server 206 include, without limitation, storage servers, database servers, application servers, and/or web servers configured to run certain software applications and/or provide various storage, database, and/or web services. Although illustrated as a single entity in
Second computing device 208 generally represents any type or form of computing device capable of reading computer-executable instructions. In some examples, second computing device 208 may represent a computer running security software, such as data leakage prevention software. Additional examples of second computing device 208 include, without limitation, laptops, tablets, desktops, servers, cellular phones, Personal Digital Assistants (PDAs), multimedia players, embedded systems, wearable devices (e.g., smart watches, smart glasses, etc.), smart vehicles, smart packaging (e.g., active or intelligent packaging), gaming consoles, Internet-of-Things devices (e.g., smart appliances, etc.), variations or combinations of one or more of the same, and/or any other suitable computing device. In some examples, second computing device 208 may be located in a prohibited geographic location per data privacy regulations.
Many other devices or subsystems may be connected to system 100 in
The term “computer-readable medium,” as used herein, generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media include, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems.
As illustrated in
The term “sensitive data,” as used herein, generally refers to valuable information, the uncontrolled dissemination of which may cause harm and/or losses to people, governments, and/or businesses. Examples of sensitive information include, without limitation, personally identifiable information (PII). In some embodiments, sensitive information may include identification (ID) numbers, social security numbers, account numbers in conjunction with names, emails, addresses, phone numbers, financial information, health care information, business strategies, classified government information, law enforcement information, the like, or combinations thereof. In some examples, the terms “sensitive data” and “sensitive information” may be interchangeable.
The term “information transfer path,” as used herein, generally refers to a physical connection between computing devices by which digital electronic information may be transferred. Examples of information transfer path may include, without limitation, physical electrical devices such as routers, network interface cards, fiber optic cables, ethernet cables, multiplexers, and/or other devices configured to transfer digital electronic information.
As illustrated in
In some examples, security action 126 may include blocking access to devices (e.g., storage devices, memories, network devices, servers, network interface devices, information transfer paths, etc.), allowing limited access to devices, allowing read-only access to devices, encrypting information, and/or other acts limiting access to devices. In some examples security action 126 may be performed automatically. In some embodiments, security action 126 may attempt to identify and/or ameliorate potential security risks. In some examples, security action 126 may include blocking access to and/or by executing processes. In additional examples, security action 126 may include displaying, on user displays, warnings indicating that processes may be potentially dangerous.
In some examples, security actions may include displaying, on user displays (e.g., devices of first computing device 202, server 206, and/or second computing device 208 in
In an embodiment, security actions may include sending, to first computing device 202, server 206, and/or second computing device 208 in
In some examples, security actions may include prophylactic measures taken to safeguard electronic information. Prophylactic measures may include acts undertaken to prevent, detect, and/or mitigate vulnerabilities of electronic information, to implement data loss prevention policies (e.g., preventing and/or mitigating privacy leakage), and/or to thwart malicious activities targeting electronic information on electronic computing devices.
As illustrated in
As illustrated in
In some examples, security actions (e.g., security action 126) may further include (i) ascertaining more additional sets of information transfer paths having information transfer paths other than the information transfer paths already determined, via which at least one of (A) original sets of users can write the sensitive information and/or (B) other identified additional users can write the sensitive information, (ii) finding more additional sets of users having permission to read the sensitive information via the more additional sets of information transfer paths, and (iii) repeating the ascertaining and finding steps until no further additional information transfer paths are identified.
In some embodiments, security actions (e.g., security action 126) may further include (i) comparing geographic locations of information transfer paths in at least one additional set of information transfer paths to lists of prohibited geographic locations and (ii) flagging specific information transfer paths in the at least one additional set of information transfer paths when the specific information transfer paths connect to prohibited geographic locations.
In some embodiments, security actions may further include preventing transfer of the sensitive information via at least one of the flagged specific information transfer paths. In an embodiment, security actions may further include preventing transfer of specific files (e.g., that include the sensitive information) via the flagged specific information transfer paths.
In some examples, security actions (e.g., security action 126) may further include (i) comparing geographic locations of users in at least one additional set of additional set of users to lists of prohibited geographic locations and (ii) flagging specific users in the at least one additional set of users when the specific users are in prohibited locations.
In an embodiment, security actions may further include preventing access to the sensitive information by the flagged specific users. In some embodiments, security actions may further include preventing access to specific files by the flagged specific users (e.g., such as those specific files including the sensitive information). In some embodiments, security actions may further include changing information access permissions of the flagged specific users.
As detailed above, the steps outlined in computer-implemented method 300 in
Provided below is an example non-limiting example implementation of computer-implemented method 300 for identifying possible leakage paths of sensitive information. The steps may be performed by any suitable computer-executable code and/or computing system, including system 100 in
In some examples, the following information may be input to this method: an originating geographic location of sensitive data to which sensitive data is to be confined, classification rules for identifying sensitive data for use by a classification engine to identify sensitive data, a user geographic location for users, and/or information describing other content sources (e.g., computing devices) in a data estate.
Step 1 may include Identifying users who have permission at the originating location and identifying users who have permissions for other content sources. In examples, stored information may be identified as sensitive data.
Step 2 may include designating path(s) at originating locations as a set DataSet1 (DataSet1 includes paths P11, P12 . . . Pn1), where sensitive data is located and each site is respectively numbered (e.g., “1”, “2”, etc.). Then, determine a set of users UserSet1 (U11, U12 . . . Um1) which has read permission to data set DataSet1. For example, UserSet1=Find(User Set having Read permission) where dataset=DataSet1.
Step 3 may include determining which paths other than in DataSet1 to which this user set UserSet1 has write permissions. For example, DataSet2(paths P21, P22 . . . Pn2)=Find(paths where UserSet1 has write permission DataSet1(P11, P12 . . . Pn1)
Step 4 may include determining a set of users UserSet2 (U21, U22 . . . U) which has read permission to DataSet1. For example, UserSet2=Find(User Set having Read permission) where dataset=DataSet2.
Step 5 may include repeating Step 2 to Step 4 until DataSetN=DataSet(N−1). The relevant user set will be UserSetN.
Step 6 may include comparing the locations of paths in DataSetN to a prohibited locations list and when locations of paths in DataSetN are found in prohibited locations list, flagging those paths. In some examples, the method may include comparing geographical locations of users in UserSetN to the prohibited locations list and when geographical locations of users in UserSetN are found in prohibited locations list, flagging those users. Flagged users and/or flagged paths may provide a basis for revoking and/or changing access permissions of users and/or paths to prevent accidental transfer of sensitive data to prohibited locations.
While the foregoing disclosure sets forth various embodiments using specific block diagrams, flowcharts, and examples, each block diagram component, flowchart step, operation, and/or component described and/or illustrated herein may be implemented, individually and/or collectively, using a wide range of hardware, software, or firmware (or any combination thereof) configurations. In addition, any disclosure of components contained within other components should be considered example in nature since many other architectures can be implemented to achieve the same functionality.
In some examples, all or a portion of example system 100 in
In various embodiments, all or a portion of example system 100 in
According to various embodiments, all or a portion of example system 100 in
In some examples, all or a portion of example system 100 in
The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various example methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
While various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these example embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using modules that perform certain tasks. These modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these modules may configure a computing system to perform one or more of the example embodiments disclosed herein.
The preceding description has been provided to enable others skilled in the art to best utilize various aspects of the example embodiments disclosed herein. This example description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the spirit and scope of the present disclosure. The embodiments disclosed herein should be considered in all respects illustrative and not restrictive. Reference should be made to the appended claims and their equivalents in determining the scope of the present disclosure.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” In some examples, the singular may portend the plural. Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”
This application is a continuation-in-part of PCT Application No. PCT/US2019/025801 filed Apr. 4, 2019, which claims the benefit of U.S. Provisional Application No. 62/653,541, filed Apr. 5, 2018, the disclosures of each of which are incorporated, in their entireties, by this reference.
Number | Name | Date | Kind |
---|---|---|---|
5561706 | Fenner | Oct 1996 | A |
6836465 | Rajan | Dec 2004 | B2 |
6888842 | Kirkby | May 2005 | B1 |
8271642 | Sankararaman | Sep 2012 | B1 |
8671455 | Zhu | Mar 2014 | B1 |
8813236 | Saha | Aug 2014 | B1 |
8914410 | Hannel | Dec 2014 | B2 |
9100430 | Seiver | Aug 2015 | B1 |
9654458 | Bhaktwatsalam | May 2017 | B1 |
9819685 | Scott | Nov 2017 | B1 |
9876801 | Scott | Jan 2018 | B1 |
10237073 | Benson | Mar 2019 | B2 |
20020138632 | Bade | Sep 2002 | A1 |
20050288036 | Brewer | Dec 2005 | A1 |
20120192252 | Kuo | Jul 2012 | A1 |
20130031598 | Whelan | Jan 2013 | A1 |
20130227645 | Lim | Aug 2013 | A1 |
20150304984 | Khemani | Oct 2015 | A1 |
20160006744 | Du | Jan 2016 | A1 |
20160036816 | Srinivasan | Feb 2016 | A1 |
20160050224 | Ricafort | Feb 2016 | A1 |
20160134624 | Jacobson | May 2016 | A1 |
20160344756 | Ricafort | Nov 2016 | A1 |
20160357994 | Kanakarajan | Dec 2016 | A1 |
20170086163 | Khemani | Mar 2017 | A1 |
20170221288 | Johnson | Aug 2017 | A1 |
20180159874 | Ricafort | Jun 2018 | A1 |
20180227315 | Taneja | Aug 2018 | A1 |
20180247503 | Kariniemi | Aug 2018 | A1 |
20180255419 | Canavor | Sep 2018 | A1 |
20180343188 | Betge-Brezetz | Nov 2018 | A1 |
20180365676 | Studnicka | Dec 2018 | A1 |
20190114930 | Russell | Apr 2019 | A1 |
20190116192 | Singh | Apr 2019 | A1 |
20190207980 | Sarin | Jul 2019 | A1 |
20190258799 | Harris | Aug 2019 | A1 |
20190279485 | VanBlon | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
108664803 | Oct 2018 | CN |
109977690 | Jul 2019 | CN |
2003099400 | Apr 2003 | JP |
2005267353 | Sep 2005 | JP |
2008305253 | Dec 2008 | JP |
4450653 | Apr 2010 | JP |
2016538618 | Dec 2016 | JP |
Entry |
---|
LeakProber: A Framework for Profiling Sensitive Data Leakage Paths by Junfeng Yu, Shengzhi Zhang, Peng Liu and Zhitang Li pp. 10; Feb. 21-23 (Year: 2011). |
FlowMon: Detecting Malicious Switches in Software-Defined Networks By Andrzej Kamisiński and Carol Fung pp. 7; Oct. 12 (Year: 2015). |
A Review on Data Leakage Detection for Secure Communication By Kishu Gupta and Ashwani Kush pp. 7; October (Year: 2017). |
RWFS: Design and Implementation of File System Executing Access Control Based on User's Location BY Yuki Yagi, Naofumi Kitsunezaki, Hiroki Saito and Yoshito Tobe pp. 6; IEEE (Year: 2014). |
PiOS: Detecting Privacy Leaks in iOS Applications by Manuel Egele, Christopher Kruegely, Engin Kirdaz, and Giovanni Vignay pp. 15; Published (Year: 2011). |
Number | Date | Country | |
---|---|---|---|
62653541 | Apr 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2019/025801 | Apr 2019 | US |
Child | 16740997 | US |