Users who join a social network are often asked to select various privacy options. These options can include different privacy levels for information with the levels dependent on the user's social association with another user of the social network. For example, certain photographs can be made available to only family members of the user. Other photographs can be made available to friends or possibly acquaintances of their friends and the like. These privacy choices allow the user to carefully control the exposure of their information on the social network.
However, third party applications tied to the social network may or may not adhere to the privacy settings selected by the user. The user typically blindly assumes that the third party application will follow their settings from the social network. This is often not the case, and the user unknowingly allows their private information to be exposed. For example, in “A Haskell and Information Flow Control Approach to Safe Execution of Untrusted Web Applications,” Stefan Deian, Talk at Stanford University, Apr. 11, 2011 (http://forum.stanford.edu/events/2011slides/security/2011securityStefan.pdf, http://forum.stanford.edu/events/2011deianstefaninfo.php), the author noticed that a privacy mismatch occurs when social media applications, such as Facebook applications, are installed, and the author proposed a solution to force a Facebook application to respect privacy settings. However, the author does not provide a means to detect the mismatch in a systematic way for any social network.
An auditing means is used to detect whether a privacy mismatch occurs between a social network's privacy settings and a third party application to permit a social network to take action to make the application comply with the privacy rules if so desired. In one instance, a system is constructed for a social network which shows the privacy mismatch between what the user believes is private according to the privacy settings they selected and what can actually be collected about them, for example, by an application installed by a friend and/or a friend of friend and/or anyone.
The above presents a simplified summary of the subject matter in order to provide a basic understanding of some aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key and/or critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose is to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
To the accomplishment of the foregoing and related ends, certain illustrative aspects of embodiments are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the subject matter can be employed, and the subject matter is intended to include all such aspects and their equivalents. Other advantages and novel features of the subject matter can become apparent from the following detailed description when considered in conjunction with the drawings.
The subject matter is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject matter. It can be evident, however, that subject matter embodiments can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the embodiments.
Currently, there is a lack of information on data that a social network application can access when the user clicks an application install button. Indeed, the install application button does more than just install an application, it also grants permissions to access additional user data, beyond the basic information as mentioned in the installation message shown to the user. Thus, the user has incomplete knowledge of which pieces of their information are being accessed by the application. The install button may also grant the application access to information about the people they are connected to in a network setting.
To prevent this type of inadvertent loss of privacy, a social network privacy auditor is constructed which shows the mismatch between a social network user's privacy settings and actual data which can be collected about a social network user with or without their knowledge or consent. If a user marks parts of their data and/or profile with different levels of privacy, the privacy auditor can show which data has an actual level of privacy that is lower (less secure) than the level indicated in a user's privacy settings. Some social networks make application developers sign a document saying that they will respect a user's privacy and not access data they are not supposed to access, and to not share such data with another party. However, these social networks do not have any system to enforce these rules by checking if an application complies with the social network platform policies about privacy and warning them if they do not (for example, see generally, Facebook Platform Policies http://developers.facebook.com/policy/). The privacy auditor is a means to audit compliance of an application with the user's privacy settings and the platform terms and policies, and can then take action to enforce compliance if so desired.
The privacy auditor can show mismatches between privacy settings, for example, such as separate privacy settings for a user's friends, friends of friends and/or anyone. These types of settings are used as an example as the privacy auditor can be constructed based on any type of relationship between users of a social network (e.g., immediate family, cousins, aunts, uncles, classmates of various institutions, etc.) and is not intended to be limiting in any manner. In one instance, a basic algorithm uses the social network privacy settings of a primary user. These can be, initially, default values provided by the social network and/or values provided directly and/or indirectly by the user of the social network. The associations can be construed as degrees of social association between a primary user and other users and the like. The higher the degree the less value a user places on that association (the user does not trust the association as much as a lower numbered degree association). However, one skilled in the art can appreciate that the degree number can be reversed as well, and, thus, the higher the degree, the more value a user places on the association. For example purposes, the former degree definition will be used.
In this example, another user of the social network is installing an application associated with the social network. If this user is a direct friend of a primary user, a 1st degree of association is established by the privacy auditor. When the application is installed by a friend of a friend, a 2nd degree of association is established. When the application is installed by, for example, anyone, a 3rd degree (or more) of association is established. The privacy auditor then tests and creates comparative data to illustrate mismatches between the social network privacy settings of the primary user and other users with various degrees of association.
If the applications 114, can retrieve data that the user has restricted based on a degree of association, the primary user 102 and/or the social network and/or the application is warned/notified 118 through a user interface (UI) and/or via other communication means (e.g., emails, text message, cell call, etc.). The warning/notification in
In
The data retrieved by the privacy data testers is then compared to data authorized to be accessible according to privacy settings of the social network 210. Any discrepancies are noted. The differences between the two sets of data are then displayed 212, ending the flow. One skilled in the art can appreciate that the data does not have to be displayed but can also be sent to the social network, primary user and/or offending entities by other means (e.g., email notification, direct notification over a network, etc.). Once communicated, the social network can take action to further limit privacy violations of the offending entity if so desired. This can include disrupting the offending entity's operations, warning the user and/or other types of actions such as monetary fines to the owner of an offending application and the like.
The privacy auditor has the advantage of having the ability to see which part of the user data is actually private and which pieces of information are leaking through applications. If a rogue application tries to access user information by violating the terms and conditions of privacy, the social network can alert the user and take action against the application.
In one instance shown in
What has been described above includes examples of the embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the embodiments, but one of ordinary skill in the art can recognize that many further combinations and permutations of the embodiments are possible. Accordingly, the subject matter is intended to embrace all such alterations, modifications and variations that fall within scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US12/68106 | 12/6/2012 | WO | 00 |