Systems and methods of automated compliance with data privacy laws

Information

  • Patent Grant
  • 10430608
  • Patent Number
    10,430,608
  • Date Filed
    Wednesday, April 30, 2014
    10 years ago
  • Date Issued
    Tuesday, October 1, 2019
    5 years ago
Abstract
The technology disclosed relates to automated compliance with data privacy laws of varying jurisdictions. In particular, it relates to constructing trust filters that automatically restrict collection, use, processing, transfer, or consumption of any person-related data that do not meet the data privacy regulations of the applicable jurisdictions. The trust filters are constructed dependent on associating person-related data entities with trust objects that track person-related data sources.
Description
BACKGROUND

The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also correspond to implementations of the claimed technology.


The use by companies of information about individuals is subject to a complex array of data protection laws. Companies that create, collect, process, store, or consume personal information have to comply with numerous data privacy laws and regulations to prevent loss of customer support, regulatory investigations, and substantial fines. Furthermore, the numerous data privacy laws and regulations differ from country to country, thus increasing the complication. In many jurisdictions, class action lawsuits are becoming the norm for data breaches involving significant numbers of affected individuals.





BRIEF DESCRIPTION OF THE DRAWINGS

The included drawings are for illustrative purposes and serve only to provide examples of possible structures and process operations for one or more implementations of this disclosure. These drawings in no way limit any changes in form and detail that may be made by one skilled in the art without departing from the spirit and scope of this disclosure. A more complete understanding of the subject matter may be derived by referring to the detailed description and claims when considered in conjunction with the following figures, wherein like reference numbers refer to similar elements throughout the figures.



FIG. 1 shows an example environment of automated compliance with data privacy laws.



FIG. 2 is one implementation of person-related data.



FIG. 3 shows one implementation of a trust object linked to person-related data entity.



FIG. 4 illustrates details of trust metadata held by a trust object.



FIG. 5 depicts one implementation of acceptable values of trust metadata in accordance with data privacy regulations specified in a jurisdiction.



FIG. 6 is one implementation of a trust filter interface that is constructed for automatically complying with data privacy laws.



FIG. 7 shows a flowchart of one implementation of automated compliance with data privacy laws.



FIG. 8 is a block diagram of an example computer system for automatically complying with data privacy laws.





DETAILED DESCRIPTION

The following detailed description is made with reference to the figures. Sample implementations are described to illustrate the technology disclosed, not to limit its scope, which is defined by the claims. Those of ordinary skill in the art will recognize a variety of equivalent variations on the description that follows.


The technology disclosed relates to automated compliance with data privacy laws by using computer-implemented systems. The technology disclosed can be implemented in the context of any computer-implemented system including a database system, a multi-tenant environment, or the like. Moreover, this technology can be implemented using two or more separate and distinct computer-implemented systems that cooperate and communicate with one another. This technology can be implemented in numerous ways, including as a process, a method, an apparatus, a system, a device, a computer readable medium such as a computer readable storage medium that stores computer readable instructions or computer program code, or as a computer program product comprising a computer usable medium having a computer readable program code embodied therein.


As used herein, the “identification” of an item of information does not necessarily require the direct specification of that item of information. Information can be “identified” in a field by simply referring to the actual information through one or more layers of indirection, or by identifying one or more items of different information which are together sufficient to determine the actual item of information. In addition, the term “specify” is used herein to mean the same as “identify.”


As used herein, a given signal, event or value is “dependent on” a predecessor signal, event or value if the predecessor signal, event or value influenced the given signal, event or value. If there is an intervening processing element, step or time period, the given signal, event or value can still be “dependent on” the predecessor signal, event or value. If the intervening processing element or step combines more than one signal, event or value, the signal output of the processing element or step is considered “dependent on” to each of the signal, event or value inputs. If the given signal, event or value is the same as the predecessor signal, event or value, this is merely a degenerate case in which the given signal, event or value is still considered to be “dependent on” the predecessor signal, event or value. “Responsiveness” of a given signal, event or value upon another signal, event or value is defined similarly.


Introduction


Data privacy laws vary dramatically from country to country. Some countries have enacted comprehensive laws, while others have few or no rules in place. For companies that do business around the world, the issue of privacy has indisputably become an international one, as countries throughout the world are increasingly active in enacting data privacy laws. Given the number and complexity of data privacy laws and regulations worldwide, and the severe penalties for violating them, companies are striving to prevent the improper disclosure or use of personal information of data subjects.


Laws governing data privacy protection are complicated, diverse, and jurisdiction specific. For instance, in the United States a complex patchwork system of state and federal laws cover data privacy, including the Federal Trade Commission Act, the Gramm-Leach-Bliley Act, and the Health Insurance Portability and Accountability Act of 1996. The European Union has a comprehensive data protection directive that requires compliance by all 27 member states. However, the European Union the directive allows significant variations among the member states. EU member approaches and enforcement have not been consistent.


Further, several Latin American countries have recently enacted or are drafting comprehensive legislative frameworks to protect private information. Throughout the Middle East, which previously had no data protection law, there is an emerging need and governments are responding. Meanwhile, China has sparse data protection law, and only a few countries in Africa, such as Tunisia and Mauritius, have adopted comprehensive privacy laws.


The technology disclosed addresses compliance with data privacy laws applicable to varying jurisdictions. It tracks various person-related sources that are used to assemble person-related data by associating person-related data entities with trust objects. Person-related data are data relating to living individuals, referred to as “data subjects,” who can be identified from those data or from those data together with other information that is in or is likely to come into the possession of the entity that decides what the data will be used for, the “data providers.” Person-related data also include opinions and indications about intentions of the data subject.


The trust object holds trust metadata, including name of the person-related data source, interface category of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, data privacy regulations applicable to the origin, at least one purpose of assembling the person-related data, and at least one classification of the person-related data. This metadata is further described below, in the discussion of FIG. 4.


When the data provider receives a request from a tenant for the person-related data, the technology disclosed constructs a filter that sets acceptable values, in accordance with the data privacy regulations applicable in the jurisdiction where the tenant intents to further use, process, or consume the person-related data. The filter is the automatically applied to the person-related data to restrict transfer of person-related data does not meet the data privacy regulations applicable to the jurisdiction.


For instance, if a tenant purchases person-related data of data subjects from Canada and further uses it in United States, the technology disclosed can be used to identify trust metadata that are not compliant with both Canada's and United States' data privacy regulations. Once the non-compliant trust metadata or data sources are identified, the technology disclosed automatically filters them to prevent their further use, thus ensuring compliance with the applicable jurisdictions.


In one example, if Canadian regulations require the data subjects to give express consent for using their personal information, the technology disclosed can automatically filter out all data subjects have provided only implied or opt-out consent. Additionally, if the United States regulations do not permit the use of personal information collected in bulk from application-programming interfaces (APIs), then the technology disclosed can automatically ensure that person-related data collected from APIs is not used, processed, or consumed in the United States.


Automated Compliance Environment



FIG. 1 shows an example environment 100 of automated compliance with data privacy laws. FIG. 1 includes a person-related database 102, trust database 108, user computing device 122, trust filter 124, network(s) 115, and tracker 128. In other implementations, environment 100 may not have the same elements or components as those listed above and/or may have other/different elements or components instead of, or in addition to, those listed above, such as an entity database, social database, or jurisdiction database. The different elements or components can be combined into single software modules and multiple software modules can run on the same hardware.


In some implementations, network(s) 115 can be any one or any combination of Local Area Network (LAN), Wide Area Network (WAN), WiFi, telephone network, wireless network, point-to-point network, star network, token ring network, hub network, peer-to-peer connections like Bluetooth, Near Field Communication (NFC), Z-Wave, ZigBee, or other appropriate configuration of data networks, including the Internet.


In some implementations, tracker 128 can be an engine of varying types including a workstation, server, computing cluster, blade server, server farm, or any other data processing system or computing device. Tracker 128 can be communicably coupled to the databases via a different network connection. For example, it can be coupled via the network 115 (e.g., the Internet) or to a direct network link.


In some implementations, datastores can store information from one or more tenants into tables of a common database image to form a multi-tenant database system (MTS). A database image can include one or more database objects. In other implementations, the databases can be relational database management systems (RDBMSs), object oriented database management systems (OODBMSs), distributed file systems (DFS), no-schema database, or any other data storing systems or computing devices. In some implementations, user computing device 122 can be a personal computer, laptop computer, tablet computer, smartphone, personal digital assistant (PDA), digital image capture devices, and the like. In one implementation, an application or service like trust filter 124 runs on user computing device 122.


Person-related database 102 specifies different entities (persons and organizations) such as contacts, accounts, opportunities, and/or leads and further provides business information related to the respective entities. Examples of business information can include names, addresses, job titles, number of employees, industry types, territories, market segments, contact information, employer information, stock prices, SIC codes, and NAICS codes. In one implementation, person-related database 102 can store web or database profiles of the users and organizations as a system of interlinked hypertext documents that can be accessed via the network 115 (e.g., the Internet). In another implementation, person-related database 105 can also include standard profile information about persons and organizations. This standard profile information can be extracted from company websites, business registration sources such as Jigsaw, Hoovers, or D&B, business intelligence sources such as Yelp or Yellow Pages, and social networking websites like Chatter, Facebook, Twitter, or LinkedIn.


Regarding different types of person-related data sources or “interface categories,” the interface categories specify whether a person-related data source is access controlled or publicly available on the Internet or a social network. Examples of access controlled application programming interfaces (APIs) can include Yahoo Boss, Facebook Open Graph, or Twitter Firehose. Public Internet includes first hand websites, blogs, search aggregators, or social media aggregators. Facebook, Twitter, LinkedIn, or Klout qualify as examples of social networking sites.


In one implementation, access controlled APIs like Yahoo Boss, Facebook Open Graph, and Twitter Firehose can provide real-time search data aggregated from numerous social media sources such as LinkedIn, Yahoo, Facebook, and Twitter. APIs can initialize sorting, processing and normalization of data. Public Internet can provide data from public sources such as first hand websites, blogs, web search aggregators, and social media aggregators. Social networking sites can provide data directly from social media sources such as Twitter, Facebook, LinkedIn, and Klout.


In one implementation, tracker 128 spiders different person-related data sources to retrieve person-related data, including web data associated with the business-to-business contacts. In some implementations, tracker 128 can extract a list of contacts from a master database and search those contacts on the different person-related data sources in order to determine if social or web content associated with the contacts exists within those sources. If the person-related data sources provide positive matches to any of the contacts, tracker 128 can store the retrieved social or web content in person-related database 102, according to one implementation.


In another implementation, tracker 128 assembles social media content from the different types of person-related data sources. Social media content can include information about social media sources, social accounts, social personas, social profiles, social handles, digital business cards, images, or contact information of users, which can be stored in person-related database 102.


Tracker 128 automates compliance with data privacy laws by tracking the different person-related data sources and associating data entities that hold person-related data with trust objects that track the different sources. In one implementation, it constructs filters that set acceptable values for trust metadata, in accordance with data privacy regulations of a particular jurisdiction.


Person-Related Data



FIG. 2 shows an example schema of person-related data 200. This and other data structure descriptions that are expressed in terms of objects can also be implemented as tables that store multiple records or object types. Reference to objects is for convenience of explanation and not as a limitation on the data structure implementation. FIG. 2 shows a profile object 216 linked to event object 208, feed object 218, connection object 228, group object 224, and photo object 204. Photo object 204 is further linked to photo album object 202 and photo tag object 214. In other implementations, person-related data 200 may not have the same objects, tables, fields or entries as those listed above and/or may have other/different objects, tables, fields or entries instead of, or in addition to, those listed above such as a work object, education object, or contact information object.


Profile object 216 provides primary information that identifies a user and includes different fields that store biographic information about a user such as first name, last name, sex, birthday, work history, interests, and the like. The profile object 216 is further linked to other objects that provide supplementary information about the user. For instance, profile object 216 is linked to an event object 208 that stores information related to events subscribed, checked-in, or attended by the user. In one implementation, profile object 216 is linked to a feed object 218 that specifies different feeds items such as posts, comments, replies, mentions, etc. posted by the user or on user's profile.


In another implementation, profile object 216 is linked to a connection object 228 that provides information about other persons in the social network of the user. In one implementation, profile object 216 is linked to a group object 224 that identifies the groups the user is part of. In yet another implementation, profile object 216 is linked to a photo object 204 that identifies an image, which is uploaded, posted, or selected by the user. The photo object 204 is further linked to a photo album object 202 that categorizes the image and to a photo tag object 214 that describes the image.


In some implementations, person-related data 200 can have one or more of the following variables with certain attributes: USER_ID being CHAR (15 BYTE), IMAGES_ID being CHAR (15 BYTE), EVENT_ID being CHAR (15 BYTE), GROUP_ID being CHAR (15 BYTE), CONNECTION_ID being CHAR (15 BYTE), FEED_ITEM_ID being CHAR (15 BYTE), CREATED_BY being CHAR (15 BYTE), CREATED_DATE being DATE, and DELETED being CHAR (1 BYTE).


Trust Object and Trust Metadata



FIG. 3 shows one implementation of a trust object 315 linked to person-related data entity 216. This and other data structure descriptions that are expressed in terms of objects can also be implemented as tables that store multiple records or object types. Reference to objects is for convenience of explanation and not as a limitation on the data structure implementation. Trust object 315 holds trust metadata that include names of person-related data sources from which person related data 200 is assembled by tracker 128, interface categories of the person-related data sources, origins of the person-related data sources, consent-types given by subjects of the person-related data 200, data privacy regulations applicable to the origins, different purposes of assembling the person-related data 200, and classifications of the person-related data 200. In other implementations, association 300 may not have the same objects, tables, fields, or entries as those listed above and/or may have other/different objects, tables, fields or entries instead of, or in addition to, those listed above such as opt-in law object, unsubscribe life object, or message type object.


Some implementations can include trust objects being linked to individual entity fields 216A-K of the person-related data entity 216. In such an implementation, person-related data sources that provide data for populating entity fields 216A-K are tracked by associating the individual entity fields 216A-K with respective trust objects that tracks the sources. Other implementations can include the different objects, tables, fields or entries of person-related data 200 being associated with different trust objects that track respective person-related data sources which populate the objects, tables, fields or entries.


In yet another implementation, association 300 can have one or more of the following variables with certain attributes: LINK_ID being CHAR (15 BYTE), DATA_ENTITY_ID being CHAR (15 BYTE), TRUST_ITEM_ID being CHAR (15 BYTE), CREATED_BY being CHAR (15 BYTE), CREATED_DATE being DATE, and DELETED being CHAR (1 BYTE).



FIG. 4 illustrates details of trust metadata 400 held by trust object 315. This and other data structure descriptions that are expressed in terms of objects can also be implemented as tables that store multiple records or object types. Reference to objects is for convenience of explanation and not as a limitation on the data structure implementation. FIG. 4 shows trust object 315 linked to a name object 402, interface category object 408, jurisdiction object 412, consent-type object 418, regulation object 422, purpose object 428, and classification object 445. In other implementations, trust metadata 400 may not have the same objects, tables, fields or entries as those listed above and/or may have other/different objects, tables, fields or entries instead of, or in addition to, those listed above such as a work object, education object, or contact information object.


Name object 402 includes names of the person-related data sources from which person-related data is assembled. In the example shown in FIG. 4, name object 402 identifies Twitter and Facebook as the person-related data sources that provide the person-related data. Interface category object 408 specifies the type of the person-related data sources included in the name object 402. The sources can be of various types, including: APIs like Yahoo Boss, Facebook Open Graph, or Twitter Firehose, Public Internet platforms such as first hand websites, blogs, search aggregators, or social media aggregators, or social networking sites like Facebook, Twitter, LinkedIn, Klout, and the like.


Jurisdiction object 412 identifies different origins of the person-related data sources. In one implementation, it specifies geographic locations (countries, states, etc.) of the person-related data sources. In another implementation, it includes applicable jurisdictions (United States law, European Union law, etc.) of the person-related data sources.


Consent-type object 418 records the type of consent associated with a data subject of person related data 200. When an organization collects personal information from an individual, most privacy legislation requires that an individual's consent be given so that an organization can collect, use, or disclose it. In some implementations, there are three different types of consent an organization can obtain—explicit consent, implicit consent, and opt-out consent. Explicit consent refers to clear and documentable consent. In one implementation, an explicit consent specifies the particular types of data, the specific purposes for which they can be used, and/or the countries to which they can be disclosed. An example of providing explicit consent is signing any consent form that outlines why an organization would like to collect, use, or disclose an individual's personal information.


Implied consent is consent, which is not expressly granted by an individual, but rather derived from individual's specific actions and the circumstances that unequivocally demonstrate the individual's consent. In one example, implied consent can be inferred when an individual voluntarily provides personal information for an organization to collect, use, or disclose for purposes that would be considered obvious at the time, or when the personal information is used in a way that clearly benefits the individual and the organization's expectations are reasonable.


When an individual is given the option to decline consent but does not clearly decline consent, the consent is referred to as opt-out consent. Opt-out consent refers to being offered an option to opt-out, but not declining to give consent. For example, when purchasing a product online, an individual can be presented with a checkbox and asked to uncheck the box (opt-out) if the individual would not like his or her personal information to be shared with affiliates for marketing purposes.


Regulation object 422 identifies regulations applicable to a particular data collection, processing, or consumption venue. For instance, if person-related data of a data subject is collected from Canada, then regulation object 422 can record Spam Act as the regulation that must be complied with. In another instance, if the same data is processed or consumed in United States, regulation object 422 can identify CAN-SPAM as the applicable regulation. In some implementations, regulation object 422 can use flags or tags to uniquely identify the respective regulations applicable to a collection venue, processing venue, and consumption venue of person-related data of a data subject.


Purpose object 426 specifies the purpose for which person related of a data subject is collected. In one example, if a data subject provided his contact information to receive marketing calls related to dental service, then purpose object 426 identifies “dental service” as the purpose of the person-related data so as to prevent its use for any purposes. In another implementation, if a data subject consented for use of his person-related data for an electoral campaigning of particular type such as presidential election campaigns, purpose object 426 restricts use of that person-related data to presidential election campaigns and prevents solicitation of the user for other types of electoral campaigns such as Congressional elections.


Person-related data has different levels of sensitivity, corresponding to the personal data type, and different laws cover protection and access to particular personal data types. In one implementation, classification object 445 classifies person-related data as personal data and business data. Personal data includes personal information such an individual's name, social security number, driver's license number, personal e-mail, state identification number, financial account number, credit card number, electronically stored biometric information, protected health information, and the like. Business data specifies an individual's employer, work address, job title, work e-mail, department, or industry. In another implementation, classification object 445 classifies person-related data as restricted data, controlled data, and public data with different levels of access controls.


In yet another implementation, trust metadata 400 can have one or more of the following variables with certain attributes: METADATA_ENTITY_ID being CHAR (15 BYTE), METADATA_ENTITY_TOTAL_ID being CHAR (15 BYTE), SOURCE_TYPE_ID being CHAR (15 BYTE), LAW_ID being CHAR (15 BYTE), ACCESS_CONTROL_ID being CHAR (15 BYTE), RESTRICTION_TYPE_ID being CHAR (15 BYTE), CREATED_BY being CHAR (15 BYTE), CREATED_DATE being DATE, and DELETED being CHAR (1 BYTE).



FIG. 5 depicts one implementation of a functional data structure that reflects values of trust metadata 500 that comply with data privacy regulations of a jurisdiction. Multiple data structures reflect compliance requirements of multiple jurisdictions. A program functional interprets the data to set filters according to an applicable jurisdiction. The data filters control transfer or use of data and can be used to implement legal compliance as describe herein. Reference in this description to objects and fields is for convenience of explanation and not as a limitation on the data structure implementation.



FIG. 5 shows a regulation object 516 linked to an engagement preference type object 518 and an API object 528. In other implementations, acceptable values 500 may not have the same objects, tables, fields or entries as those listed above and/or may have other/different objects, tables, fields or entries instead of, or in addition to, those listed. In some implementations, regulation objects 516 can be grouped under state, country or other jurisdiction objects, which are not shown.


In one implementation, a filter is constructed that sets acceptable values 500 in accordance with the data privacy regulations specified in a particular jurisdiction. For instance, if a data provider wants a set of person-related data to comply with United States jurisdiction, then the person-related data set should have field values consistent with that required by the applicable CAN-SPAM Act. As shown in FIG. 5, engagement preferences required by the CAN-SPAM ACT can be met by linking CAN-SPAM regulation object 516 to an engagement preferences type object 518. Engagement preferences type object 518 can specify an engagement preference of type “Not-Opt-Out-For-Display” such that displaying person-related data of data subjects requires that the data subjects have not opted out and thus necessitates providing valid and simple opt-out mechanism or operative unsubscribe facility. In another example, an engagement preference of type “Opt-In-Out-For-Display” can require an explicit opt-in of the data subjects. Similarly, API object 528 can identify the different person-related data sources that follow and enforce the CAN-SPAM Act. In other implementations, acceptable values that identify specific opt-in/opt-out mediums like email (Not-Opt-Out-For-Email/Opt-In-For-Email) or specific solicitation purposes like campaigns (Opt-Out-For-Campaigns/Opt-In-For-Campaigns) can be set.


Trust Filter Interface



FIG. 6 illustrates one implementation of generating for display a trust filter interface 124 that can be used to automatically comply with data privacy laws. Trust filter interface 124 can take one of a number of forms, including user interfaces, dashboard interfaces, engagement consoles, and other interfaces, such as mobile interfaces, tablet interfaces, summary interfaces, or wearable interfaces. In some implementations, it can be hosted on a web-based or cloud-based privacy management application running on a computing device such as a personal computer, laptop computer, mobile device, and/or any other hand-held computing device. It can also be hosted on a non-social local application running in an on-premise environment. In one implementation, trust filter interface 124 can be accessed from a browser running on a computing device. The browser can be Chrome, Internet Explorer, Firefox, Safari, and the like. In other implementations, trust filter interface 124 can run as an engagement console on a computer desktop application.


In other implementations, trust filter 600 can be presented in online social networks such as Salesforce's Chatter, Facebook, Twitter, LinkedIn, and the like. FIG. 6 also shows an interface category tab 610, source type tab 620, jurisdiction tab 630, regulation tab 640, consent tab 650, purpose tab 660, and classification tab 670. In other implementations, user interface 600 may not have the same widgets or screen objects as those listed above and/or may have other/different widgets or screen objects instead of, or in addition to, those listed above such as language tab, subscribe life tab, or message type tab.


In particular, trust filter interface 124 can be used by tenant personnel such as privacy engineers or privacy administrators to specify the privacy requirements or regulations that would want the instant person-related data to comply with. In one implementation, interface category tab 610 can be used to select the type of person-related data source from which person-related data is collected. Further, source type tab 620 can be used to particularly specify a source from which person-related data is assembled. Then, a collection point, processing point, usage point, or consumption point of the data can be selected via jurisdiction tab 630. In some implementations, the jurisdiction can be automatically selected based on the type or name of the person-related data source. Also, regulations applicable to the selected collection point, processing point, usage point, or consumption point can be identified using regulation tab 640. In other implementations, the one or more applicable regulations can be automatically selected based on the collection point, processing point, usage point, or consumption point of the person-related data source.


Consent tab 650 can be used to select the type of consent that is mandated by the regulations applicable to the selected collection point, processing point, usage point, or consumption point. Similarly, purpose tab 660 can specify the purposes for which the subsequent processing, usage, or consumption of the person-related data must be restricted to. Additionally, further processing, usage, or consumption of the person-related data can be limited to only its person or business data types through the classification tab 670.


Flowchart of Automatically Complying with Data Privacy Laws



FIG. 7 is a flowchart 700 of one implementation of automated compliance with data privacy laws. Flowchart 700 can be implemented at least partially with a database system, e.g., by one or more processors configured to receive or retrieve information, process the information, store results, and transmit the results. Other implementations may perform the actions in different orders and/or with different, fewer or additional actions than those illustrated in FIG. 7. Multiple actions can be combined in some implementations. For convenience, this flowchart is described with reference to the system that carries out a method. The system is not necessarily part of the method.


At action 710, a person-related data source is tracked by associating a data entity that holds person-related data with a trust object that tracks the source. The trust object holds trust metadata, including name of the person-related data source, interface category of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, data privacy regulations applicable to the origin, at least one purpose of assembling the person-related data, and at least one classification of the person-related data.


At action 720, a tenant request for the person-related data is request. The tenant request identifies a jurisdiction for subsequently processing, or using, consuming the person-related data. In one implementation, a tenant can be a customer, customer department, business or legal organization, and/or any other entity that acquires person-related data for sales, marketing, campaigning or other customer engagement purposes from a primary data provider who owns crowd-sourced data repositories or knowledge bases.


At action 730, a filter is constructed that sets acceptable values, in accordance with the data privacy regulations specified in the jurisdiction, for the name of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data.


At action 740, the filter is automatically applied to the person-related data to restrict transfer of any person-related data that do not meet the data privacy regulations. In one implementation, transfer of the filtered person-related data into the jurisdiction used to construct the filter is automatically authorized. In another implementation, tenant personnel stationed in the jurisdiction used to construct the filter automatically authorize access to the filtered person-related data.


Computer System



FIG. 8 is a block diagram of an example computer system 800 for automatically complying with data privacy laws. Computer system 810 typically includes at least one processor 614 that communicates with a number of peripheral devices via bus subsystem 812. These peripheral devices can include a storage subsystem 824 including, for example, memory devices and a file storage subsystem, user interface input devices 822, user interface output devices 820, and a network interface subsystem 818. The input and output devices allow user interaction with computer system 810. Network interface subsystem 816 provides an interface to outside networks, including an interface to corresponding interface devices in other computer systems.


User interface input devices 822 can include a keyboard; pointing devices such as a mouse, trackball, touchpad, or graphics tablet; a scanner; a touch screen incorporated into the display; audio input devices such as voice recognition systems and microphones; and other types of input devices. In general, use of the term “input device” is intended to include all possible types of devices and ways to input information into computer system 810.


User interface output devices 820 can include a display subsystem, a printer, a fax machine, or non-visual displays such as audio output devices. The display subsystem can include a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, or some other mechanism for creating a visible image. The display subsystem can also provide a non-visual display such as audio output devices. In general, use of the term “output device” is intended to include all possible types of devices and ways to output information from computer system 810 to the user or to another machine or computer system.


Storage subsystem 824 stores programming and data constructs that provide the functionality of some or all of the modules and methods described herein. These software modules are generally executed by processor 814 alone or in combination with other processors.


Memory 826 used in the storage subsystem can include a number of memories including a main random access memory (RAM) 830 for storage of instructions and data during program execution and a read only memory (ROM) 832 in which fixed instructions are stored. A file storage subsystem 826 can provide persistent storage for program and data files, and can include a hard disk drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, or removable media cartridges. The modules implementing the functionality of certain implementations can be stored by file storage subsystem 826 in the storage subsystem 824, or in other machines accessible by the processor.


Bus subsystem 812 provides a mechanism for letting the different components and subsystems of computer system 810 communicate with each other as intended. Although bus subsystem 812 is shown schematically as a single bus, alternative implementations of the bus subsystem can use multiple busses.


Computer system 810 can be of varying types including a workstation, server, computing cluster, blade server, server farm, or any other data processing system or computing device. Due to the ever-changing nature of computers and networks, the description of computer system 810 depicted in FIG. 8 is intended only as one example. Many other configurations of computer system 810 are possible having more or fewer components than the computer system depicted in FIG. 8.


Particular Implementations


In one implementation, a method is described from the perspective of a server receiving messages from user software. The method includes tracking a person-related data source by associating a data entity that holds person-related data with a trust object that tracks the source. The trust object holds trust metadata, including name of the person-related data source, interface category of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, data privacy regulations applicable to the origin, at least one purpose of assembling the person-related data, and at least one classification of the person-related data.


The method also includes receiving a tenant request for the person-related data, wherein the tenant request identifies at least one jurisdiction for subsequently using the person-related data. It includes constructing a filter that sets acceptable values, in accordance with the data privacy regulations specified in the jurisdiction, for the name of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data. It further includes automatically applying the filter to the person-related data to restrict transfer of any person-related data that do not meet the data privacy regulations.


This method described can be presented from the perspective of a mobile device and user software interacting with a server. From the mobile device perspective, the method includes tracking a person-related data source by associating a data entity that holds person-related data with a trust object that tracks the source. The trust object holds trust metadata, including name of the person-related data source, interface category of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, data privacy regulations applicable to the origin, at least one purpose of assembling the person-related data, and at least one classification of the person-related data.


The method also includes receiving a tenant request for the person-related data, wherein the tenant request identifies at least one jurisdiction for subsequently using the person-related data. The method relies on the server to construct a filter that sets acceptable values, in accordance with the data privacy regulations specified in the jurisdiction, for the name of the person-related data source, origin of the person-related data source, consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data. It further includes automatically applying the filter to the person-related data to restrict transfer of any person-related data that do not meet the data privacy regulations.


This method and other implementations of the technology disclosed can include one or more of the following features and/or features described in connection with additional methods disclosed. In the interest of conciseness, the combinations of features disclosed in this application are not individually enumerated and are not repeated with each base set of features. The reader will understand how features identified in this section can readily be combined with sets of base features identified as implementations such as automated compliance environment, trust object, trust metadata, or trust filter.


The interface categories of person-related data sources include access controlled APIs, public Internet, and social networking sites. The origin of the person-related data source identifies at least one geographic location of the source. Consent-types of subjects of person-related data include at least express consent, implied consent, and opt-out consent. The classification of the person-related data includes at least personal data and business data.


The method also includes automatically authorizing transfer of the filtered person-related data into the jurisdiction used to construct the filter. It further includes automatically authorizing access to the filtered person-related data by tenant personnel stationed in the jurisdiction used to construct the filter.


Other implementations include a non-transitory computer readable storage medium storing instructions executable by a processor to perform any of the methods described above. Yet other implementations include a system including memory and one or more processors operable to execute instructions, stored in the memory, to perform any of the methods described above.


While the present technology is disclosed by reference to the preferred implementations and examples detailed above, it is to be understood that these examples are intended in an illustrative rather than in a limiting sense. It is contemplated that modifications and combinations will readily occur to those skilled in the art, which modifications and combinations will be within the spirit of the technology and the scope of the following claims.

Claims
  • 1. A method of a server restricting transfer of private data that do not meet data privacy regulations, the method including: tracking a person-related data source by associating a data entity that holds person-related data with a trust object that tracks the person-related data source, wherein the person-related data includes private data;wherein the trust object holds trust metadata, including: name of the person-related data source,interface category of the person-related data source,physical origin of the person-related data source,consent-type given by subject of the person-related data,data privacy regulations that control access to the private data and are set for a legal jurisdiction that governs the physical origin,at least one purpose of assembling the person-related data, andat least one classification of the person-related data;representing the data privacy regulations set for the legal jurisdiction that governs the physical origin of the person-related data in an access control object that specifies, by the legal jurisdiction, access control based on the name of the person-related data source, the physical origin of the person-related data source, the consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data;receiving, from a client computer, a tenant request for the person-related data, wherein the tenant request identifies at least one legal jurisdiction where the person-related data will be subsequently used;constructing a filter that implements access control, in accordance with the access control object representing the data privacy regulations specified in the legal jurisdiction that governs the physical origin of the person-related data, for the name of the person-related data source, the physical origin of the person-related data source, the consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data; andautomatically applying the filter to the person-related data requested by the client computer to restrict transfer of any private data, from the server to the client computer, that do not meet the data privacy regulations specified in the legal jurisdiction that governs the physical origin of the person-related data, and to restrict transfer of any private data from the server to the client computer that do not meet the data privacy regulations of the at least one legal jurisdiction identified in the tenant request.
  • 2. The method of claim 1, wherein interface categories of person-related data sources include access controlled APIs, public Internet, and social networking sites.
  • 3. The method of claim 1, wherein the physical origin of the person-related data source identifies at least one geographic location of the person-related data source.
  • 4. The method of claim 1, wherein consent-types of subjects of person-related data include at least express consent, implied consent, and opt-out consent.
  • 5. The method of claim 1, wherein the classification of the person-related data includes at least personal data and business data.
  • 6. The method of claim 1, further including automatically authorizing transfer of filtered person-related data into the legal jurisdiction used to construct the filter.
  • 7. The method of claim 1, further including automatically authorizing access to filtered person-related data by tenant personnel stationed in the legal jurisdiction used to construct the filter.
  • 8. A system, including: a server including a processor and a computer readable storage medium storing computer instructions configured to cause the processor to: track a person-related data source by associating a data entity that holds person-related data with a trust object that tracks the person-related data source, wherein the person-related data includes private data;wherein the trust object holds trust metadata, including:name of the person-related data source,interface category of the person-related data source,physical origin of the person-related data source,consent-type given by subject of the person-related data,data privacy regulations that control access to the private data and are set for a legal jurisdiction that governs the physical origin,at least one purpose of assembling the person-related data, andat least one classification of the person-related data;represent the data privacy regulations set for the legal jurisdiction that governs the physical origin of the person-related data in an access control object that specifies, by the legal jurisdiction, access control based on the name of the person-related data source, the physical origin of the person-related data source, the consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data;receive, from a client computer, a tenant request for the person-related data, wherein the tenant request identifies at least one legal jurisdiction where the person-related data will be subsequently used;construct a filter that implements access control, in accordance with the access control object representing the data privacy regulations specified in the legal jurisdiction that governs the physical origin of the person-related data, for the name of the person-related data source, the physical origin of the person-related data source, the consent-type given by subject of the person-related data, the purpose of assembling the person-related data, and the classification of the person-related data; andautomatically apply the filter to the person-related data requested by the client computer to restrict transfer of any private data, from the server to the client computer, that do not meet the data privacy regulations specified in the legal jurisdiction that governs the physical origin of the person-related data, and to restrict transfer of any private data from the server to the client computer that do not meet the data privacy regulations of the at least one legal jurisdiction identified in the tenant request.
  • 9. The system of claim 8, wherein interface categories of person-related data sources include access controlled APIs, public Internet, and social networking sites.
  • 10. The system of claim 8, wherein the physical origin of the person-related data source identifies at least one geographic location of the person-related data source.
  • 11. The system of claim 8, wherein consent-types of subjects of person-related data include at least express consent, implied consent, and opt-out consent.
  • 12. The system of claim 8, wherein the classification of the person-related data includes at least personal data and business data.
  • 13. The system of claim 8, further including automatically authorizing transfer of filtered person-related data into the legal jurisdiction used to construct the filter.
  • 14. The system of claim 8, further including automatically authorizing access to filtered person-related data by tenant personnel stationed in the legal jurisdiction used to construct the filter.
  • 15. The method of claim 1, wherein the tenant request is received from a tenant subject to the at least one legal jurisdiction for subsequently using the person-related data, and the subject of the person-related data is subject to the legal jurisdiction that governs the physical origin of the person-related data.
RELATED APPLICATION

The application claims the benefit of U.S. provisional Patent Application No. 61/835,225, entitled, “Systems and Methods for Managing Social Data as Per Third-Party Preferences,” filed on Jun. 14, 2013. The provisional application is hereby incorporated by reference for all purposes.

US Referenced Citations (194)
Number Name Date Kind
5577188 Zhu Nov 1996 A
5608872 Schwartz et al. Mar 1997 A
5649104 Carleton et al. Jul 1997 A
5715450 Ambrose et al. Feb 1998 A
5761419 Schwartz et al. Jun 1998 A
5819038 Carleton et al. Oct 1998 A
5821937 Tonelli et al. Oct 1998 A
5831610 Tonelli et al. Nov 1998 A
5873096 Lim et al. Feb 1999 A
5918159 Fomukong et al. Jun 1999 A
5963953 Cram et al. Oct 1999 A
6092083 Brodersen et al. Jul 2000 A
6161149 Achacoso et al. Dec 2000 A
6169534 Raffel et al. Jan 2001 B1
6178425 Brodersen et al. Jan 2001 B1
6189011 Lim et al. Feb 2001 B1
6216135 Brodersen et al. Apr 2001 B1
6233617 Rothwein et al. May 2001 B1
6266669 Brodersen et al. Jul 2001 B1
6295530 Ritchie et al. Sep 2001 B1
6324568 Diec Nov 2001 B1
6324693 Brodersen et al. Nov 2001 B1
6336137 Lee et al. Jan 2002 B1
D454139 Feldcamp Mar 2002 S
6367077 Brodersen et al. Apr 2002 B1
6393605 Loomans May 2002 B1
6405220 Brodersen et al. Jun 2002 B1
6434550 Warner et al. Aug 2002 B1
6446089 Brodersen et al. Sep 2002 B1
6480850 Veldhuisen Nov 2002 B1
6535909 Rust Mar 2003 B1
6549908 Loomans Apr 2003 B1
6553563 Ambrose et al. Apr 2003 B2
6560461 Fomukong et al. May 2003 B1
6574635 Stauber et al. Jun 2003 B2
6577726 Huang et al. Jun 2003 B1
6601087 Zhu et al. Jul 2003 B1
6604117 Lim et al. Aug 2003 B2
6604128 Diec Aug 2003 B2
6609150 Lee et al. Aug 2003 B2
6621834 Scherpbier et al. Sep 2003 B1
6654032 Zhu et al. Nov 2003 B1
6665648 Brodersen et al. Dec 2003 B2
6665655 Warner et al. Dec 2003 B1
6684438 Brodersen et al. Feb 2004 B2
6711565 Subramaniam et al. Mar 2004 B1
6724399 Katchour et al. Apr 2004 B1
6728702 Subramaniam et al. Apr 2004 B1
6728960 Loomans Apr 2004 B1
6732095 Warshavsky et al. May 2004 B1
6732100 Brodersen et al. May 2004 B1
6732111 Brodersen et al. May 2004 B2
6754681 Brodersen et al. Jun 2004 B2
6763351 Subramaniam et al. Jul 2004 B1
6763501 Zhu et al. Jul 2004 B1
6768904 Kim Jul 2004 B2
6772229 Achacoso et al. Aug 2004 B1
6782383 Subramaniam et al. Aug 2004 B2
6804330 Jones et al. Oct 2004 B1
6826565 Ritchie et al. Nov 2004 B2
6826582 Chatterjee et al. Nov 2004 B1
6826745 Coker et al. Nov 2004 B2
6829655 Huang et al. Dec 2004 B1
6842748 Warner et al. Jan 2005 B1
6850895 Brodersen et al. Feb 2005 B2
6850949 Warner et al. Feb 2005 B2
7062502 Kesler Jun 2006 B1
7069231 Cinarkaya et al. Jun 2006 B1
7069497 Desai Jun 2006 B1
7181758 Chan Feb 2007 B1
7266846 King Sep 2007 B2
7289976 Kihneman et al. Oct 2007 B2
7340411 Cook Mar 2008 B2
7356482 Frankland et al. Apr 2008 B2
7401094 Kesler Jul 2008 B1
7412455 Dillon Aug 2008 B2
7508789 Chan Mar 2009 B2
7603483 Psounis et al. Oct 2009 B2
7620655 Larsson et al. Nov 2009 B2
7698160 Beaven et al. Apr 2010 B2
7779475 Jakobson et al. Aug 2010 B2
7851004 Hirao et al. Dec 2010 B2
8014943 Jakobson Sep 2011 B2
8015495 Achacoso et al. Sep 2011 B2
8032297 Jakobson Oct 2011 B2
8073850 Hubbard et al. Dec 2011 B1
8082301 Ahlgren et al. Dec 2011 B2
8095413 Beaven Jan 2012 B1
8095594 Beaven et al. Jan 2012 B2
8209308 Rueben et al. Jun 2012 B2
8209333 Hubbard et al. Jun 2012 B2
8275836 Beaven et al. Sep 2012 B2
8416954 Raizen Apr 2013 B1
8423792 Luciani Apr 2013 B2
8457545 Chan Jun 2013 B2
8484111 Frankland et al. Jul 2013 B2
8490025 Jakobson et al. Jul 2013 B2
8504945 Jakobson et al. Aug 2013 B2
8510045 Rueben et al. Aug 2013 B2
8510664 Rueben et al. Aug 2013 B2
8566301 Rueben et al. Oct 2013 B2
8607308 Langford Dec 2013 B1
8646103 Jakobson et al. Feb 2014 B2
8756275 Jakobson Jun 2014 B2
8769004 Jakobson Jul 2014 B2
8769017 Jakobson Jul 2014 B2
9207866 Boeuf Dec 2015 B2
9537656 Debout Jan 2017 B2
20010044791 Richter et al. Nov 2001 A1
20020072951 Lee et al. Jun 2002 A1
20020082892 Raffel et al. Jun 2002 A1
20020129352 Brodersen et al. Sep 2002 A1
20020140731 Subramaniam et al. Oct 2002 A1
20020143997 Huang et al. Oct 2002 A1
20020162090 Parnell et al. Oct 2002 A1
20020165742 Robins Nov 2002 A1
20030004734 Adler Jan 2003 A1
20030004971 Gong et al. Jan 2003 A1
20030018705 Chen et al. Jan 2003 A1
20030018830 Chen et al. Jan 2003 A1
20030066031 Laane Apr 2003 A1
20030066032 Ramachandran et al. Apr 2003 A1
20030069936 Warner et al. Apr 2003 A1
20030070000 Coker et al. Apr 2003 A1
20030070004 Mukundan et al. Apr 2003 A1
20030070005 Mukundan et al. Apr 2003 A1
20030074418 Coker Apr 2003 A1
20030120675 Stauber et al. Jun 2003 A1
20030151633 George et al. Aug 2003 A1
20030159136 Huang et al. Aug 2003 A1
20030187921 Diec Oct 2003 A1
20030189600 Gune et al. Oct 2003 A1
20030204427 Gune et al. Oct 2003 A1
20030206192 Chen et al. Nov 2003 A1
20030225730 Warner et al. Dec 2003 A1
20040001092 Rothwein et al. Jan 2004 A1
20040010489 Rio Jan 2004 A1
20040015981 Coker et al. Jan 2004 A1
20040027388 Berg et al. Feb 2004 A1
20040128001 Levin et al. Jul 2004 A1
20040186860 Lee et al. Sep 2004 A1
20040193510 Catahan et al. Sep 2004 A1
20040199489 Barnes-Leon et al. Oct 2004 A1
20040199536 Barnes Leon et al. Oct 2004 A1
20040199543 Braud et al. Oct 2004 A1
20040249854 Barnes-Leon et al. Dec 2004 A1
20040260534 Pak et al. Dec 2004 A1
20040260659 Chan et al. Dec 2004 A1
20040268299 Lei et al. Dec 2004 A1
20050050555 Exley et al. Mar 2005 A1
20050091098 Brodersen et al. Apr 2005 A1
20050262361 Thibadeau Nov 2005 A1
20060021019 Hinton et al. Jan 2006 A1
20060026042 Awaraji Feb 2006 A1
20060085443 Pae Apr 2006 A1
20070269044 Bruestle Nov 2007 A1
20080107262 Helfman May 2008 A1
20080229114 Okabe Sep 2008 A1
20080249972 Dillon Oct 2008 A1
20090063415 Chatfield et al. Mar 2009 A1
20090100342 Jakobson Apr 2009 A1
20090177744 Marlow et al. Jul 2009 A1
20090178144 Redlich Jul 2009 A1
20090326791 Horvitz Dec 2009 A1
20110035577 Lin Feb 2011 A1
20110055560 Meissner Mar 2011 A1
20110154023 Smith Jun 2011 A1
20110218958 Warshavsky et al. Sep 2011 A1
20110238482 Carney Sep 2011 A1
20110246785 Linsley Oct 2011 A1
20110247051 Bulumulla et al. Oct 2011 A1
20110264786 Kedem Oct 2011 A1
20120017095 Blenkhorn Jan 2012 A1
20120036347 Swanson Feb 2012 A1
20120042218 Cinarkaya et al. Feb 2012 A1
20120047552 Sengupta Feb 2012 A1
20120084574 Nakanishi Apr 2012 A1
20120131336 Price May 2012 A1
20120222083 Vaha-Sipila Aug 2012 A1
20120233137 Jakobson et al. Sep 2012 A1
20120246739 Mebed Sep 2012 A1
20120290407 Hubbard et al. Nov 2012 A1
20120304265 Richter Nov 2012 A1
20130054979 Basmov Feb 2013 A1
20130067242 Lyakhovitskiy Mar 2013 A1
20130091210 Rajakarunanayake Apr 2013 A1
20130145027 Parthasarathy Jun 2013 A1
20130173642 Oliver Jul 2013 A1
20130212497 Zelenko et al. Aug 2013 A1
20130247216 Cinarkaya et al. Sep 2013 A1
20140032933 Smith Jan 2014 A1
20140188804 Gokhale Jul 2014 A1
20140230007 Roth Aug 2014 A1
20150227976 Glorikian Aug 2015 A1
Related Publications (1)
Number Date Country
20140373182 A1 Dec 2014 US
Provisional Applications (1)
Number Date Country
61835225 Jun 2013 US