This disclosure relates to, among other things, data processing systems and methods for retrieving data regarding a plurality of privacy campaigns, using that data to assess a relative risk associated with the respective data privacy campaigns, providing an audit schedule for each campaign, measuring privacy requirement compliance across a plurality of privacy campaigns, determining respective privacy maturity ratings for one or more groups within an organization, and processing the relevant data.
Over the past years, privacy and security policies, and related operations have become increasingly important. Breaches in security, leading to the unauthorized access of personal data (which may include sensitive personal data) have become more frequent among companies and other organizations of all sizes. Such personal data may include, but is not limited to, personally identifiable information (PII), which may be information that directly (or indirectly) identifies an individual or entity. Examples of PII include names, addresses, dates of birth, social security numbers, and biometric identifiers such as a person's fingerprints or picture. Other personal data may include, for example, customers' Internet browsing habits, purchase history, or even their preferences (e.g., likes and dislikes, as provided or obtained through social media). While not all personal data may be sensitive, in the wrong hands, this kind of information may have a negative impact on the individuals or entities whose sensitive personal data is collected, including identity theft and embarrassment. Not only would this breach have the potential of exposing individuals to malicious wrongdoing, the fallout from such breaches may result in damage to reputation, potential liability, and costly remedial action for the organizations that collected the information and that were under an obligation to maintain its confidentiality and security. These breaches may result in not only financial loss, but loss of credibility, confidence, and trust from individuals, stakeholders, and the public.
In order to reduce the risks associated with obtaining, storing, and using personal data, private companies and other organizations have begun to train their employees to properly handle personal data. However, such training efforts may be implemented inconsistently, which may, in turn, result in the inconstant implementation of proper procedures for handling personal data. Accordingly, there is a need for improved systems and methods for evaluating the privacy maturity of particular individuals and/or groups of individuals within an organization.
A computer-implemented data processing method for measuring a particular business unit within an organization's compliance with one or more privacy requirements, according to particular embodiments, comprises: (1) receiving, by one or more processors, a request to measure a privacy maturity of the particular business unit; and (2) in response to receiving the request, retrieving, by one or more processors, from a privacy compliance system, an electronic record comprising information associated with one or more privacy impact assessments submitted by the particular business unit, wherein the privacy compliance system digitally stores the electronic record associated with the one or more privacy impact assessments and the electronic record comprises: (a) one or more types of personal data collected as part of one or more privacy campaigns for which the privacy impact assessment was performed; (b) a subject from which the personal data was collected; (c) a storage location of the personal data; and (d) one or more access permissions associated with the personal data. In any embodiment described herein, the method may further comprise: (1) determining, by one or more processors, one or more identified issues with the one or more privacy impact assessments; (2) determining, by one or more processors, based at least in part on the one or more privacy impact assessments, information associated with privacy campaign data collected as part of the one or more privacy campaigns; (3) receiving, by one or more processors, training data associated with one or more individuals associated with the particular business unit; (4) generating, by one or more processors, a privacy maturity report for the particular business unit based at least in part on the one or more identified issues, the information associated with the privacy campaign data, and the training data; and (5) displaying, by one or more processors, the privacy maturity report on a display screen associated with a computing device.
A computer-implemented data processing method for measuring a particular organization's compliance with one or more requirements associated with one or more pieces of computer code originating from the particular organization, in some embodiments, comprises: (1) determining, by one or more processors, for each of the one or more pieces of computer code, one or more respective storage locations; (2) electronically obtaining, by one or more processors, each of the one or more pieces of computer code based on the one or more respective storage locations; (3) automatically electronically analyzing each of the one or more pieces of computer code to determine one or more privacy-related attributes of each of the one or more pieces of computer code, each of the privacy-related attributes indicating one or more types of privacy campaign data that the computer code collects or accesses; (4) retrieving, by one or more processors, for at least one individual associated with the organization, privacy training data comprising an amount of privacy training received by the at least one individual; (5) determining, by one or more processors, based at least in part on the one or more types of privacy campaign data that the computer code collects or accesses and the privacy training data, a privacy maturity score for the particular organization; and (6) displaying, by one or more processors, the privacy maturity score on a display screen associated with a computing device.
A computer-implemented data processing method for measuring a privacy maturity of a sub-group within an organization, according to particular embodiments, comprises: (1) determining, by one or more processors, a number of issues identified by one or more privacy impact assessments performed on each of a plurality of privacy campaigns undertaken by the sub-group; and (2) determining, by one or more processors, from a privacy compliance system, information associated with privacy campaign data collected as part each of the plurality of privacy campaigns, wherein the privacy compliance system digitally stores an electronic record associated with each of the plurality of privacy campaigns and the electronic record comprises: (a) one or more types of personal data collected as part of each of the plurality of privacy campaigns; (b) a subject from which the personal data was collected; (c) a storage location of the personal data; and (d) one or more access permissions associated with the personal data. In further embodiments, the method comprises: (1) receiving, by one or more processors, training data associated with one or more individuals associated with the particular sub-group; (2) determining, by one or more processors, a privacy maturity for the particular sub-group based at least in part on the number of issues identified by the one or more privacy impact assessments, the information associated with privacy campaign data, and the training data; and (3) displaying, by one or more processors, the privacy maturity on a display screen associated with a computing device.
Various embodiments of a system and method for privacy compliance measurement are described below. In the course of this description, reference will be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Various embodiments now will be described more fully hereinafter with reference to the accompanying drawings. It should be understood that the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
Overview
A privacy compliance measurement system, according to various embodiments, is configured to determine compliance with one or more privacy compliance requirements by an organization or sub-group of the organization (e.g., one or more business groups or units within the organization). In particular embodiments, the system is configured to determine compliance with the one or more privacy compliance requirements based on, for example: (1) a frequency of risks or issues identified with Privacy Impact Assessments (PIAs) performed or completed by the one or more business units; (2) a relative training level of members of the one or more business units with regard to privacy related matters; (3) a breadth and amount of personal data collected by the one or more business units; and/or (4) any other suitable information related to the one or more business units' collection and storage of personal data.
In particular embodiments, the system is configured to determine a privacy maturity score (e.g., privacy maturity level) for a particular business unit that may, for example, be based at least in part on: (1) a percentage of PIAs associated with the particular business unit that the system identifies as having a high, medium, or low number of issues; (2) a total number of PIAs performed by the particular business unit; (3) a number of privacy campaigns initiated by the particular business unit; (4) an amount of personal data collected and stored by the particular business unit; (5) a number of individuals within the particular business unit who have received formal privacy training; (6) a number of individuals within the particular business unit who have received a privacy-related professional certification; and/or (7) any other suitable factor. In various embodiments, the system is configured to enable a user to customize the one or more factors upon which the privacy maturity score is determined.
In particular embodiments, the privacy compliance measurement system may be implemented in the context of any suitable privacy compliance system that is configured to ensure compliance with one or more legal or industry standards related to the collection and storage of private information. In particular embodiments, a particular organization or sub-group may initiate a privacy campaign as part of its business activities.
In various embodiments, a privacy campaign may include any undertaking by a particular organization (e.g., such as a project or other activity) that includes the collection, entry, and/or storage (e.g., in memory) of any privacy information or personal data associated with one or more individuals. This personal data may include, for example, for an individual: (1) name; (2) address; (3) telephone number; (4) e-mail address; (5) social security number; (6) information associated with one or more credit accounts (e.g., credit card numbers); (7) banking information; (8) location data; (9) internet search history; (10) account data; and (11) any other suitable personal information discussed herein.
As generally discussed above, a particular organization may be required to implement operational policies and processes to comply with one or more legal requirements in handling such personal data. A particular organization may further take steps to comply with one or more industry best practices. In particular embodiments, these operational policies and processes may include, for example: (1) storing personal data in a suitable location; (2) limiting access to the personal data to only suitable individuals or entities within the origination or external to the organization; (3) limiting a length of time for which the data will be stored; and (4) any other suitable policy to ensure compliance with any legal or industry guidelines. In particular embodiments, the legal or industry guidelines may vary based at least in part on, for example: (1) the type of data being stored; (2) an amount of data; (3) whether the data is encrypted; (4) etc.
For example, a particular organization's privacy compliance system may store information related to a plurality of privacy campaigns that the particular organization has undertaken. Each particular privacy campaign may include the receipt or entry and subsequent storage of personal data associated with one or more individuals as part of the privacy campaign. An exemplary privacy campaign, may, for example, include the collection and storage of the organization's employees' names, contact information, banking information, and social security numbers for use by the organization's accounting department for payroll purposes.
In particular embodiments, the system is configured to rate the privacy maturity of a particular organization or sub-group's execution of privacy campaigns in general. This may include, for example, rating the ability or likelihood of these organizations or sub-groups to comply with the legal and industry standards when initiating new privacy campaigns and participating in existing privacy campaigns. By rating the privacy maturity of a particular organization or sub-group, the system may enable privacy officers, administrators, or the system to identify those organizations or sub-groups whose privacy campaigns may require additional auditing or modification to ensure compliance with any legal or industry guidelines.
Exemplary Technical Platforms
As will be appreciated by one skilled in the relevant field, the present invention may be, for example, embodied as a computer system, a method, or a computer program product. Accordingly, various embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, particular embodiments may take the form of a computer program product stored on a computer-readable storage medium having computer-readable instructions (e.g., software) embodied in the storage medium. Various embodiments may take the form of web-implemented computer software. Any suitable computer-readable storage medium may be utilized including, for example, hard disks, compact disks, DVDs, optical storage devices, and/or magnetic storage devices.
Various embodiments are described below with reference to block diagrams and flowchart illustrations of methods, apparatuses (e.g., systems), and computer program products. It should be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by a computer executing computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
Accordingly, blocks of the block diagrams and flowchart illustrations support combinations of mechanisms for performing the specified functions, combinations of steps for performing the specified functions, and program instructions for performing the specified functions. It should also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and other hardware executing appropriate computer instructions.
Example System Architecture
As may be understood from
The one or more computer networks 115 may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a public switch telephone network (PSTN), or any other type of network. The communication link between the Privacy Compliance Measurement Server 110 and the one or more databases 140 may be, for example, implemented via a Local Area Network (LAN) or via the Internet.
In particular embodiments, the computer 200 may be connected (e.g., networked) to other computers in a LAN, an intranet, an extranet, and/or the Internet. As noted above, the computer 200 may operate in the capacity of a server or a client computer in a client-server network environment, or as a peer computer in a peer-to-peer (or distributed) network environment. The Computer 200 may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, a switch or bridge, or any other computer capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computer. Further, while only a single computer is illustrated, the term “computer” shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
An exemplary computer 200 includes a processing device 202, a main memory 204 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), static memory 206 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 218, which communicate with each other via a bus 232.
The processing device 202 represents one or more general-purpose processing devices such as a microprocessor, a central processing unit, or the like. More particularly, the processing device 202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. The processing device 202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 202 may be configured to execute processing logic 226 for performing various operations and steps discussed herein.
The computer 120 may further include a network interface device 208. The computer 200 also may include a video display unit 210 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 212 (e.g., a keyboard), a cursor control device 214 (e.g., a mouse), and a signal generation device 216 (e.g., a speaker).
The data storage device 218 may include a non-transitory computer-accessible storage medium 230 (also known as a non-transitory computer-readable storage medium or a non-transitory computer-readable medium) on which is stored one or more sets of instructions (e.g., software instructions 222) embodying any one or more of the methodologies or functions described herein. The software instructions 222 may also reside, completely or at least partially, within main memory 204 and/or within processing device 202 during execution thereof by computer 200—main memory 204 and processing device 202 also constituting computer-accessible storage media. The software instructions 222 may further be transmitted or received over a network 115 via network interface device 208.
While the computer-accessible storage medium 230 is shown in an exemplary embodiment to be a single medium, the term “computer-accessible storage medium”, and similar terms, such as “non-transitory computer-readable medium”, should be understood to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. Such terms should also be understood to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the computer and that cause the computer to perform any one or more of the methodologies of the present invention. Such terms should accordingly be understood to include, but not be limited to, solid-state memories, optical and magnetic media, etc.
Exemplary System Platform
Various embodiments of a privacy compliance measurement system may be implemented in the context of any suitable privacy compliance system. For example, the privacy compliance measurement system may be implemented to determine the privacy maturity of a particular organization or sub-group of the organization related to the collection and storage of personal data by those organizations or sub-groups. The system may, for example, be configured to measure the privacy maturity of the organization based on an overall handling of privacy campaigns by the organization. Various aspects of the system's functionality may be executed by certain system modules, including a Privacy Compliance Measurement Module 300 and Privacy Campaign Modification Module 400. These modules are discussed in greater detail below. Although these modules are presented as a series of steps, it should be understood in light of this disclosure that various embodiments of the various modules described herein may perform the steps described below in an order other than in which they are presented. In other embodiments, any module described herein may omit certain steps described below. In still other embodiments, any module described herein may perform steps in addition to those described.
Privacy Compliance Measurement Module
In particular embodiments, a Privacy Compliance Measurement Module 300 is configured to measure the privacy maturity of a particular organization or sub-group within the organization. As described above, the privacy maturity may include the organization's or sub-group's acumen for adhering to one or more privacy compliance requirements.
Turning to
In particular embodiments, the privacy maturity rating (e.g., numerical score or text score, such as “excellent” or “poor”) of the organization or sub-group may include a rating related to the organization or sub-group's compliance with the industry best practices and/or legal requirements related to the handling of personal data. In various embodiments, the privacy maturity rating may comprise a privacy maturity rating (e.g., relative rating) based on a particular scale (e.g., from 0-10, from 0-100, a percentage rating, etc.). In some embodiments, the privacy maturity rating may be further based on a likelihood of continued compliance with such legal and industry requirements, which the system may determine, for example, based on one or more attributes of the organization and/or its individual members. In particular embodiments, the privacy maturity rating may be based at least in part on, for example: (1) one or more issues or risks identified in privacy assessments submitted by the particular business unit; (2) a size and type of personal data used by the business unit; (3) a training level on various privacy issues of one or more members of the business unit; and/or (4) any other suitable factor. In further embodiments, the privacy maturity rating may be accompanied by a report that includes a number of identified issues in one or more privacy impact assessments submitted by the organization or sub-group.
Continuing to Step 320, the system retrieves information associated with one or more privacy impact assessments submitted by the organization or sub-group of the organization. In particular embodiments, the system may retrieve the information associated with the one or more privacy impact assessments in response to the request to measure the privacy maturity of the particular organization or sub-group received at Step 310. As may be understood in light of this disclosure, when initiating a new privacy campaign (e.g., any undertaking by a particular organization or sub-group that includes the collection, entry, and/or storage of any privacy information or personal data associated with one or more individuals), a particular organization may complete or perform a privacy impact assessment for the privacy campaign. An organization or sub-group may further complete or perform a privacy impact assessment for an existing privacy campaign.
In various embodiments, the privacy campaign may be associated with an electronic data structure comprising privacy campaign data. In particular embodiments, the privacy campaign data comprises a description of the privacy campaign, one or more types of personal data related to the campaign, a subject from which the personal data is collected as part of the privacy campaign, a storage location of the personal data (e.g., including a physical location of physical memory on which the personal data is stored), one or more access permissions associated with the personal data, and/or any other suitable data associated with the privacy campaign.
An exemplary privacy campaign, project, or other activity may include, for example: (1) the implementation of new hardware and/or software for storing and accessing personal data; (2) the implementation of a data sharing initiative where two or more organizations seek to pool and/or link one or more sets of personal data; (3) implantation of a proposal to identify people in a particular group or demographic and initiate a course of action; (4) using existing personal data for a new purpose; and/or (5) the consolidation of information held by separate parts of a particular organization. In still other embodiments, the particular privacy campaign, project or other activity may include any other privacy campaign, project, or other activity discussed herein, or any other suitable privacy campaign, project, or activity.
During a privacy impact assessment for a particular privacy campaign, a privacy impact assessment system may ask one or more users (e.g., one or more individuals associated with the particular organization or sub-group that is undertaking the privacy campaign) a series of privacy impact assessment questions regarding the particular privacy campaign and then store the answers to these questions in the system's memory, or in memory of another system, such as a third-party computer system.
Such privacy impact assessment questions may include questions regarding, for example: (1) what type of data is to be collected as part of the campaign; (2) who the data is to be collected from; (3) where the data is to be stored; (4) who will have access to the data; (5) how long the data will be kept before being deleted from the system's memory or archived; and/or (6) any other relevant information regarding the campaign. In various embodiments, a privacy impact assessment system may determine a relative risk and/or potential issues with a particular privacy campaign as it related to the collection and storage of personal data. For example, the system may be configured to identify a privacy campaign as being “High” risk, “Medium” risk, or “Low” risk based at least in part on answers submitted to the questions listed above. For example, a Privacy Impact Assessment that revealed that credit card numbers would be stored without encryption for a privacy campaign would likely cause the system to determine that the privacy campaign was high risk.
Continuing at Step 330, the system is configured to determine one or more identified risks with the one or more submitted privacy impact assessments. As may be understood in light of this disclosure, the system may be configured to determine one or more identified risks based on, for example: (1) one or more risks or issues identified as part of a review of a particular privacy impact assessment or privacy campaign performed by one or more third party regulators; (2) one or more issues or risks identified as part of a particular privacy impact assessment prior to initiation of a particular privacy campaign (e.g., by one or more other members of the organization or sub-group, by a system configured for identifying such issues or risks, etc.); (3) one or more issues or risks identified for an existing privacy campaign; and/or (4) etc.
In various embodiments, the one or more issues or risks may include, for example, that: (1) unnecessary personal data is to be collected as part of the privacy campaign; (2) personal data is to be stored in a manner that fails to meet one or more particular legal requirements or best industry practices; (3) personal data is to be stored in a non-suitable location; (4) access to the personal data will be available to too many individuals or entities within the origination, or external to the organization; and/or (5) any other potential issue or risk that may arise or that may have been identified based on a proposed collection and storage of personal data that makes up part of the privacy campaign.
Continuing to Step 340, the system determines, based at least in part on the one or more privacy impact assessments submitted by the organization or sub-group, information associated with privacy campaign data collected by the organization or sub-group. In various embodiments, the information associated with privacy campaign data collected by the organization or sub-group may include, for example: (1) a total number of privacy impact assessments performed or completed by the organization or sub-group; (2) a number of privacy campaigns undertaken or currently in effect that were initiated by the organization or sub-group; (3) an amount of personal data collected as part of those privacy campaigns; (4) a type of the personal data collected; (5) a volume of personal data transferred by the organization or sub-group (e.g., both within the organization or sub-group and externally to third parties, other sub-groups within the organization, etc.); and/or (6) any other suitable information related to privacy campaign data collected by the organization or sub-group or the organization or sub-group's other privacy campaign activities.
In various embodiments, the system is configured to substantially automatically track an amount of data received as part of any particular privacy campaign associated with the organization or sub-group as well as a volume of personal data transferred by the organization or sub-group. The system may, for example, track and store, in memory, a running total of privacy campaign data collected on behalf of a particular organization or subgroup. In such embodiments, the system may be configured to retrieve such information for use in determining the privacy maturity of the subgroup without having to determine the amount on-the-fly. The system may, for example, continuously track personal data collection and transfer in substantially real-time. In this way, the system may be configured to conserve processing power that would otherwise be required to determine a total amount of personal data collected and/or transferred at the time that the system is measuring privacy maturity. The system may be configured to store information related to the amount and volume of personal data collected and transferred in any suitable database. In particular embodiments, the system is further configured to associate, in memory, the collected and/or transferred data with a particular privacy campaign for which it was collected and/or transferred.
Returning to Step 350, the system continues by receiving training data associated with one or more individuals associated with the organization or sub-group. In particular embodiments, the training data may include, for example, training data associated with any suitable member of the organization or sub-group. In various embodiments, the training data comprises training data associated with one or more privacy officers within the sub-group (e.g., one or more individuals tasked with overseeing one or more privacy campaigns undertaken by the organization or sub-group). In still other embodiments, the training data comprises training data associated with any individual that has at least partially completed or performed a privacy impact assessment for the organization or sub-group. In still other embodiments, the training data includes training data for every member of a particular organization or sub-group (e.g., all employees within a Marketing business unit that have access to personal data collected as part of one or more privacy campaigns).
In various embodiments, the system is configured to retrieve the training data from memory. In such embodiments, the system is configured to maintain a database (e.g., in memory) of training data for each of a plurality of employees and other individuals associated with a particular organization or sub-group (e.g., business unit). In various embodiments, the system is configured to receive input, by any suitable individual, of updated training data for particular individuals associated with the organization or sub-group. For example, the system may receive a listing of one or more individuals who have received a particular privacy certification within the organization or sub-group.
In any embodiment described herein, the training data may include, for example: (1) a number or percentage of individuals within the organization or sub-group who have completed one or more privacy trainings (e.g., viewed one or more privacy videos, attended one or more privacy training sessions, etc.); (2) a number or percentage of individuals within the organization or sub-group who have completed one or more privacy-related quizzes or tests indicating knowledge of proper procedures relating to the collection and storage of personal data; (3) a number or percentage of individuals within the organization or sub-group who have attended one or more privacy events (e.g., internal privacy events within the organization or external privacy events put on by one or more third parties) such as a privacy conference or seminar; (4) a number or percentage of individuals within the organization or sub-group that are members of the International Association of Privacy Professionals (IAPP) or other privacy professional association; (5) a number or percentage of individuals within the organization or sub-group that hold one or more certifications related to privacy (e.g., CIPP certification, CIPT certification; CIPM certification, etc.), for example, through one or more privacy professional organizations (e.g., such as IAPP); (6) a number or percentage of individuals within the organization or sub-group that have received formal privacy training; (7) a number or percentage of individuals within the organization or sub-group that utilize one or more available mobile training modules or communication portals as part of a privacy campaign; and/or (8) any other suitable training data related to particular individuals' aptitude and training for following legal requirements and best industry practices related to the collection and storage of personal data and other privacy information.
In particular embodiments, as may be understood in light of this disclosure, the training data associated with particular individuals may impact a particular organization's privacy maturity in that a level of knowledge of the individuals involved with initiating and maintaining privacy campaigns for the organization may affect the organization's effectiveness and consistency in complying with legal and industry privacy requirements. For example, an individual with more training and experience may be less likely to mistakenly assign a particular type of personal data to be stored in an improper location than an individual with no training or experience.
Continuing to Step 360, the system generates and displays a privacy maturity report for the organization or sub-group based at least in part on the one or more identified issues or risks, the information associated with the privacy campaign data, and the training data. In various embodiments, the system is configured to display the privacy maturity report on a computing device via a suitable graphical user interface. In various embodiments, the privacy maturity report may be calculated using a suitable formula. In such embodiments, the system is configured to weigh the factors discussed above in any suitable manner. For example, the system may place higher weight on previous issues found in submitted privacy assessments by a particular organization when determining privacy maturity. In still other embodiments, the system may be configured to give a higher weight to a total amount of personal data handled by the organization (e.g., because, for example, an organization that has experience handling a large amount of personal data may be better equipped to follow legal and industry guidelines).
In various embodiments, the system is configured to assign a privacy maturity score to the particular organization or sub-group. The privacy maturity score may enable users of the privacy compliance measurement system to compare one or more organization's privacy maturity based on their relative respective scores. In this way, for example, a particular organization may be able to determine which business groups within the organization require, for example: (1) additional training in privacy matters; (2) additional oversight of their associated privacy campaigns; (3) etc.
Although the Privacy Compliance Measurement Module 300 is described above in reference to generating a privacy maturity report in response to a request to measure privacy maturity, it should be understood that any embodiment of the system described herein may measure privacy maturity of particular business groups or other sub-groups within an organization substantially automatically. For example, in any embodiment described herein, the system may be configured to substantially automatically measure a particular group's privacy maturity according to a particular schedule (e.g., weekly, monthly, quarterly, annually, every certain number of years, and/or according to any other suitable review schedule). In particular embodiments, the system is configured to substantially automatically measure a particular group's privacy maturity based at least in part on a type of privacy campaign, project, or other activity that the group undertakes that relates to the collection and storage of personal data.
Privacy Assessment Modification Module
In particular embodiments, a Privacy Assessment Modification Module 400 is configured to modify one or more aspects related to one or more privacy campaigns of a particular sub-group within an organization based at least in part on the sub-group's privacy maturity. For example, the system may, in various embodiments, initiate stricter review standards or oversight for those sub-groups with relatively low privacy maturity scores. In another example, the system may automatically generate or distribute electronic training materials to members of a sub-group with a low privacy maturity score.
In particular embodiments, modifying one or more aspects of one or more privacy campaigns of various sub-groups within an organization may conserve resources related to reviewing and analyzing privacy impact assessments and privacy campaigns for the organization as a whole. Turning to
Continuing to Step 420, the system modifies one or more privacy campaigns related to at least one of the one or more sub-groups based at least in part on the one or more privacy maturity reports. In particular embodiments, the system is configured to substantially automatically modify any suitable privacy assessment aspect related to the at least one of the one or more sub-groups. For example, a particular privacy campaign initiated by the at least one sub-group may include a particular privacy audit schedule (e.g., weekly, monthly, annually, etc.). In various embodiments, the privacy audit may be substantially automatically performed by a system according to the schedule in order to review the personal data collection and storage procedures utilized by the particular privacy campaign.
In this example, the system may be configured to modify the audit schedule based on the one or more privacy maturity reports by increasing the frequency of the privacy audits in response to determining that the at least one sub-group has a lower privacy maturity score than other sub-groups within the organization. In this way, the system may allocate limited resources (e.g., computing resources) to auditing and assessing those privacy campaigns that have been initiated and are maintained and run by sub-groups within the organization that are more likely to have an issue or run afoul of one or more legal requirements or best industry practices relating to personal data collection and storage.
In particular embodiments, the system is configured to substantially automatically trigger a review of at least one aspect of at least one privacy campaign associated with at least one sub-group within the organization based on the one or more privacy maturity reports. For example, the system may determine, based on the one or more privacy maturity reports, that the at least one-subgroup has a large number of associated privacy campaigns and a high privacy maturity score. However, the system may further determine that the at least one sub-group, in its privacy campaigns, has a large crossover of personal data stored as part of the various privacy campaigns, and often utilizes data storage techniques that exceed legal and industry requirements. In such embodiments, the system may substantially automatically modify one or more aspects of the privacy campaigns to meet but not exceed a particular legal or industry standard. In such embodiments, the system may, for example, be configured to: (1) limit redundancy of stored data (e.g., which may conserve memory) across privacy campaigns that collect similar data; (2) eliminate unnecessary data permission limitations; and/or (3) take any other action which may limit privacy campaign data recall times, storage size, transfer time, etc.
Exemplary User Experience
In exemplary embodiments of a privacy compliance measurement system, a user may access a privacy compliance system, for example: (1) to initiate a new privacy campaign; (2) to perform or complete a privacy impact assessment; (3) to review one or more privacy maturity reports; (4) to provide one or more metrics to customize a determination of privacy maturity; and/or (5) take any other action related to the privacy compliance system. For example, a user that is part of a particular business group within an organization (e.g., an IT group) may access the system to initiate a privacy impact assessment that the system may later use as part of a determination regarding a privacy maturity of the particular business group.
The one or more GUIs may enable the individual to, for example, provide information such as: (1) a description of the campaign; (2) the personal data to be collected as part of the campaign; (3) who the personal data relates to; (4) where the personal data is to be stored; and (5) who will have access to the indicated personal data, etc. Various embodiments of a system for implementing and auditing a privacy campaign are described in U.S. patent application Ser. No. 15/169,643, filed May 31, 2016 entitled “Data Processing Systems and Methods for Operationalizing Privacy Compliance and Assessing the Risk of Various Respective Privacy Campaigns”, which is hereby incorporated by reference herein in its entirety. In particular embodiments, the system is further configured to provide access to a privacy compliance measurement system via one or more GUIs that enable the user to view and compare privacy maturity data for one or more business groups within an organization. These exemplary screen displays and user experiences according to particular embodiments are described more fully below.
A.
As shown in
At any point, a user assigned as the owner may also assign others the task of selecting or answering any question related to the campaign. The user may also enter one or more tag words associated with the campaign in the Tags field 830. After entry, the tag words may be used to search for campaigns, or used to filter for campaigns (for example, under Filters 845). The user may assign a due date for completing the campaign entry, and turn reminders for the campaign on or off. The user may save and continue, or assign and close.
In example embodiments, some of the fields may be filled in by a user, with suggest-as-you-type display of possible field entries (e.g., Business Group field 815), and/or may include the ability for the user to select items from a drop-down selector (e.g., drop-down selectors 840a, 840b, 840c). The system may also allow some fields to stay hidden or unmodifiable to certain designated viewers or categories of users. For example, the purpose behind a campaign may be hidden from anyone who is not the chief privacy officer of the company, or the retention schedule may be configured so that it cannot be modified by anyone outside of the organization's' legal department.
In various embodiments, when initiating a new privacy campaign, project, or other activity (e.g., or modifying an existing one), the user associated with the organization may set a Due Date 835 that corresponds to a date by which the privacy campaign needs to be approved by a third-party regulator (e.g., such that the campaign may be approved prior to launching the campaign externally and/or beginning to collect data as part of the campaign). In various embodiments, the system may limit the proximity of a requested Due Date 835 to a current date based on a current availability of third-party regulators and/or whether the user has requested expedited review of the particular privacy campaign.
B.
Moving to
In this example, if John selects the hyperlink Privacy Portal 910, he is able to access the system, which displays a landing page 915. The landing page 915 displays a Getting Started section 920 to familiarize new owners with the system, and also display an “About This Data Flow” section 930 showing overview information for the campaign. As may be understood in light of this disclosure, in response to accessing the Privacy Portal 910 for the particular privacy campaign by John Doe, the system may collect, receive, or otherwise retrieve training data associated with John Doe. In various embodiments, because John Doe may be contributing to the privacy campaign by providing information about various aspects of the privacy campaign, John Doe's training data may become relevant to the privacy maturity of the particular campaign as well as to the organization responsible for the campaign.
C.
As shown in
For example, in
D.
As shown in the example of
E.
In various embodiments, the system also allows the user to select whether the destination settings are applicable to all the personal data of the campaign, or just select data (and if so, which data). As shown in
In particular embodiments, the system is configured to prompt the user to provide additional information when indicating where particular sensitive information is to be stored as part of the particular privacy campaign. For example, where the user is part of a business group with a relatively low maturity score, the system may be configured to prompt the user to provide additional information regarding where, how, and how long personal data will be stored as part of the privacy campaign. In some embodiments, the system may automatically generate recommendations to store the personal data in a location other than a location initially entered by the user.
F.
G:
After new campaigns have been added, for example using the exemplary processes explained in regard to
Still referring to
The inventory page 1500 may also display the status of each campaign, as indicated in column heading Status 1515. Exemplary statuses may include “Pending Review”, which means the campaign has not been approved yet, “Approved,” meaning the personal data associated with that campaign has been approved, “Audit Needed,” which may indicate that a privacy audit of the personal data associated with the campaign is needed, and “Action Required,” meaning that one or more individuals associated with the campaign must take some kind of action related to the campaign (e.g., completing missing information, responding to an outstanding message, etc.). In certain embodiments, the approval status of the various campaigns relates to approval by one or more third-party regulators as described herein.
The inventory page 1500 of
The inventory page 1500 of
On the inventory page 1500, the Access heading 1530 may show the number of transfers that the personal data associated with a campaign has undergone. This may, for example, indicate how many times the data has been accessed by one or more authorized individuals or systems.
The column with the heading Audit 1535 shows the status of any privacy audits associated with the campaign. Privacy audits may be pending, in which an audit has been initiated but yet to be completed. The audit column may also show for the associated campaign how many days have passed since a privacy audit was last conducted for that campaign. (e.g., 140 days, 360 days). If no audit for a campaign is currently required, an “OK” or some other type of indication of compliance (e.g., a “thumbs up” indicia) may be displayed for that campaign's audit status. The audit status, in various embodiments, may refer to whether the privacy campaign has been audited by a third-party regulator or other regulator as required by law or industry practice or guidelines. As discussed above, in any embodiment described herein, the system may be configured to substantially automatically adjust an audit schedule for one or more privacy campaigns associated with a particular organization based at least in part on that organization's privacy maturity.
The example inventory page 1500 may comprise a filter tool, indicated by Filters 1545, to display only the campaigns having certain information associated with them. For example, as shown in
From example inventory page 1500, a user may also add a campaign by selecting (i.e., clicking on) Add Data Flow 1555. Once this selection has been made, the system initiates a routine (e.g., a wizard) to guide the user in a phase-by-phase manner through the process of creating a new campaign. An example of the multi-phase GUIs in which campaign data associated with the added privacy campaign may be input and associated with the privacy campaign record is described in
From the example inventory page 1500, a user may view the information associated with each campaign in more detail, or edit the information associated with each campaign. To do this, the user may, for example, click on or select the name of the campaign (i.e., click on Internet Usage History 1510). As another example, the user may select a button displayed on the screen indicating that the campaign data is editable (e.g., edit button 1560).
H.
As shown in
In particular embodiments, the privacy maturity report overview 1600 includes an indication of whether there has been action taken by the system relating to a particular business group's privacy maturity, via an “Action Taken?” 1620 indicia. As may be understood from this figure and this disclosure, the system may be configured to substantially automatically modify one or more aspects of one or more particular privacy campaigns based at least in part on the privacy maturity of a particular business group associated with the one or more privacy campaigns. In such embodiments, the system is configured to notify one or more individuals, via the privacy maturity report overview 1600 interface that action has been taken. The one or more individuals may then, for example, request more detail about the privacy maturity report and action taken by selecting a suitable “More Detail” indicia 1625.
I.
Various embodiments of the privacy compliance measurement systems described herein may include features in addition to those described above. Exemplary alternative embodiments are described below.
Automatic Implementation of Privacy Campaign, Project, or Other Activity for Business Groups with High Privacy Maturity
In embodiments in which a privacy campaign (e.g., or project or other activity) requires third-party, privacy office, or other approval prior to implementation, the system may be configured to substantially automatically implement the privacy campaign in response to determining that the privacy maturity score of the business group initiating the privacy campaign exceeds a particular threshold. For example, in response to determining that a business group has a privacy maturity score above a certain threshold amount, the system may be configured to automatically initiate a new privacy campaign for the business group by beginning to collect the personal data as directed by the campaign. In this way, the system may be configured to conserve computing and other resources by avoiding a full audit of a privacy campaign prior to initiation for those business groups that are unlikely to initiate a privacy campaign that includes one or more issues or risks (e.g., because the business group has a relatively high privacy maturity).
Automatic Modification and Flagging of One or More Privacy Campaigns in Response to Determination that a Particular Business Group has Low Privacy Maturity
In particular embodiments, such as those described above, the system may determine that a particular business group or other sub-group of an organization has a privacy maturity score below a threshold amount. In such embodiments, the system may be configured to substantially automatically modify one or more privacy campaigns associated with the particular business group to, for example: (1) increase a level of encryption used on stored personal data; (2) further limit access to stored personal data; (3) decrease an amount of time for which personal data is stored; and/or (4) take any other suitable additional precaution with respect to the personal data collected and stored as part of the one or more privacy campaigns to reduce a likelihood that the campaign may run afoul of one or more legal or industry standards for the collection and storage of personal data.
In various embodiments, in addition to automatically modifying the one or more privacy campaigns, the system may automatically flag the one or more privacy campaigns for further review or audit. In such embodiments, the system may be configured to: (1) maintain the modifications until the audit or review occurs; (2) receive an indication that the audit or review has been completed and the original aspects of the one or more privacy campaigns were sufficient to meet or exceed relevant legal or industry standards; and (3) in response to receiving the indication, reverting the modified aspects of the one or more privacy campaigns to their defaults.
Privacy Maturity Determination Based on Public Record and Other Data Scans
In any embodiment of the system described herein, a particular organization may undertake one or more privacy campaigns, processing activities, or other activities that collect, store, and otherwise process data (e.g., personal data). These processing activities may be undertaken by particular groups within the particular organization (e.g., sub groups, groups of individuals, etc.).
In light of the above, the system may use one or more techniques described herein in combination with one or more additional suitable factors, to electronically calculate or otherwise determine a privacy maturity of a particular group of individuals (e.g., organization) within an organization.
In various embodiments, the system may, for example, determine a privacy maturity of a particular group of individuals based on any technique described herein in addition to, for example: (1) the nature of the sensitive information collected as part of a processing activity undertaken by the group of individuals; (2) the location in which the information is stored as part of such a processing activity (e.g., as part of a piece of computer software published by the group of individuals) (3) the number of individuals who have access to the information collected and/or stored by such a processing activity; (4) the length of time that the data will be stored by (5) the individuals whose sensitive information will be stored; (6) the country of residence of the individuals whose sensitive information will be stored; and/or (7) any other suitable factor related to the collection, processing and/or storage of data (e.g., personal data) by any processing activity undertaken by the group of individuals.
In particular embodiments, the system may, for example, be configured to calculate a privacy maturity score for a particular individual, group of individuals, departments, etc. within an organization. The system may then, in various embodiments, use the privacy maturity score in order to calculate a risk rating or other risk score for a particular piece of software or other service initiated by the group of individuals.
In various embodiments, the system may, for example: (1) analyze one or more pieces of publicly available data associated with the one or more individuals that make up the group for which privacy maturity is being evaluated: and (2) calculate the privacy maturity score for the group (e.g., business unit) based on the analyzed one or more pieces of publicly available data. In particular embodiments, the system is configured to analyze one or more pieces of the group's published applications of software available to one or more customers to detect one or more privacy disclaimers associated with the published applications. The system may then, for example, be configured to use one or more text matching techniques to determine whether the one or more privacy disclaimers contain one or more pieces of language required by one or more prevailing industry or legal requirements related to data privacy. The system may, for example, be configured to assign a relatively high privacy maturity score to a group whose published software includes required privacy disclaimers, and configured to assign a relatively low privacy maturity score to group whose software does not include such disclaimers.
In another example, the system may be configured to analyze one or more websites associated with the group (e.g., one or more websites that host one or more pieces of computer code or software made available by the group) for one or more privacy notices, one or more blog posts, one or more preference centers, and/or one or more control centers. The system may, for example, calculate the privacy maturity score based at least in part on a presence of one or more suitable privacy notices, one or more contents of one or more blog posts on the group site (e.g., whether the group site has one or more blog posts directed toward user privacy), a presence of one or more preference or control centers that enable visitors to the site to opt in or out of certain data collection policies (e.g., cookie policies, etc.), etc.
In particular other embodiments, the system may be configured to determine whether the particular group (e.g., or any of its members) holds one or more security certifications. The one or more security certifications may include, for example: (1) system and organization control (SOC); (2) International Organization for Standardization (ISO); (3) Health Insurance Portability and Accountability ACT (HIPPA); (4) etc. In various embodiments, the system is configured to access one or more public databases of security certifications to determine whether the particular group or any suitable number of its individual members holds any particular certification. The system may then determine the privacy maturity score based on whether the group or its members hold one or more security certifications (e.g., the system may calculate a relatively higher score depending on one or more particular security certifications held by members of the group or business unit). The system may be further configured to scan a group website for an indication of the one or more security certifications.
In various embodiments, the system may be further configured to analyze on or more credit bureau databases, one or more government or industry certification body databases, one or more vendor membership databases, or other suitable databases to determine whether the particular group belongs to or is associated with one or more organizations that may indicate a particular awareness and attention to one or more privacy issues (e.g., one or more issues related to the collection, storage, and/or processing of personal data).
In still other embodiments, the system is configured to analyze one or more social networking sites (e.g., LinkedIn, Facebook, etc.) and/or one or more business related job sites (e.g., one or more job-posting sites, one or more corporate websites, etc.). The system may, for example, use social networking and other data to identify one or more employee titles of the business unit, one or more job roles for one or more employees in the group, one or more job postings for the for the business unit (e.g., group), etc. The system may then analyze the one or more job titles, postings, listings, roles, etc. to determine whether the group has or is seeking one or more employees that have a role associated with data privacy or other privacy concerns. In this way, the system may determine whether the group is particularly focused on privacy or other related activities. The system may then calculate a privacy maturity score based on such a determination (e.g., a vendor that has one or more employees whose roles or titles are related to privacy may receive a relatively higher privacy awareness score).
In particular embodiments, the system may be configured to calculate the privacy maturity score using one or more additional factors such as, for example: (1) public information associated with one or more events that the group or its members may be attending; (2) public information associated with one or more conferences that the group or its members have participated in or are planning to participate in; (3) etc. In some embodiments, the system may calculate a privacy maturity score based at least in part on one or more government relationships with the group. For example, the system may be configured to calculate a relatively high privacy maturity score for a group that has one or more contracts with one or more government entities (e.g., because an existence of such a contract may indicate that the group has passed one or more vetting requirements imposed by the one or more government entities).
In any embodiment described herein, the system may be configured to assign, identify, and/or determine a weighting factor for each of a plurality of factors used to determine a privacy maturity score for a particular entity, group, organization, or plurality of individuals. For example, when calculating the privacy maturity score, the system may assign a first weighting factor to whether the group has one or more suitable privacy notices posted on the vendor website, a second weighting factor to whether the group has one or more particular security certifications, etc. The system may, for example, assign one or more weighting factors using any suitable technique described herein with relation to risk rating determination. In some embodiments, the system may be configured to receive the one or more weighting factors (e.g., from a user). In other embodiments, the system may be configured to determine the one or more weighting factors based at least in part on a type of the factor.
In any embodiment described herein, the system may be configured to determine an overall risk rating for a particular group (e.g., particular piece of vendor software) based in part on the privacy maturity score. In other embodiments, the system may be configured to determine an overall risk rating for a particular group based on the privacy maturity rating in combination with one or more additional factors (e.g., one or more additional risk factors described herein). In any such embodiment, the system may assign one or more weighting factors or relative risk ratings to each of the privacy maturity score and other risk factors when calculating an overall risk rating. The system may then be configured to provide the risk score for the group, software, and/or service for use in calculating a risk of undertaking any particular processing activity that the group may undertake (e.g., in any suitable manner described herein).
Although embodiments above are described in reference to various privacy compliance measurement systems, it should be understood that various aspects of the system described above may be applicable to other privacy-related systems, or to other types of systems, in general.
While this specification contains many specific embodiment details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments may also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment may also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products.
Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. While examples discussed above cover the use of various embodiments in the context of operationalizing privacy compliance and assessing risk of privacy campaigns, various embodiments may be used in any other suitable context. Therefore, it is to be understood that the invention is not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for the purposes of limitation.
This application is a continuation of U.S. patent application Ser. No. 16/042,642, filed Jul. 23, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/619,251, filed Jun. 9, 2017, now U.S. Pat. No. 10,032,172, issued Jul. 24, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/254,901, filed Sep. 1, 2016, now U.S. Pat. No. 9,729,583, issued Aug. 8, 2017, which claims priority to U.S. Provisional Patent Application Ser. No. 62/360,123, filed Jul. 8, 2016; U.S. Provisional Patent Application Ser. No. 62/353,802, filed Jun. 23, 2016; and U.S. Provisional Patent Application Ser. No. 62/348,695, filed Jun. 10, 2016; the disclosures of all of the above-referenced patent applications are hereby incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5193162 | Bordsen et al. | Mar 1993 | A |
5276735 | Boebert et al. | Jan 1994 | A |
5535393 | Reeve et al. | Jul 1996 | A |
5560005 | Hoover et al. | Sep 1996 | A |
6122627 | Carey et al. | Sep 2000 | A |
6148342 | Ho | Nov 2000 | A |
6253203 | Oflaherty et al. | Jun 2001 | B1 |
6272631 | Thomlinson et al. | Aug 2001 | B1 |
6275824 | Oflaherty et al. | Aug 2001 | B1 |
6374252 | Althoff et al. | Apr 2002 | B1 |
6442688 | Moses et al. | Aug 2002 | B1 |
6606744 | Mikurak | Aug 2003 | B1 |
6611812 | Hurtado et al. | Aug 2003 | B2 |
6625602 | Meredith et al. | Sep 2003 | B1 |
6662192 | Rebane | Dec 2003 | B1 |
6757888 | Knutson et al. | Jun 2004 | B1 |
6816944 | Peng | Nov 2004 | B2 |
6826693 | Yoshida et al. | Nov 2004 | B1 |
6904417 | Clayton et al. | Jun 2005 | B2 |
6925443 | Baggett, Jr. et al. | Aug 2005 | B1 |
6938041 | Brandow et al. | Aug 2005 | B1 |
6983221 | Tracy | Jan 2006 | B2 |
6985887 | Sunstein et al. | Jan 2006 | B1 |
6993495 | Smith, Jr. et al. | Jan 2006 | B2 |
7013290 | Ananian | Mar 2006 | B2 |
7017105 | Flanagin et al. | Mar 2006 | B2 |
7039654 | Eder | May 2006 | B1 |
7047517 | Brown et al. | May 2006 | B1 |
7051036 | Rosnow et al. | May 2006 | B2 |
7069427 | Adler et al. | Jun 2006 | B2 |
7127741 | Bandini et al. | Oct 2006 | B2 |
7139999 | Bowman-Amuah | Nov 2006 | B2 |
7171379 | Menninger et al. | Jan 2007 | B2 |
7181438 | Szabo | Feb 2007 | B1 |
7203929 | Vinodkrishnan et al. | Apr 2007 | B1 |
7213233 | Vinodkrishnan et al. | May 2007 | B1 |
7216340 | Vinodkrishnan et al. | May 2007 | B1 |
7223234 | Stupp et al. | May 2007 | B2 |
7234065 | Breslin et al. | Jun 2007 | B2 |
7251624 | Lee et al. | Jul 2007 | B1 |
7260830 | Sugimoto | Aug 2007 | B2 |
7287280 | Young | Oct 2007 | B2 |
7313575 | Carr et al. | Dec 2007 | B2 |
7313699 | Koga | Dec 2007 | B2 |
7353204 | Liu | Apr 2008 | B2 |
7370025 | Pandit | May 2008 | B1 |
7391854 | Salonen et al. | Jun 2008 | B2 |
7401235 | Mowers et al. | Jul 2008 | B2 |
7403942 | Bayliss | Jul 2008 | B1 |
7412402 | Cooper | Aug 2008 | B2 |
7478157 | Bohrer et al. | Jan 2009 | B2 |
7512987 | Williams | Mar 2009 | B2 |
7516882 | Cucinotta | Apr 2009 | B2 |
7523053 | Pudhukottai et al. | Apr 2009 | B2 |
7548968 | Bura et al. | Jun 2009 | B1 |
7552480 | Voss | Jun 2009 | B1 |
7584505 | Mondri et al. | Sep 2009 | B2 |
7590972 | Axelrod et al. | Sep 2009 | B2 |
7603356 | Schran et al. | Oct 2009 | B2 |
7606790 | Levy | Oct 2009 | B2 |
7613700 | Lobo et al. | Nov 2009 | B1 |
7620644 | Cote et al. | Nov 2009 | B2 |
7630874 | Fables et al. | Dec 2009 | B2 |
7630998 | Zhou et al. | Dec 2009 | B2 |
7636742 | Olavarrieta et al. | Dec 2009 | B1 |
7653592 | Flaxman et al. | Jan 2010 | B1 |
7665073 | Meijer et al. | Feb 2010 | B2 |
7668947 | Hutchinson et al. | Feb 2010 | B2 |
7673282 | Amaru et al. | Mar 2010 | B2 |
7685561 | Deem et al. | Mar 2010 | B2 |
7685577 | Pace et al. | Mar 2010 | B2 |
7693593 | Ishibashi et al. | Apr 2010 | B2 |
7707224 | Chastagnol et al. | Apr 2010 | B2 |
7716242 | Pae et al. | May 2010 | B2 |
7729940 | Harvey et al. | Jun 2010 | B2 |
7730142 | Levasseur et al. | Jun 2010 | B2 |
7752124 | Green et al. | Jul 2010 | B2 |
7756987 | Wang et al. | Jul 2010 | B2 |
7774745 | Fildebrandt et al. | Aug 2010 | B2 |
7788212 | Beckmann et al. | Aug 2010 | B2 |
7788632 | Kuester et al. | Aug 2010 | B2 |
7801758 | Gracie et al. | Sep 2010 | B2 |
7853468 | Callahan et al. | Dec 2010 | B2 |
7853470 | Sonnleithner et al. | Dec 2010 | B2 |
7870540 | Zare et al. | Jan 2011 | B2 |
7870608 | Shraim et al. | Jan 2011 | B2 |
7873541 | Klar et al. | Jan 2011 | B1 |
7877327 | Gwiazda et al. | Jan 2011 | B2 |
7877812 | Koved et al. | Jan 2011 | B2 |
7885841 | King | Feb 2011 | B2 |
7917963 | Goyal et al. | Mar 2011 | B2 |
7958494 | Chaar et al. | Jun 2011 | B2 |
7966310 | Sullivan et al. | Jun 2011 | B2 |
7966663 | Strickland et al. | Jun 2011 | B2 |
7991559 | Dzekunov et al. | Aug 2011 | B2 |
8019881 | Sandhu et al. | Sep 2011 | B2 |
8032721 | Murai | Oct 2011 | B2 |
8037409 | Jacob et al. | Oct 2011 | B2 |
8041913 | Wang | Oct 2011 | B2 |
8069161 | Bugir et al. | Nov 2011 | B2 |
8146074 | Ito et al. | Mar 2012 | B2 |
8150717 | Whitmore | Apr 2012 | B2 |
8156158 | Rolls et al. | Apr 2012 | B2 |
8176177 | Sussman et al. | May 2012 | B2 |
8176334 | Vainstein | May 2012 | B2 |
8180759 | Hamzy | May 2012 | B2 |
8239244 | Ginsberg et al. | Aug 2012 | B2 |
8250051 | Bugir et al. | Aug 2012 | B2 |
8286239 | Sutton | Oct 2012 | B1 |
8364713 | Pollard | Jan 2013 | B2 |
8381180 | Rostoker | Feb 2013 | B2 |
8418226 | Gardner | Apr 2013 | B2 |
8423954 | Ronen et al. | Apr 2013 | B2 |
8429597 | Prigge | Apr 2013 | B2 |
8429758 | Chen et al. | Apr 2013 | B2 |
8438644 | Watters et al. | May 2013 | B2 |
8494894 | Jaster et al. | Jul 2013 | B2 |
8504481 | Motahari et al. | Aug 2013 | B2 |
8516076 | Thomas | Aug 2013 | B2 |
8578036 | Holfelder et al. | Nov 2013 | B1 |
8578166 | De Monseignat et al. | Nov 2013 | B2 |
8578481 | Rowley | Nov 2013 | B2 |
8583694 | Siegel et al. | Nov 2013 | B2 |
8589183 | Awaraji et al. | Nov 2013 | B2 |
8601591 | Krishnamurthy et al. | Dec 2013 | B2 |
8612420 | Sun et al. | Dec 2013 | B2 |
8612993 | Grant et al. | Dec 2013 | B2 |
8620952 | Bennett et al. | Dec 2013 | B2 |
8621637 | Al-Harbi et al. | Dec 2013 | B2 |
8627114 | Resch et al. | Jan 2014 | B2 |
8640110 | Kopp et al. | Jan 2014 | B2 |
8681984 | Lee et al. | Mar 2014 | B2 |
8683502 | Shkedi et al. | Mar 2014 | B2 |
8688601 | Jaiswal | Apr 2014 | B2 |
8706742 | Ravid et al. | Apr 2014 | B1 |
8712813 | King | Apr 2014 | B2 |
8744894 | Christiansen | Jun 2014 | B2 |
8763071 | Sinha et al. | Jun 2014 | B2 |
8767947 | Ristock et al. | Jul 2014 | B1 |
8805707 | Schumann, Jr. et al. | Aug 2014 | B2 |
8805925 | Price et al. | Aug 2014 | B2 |
8812342 | Barcelo et al. | Aug 2014 | B2 |
8819253 | Simeloff et al. | Aug 2014 | B2 |
8819617 | Koenig et al. | Aug 2014 | B1 |
8826446 | Liu et al. | Sep 2014 | B1 |
8839232 | Taylor et al. | Sep 2014 | B2 |
8843487 | McGraw et al. | Sep 2014 | B2 |
8893286 | Oliver | Nov 2014 | B1 |
8914263 | Shimada et al. | Dec 2014 | B2 |
8914299 | Pesci-Anderson | Dec 2014 | B2 |
8914342 | Kalaboukis et al. | Dec 2014 | B2 |
8918392 | Brooker et al. | Dec 2014 | B1 |
8918632 | Sartor | Dec 2014 | B1 |
8930896 | Wiggins | Jan 2015 | B1 |
8935266 | Wu | Jan 2015 | B2 |
8935804 | Clark et al. | Jan 2015 | B1 |
8943076 | Stewart et al. | Jan 2015 | B2 |
8959584 | Piliouras | Feb 2015 | B2 |
8966575 | McQuay et al. | Feb 2015 | B2 |
8977234 | Chava | Mar 2015 | B2 |
8983972 | Kriebel et al. | Mar 2015 | B2 |
8990933 | Magdalin | Mar 2015 | B1 |
8997213 | Papakipos et al. | Mar 2015 | B2 |
9003295 | Baschy | Apr 2015 | B2 |
9043217 | Cashman et al. | May 2015 | B2 |
9047463 | Porras | Jun 2015 | B2 |
9047582 | Hutchinson et al. | Jun 2015 | B2 |
9069940 | Hars | Jun 2015 | B2 |
9092796 | Eversoll et al. | Jul 2015 | B2 |
9094434 | Williams et al. | Jul 2015 | B2 |
9098515 | Richter et al. | Aug 2015 | B2 |
9100778 | Stogaitis et al. | Aug 2015 | B2 |
9111295 | Tietzen et al. | Aug 2015 | B2 |
9129311 | Schoen et al. | Sep 2015 | B2 |
9135261 | Maunder et al. | Sep 2015 | B2 |
9152820 | Pauley, Jr. et al. | Oct 2015 | B1 |
9158655 | Wadhwani et al. | Oct 2015 | B2 |
9172706 | Krishnamurthy et al. | Oct 2015 | B2 |
9178901 | Xue et al. | Nov 2015 | B2 |
9202085 | Mawdsley et al. | Dec 2015 | B2 |
9215252 | Smith et al. | Dec 2015 | B2 |
9232040 | Barash et al. | Jan 2016 | B2 |
9235476 | McHugh et al. | Jan 2016 | B2 |
9241259 | Daniela et al. | Jan 2016 | B2 |
9245126 | Christodorescu et al. | Jan 2016 | B2 |
9286282 | Ling, III et al. | Mar 2016 | B2 |
9288118 | Pattan | Mar 2016 | B1 |
9317715 | Schuette et al. | Apr 2016 | B2 |
9336332 | Davis et al. | May 2016 | B2 |
9336400 | Milman et al. | May 2016 | B2 |
9338188 | Ahn | May 2016 | B1 |
9344424 | Tenenboym et al. | May 2016 | B2 |
9348802 | Massand | May 2016 | B2 |
9355157 | Mohammed et al. | May 2016 | B2 |
9356961 | Todd | May 2016 | B1 |
9369488 | Woods et al. | Jun 2016 | B2 |
9384199 | Thereska et al. | Jul 2016 | B2 |
9384357 | Patil et al. | Jul 2016 | B2 |
9386104 | Adams et al. | Jul 2016 | B2 |
9396332 | Abrams et al. | Jul 2016 | B2 |
9401900 | Levasseur et al. | Jul 2016 | B2 |
9411982 | Dippenaar et al. | Aug 2016 | B1 |
9424021 | Zamir | Aug 2016 | B2 |
9462009 | Kolman et al. | Oct 2016 | B1 |
9465800 | Lacey | Oct 2016 | B2 |
9477523 | Warman et al. | Oct 2016 | B1 |
9477660 | Scott et al. | Oct 2016 | B2 |
9477942 | Adachi et al. | Oct 2016 | B2 |
9483659 | Bao et al. | Nov 2016 | B2 |
9507960 | Bell et al. | Nov 2016 | B2 |
9521166 | Wilson | Dec 2016 | B2 |
9549047 | Fredinburg et al. | Jan 2017 | B1 |
9552395 | Bayer et al. | Jan 2017 | B2 |
9558497 | Carvalho | Jan 2017 | B2 |
9571509 | Satish et al. | Feb 2017 | B1 |
9602529 | Jones et al. | Mar 2017 | B2 |
9619661 | Finkelstein | Apr 2017 | B1 |
9621357 | Williams et al. | Apr 2017 | B2 |
9621566 | Gupta et al. | Apr 2017 | B2 |
9646095 | Gottlieb et al. | May 2017 | B1 |
9648036 | Seiver et al. | May 2017 | B2 |
9652314 | Mahiddini | May 2017 | B2 |
9654541 | Kapczynski et al. | May 2017 | B1 |
9665722 | Nagasundaram et al. | May 2017 | B2 |
9672053 | Tang et al. | Jun 2017 | B2 |
9691090 | Barday | Jun 2017 | B1 |
9721078 | Cornick et al. | Aug 2017 | B2 |
9721108 | Krishnamurthy et al. | Aug 2017 | B2 |
9729583 | Barday | Aug 2017 | B1 |
9740985 | Byron et al. | Aug 2017 | B2 |
9740987 | Dolan | Aug 2017 | B2 |
9749408 | Subramani et al. | Aug 2017 | B2 |
9760620 | Nachnani et al. | Sep 2017 | B2 |
9760697 | Walker | Sep 2017 | B1 |
9762553 | Ford et al. | Sep 2017 | B2 |
9767309 | Patel et al. | Sep 2017 | B1 |
9785795 | Grondin et al. | Oct 2017 | B2 |
9798749 | Saner | Oct 2017 | B2 |
9800605 | Baikalov | Oct 2017 | B2 |
9804928 | Davis et al. | Oct 2017 | B2 |
9811532 | Parkison et al. | Nov 2017 | B2 |
9817850 | Dubbels et al. | Nov 2017 | B2 |
9817978 | Marsh et al. | Nov 2017 | B2 |
9838407 | Oprea et al. | Dec 2017 | B1 |
9838839 | Vudali et al. | Dec 2017 | B2 |
9842349 | Sawczuk et al. | Dec 2017 | B2 |
9852150 | Sharpe et al. | Dec 2017 | B2 |
9877138 | Franklin | Jan 2018 | B1 |
9882935 | Barday | Jan 2018 | B2 |
9892441 | Barday | Feb 2018 | B2 |
9892442 | Barday | Feb 2018 | B2 |
9892443 | Barday | Feb 2018 | B2 |
9892444 | Barday | Feb 2018 | B2 |
9898769 | Barday | Feb 2018 | B2 |
9912625 | Mutha et al. | Mar 2018 | B2 |
9916703 | Douillard et al. | Mar 2018 | B2 |
9923927 | McClintock et al. | Mar 2018 | B1 |
9953189 | Cook et al. | Apr 2018 | B2 |
9961070 | Tang | May 2018 | B2 |
9983936 | Dornemann et al. | May 2018 | B2 |
9992213 | Sinnema | Jun 2018 | B2 |
10001975 | Bharthulwar | Jun 2018 | B2 |
10002064 | Muske | Jun 2018 | B2 |
10013577 | Beaumont et al. | Jul 2018 | B1 |
10015164 | Hamburg et al. | Jul 2018 | B2 |
10019339 | Von Hanxleden et al. | Jul 2018 | B2 |
10025804 | Vranyes et al. | Jul 2018 | B2 |
10044761 | Ducatel et al. | Aug 2018 | B2 |
10055426 | Arasan et al. | Aug 2018 | B2 |
10061847 | Mohammed et al. | Aug 2018 | B2 |
10073924 | Karp et al. | Sep 2018 | B2 |
10075451 | Hall et al. | Sep 2018 | B1 |
10102533 | Barday | Oct 2018 | B2 |
10122760 | Terrill et al. | Nov 2018 | B2 |
10158676 | Barday | Dec 2018 | B2 |
10165011 | Barday | Dec 2018 | B2 |
10181043 | Pauley, Jr. | Jan 2019 | B1 |
10181051 | Barday et al. | Jan 2019 | B2 |
10250594 | Chathoth | Apr 2019 | B2 |
10284604 | Barday et al. | May 2019 | B2 |
10289867 | Barday et al. | May 2019 | B2 |
20020042687 | Tracy | Apr 2002 | A1 |
20020069035 | Tracy | Jun 2002 | A1 |
20020161594 | Bryan et al. | Oct 2002 | A1 |
20030041250 | Proudler | Feb 2003 | A1 |
20030097451 | Bjorksten et al. | May 2003 | A1 |
20030097661 | Li et al. | May 2003 | A1 |
20030115142 | Brickell et al. | Jun 2003 | A1 |
20030131093 | Aschen et al. | Jul 2003 | A1 |
20030163728 | Shaw | Aug 2003 | A1 |
20040010709 | Baudoin | Jan 2004 | A1 |
20040088235 | Ziekle et al. | May 2004 | A1 |
20040186912 | Harlow et al. | Sep 2004 | A1 |
20040193907 | Patanella | Sep 2004 | A1 |
20050022198 | Olapurath et al. | Jan 2005 | A1 |
20050033616 | Vavul et al. | Feb 2005 | A1 |
20050114343 | Wesinger, Jr. et al. | May 2005 | A1 |
20050144066 | Cope et al. | Jun 2005 | A1 |
20050197884 | Mullen, Jr. | Sep 2005 | A1 |
20060031078 | Pizzinger et al. | Feb 2006 | A1 |
20060075122 | Lindskog et al. | Apr 2006 | A1 |
20060149730 | Curtis | Jul 2006 | A1 |
20070027715 | Gropper et al. | Feb 2007 | A1 |
20070130101 | Anderson et al. | Jun 2007 | A1 |
20070157311 | Meier et al. | Jul 2007 | A1 |
20070179793 | Bagchi et al. | Aug 2007 | A1 |
20070180490 | Renzi et al. | Aug 2007 | A1 |
20070266420 | Hawkins et al. | Nov 2007 | A1 |
20070283171 | Breslin et al. | Dec 2007 | A1 |
20080015927 | Ramirez | Jan 2008 | A1 |
20080028435 | Strickland et al. | Jan 2008 | A1 |
20080047016 | Spoonamore | Feb 2008 | A1 |
20080120699 | Spear | May 2008 | A1 |
20080270203 | Holmes et al. | Oct 2008 | A1 |
20080282320 | Denovo et al. | Nov 2008 | A1 |
20080288271 | Faust | Nov 2008 | A1 |
20090037975 | Ishikawa et al. | Feb 2009 | A1 |
20090182818 | Krywaniuk | Jul 2009 | A1 |
20090204452 | Iskandar et al. | Aug 2009 | A1 |
20090216610 | Chorny | Aug 2009 | A1 |
20090249076 | Reed et al. | Oct 2009 | A1 |
20090254511 | Yeap | Oct 2009 | A1 |
20090303237 | Liu | Dec 2009 | A1 |
20100100398 | Auker et al. | Apr 2010 | A1 |
20100114634 | Christiansen | May 2010 | A1 |
20100121773 | Currier | May 2010 | A1 |
20100192201 | Shimoni et al. | Jul 2010 | A1 |
20100205057 | Hook et al. | Aug 2010 | A1 |
20100228786 | Török | Sep 2010 | A1 |
20100235915 | Memon et al. | Sep 2010 | A1 |
20100268628 | Pitkow et al. | Oct 2010 | A1 |
20100281313 | White et al. | Nov 2010 | A1 |
20100333012 | Adachi et al. | Dec 2010 | A1 |
20110010202 | Neale | Jan 2011 | A1 |
20110137696 | Meyer et al. | Jun 2011 | A1 |
20110231896 | Tovar | Sep 2011 | A1 |
20120084349 | Lee et al. | Apr 2012 | A1 |
20120102543 | Kohli et al. | Apr 2012 | A1 |
20120110674 | Belani et al. | May 2012 | A1 |
20120116923 | Irving et al. | May 2012 | A1 |
20120143650 | Crowley et al. | Jun 2012 | A1 |
20120144499 | Tan et al. | Jun 2012 | A1 |
20120259752 | Agee | Oct 2012 | A1 |
20130018954 | Cheng | Jan 2013 | A1 |
20130085801 | Sharpe et al. | Apr 2013 | A1 |
20130103485 | Postrel | Apr 2013 | A1 |
20130111323 | Taghaddos et al. | May 2013 | A1 |
20130218829 | Martinez | Aug 2013 | A1 |
20130311224 | Heroux et al. | Nov 2013 | A1 |
20130326112 | Park et al. | Dec 2013 | A1 |
20130332362 | Ciurea | Dec 2013 | A1 |
20130340086 | Blom | Dec 2013 | A1 |
20140006616 | Aad et al. | Jan 2014 | A1 |
20140012833 | Humprecht | Jan 2014 | A1 |
20140019561 | Belity et al. | Jan 2014 | A1 |
20140032259 | Lafever et al. | Jan 2014 | A1 |
20140032265 | Paprocki | Jan 2014 | A1 |
20140040134 | Ciurea | Feb 2014 | A1 |
20140040161 | Berlin | Feb 2014 | A1 |
20140047551 | Nagasundaram | Feb 2014 | A1 |
20140052463 | Cashman et al. | Feb 2014 | A1 |
20140074645 | Ingram | Mar 2014 | A1 |
20140089027 | Brown | Mar 2014 | A1 |
20140089039 | McClellan | Mar 2014 | A1 |
20140143011 | Mudugu et al. | May 2014 | A1 |
20140208418 | Libin | Jul 2014 | A1 |
20140244309 | Francois | Aug 2014 | A1 |
20140244325 | Cartwright | Aug 2014 | A1 |
20140244399 | Orduna et al. | Aug 2014 | A1 |
20140278663 | Samuel et al. | Sep 2014 | A1 |
20140283027 | Orona et al. | Sep 2014 | A1 |
20140283106 | Stahura et al. | Sep 2014 | A1 |
20140288971 | Whibbs, III | Sep 2014 | A1 |
20140289862 | Gorfein et al. | Sep 2014 | A1 |
20140337466 | Li et al. | Nov 2014 | A1 |
20140344015 | Puértolas-Montañés et al. | Nov 2014 | A1 |
20150019530 | Felch | Jan 2015 | A1 |
20150066577 | Christiansen et al. | Mar 2015 | A1 |
20150106867 | Liang | Apr 2015 | A1 |
20150106948 | Holman et al. | Apr 2015 | A1 |
20150106949 | Holman et al. | Apr 2015 | A1 |
20150169318 | Nash | Jun 2015 | A1 |
20150178740 | Borawski et al. | Jun 2015 | A1 |
20150207819 | Sartor | Jul 2015 | A1 |
20150229664 | Hawthorn et al. | Aug 2015 | A1 |
20150235050 | Wouhaybi | Aug 2015 | A1 |
20150242778 | Wilcox et al. | Aug 2015 | A1 |
20150254597 | Jahagirdar | Sep 2015 | A1 |
20150261887 | Joukov | Sep 2015 | A1 |
20150269384 | Holman et al. | Sep 2015 | A1 |
20150310575 | Shelton | Oct 2015 | A1 |
20150356362 | Demos | Dec 2015 | A1 |
20150379430 | Dirac et al. | Dec 2015 | A1 |
20160026394 | Goto | Jan 2016 | A1 |
20160034918 | Bjelajac et al. | Feb 2016 | A1 |
20160048700 | Stransky-Heilkron | Feb 2016 | A1 |
20160050213 | Storr | Feb 2016 | A1 |
20160063523 | Nistor et al. | Mar 2016 | A1 |
20160063567 | Srivastava | Mar 2016 | A1 |
20160071112 | Unser | Mar 2016 | A1 |
20160099963 | Mahaffey et al. | Apr 2016 | A1 |
20160103963 | Mishra | Apr 2016 | A1 |
20160125751 | Barker et al. | May 2016 | A1 |
20160142445 | Sartor | May 2016 | A1 |
20160148143 | Anderson et al. | May 2016 | A1 |
20160162269 | Pogorelik et al. | Jun 2016 | A1 |
20160164915 | Cook | Jun 2016 | A1 |
20160188450 | Appusamy et al. | Jun 2016 | A1 |
20160226905 | Baikalov | Aug 2016 | A1 |
20160234319 | Griffin | Aug 2016 | A1 |
20160262163 | Gonzalez Garrido et al. | Sep 2016 | A1 |
20160321748 | Mahatma et al. | Nov 2016 | A1 |
20160330237 | Edlabadkar | Nov 2016 | A1 |
20160342811 | Whitcomb et al. | Nov 2016 | A1 |
20160364736 | Maugans, III | Dec 2016 | A1 |
20160370954 | Burningham et al. | Dec 2016 | A1 |
20160381064 | Chan et al. | Dec 2016 | A1 |
20160381560 | Margaliot | Dec 2016 | A1 |
20170004055 | Horan et al. | Jan 2017 | A1 |
20170111395 | Sartor | Apr 2017 | A1 |
20170115864 | Thomas et al. | Apr 2017 | A1 |
20170124570 | Nidamanuri et al. | May 2017 | A1 |
20170140174 | Lacey | May 2017 | A1 |
20170142158 | Laoutaris et al. | May 2017 | A1 |
20170161520 | Lockhart, III et al. | Jun 2017 | A1 |
20170171235 | Mulchandani et al. | Jun 2017 | A1 |
20170177324 | Frank et al. | Jun 2017 | A1 |
20170180505 | Shaw et al. | Jun 2017 | A1 |
20170193624 | Tsai | Jul 2017 | A1 |
20170201518 | Holmqvist et al. | Jul 2017 | A1 |
20170206707 | Guay et al. | Jul 2017 | A1 |
20170208084 | Steelman et al. | Jul 2017 | A1 |
20170220964 | Datta | Aug 2017 | A1 |
20170249710 | Guillama et al. | Aug 2017 | A1 |
20170270318 | Ritchie | Sep 2017 | A1 |
20170278117 | Wallace et al. | Sep 2017 | A1 |
20170286719 | Krishnamurthy et al. | Oct 2017 | A1 |
20170287031 | Barday | Oct 2017 | A1 |
20170308875 | O'Regan et al. | Oct 2017 | A1 |
20170330197 | DiMaggio et al. | Nov 2017 | A1 |
20170357982 | Barday | Dec 2017 | A1 |
20180063174 | Grill et al. | Mar 2018 | A1 |
20180063190 | Wright et al. | Mar 2018 | A1 |
20180083843 | Sambandam | Mar 2018 | A1 |
20180091476 | Jakobsson et al. | Mar 2018 | A1 |
20180165637 | Romero et al. | Jun 2018 | A1 |
20180182009 | Barday | Jun 2018 | A1 |
20180198614 | Neumann | Jul 2018 | A1 |
20180248914 | Sartor | Aug 2018 | A1 |
20180285887 | Maung | Oct 2018 | A1 |
20180307859 | LaFever | Oct 2018 | A1 |
20180374030 | Barday et al. | Dec 2018 | A1 |
Number | Date | Country |
---|---|---|
1394698 | Mar 2004 | EP |
2031540 | Mar 2009 | EP |
2001033430 | May 2001 | WO |
2005008411 | Jan 2005 | WO |
2007002412 | Jan 2007 | WO |
2012174659 | Dec 2012 | WO |
2015116905 | Aug 2015 | WO |
Entry |
---|
Berezovskiy et al, “A framework for dynamic data source identification and orchestration on the Web”, ACM, pp. 1-8 (Year: 2010). |
McGarth et al, “Digital Library Technology for Locating and Accessing Scientific Data”, ACM, pp. 188-194 (Year: 1999). |
Mudepalli et al, “An efficient data retrieval approach using blowfish encryption on cloud CipherText Retrieval in Cloud Computing” IEEE, pp. 267-271 (Year: 2017). |
Notice of Allowance, dated Jul. 10, 2019, from corresponding U.S. Appl. No. 16/237,083. |
Notice of Allowance, dated Jul. 10, 2019, from corresponding U.S. Appl. No. 16/403,358. |
Notice of Allowance, dated Jul. 12, 2019, from corresponding U.S. Appl. No. 16/278,121. |
Office Action, dated Jun. 27, 2019, from corresponding U.S. Appl. No. 16/404,405. |
Salim et al, “Data Retrieval and Security using Lightweight Directory Access Protocol”, IEEE, pp. 685-688 (Year: 2009). |
Stern, Joanna, “iPhone Privacy Is Broken . . . and Apps Are to Blame”, The Wall Street Journal, wsj.com, May 31, 2019. |
Notice of Allowance, dated Jul. 17, 2019, from corresponding U.S. Appl. No. 16/055,961. |
Office Action, dated Jul. 18, 2019, from corresponding U.S. Appl. No. 16/410,762. |
International Search Report, dated Oct. 12, 2018, from corresponding International Application No. PCT/US2018/044046. |
International Search Report, dated Oct. 16, 2018, from corresponding International Application No. PCT/US2018/045243. |
International Search Report, dated Oct. 18, 2018, from corresponding International Application No. PCT/US2018/045249. |
International Search Report, dated Oct. 20, 2017, from corresponding International Application No. PCT/US2017/036917. |
International Search Report, dated Oct. 3, 2017, from corresponding International Application No. PCT/US2017/036912. |
International Search Report, dated Sep. 1, 2017, from corresponding International Application No. PCT/US2017/036896. |
International Search Report, dated Sep. 12, 2018, from corresponding International Application No. PCT/US2018/037504. |
Invitation to Pay Additional Search Fees, dated Aug. 10, 2017, from corresponding International Application No. PCT/US2017/036912. |
Invitation to Pay Additional Search Fees, dated Aug. 10, 2017, from corresponding International Application No. PCT/US2017/036917. |
Invitation to Pay Additional Search Fees, dated Aug. 24, 2017, from corresponding International Application No. PCT/US2017/036888. |
Invitation to Pay Additional Search Fees, dated Jan. 18, 2019, from corresponding International Application No. PCT/US2018/055736. |
Invitation to Pay Additional Search Fees, dated Jan. 7, 2019, from corresponding International Application No. PCT/US2018/055773. |
Invitation to Pay Additional Search Fees, dated Jan. 8, 2019, from corresponding International Application No. PCT/US2018/055774. |
Invitation to Pay Additional Search Fees, dated Oct. 23, 2018, from corresponding International Application No. PCT/US2018/045296. |
Korba, Larry et al.; “Private Data Discovery for Privacy Compliance in Collaborative Environments”; Cooperative Design, Visualization, and Engineering; Springer Berlin Heidelberg; Sep. 21, 2008; pp. 142-150. |
Krol, Kat, et al, Control versus Effort in Privacy Warnings for Webforms, ACM, Oct. 24, 2016, pp. 13-23. |
Lamb et al, “Role-Based Access Control for Data Service Integration”, ACM, pp. 3-11 (Year: 2006). |
Li, Ninghui, et al, t-Closeness: Privacy Beyond k-Anonymity and I-Diversity, IEEE, 2014, p. 106-115. |
Liu, Kun, et al, A Framework for Computing the Privacy Scores of Users in Online Social Networks, ACM Transactions on Knowledge Discovery from Data, vol. 5, No. 1, Article 6, Dec. 2010, 30 pages. |
Maret et al, “Multimedia Information Interchange: Web Forms Meet Data Servers”, IEEE, pp. 499-505 (Year: 1999). |
Newman, “Email Archive Overviews using Subject Indexes”, ACM, pp. 652-653, 2002 (Year: 2002). |
Notice of Allowance, dated Apr. 12, 2017, from corresponding U.S. Appl. No. 15/256,419. |
Notice of Allowance, dated Apr. 2, 2019, from corresponding U.S. Appl. No. 16/160,577. |
Notice of Allowance, dated Apr. 25, 2018, from corresponding U.S. Appl. No. 15/883,041. |
Notice of Allowance, dated Aug. 14, 2018, from corresponding U.S. Appl. No. 15/989,416. |
Notice of Allowance, dated Aug. 18, 2017, from corresponding U.S. Appl. No. 15/619,455. |
Notice of Allowance, dated Aug. 24, 2018, from corresponding U.S. Appl. No. 15/619,479. |
Notice of Allowance, dated Aug. 30, 2018, from corresponding U.S. Appl. No. 15/996,208. |
Notice of Allowance, dated Aug. 9, 2018, from corresponding U.S. Appl. No. 15/882,989. |
Notice of Allowance, dated Dec. 10, 2018, from corresponding U.S. Appl. No. 16/105,602. |
Notice of Allowance, dated Dec. 12, 2017, from corresponding U.S. Appl. No. 15/169,643. |
Notice of Allowance, dated Dec. 12, 2017, from corresponding U.S. Appl. No. 15/619,212. |
Notice of Allowance, dated Dec. 12, 2017, from corresponding U.S. Appl. No. 15/619,382. |
Notice of Allowance, dated Dec. 31, 2018, from corresponding U.S. Appl. No. 16/159,634. |
Notice of Allowance, dated Dec. 5, 2017, from corresponding U.S. Appl. No. 15/633,703. |
Notice of Allowance, dated Dec. 6, 2017, from corresponding U.S. Appl. No. 15/619,451. |
Notice of Allowance, dated Dec. 6, 2017, from corresponding U.S. Appl. No. 15/619,459. |
Notice of Allowance, dated Feb. 13, 2019, from corresponding U.S. Appl. No. 16/041,563. |
Notice of Allowance, dated Feb. 14, 2019, from corresponding U.S. Appl. No. 16/226,272. |
Notice of Allowance, dated Feb. 19, 2019, from corresponding U.S. Appl. No. 16/159,632. |
Notice of Allowance, dated Feb. 27, 2019, from corresponding U.S. Appl. No. 16/041,468. |
Notice of Allowance, dated Feb. 27, 2019, from corresponding U.S. Appl. No. 16/226,290. |
Notice of Allowance, dated Jan. 18, 2018, from corresponding U.S. Appl. No. 15/619,478. |
Notice of Allowance, dated Jan. 18, 2019 from corresponding U.S. Appl. No. 16/159,635. |
Notice of Allowance, dated Jan. 23, 2018, from corresponding U.S. Appl. No. 15/619,251. |
Notice of Allowance, dated Jan. 26, 2018, from corresponding U.S. Appl. No. 15/619,469. |
Notice of Allowance, dated Jun. 19, 2018, from corresponding U.S. Appl. No. 15/894,890. |
Notice of Allowance, dated Jun. 27, 2018, from corresponding U.S. Appl. No. 15/882,989. |
Notice of Allowance, dated Jun. 6, 2018, from corresponding U.S. Appl. No. 15/875,570. |
Notice of Allowance, dated Mar. 1, 2018, from corresponding U.S. Appl. No. 15/853,674. |
Notice of Allowance, dated Jul. 23, 2019, from corresponding U.S. Appl. No. 16/220,978. |
Office Action, dated Jul. 23, 2019, from corresponding U.S. Appl. No. 16/436,616. |
Notice of Allowance, dated Jul. 26, 2019, from corresponding U.S. Appl. No. 16/409,673. |
Notice of Allowance, dated Jul. 31, 2019, from corresponding U.S. Appl. No. 16/221,153. |
Agar, Gunes, et al, The Web Never Forgets, Computer and Communications Security, ACM, Nov. 3, 2014, pp. 674-689. |
Aghasian, Erfan, et al, Scoring Users' Privacy Disclosure Across Multiple Online Social Networks, IEEE Access, Multidisciplinary Rapid Review Open Access Journal, Jul. 31, 2017, vol. 5, 2017. |
Agosti et al, “Access and Exchange of Hierarchically Structured Resources on the Web with the NESTOR Framework”, IEEE, pp. 659-662 (Year: 2009). |
Antunes et al, “Preserving Digital Data in Heterogeneous Environments”, ACM, pp. 345-348, 2009 (Year: 2009). |
Avepoint, AvePoint Privacy Impact Assessment 1: User Guide, Cumulative Update 2, Revision E, Feb. 2015, AvePoint, Inc. |
Byun, Ji-Won, Elisa Bertino, and Ninghui Li. “Purpose based access control of complex data for privacy protection.” Proceedings of the tenth ACM symposium on Access control models and technologies. ACM, 2005. (Year: 2005). |
Decision Regarding Institution of Post-Grant Review in Case PGR2018-00056 for U.S. Pat. No. 9,691,090 B1, Oct. 11, 2018. |
Enck, William, et al, TaintDroid: An Information-Flow Tracking System for Realtime Privacy Monitoring on Smartphones, ACM Transactions on Computer Systems, vol. 32, No. 2, Article 5, Jun. 2014, p. 5:1-5:29. |
Falahrastegar, Marjan, et al, Tracking Personal Identifiers Across the Web, Medical Image Computing and Computer-Assisted Intervention—Miccai 2015, 18th International Conference, Oct. 5, 2015, Munich, Germany. |
Final Office Action, dated Jan. 17, 2018, from corresponding U.S. Appl. No. 15/619,278. |
Final Office Action, dated Jan. 23, 2018, from corresponding U.S. Appl. No. 15/619,479. |
Final Office Action, dated Mar. 5, 2019, from corresponding U.S. Appl. No. 16/055,961. |
Final Office Action, dated Nov. 29, 2017, from corresponding U.S. Appl. No. 15/619,237. |
Francis, Andre, Business Mathematics and Statistics, South-Western Cengage Learning, 2008, Sixth Edition. |
Frikken, Keith B., et al, Yet Another Privacy Metric for Publishing Micro-data, Miami University, Oct. 27, 2008, p. 117-121. |
Fung et al, “Discover Information and Knowledge from Websites using an Integrated Summarization and Visualization Framework”, IEEE, pp. 232-235 (Year 2010). |
Ghiglieri, Marco et al.; Personal DLP for Facebook, 2014 IEEE International Conference on Pervasive Computing and Communication Workshops (Percom Workshops); IEEE; Mar. 24, 2014; pp. 629-634. |
Hacigümüs, Hakan, et al, Executing SQL over Encrypted Data in the Database-Service-Provider Model, ACM, Jun. 4, 2002, pp. 216-227. |
Huner et al, “Towards a Maturity Model for Corporate Data Quality Management”, ACM, pp. 231-238, 2009 (Year: 2009). |
Hunton & Williams LLP, The Role of Risk Management in Data Protection, Privacy Risk Framework and the Risk-based Approach to Privacy, Centre for Information Policy Leadership, Workshop II, Nov. 23, 2014. |
Iapp, Daily Dashboard, PIA Tool Stocked With New Templates for DPI, Infosec, International Association of Privacy Professionals, Apr. 22, 2014. |
International Search Report, dated Aug. 15, 2017, from corresponding International Application No. PCT/US2017/036919. |
International Search Report, dated Aug. 21, 2017, from corresponding International Application No. PCT/US2017/036914. |
International Search Report, dated Aug. 29, 2017, from corresponding International Application No. PCT/US2017/036898. |
International Search Report, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036889. |
International Search Report, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036890. |
International Search Report, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036893. |
International Search Report, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036901. |
International Search Report, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036913. |
International Search Report, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036920. |
International Search Report, dated Dec. 14, 2018, from corresponding International Application No. PCT/US2018/045296. |
International Search Report, dated Jan. 14, 2019, from corresponding International Application No. PCT/US2018/046949. |
International Search Report, dated Jan. 7, 2019, from corresponding International Application No. PCT/US2018/055772. |
International Search Report, dated Jun. 21, 2017, from corresponding International Application No. PCT/US2017/025600. |
International Search Report, dated Jun. 6, 2017, from corresponding International Application No. PCT/US2017/025605. |
International Search Report, dated Jun. 6, 2017, from corresponding International Application No. PCT/US2017/025611. |
International Search Report, dated Mar. 14, 2019, from corresponding International Application No. PCT/US2018/055736. |
International Search Report, dated Mar. 4, 2019, from corresponding International Application No. PCT/US2018/055773. |
International Search Report, dated Mar. 4, 2019, from corresponding International Application No. PCT/US2018/055774. |
International Search Report, dated Nov. 19, 2018, from corresponding International Application No. PCT/US2018/046939. |
International Search Report, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/043975. |
International Search Report, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/043976. |
International Search Report, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/043977. |
International Search Report, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/044026. |
International Search Report, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/045240. |
International Search Report, dated Oct. 12, 2017, from corresponding International Application No. PCT/US2017/036888. |
Notice of Allowance, dated Mar. 1, 2019, from corresponding U.S. Appl. No. 16/059,911. |
Notice of Allowance, dated Mar. 13, 2019, from corresponding U.S. Appl. No. 16/055,083. |
Notice of Allowance, dated Mar. 14, 2019, from corresponding U.S. Appl. No. 16/055,944. |
Notice of Allowance, dated Mar. 2, 2018, from corresponding U.S. Appl. No. 15/858,802. |
Notice of Allowance, dated Mar. 25, 2019, from corresponding U.S. Appl. No. 16/054,780. |
Notice of Allowance, dated Mar. 27, 2019, from corresponding U.S. Appl. No. 16/226,280. |
Notice of Allowance, dated Mar. 29, 2019, from corresponding U.S. Appl. No. 16/055,998. |
Notice of Allowance, dated May 21, 2018, from corresponding U.S. Appl. No. 15/896,790. |
Notice of Allowance, dated May 5, 2017, from corresponding U.S. Appl. No. 15/254,901. |
Notice of Allowance, dated Nov. 2, 2018, from corresponding U.S. Appl. No. 16/054,762. |
Notice of Allowance, dated Nov. 7, 2017, from corresponding U.S. Appl. No. 15/671,073. |
Notice of Allowance, dated Nov. 8, 2018, from corresponding U.S. Appl. No. 16/042,642. |
Notice of Allowance, dated Oct. 17, 2018, from corresponding U.S. Appl. No. 15/896,790. |
Notice of Allowance, dated Oct. 17, 2018, from corresponding U.S. Appl. No. 16/054,672. |
Notice of Allowance, dated Sep. 13, 2018, from corresponding U.S. Appl. No. 15/894,809. |
Notice of Allowance, dated Sep. 13, 2018, from corresponding U.S. Appl. No. 15/894,890. |
Notice of Allowance, dated Sep. 18, 2018, from corresponding U.S. Appl. No. 15/894,819. |
Notice of Allowance, dated Sep. 18, 2018, from corresponding U.S. Appl. No. 16/041,545. |
Notice of Allowance, dated Sep. 27, 2017, from corresponding U.S. Appl. No. 15/626,052. |
Notice of Allowance, dated Sep. 28, 2018, from corresponding U.S. Appl. No. 16/041,520. |
Notice of Allowance, dated Sep. 4, 2018, from corresponding U.S. Appl. No. 15/883,041. |
Notice of Filing Date for Petition for Post-Grant Review of related U.S. Pat. No. 9,691,090 dated Apr. 12, 2018. |
Office Action, dated Apr. 18, 2018, from corresponding U.S. Appl. No. 15/894,819. |
Office Action, dated Aug. 23, 2017, from corresponding U.S. Appl. No. 15/626,052. |
Office Action, dated Aug. 24, 2017, from corresponding U.S. Appl. No. 15/169,643. |
Office Action, dated Aug. 24, 2017, from corresponding U.S. Appl. No. 15/619,451. |
Office Action, dated Aug. 29, 2017, from corresponding U.S. Appl. No. 15/619,237. |
Office Action, dated Aug. 30, 2017, from corresponding U.S. Appl. No. 15/619,212. |
Office Action, dated Aug. 30, 2017, from corresponding U.S. Appl. No. 15/619,382. |
Office Action, dated Dec. 14, 2018, from corresponding U.S. Appl. No. 16/104,393. |
Office Action, dated Dec. 15, 2016, from corresponding U.S. Appl. No. 15/256,419. |
Office Action, dated Dec. 3, 2018, from corresponding U.S. Appl. No. 16/055,998. |
Office Action, dated Dec. 31, 2018, from corresponding U.S. Appl. No. 16/160,577. |
Office Action, dated Feb. 15, 2019, from corresponding U.S. Appl. No. 16/220,899. |
Office Action, dated Feb. 26, 2019, from corresponding U.S. Appl. No. 16/228,250. |
Office Action, dated Jan. 18, 2019, from corresponding U.S. Appl. No. 16/055,984. |
Office Action, dated Jan. 4, 2019, from corresponding U.S. Appl. No. 16/159,566. |
Office Action, dated Jan. 4, 2019, from corresponding U.S. Appl. No. 16/159,628. |
Office Action, dated Jul. 21, 2017, from corresponding U.S. Appl. No. 15/256,430. |
Office Action, dated Mar. 11, 2019, from corresponding U.S. Appl. No. 16/220,978. |
Office Action, dated Mar. 12, 2019, from corresponding U.S. Appl. No. 16/221,153. |
Office Action, dated Mar. 25, 2019, from corresponding U.S. Appl. No. 16/278,121. |
Office Action, dated Mar. 27, 2019, from corresponding U.S. Appl. No. 16/278,120. |
Office Action, dated Mar. 30, 2018, from corresponding U.S. Appl. No. 15/894,890. |
Office Action, dated Mar. 30, 2018, from corresponding U.S. Appl. No. 15/896,790. |
Office Action, dated Mar. 4, 2019, from corresponding U.S. Appl. No. 16/237,083. |
Office Action, dated May 16, 2018, from corresponding U.S. Appl. No. 15/882,989. |
Office Action, dated May 2, 2018, from corresponding U.S. Appl. No. 15/894,809. |
Office Action, dated Nov. 1, 2017, from corresponding U.S. Appl. No. 15/169,658. |
Office Action, dated Nov. 15, 2018, from corresponding U.S. Appl. No. 16/059,911. |
Written Opinion of the International Searching Authority, dated Nov. 19, 2018, from corresponding International Application No. PCT/US2018/046939. |
Written Opinion of the International Searching Authority, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/043975. |
Written Opinion of the International Searching Authority, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/043976. |
Written Opinion of the International Searching Authority, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/043977. |
Written Opinion of the International Searching Authority, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/044026. |
Written Opinion of the International Searching Authority, dated Oct. 11, 2018, from corresponding International Application No. PCT/US2018/045240. |
Written Opinion of the International Searching Authority, dated Oct. 12, 2017, from corresponding International Application No. PCT/US2017/036888. |
Written Opinion of the International Searching Authority, dated Oct. 12, 2018, from corresponding International Application No. PCT/US2018/044046. |
Written Opinion of the International Searching Authority, dated Oct. 16, 2018, from corresponding International Application No. PCT/US2018/045243. |
Written Opinion of the International Searching Authority, dated Oct. 18, 2018, from corresponding International Application No. PCT/US2018/045249. |
Written Opinion of the International Searching Authority, dated Oct. 20, 2017, from corresponding International Application No. PCT/US2017/036917. |
Written Opinion of the International Searching Authority, dated Oct. 3, 2017, from corresponding International Application No. PCT/US2017/036912. |
Written Opinion of the International Searching Authority, dated Sep. 1, 2017, from corresponding International Application No. PCT/US2017/036896. |
Written Opinion of the International Searching Authority, dated Sep. 12, 2018, from corresponding International Application No. PCT/US2018/037504. |
www.truste.com (1), 200150207, Internet Archive Wayback Machine, www.archive.org,2_7_2015. |
Yu, “Using Data from Social Media Websites to Inspire the Design of Assistive Technology”, ACM, pp. 1-2 (Year: 2016). |
Zhang et al, “Dynamic Topic Modeling for Monitoring Market Competition from Online Text and Image Data”, ACM, pp. 1425-1434 (Year: 2015). |
Bhargav-Spantzel et al., Receipt Management—Transaction History based Trust Establishment, 2007, ACM, p. 82-91. |
Notice of Allowance, dated Apr. 8, 2019, from corresponding U.S. Appl. No. 16/228,250. |
Office Action, dated Apr. 5, 2019, from corresponding U.S. Appl. No. 16/278,119. |
Restriction Requirement, dated Apr. 10, 2019, from corresponding U.S. Appl. No. 16/277,715. |
Notice of Allowance, dated May 28, 2019, from corresponding U.S. Appl. No. 16/277,568. |
Office Action, dated May 17, 2019, from corresponding U.S. Appl. No. 16/277,539. |
Office Action, dated May 2, 2019, from corresponding U.S. Appl. No. 16/104,628. |
Dimou et al, “Machine-Interpretable Dataset and Service Descriptions for Heterogeneous Data Access and Retrieval”, ACM, pp. 145-152 (Year: 2015). |
Dunkel et al, “Data Organization and Access for Efficient Data Mining”, IEEE, pp. 522-529 (Year: 1999). |
Office Action, dated Apr. 22, 2019, from corresponding U.S. Appl. No. 16/241,710. |
Restriction Requirement, dated Apr. 24, 2019, from corresponding U.S. Appl. No. 16/278,122. |
Joel Reardon et al., Secure Data Deletion from Persistent Media, ACM, Nov. 4, 2013, retrieved online on Jun. 13, 2019, pp. 271-283. Retrieved from the Internet: URL: http://delivery.acm.org/10.1145/2520000/2516699/p271-reardon.pdf? (Year: 2013). |
Notice of Allowance, dated Jun. 12, 2019, from corresponding U.S. Appl. No. 16/278,123. |
Notice of Allowance, dated Jun. 18, 2019, from corresponding U.S. Appl. No. 16/410,566. |
Notice of Allowance, dated Jun. 19, 2019, from corresponding U.S. Appl. No. 16/042,673. |
Notice of Allowance, dated Jun. 19, 2019, from corresponding U.S. Appl. No. 16/055,984. |
Notice of Allowance, dated Jun. 21, 2019, from corresponding U.S. Appl. No. 16/404,439. |
Notice of Allowance, dated Jun. 4, 2019, from corresponding U.S. Appl. No. 16/159,566. |
Notice of Allowance, dated Jun. 5, 2019, from corresponding U.S. Appl. No. 16/220,899. |
Notice of Allowance, dated Jun. 5, 2019, from corresponding U.S. Appl. No. 16/357,260. |
Notice of Allowance, dated Jun. 6, 2019, from corresponding U.S. Appl. No. 16/159,628. |
Office Action, dated Jun. 24, 2019, from corresponding U.S. Appl. No. 16/410,336. |
Tuomas Aura et al., Scanning Electronic Documents for Personally Identifiable Information, ACM, Oct. 30, 2006, retrieved online on Jun. 13, 2019, pp. 41-49. Retrieved from the Internet: URL: http://delivery.acm.org/10.1145/1180000/1179608/p41-aura.pdf? (Year: 2006). |
Abdullah et al, “The Mapping Process of Unstructured Data to the Structured Data”, ACM, pp. 151-155 (Year: 2013). |
Bhuvaneswaran et al, “Redundant Parallel Data Transfer Schemes for the Grid Environment”, ACM, pp. 18 (Year: 2006). |
Chowdhury et al, “A System Architecture for Subject-Centric Data Sharing”, ACM, pp. 1-10 (Year: 2018). |
Hacigumus et al, “Executing SQL over Encrypted Data in the Database-Service-Provider Model”, ACM, pp. 216-227 (Year: 2002). |
Popescu-Zeletin, “The Data Access and Transfer Support in a Local Heterogeneous Network (HMINET)”, IEEE, pp. 147-152 (Year: 1979). |
Yin et al, “Multibank Memory Optimization for Parallel Data Access in Multiple Data Arrays”, ACM, pp. 1-8 (Year: 2016). |
Yiu et al, “Outsourced Similarity Search on Metric Data Assets”, IEEE, pp. 338-352 (Year: 2012). |
Zhang et al, “Data Transfer Performance Issues for a Web Services Interface to Synchrotron Experiments”, ACM, pp. 59-65 (Year: 2007). |
Office Action, dated Nov. 23, 2018, from corresponding U.S. Appl. No. 16/042,673. |
Office Action, dated Oct. 10, 2018, from corresponding U.S. Appl. No. 16/041,563. |
Office Action, dated Oct. 10, 2018, from corresponding U.S. Appl. No. 16/055,083. |
Office Action, dated Oct. 10, 2018, from corresponding U.S. Appl. No. 16/055,944. |
Office Action, dated Oct. 15, 2018, from corresponding U.S. Appl. No. 16/054,780. |
Office Action, dated Oct. 23, 2018, from corresponding U.S. Appl. No. 16/055,961. |
Office Action, dated Oct. 26, 2018, from corresponding U.S. Appl. No. 16/041,468. |
Office Action, dated Sep. 1, 2017, from corresponding U.S. Appl. No. 15/619,459. |
Office Action, dated Sep. 11, 2017, from corresponding U.S. Appl. No. 15/619,375. |
Office Action, dated Sep. 11, 2017, from corresponding U.S. Appl. No. 15/619,478. |
Office Action, dated Sep. 19, 2017, from corresponding U.S. Appl. No. 15/671,073. |
Office Action, dated Sep. 22, 2017, from corresponding U.S. Appl. No. 15/619,278. |
Office Action, dated Sep. 5, 2017, from corresponding U.S. Appl. No. 15/619,469. |
Office Action, dated Sep. 6, 2017, from corresponding U.S. Appl. No. 15/619,479. |
Office Action, dated Sep. 7, 2017, from corresponding U.S. Appl. No. 15/633,703. |
Office Action, dated Sep. 8, 2017, from corresponding U.S. Appl. No. 15/619,251. |
Olenski, Steve, For Consumers, Data is a Matter of Trust, CMO Network, Apr. 18, 2016, https://www.forbes.com/sites/steveolenski/2016/04/18/for-consumers-data-is-a-matter-of-trust/#2e48496278b3. |
Petition for Post-Grant Review of related U.S. Pat. No. 9,691,090 dated Mar. 27, 2018. |
Petrie et al, “The Relationship between Accessibility and Usability of Websites”, ACM, pp. 397-406 (Year: 2007). |
Pfeifle, Sam, The Privacy Advisor, IAPP and AvePoint Launch New Free PIA Tool, International Association of Privacy Professionals, Mar. 5, 2014. |
Pfeifle, Sam, The Privacy Advisor, IAPP Heads to Singapore with APIA Template in Tow, International Association of Privacy Professionals, https://iapp.org/news/a/iapp-heads-to-singapore-with-apia-template_in_tow/, Mar. 28, 2014, p. 1-3. |
Restriction Requirement, dated Dec. 31, 2018, from corresponding U.S. Appl. No. 15/169,668. |
Restriction Requirement, dated Jan. 18, 2017, from corresponding U.S. Appl. No. 15/256,430. |
Restriction Requirement, dated Jul. 28, 2017, from corresponding U.S. Appl. No. 15/169,658. |
Restriction Requirement, dated Nov. 21, 2016, from corresponding U.S. Appl. No. 15/254,901. |
Restriction Requirement, dated Oct. 17, 2018, from corresponding U.S. Appl. No. 16/055,984. |
Schwartz, Edward J., et al, 2010 IEEE Symposium on Security and Privacy: All You Ever Wanted to Know About Dynamic Analysis and forward Symbolic Execution (but might have been afraid to ask), Carnegie Mellon University, IEEE Computer Society, 2010, p. 317-331. |
Srivastava, Agrima, et al, Measuring Privacy Leaks in Online Social Networks, International Conference on Advances in Computing, Communications and Informatics (ICACCI), 2013. |
Symantec, Symantex Data Loss Prevention—Discover, monitor, and protect confidential data; 2008; Symantec Corporation; http://www.mssuk.com/images/Symantec%2014552315_IRC_BR_DLP_03.09_sngl.pdf. |
The Cookie Collective, Optanon Cookie Policy Generator, The Cookie Collective, Year 2016, http://web.archive.org/web/20160324062743/https:/optanon.com/. |
TRUSTe Announces General Availability of Assessment Manager for Enterprises to Streamline Data Privacy Management with Automation, PRNewswire, 20150304. |
Weaver et al, “Understanding Information Preview in Mobile Email Processing”, ACM, pp. 303-312, 2011 (Year: 2011). |
Written Opinion of the International Searching Authority, dated Jun. 6, 2017, from corresponding International Application No. PCT/US2017/025611. |
Written Opinion of the International Searching Authority, dated Aug. 15, 2017, from corresponding International Application No. PCT/US2017/036919. |
Written Opinion of the International Searching Authority, dated Aug. 21, 2017, from corresponding International Application No. PCT/US2017/036914. |
Written Opinion of the International Searching Authority, dated Aug. 29, 2017, from corresponding International Application No. PCT/US2017/036898. |
Written Opinion of the International Searching Authority, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036889. |
Written Opinion of the International Searching Authority, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036890. |
Written Opinion of the International Searching Authority, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036893. |
Written Opinion of the International Searching Authority, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036901. |
Written Opinion of the International Searching Authority, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036913. |
Written Opinion of the International Searching Authority, dated Aug. 8, 2017, from corresponding International Application No. PCT/US2017/036920. |
Written Opinion of the International Searching Authority, dated Dec. 14, 2018, from corresponding International Application No. PCT/US2018/045296. |
Written Opinion of the International Searching Authority, dated Jan. 14, 2019, from corresponding International Application No. PCT/US2018/046949. |
Written Opinion of the International Searching Authority, dated Jan. 7, 2019, from corresponding International Application No. PCT/US2018/055772. |
Written Opinion of the International Searching Authority, dated Jun. 21, 2017, from corresponding International Application No. PCT/US2017/025600. |
Written Opinion of the International Searching Authority, dated Jun. 6, 2017, from corresponding International Application No. PCT/US2017/025605. |
Written Opinion of the International Searching Authority, dated Mar. 14, 2019, from corresponding International Application No. PCT/US2018/055736. |
Written Opinion of the International Searching Authority, dated Mar. 4, 2019, from corresponding International Application No. PCT/US2018/055773. |
Written Opinion of the International Searching Authority, dated Mar. 4, 2019, from corresponding International Application No. PCT/US2018/055774. |
Number | Date | Country | |
---|---|---|---|
20190220623 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
62360123 | Jul 2016 | US | |
62353802 | Jun 2016 | US | |
62348695 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16042642 | Jul 2018 | US |
Child | 16363454 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15619251 | Jun 2017 | US |
Child | 16042642 | US | |
Parent | 15254901 | Sep 2016 | US |
Child | 15619251 | US |