AUTOMATIC ADJUSTMENT OF A DISPLAY TO OBSCURE DATA

Information

  • Patent Application
  • 20180239931
  • Publication Number
    20180239931
  • Date Filed
    February 05, 2018
    6 years ago
  • Date Published
    August 23, 2018
    6 years ago
Abstract
Methods, systems, and computer program products are disclosed for automatically adjusting a display to obscure application data. In an example, a computer-implemented method may include collecting eye data from a user, receiving the eye data collected from the user, analyzing the eye data, determining that eyesight of the user is on a display based on the eye data, providing data on the display to the user when the eyesight of the user is determined to be on the display, determining that the eyesight of the user is off the display, obscuring the data on the display in response to determining that the eyesight of the user is off the display, removing the obscuring applied to the data on the display when the eyesight of the user returns to the display.
Description
TECHNICAL FIELD

The present disclosure generally relates to computer systems and, more particularly, to protecting sensitive data displayed by a computing device.


BACKGROUND

The use of computing devices has become widespread throughout society. For example, many users carry one or more devices, such as smart phones, smart watches, laptops, and tablets at any given time. Further, computing devices have become increasingly important in the personal lives of users. For example, users can communicate with family and friends, purchase goods and services, and access various personal and work-related data virtually anywhere using smart phones, smart watches, and other portable devices.


In a related trend, corporate workspaces have become increasingly smaller and collaborative. For example, users may share a common workspace lacking walls, dividers, and other protections to safeguard privacy. As such, many users access personal and confidential data both in public and at work with little or no privacy. As a result, sensitive data may be unintentionally displayed to an untrusted or unauthorized individual, for example, when a user performs activities using a device or temporarily turns their attention elsewhere.





BRIEF DESCRIPTION OF THE DRAWINGS

Various examples of the present disclosure will be understood more fully from the detailed description given below and from the accompanying drawings of various examples of the disclosure. In the drawings, like reference numbers may indicate identical or functionally similar elements. The drawing in which an element first appears is generally indicated by the left-most digit in the corresponding reference number.



FIG. 1 is a block diagram illustrating a system architecture, in accordance with various examples of the present disclosure.



FIG. 2 is a flow diagram for automatically adjusting a display to obscure application data, according to an example of the present disclosure.



FIG. 3 is a flow diagram for automatically adjusting multiple displays to obscure application data, according to an example of the present disclosure.



FIG. 4 is a block diagram of an exemplary computer system that may perform one or more of the operations described herein.





DETAILED DESCRIPTION

Systems, methods, and computer program products are disclosed for automatic adjustment of a display to obscure application data.


In an example, a user operates a computing device in a location with little privacy or protection to prevent others from viewing the user's data. To protect the user's data, a data obscuring system automatically obscures sensitive data displayed by the user's computing device.


In an example, a data obscuring system receives and tracks eye data, head data, and/or facial data of a user as the user operates a computing device. The data obscuring system then analyzes the data collected from the user to determine whether eyesight of the user is on a display of the computing device. The data obscuring system presents the data on the display of the computing device when it determines the user's eyesight is on the display. However, the data obscuring system automatically adjusts presentation of the display to obscure the data when it determines that the user's eyesight is not focused on the display. The data obscuring system removes obscuring applied to the display when the user's eyesight returns to the display and continually adjusts the display accordingly throughout the remainder of the user's session.


Accordingly, aspects of the present disclosure provide novel features to prevent unauthorized and inadvertent disclosure of sensitive data on one or more displays of a computer system.



FIG. 1 illustrates an exemplary system architecture 100 in which examples of the present disclosure may be implemented. System architecture 100 includes server machine(s) 110, data store(s) 116, and client machines 102A-102N connected to one or more network(s) 104. Network(s) 104 may be public networks (e.g., the Internet), private networks (e.g., local area networks (LANs) or wide area networks (WANs)), or any combination thereof. In an example, network(s) 104 may include the Internet and/or one or more intranets, wired networks, wireless networks, and/or other appropriate types of communication networks. In one example, network(s) 104 may comprise wireless telecommunications networks (e.g., cellular phone networks) adapted to communicate with other communication networks, such as the Internet.


Data store 116 is persistent storage capable of storing various types of data, such as text, audio, image, and video content. In some examples, data store 116 might be a network-attached file server, while in other examples data store 116 might be some other type of persistent storage such as an object-oriented database, a relational database, and so forth.


Client machines 102A-102N may be personal computers (PC), laptops, mobile phones, tablet computers, server computers, wearable computing devices, or any other type of computing device. Client machines 102A-102N may run an operating system (OS) that manages hardware and software of the client machines 102A-102N. A browser (not shown) may run on the client machines 102A-102N (e.g., on the OS of the client machines). The browser may be a web browser that can access content and services provided by a web server 120 of server machine 110. Other types of computer programs and computer scripts also may run on client machines 102A-102N.


Client machine 102A includes one or more camera(s) 106, one or more display(s) 108, and data obscuring system 130. Data obscuring system 130 includes data collector module 140, data receiver module 150, user identifier module 160, sight analyzer module 170, and display adjuster module 180. In an example, a data obscuring system 130, 130A, 130B, 130C, 130D may include one or more of a data collector module 140, a data receiver module 150, a user identifier module 160, a sight analyzer module 170, and a display adjuster module 180. In some examples, functionality associated with data collector module 140, data receiver module 150, user identifier module 160, sight analyzer module 170, and display adjuster module 180 may be combined, divided, and organized in various arrangements on one or more computing devices.


In an example, client machine 102A is coupled to one or more camera(s) 106, is coupled to one or more display(s) 108, and includes data obscuring system 130. For example, client machine 102A may be directly or wirelessly (e.g., via Bluetooth) connected to one or more camera(s) 106 and one or more display(s) 108. Camera(s) 106 may include one or more external cameras (e.g., attached webcams) or one or more cameras that are embedded, incorporated, or built into a computing device (e.g., personal computer, laptop, smartphone, smart glasses, various types of wearable computing devices, etc.).


A camera generally describes an optical instrument that records images or other visual data for storage and/or transmission to other locations. In an example, one or more cameras 106 perform eye tracking to measure eye activity and/or eye positioning of a user operating client machine 102A. For example, eye tracking data may be collected using a local or head-mounted eye tracker coupled to client machine 102A.


In an example, an eye tracking device includes a camera 106 and a light source (e.g., infrared or other light) directed onto one or more eyes of a user. A camera 106 may track the reflection of the light source and ascertainable ocular features of an eye, such as a pupil. The eye data of the user then may be used to extrapolate rotation of a user's eye and the direction of the user's line of sight or gaze.


A display 108 generally refers to an output device for presenting various types of information to a user in a digital format. For example, various types of two-dimensional displays include, television sets, computer monitors, head-mounted displays, broadcast reference monitors, medical monitors, etc. Three-dimensional displays may include stereoscopic projections, laser displays, holographic displays, tactile electronic displays, etc.


In an example, data collector module 140 of data obscuring system 130 tracks and collects eye movement data and other eye data from a user. In one example, data collector module collects biometric eye data identifying a user. For example, data collector module 140 may collect iris scan data, pupil scan data, retinal scan data and other types of eye data from a user. In some examples, data collector module 140 collects head and facial data for a user. For example, data collector module 140 may collect data about head and/or or facial positioning of a user. Data collector module 140 also may collect facial recognition data from a user, for example, to identify the user in a current or future session.


In an example, data receiver module 150 of data obscuring system 130 receives data collected from a user or from a device worn by the user. For example, data receiver module 150 may receive eye data collected from a user by a camera 106 associated with an eye tracking device. In some examples, data receiver module 150 receives various types of eye data from a user, including biometric eye data identifying the user. In addition, data receiver module 150 may receive data about the positioning of a user's head and/or face in relation to one or more displays.


In an example, user identifier module 160 of data obscuring system 130 determines whether to allow a user of a client machine 102A to view data presented on one or more displays 108A, 108B. For example, user identifier module 160 may determine whether to login a user on client machine 102A, to lock or unlock client machine 102A, to present data on a display 108 to a user, or to obscure data presented on a display 108 based on eye data and/or facial data collected from the user.


In an example, user identifier module 160 automatically locks client machine 102A or logs a user off client machine 102A when eye data is not received from the user within a predetermined amount of time. In some examples, user identifier module 160 automatically adjusts a user profile on client machine 102A in response to identifying a user based on eye data and/or facial data. For example, user identifier module 160 may adjust a user profile used on client machine 102A based on parental settings or the security classifications associated with the detected user of the client machine 102A based on eye data and/or facial recognition data.


In an example, user identifier module 160 authenticates a user seeking to use client machine 102A and/or to view data presented or ready for presentation on a display 108 based on eye data and/or facial data. For example, user identifier module 160 may compare data points, signatures, or patterns of eye data and/or facial data of a user to a database of trusted and/or untrusted user profiles when authenticating or authorizing a user.


In an example, sight analyzer module 170 of data obscuring system 130 analyzes eye data and/or facial data from a user to determine whether the user's visual orientation, visual perspective, direction of sight, gaze, or line of sight is on one or more displays 108A, 108B. For example, sight analyzer module 170 may detect various changes and perform various calculations involving eye data and/or facial data of a user to determine whether eyesight of the user is on one or more displays 108A, 108B.


In an example, sight analyzer module 170 determines whether eyesight of a user is directed towards or aligned with a display 108 based on a calculated line of sight for the user in reference to position and size of the display 108. Sight analyzer module 170 also may determine whether eyesight of a user is focused on an application on a display 108 based on characteristics of the display 108A and positioning of the application on the display.


In an example, display adjuster module 180 of data obscuring system 130 automatically adjusts presentation of data on one or more displays 108A, 108B based on whether eyesight of a user is determined to be on a respective display 108A, 108B. In one example, display adjuster module 180 automatically presents data on a display 108 to a user when eyesight of the user is determined to be on the display 108A. In addition, display adjuster module 180 automatically obscures presentation of the data on the display 108A when eyesight of the user is determined to be off the display 108A (e.g., when the user is not looking at the display 108A).


Client machine 102N includes camera 106A, display 108A, display 108B, data obscuring system 130B, data obscuring system 130C, data obscuring system 130D, operating system 192, data obscuring application 194, application(s) 196, application(s) 198.


In an example, a data obscuring system 130B, 130C, 130D uses camera 106A to perform eye tracking and to collect eye data from a user. For example, camera 106A may track a user's eye movement, collect eye position data from the user, collect eye movement data from the user, and/or collect biometric eye data from the user client machine 102N. In an example, biometric eye data may include optical data that identifies the user. For example, biometric eye data may include iris scan data, pupil scan data, retinal scan data, etc.


In an example, a data obscuring system 130B, 130C, 130D automatically adjusts one or more displays 108A, 108B based on eye data, facial data, head data and/or other data collected from a user. For example, a data obscuring system 130B, 130C, 130D may present data on one or more displays 108A, 108B to a user when it is determined that the eyesight of the user is on one or more of the respective displays 108A, 108B. Eyesight of a user generally refers to a user's visual orientation, visual perspective, direction of sight, gaze, or line of sight as it pertains to a hypothetical or real target, such as a display 108A, 108B. A user's eyesight may be determined by analyzing eye data and/or other types of data collected from a user.


In an example, a data obscuring system 130B, 130C, 130D automatically adjusts one or more displays 108A, 108B to obscure data based on eye data collected from a user. For example, a data obscuring system 130B, 130C, 130D may obscure data presented on one or more displays 108A, 108B when it is determined that eyesight of the user is off a respective display 108A, 108B.


In an example, a data obscuring system 130B, 130C, 130D presents data on a first display 108A and obscures data on a second display 108B in a dual display configuration when the user looks at the first display and not the second display. Data obscuring system 130B, 130C, 130D then automatically obscures presentation of the data on the first display 108A and removes obscuring of the data on the second display 108B when it is determined that the user's eyesight has moved from the first display 108A to the second display 108B in the dual monitor configuration.


Obscuring generally refers to one or more actions that remove data from being presented on a display 108A, 108B and/or render data presented on a display 108A, 108B as unreadable or indecipherable. For example, obscuring data may be performed by minimizing an application, closing an application, hiding data, moving an application to the background, redacting data, blurring all or part of an application or display, running a screensaver, displaying a protective image over the data, turning off a display, and/or distorting or rendering material presented on a display as unidentifiable or not viewable.


In an example, a data obscuring system 130B is provided as part of an operating system 192. For example, a data obscuring system 130B provided with operating system 192 may allow a user or computer system administrator to select or predetermine which of one or more software applications 196, 198 running on the operating system 192 to obscure when eyesight of the user moves off a display 108A, 108B.


For example, a software application 196, 198 developer and/or a computer system administrator may register or identify a software application 196, 198 to be obscured. In addition, a user may register or tag application 196, 198 windows, uniquely identified by application window IDs, to be obscured by a data obscuring system 130B, 130C. In some examples, a user may tag a software application 196, 198 or window using a keyboard shortcut. For example, one type of keyboard shortcut may tag a software application 196, 198 or window (e.g., a web browser window) to be obscured for the remainder of a session. Another type of keyboard shortcut may allow a user to select obscuring of a software application 196, 198 or window for current and future sessions until otherwise indicated.


In an example, a data obscuring system 130C is provided as part of data obscuring application 194. For example, a data obscuring application 194 installed and running on an operating system 192 may allow a user or computer system administrator to select or predetermine which of one or more software applications 196, 198 obscure when eyesight of the user is off a display 108A, 108B. In one example, a data obscuring application 194 provides an application programming interface (API) that allows other software applications 196, 198 to perform obscuring of displayed data.


In an example, a data obscuring system 130D is provided as part of one or more software application(s) 196. For example, a software developer may include data obscuring system 130D functionality within a software application 196 to allow a user or a computer system administrator to select or predetermine whether an application 196 or specific sensitive/private data within the application 196 is to be obscured when eyesight of the user is off the display 108A, 108.


In some examples, data obscuring system 130D functionality is predetermined and not configurable by a user or computer system administrator. For example, a software application 196 may be designed by an application developer to provide configurable or non-configurable obscuring for various application data and graphical user interface controls (e.g., labels, buttons, menus, lists, drop downs, windows, tabs, dialogues, panels, tables, graphics, etc.).


In an example, various software application(s) 198 do not have or use a data obscuring system (e.g. data obscuring system 130). In some examples, software application(s) 198 without a data obscuring system may be obscured or have application 198 data obscured, for example, by an operating system 192 data obscuring system 130B and/or a data obscuring system 130C of a data obscuring application 194.


Server machines 110 may include one or more web server(s) 120. Web server(s) 120 may provide text, audio, and video images from data store(s) 116 to client machines 102A-102N. Web server(s) 120 also may provide web-based application services, business logic, and/or updates to client machines 102A-102N. Client machines 102A-102N may locate, access, and consume various forms of content and services from web server(s) 120 using applications, such as a web browser, web servers, application servers, computer programs, etc. Web server(s) 120 also may receive text, audio, video, and image content from clients 102A-102N saved in data store(s) 116 for purposes that may include preservation and distribution of content.


In an example, a web server 120 is coupled to one or more applications servers (not shown) that provide application 114 services, data, and/or APIs to client machines 102A-102N. In one example, web server(s) 120 may provide clients 102A-102N with access to one or more application 114 services associated with a server-based data obscuring system 130A. Such functionality also may be provided, for example, as part of one or more different web applications, standalone applications, systems, plug-ins, web browser extensions, and application programming interfaces (APIs). In some examples, plug-ins and extensions also may be referred to, individually or collectively, as “add-ons.”


In an example, some client machines 102A-102N may include applications associated with a service provided by server machine 110. In one example, one or more device types (e.g., smart phones, smart televisions, tablet computers, wearable devices, smart home computer systems, etc.) may use applications to access content provided by, to issue commands to server machine(s) 110, and/or to receive content from server machine(s) 110 without visiting or using web pages.


In an example, functions performed by server machine(s) 110 and/or web server(s) 120 also may be performed by the client machines 102A-102N, in whole or in part. In addition, the functionality attributed to a particular component may be performed by different or multiple components operating together. Further, server machine(s) 110 may be accessed as a service provided to other systems or devices via appropriate application programming interfaces (APIs), and thus are not limited to use with websites.



FIG. 2 is a flow diagram for automatically adjusting a display to obscure application data, according to an example of the present disclosure. The method 200 may be performed by processing logic that may comprise hardware (circuitry, dedicated logic, programmable logic, microcode, etc.), software (such as instructions run on a general purpose computer system, dedicated machine, or processing device), firmware, or a combination thereof.


Method 200 begins at block 202 when data receiver module 150 of data obscuring system 130 receives eye data from a user. In an example, data receiver module 150 receives eye data from one or more cameras 106 associated with an eye tracking device. For example, data receiver module 150 may receive eye position, eye movement, and other eye tracking data from an eye tracking device. Data receiver module 150 also may receive biometric eye data, such as iris scan data, pupil scan data and retinal scan data that identifies a user. In some examples, data receiver module 150 receives head position data, facial position, and/or facial recognition data collected from a user.


At block 204, sight analyzer module 170 of data obscuring system 130 analyzes the eye data received from the user to determine whether eyesight of the user is on a display 108. In an example, sight analyzer module 170 determines whether eyesight of a user is directed towards or aligned with a display 108 based on a calculated line of sight for the user in reference to position and size of the display 108. Sight analyzer module 170 also may determine whether eyesight of a user is on an application presented within a display 108 area based on characteristics of the display 108A area and positioning of the application within the display area.


In an example, sight analyzer module 170 detects a change in eyesight of a user. For example, sight analyzer module 170 may detect a change in eyesight of a user based on updated eye data received from the user. In one example, sight analyzer module 170 performs eye data and eyesight analysis to determine and/or re-determine whether the eyesight of the user is on one or more displays 108A, 108B.


At block 206, display adjuster module 180 of data obscuring system 130 obscures data on the display in response to determining that the eyesight of the user is off the display 108. In an example, display adjuster module 180 secures some or all of data presented on a display 108 in response to detecting that eyesight of the user is not focused on the display 108.


In an example, display adjuster module 180 obscures presentation of data on a display 108 by obscuring an entire display 108. Display adjuster module 180 also may obscure one or more applications 196, 198 presented on the display 108 and/or one or more areas of data presented on the display, for example, the areas comprising sensitive, private, and/or confidential data provided by applications 196, 198. In some examples, display adjuster module 180 obscures data presented on a display in real-time or substantially in real-time (e.g., under a second, under a tenth of a second, under one hundredth of a second, etc.)


In an example, display adjuster module 180 obscures web browser software applications and data presented on a display 108 in response to determining that eyesight of a user off the display 108. For example, a web browser may include its own data obscuring system 130D or use a data obscuring application 194 available on a client machine 102A to obscure web browser windows, tabs, data, etc.


In an example, a web browser plug-in obscures web browser applications and data. For example, a web browser plug-in may receive data that it uses to determine whether to obscure web browser applications and data. In one example, an application developer may code web application using custom tags and/or scripts that mark web application data as sensitive, indicating to a web browser or a web browser plug-in that the web application data is to be obscured. In some examples, a web browser and/or a web browser plug-in also may dynamically inject code into a webpage, for example during rendering of the webpage, to perform the obscuring (e.g., custom HTML, dynamic HTML, JavaScript, etc.).


In some examples, data obscuring system 130 uses other types of detected information in addition to eye data and eyesight analysis. For example, data obscuring system 130, may present data on a display to a user without performing obscuring when detecting sole presence of the user, when detecting presence of trusted users, and/or when detecting absence of untrusted users in an area, space (e.g., room), or within viewing distance of a display 108. Thus, a display 108 or content presented on a display 108 may remain visible (no obscuring) when there is a low or nonexistent risk of an unwanted (e.g., unknown, untrusted, unauthorized) user viewing a display 108 or presented data.


In some examples, data obscuring system 130 may perform obscuring when detecting that an individual (e.g., a known individual, an unknown individual, a trusted individual, an untrusted individual, etc.) other than a user has entered a location, is about to enter a location, is within viewing distance of a display 108, or is about to be within viewing distance of a display 108. In some examples, data obscuring system 130 determines whether a detected individual other than a user is authorized to view one or more portions of content presented on a display 108 and may obscure one or more parts of the content that the detected individual is unauthorized to view (e.g., sensitive data, private data, classified data, etc). On the other hand, when the individual is authorized to view the content (e.g., a co-worker or boss viewing a presentation), the content may remain visible (i.e., with no obscuring applied, “unobscured”) even when eyesight of the user is detected as being off a display 108 (e.g., when the user looks away, at another display, or at paper notes).



FIG. 3 is a flow diagram for is a flow diagram for automatically adjusting multiple displays to obscure application data, according to an example of the present disclosure. The method 300 may be performed by processing logic that may comprise hardware (circuitry, dedicated logic, programmable logic, microcode, etc.), software (such as instructions run on a general purpose computer system, dedicated machine, or processing device), firmware, or a combination thereof.


Method 300 begins at block 302 when data receiver module 150 of data obscuring system 130 receives eye data collected from a user. In an example, data receiver module 150 receives eye data from one or more cameras 106. For example, data receiver module 150 may receive eye position data, eye movement data, and other eye tracking data from an eye tracking device or other type of computing device. Data receiver module 150 also may receive biometric eye data (e.g., iris data, pupil data, retinal data) identifying a user. In some examples, data receiver module 150 receives head position data, facial position data, and/or facial recognition data collected from a user.


At block 304, sight analyzer module 170 of data obscuring system 130 determines that eyesight of a user is on a first display 108A. For example, sight analyzer module 170 may use one or more of eye data, head data, facial data, or other types of data collected from a user to determine that the user's focus is on the first display 108A. In one example, sight analyzer module 170 determines that eyesight of the user is on the first display 108A based on eye data of the user in relation to a size and position of the first display 108A.


At block 306, display adjuster module 180 of data obscuring system 130 presents data on the first display 108A to a user. In an example, display adjuster module 180 allows the regular, non-obscured presentation of data on the first display 108A when the eyesight of the user is determined to be directed towards or focused on the first display 108A. In one example, display adjuster module 180 restores presentation of data on the first display 108A by removing obscuring previously applied to the first display 108A. For example, display adjuster module 180 may remove obscuring from the first display 108A when the eyesight of the user returns to the first display 108A after being directed towards another location.


At block 308, sight analyzer module 170 of data obscuring system 130 determines that eyesight of a user is off the first display 108A. In an example, sight analyzer module 170 determines that the direction or focus of the user's eyesight has moved to a location that is not on the first display 108A. For example, sight analyzer module 170 may determine that the eyesight of the user is off the first display 108A based on eye data, head data, and/or facial data collected from the user in relation to a size and position data associated with the first display 108A.


At block 310, display adjuster module 180 of data obscuring system 130 obscures data on the second display 108B in response to determining that the eyesight of the user is off the second display 108B. In an example, sight analyzer module 170 adjusts presentation of the data on the second display 108B by obscuring the data.


For example, sight analyzer module 170 may remove the data from the second display 108B or alter presentation of the data on the second display 108B by making the data unreadable or indecipherable. In some examples, data obscuring system may obscure data by minimizing an application, closing an application, hiding data, moving an application to the background, redacting data, blurring all or part of an application or display, running a screensaver, displaying a protective image on a display over the data, turning off a display, and/or distorting or rendering material presented on a display as unidentifiable or not viewable.


At block 312, sight analyzer module 170 of data obscuring system 130 detects a change in the eye data of the user. In an example, sight analyzer module 170 compares updated eye data from a user to previous eye data received from the user to detect the change. Sight analyzer module 170 also may detect a change in one or more other types of user data, such as head data, facial data, etc. In one example, sight analyzer module 170 calculates or recalculates a user's line of sight in response to detecting a change in eye data, head data, and/or facial data of a user.


At block 314, display adjuster module 180 of data obscuring system 130 automatically obscures the data on the first display 108A and removes the obscuring from the second display 108B in response to determining that the eyesight of the user transitioned from the first display 108A to the second display 108B. In an example, display adjuster module 180 dynamically applies obscuring to and removes the obscuring from one or more displays 108A, 108B based on whether the user's eyesight is directed toward a respective display 108A, 108B. In one example, display adjuster module 180 obscures the presentation of data on both the first display 108A and the second display 108B when user's eyesight is off both displays.



FIG. 4 illustrates a diagram of a machine in the exemplary form of a computer system 400, within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In other examples, the machine may be connected (e.g., networked) to other machines in a LAN, an intranet, an extranet, or the Internet. The machine may operate in the capacity of a server or a client machine in client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a wearable computing device, a web appliance, a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The exemplary computer system 400 includes a processing device (processor) 402, a main memory 404 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM), double data rate (DDR SDRAM), or DRAM (RDRAM), etc.), a static memory 406 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 418, which communicate with each other via a bus 430.


Processor 402 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processor 402 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processor 402 also may be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processor 402 is configured to execute instructions 422 for performing the operations and steps discussed herein.


The computer system 400 also may include a network interface device 408. The computer system 400 may further include a video display unit 410 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 412 (e.g., a keyboard), a cursor control device 414 (e.g., a mouse), and a signal generation device 416 (e.g., a speaker).


The data storage device 418 may include a computer-readable storage medium 428 on which is stored one or more sets of instructions 422 (e.g., software computer instructions) embodying any one or more of the methodologies or functions described herein. The instructions 422 also may reside, completely or at least partially, within the main memory 404 and/or within the processor 402 during execution thereof by the computer system 400, the main memory 404 and the processor 402 also constituting computer-readable storage media. The instructions 422 may be transmitted or received over a network 420 via the network interface device 408.


In one example, the instructions 422 include instructions for one or more modules of a data obscuring system (e.g., data obscuring system 130 of FIG. 1) and/or a software library containing methods that call an automated data obscuring system 130. While the computer-readable storage medium 428 (machine-readable storage medium) is shown as an example to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable storage medium” also may include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.


Numerous details are set forth in the foregoing description. However, it will be apparent to one of ordinary skill in the art having the benefit of this disclosure that the present disclosure may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring the present disclosure.


Some portions of the detailed description have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. Here, an algorithm is generally conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.


It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “computing”, “capturing”, “determining”, “obscuring”, “providing”, “receiving, ” “processing,” or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (e.g., electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.


Certain examples of the present disclosure also relate to an apparatus for performing the operations herein. This apparatus may be constructed for the intended purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.


It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other examples will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the disclosure therefore should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims
  • 1. (canceled)
  • 2. A system, comprising: a non-transitory memory; andone or more hardware processors coupled to the non-transitory memory and configured to read instructions from the non-transitory memory to cause the system to perform operations comprising: receiving presentation data for presenting content on a display;obtaining eye data associated of a user viewing the display;determining a first threat level based on analyzing the eye data associated of the user;altering the presentation data to obscure at least a portion of the content based on the first threat level before rendering the altered presentation data on the display; andrendering the altered presentation data on the display.
  • 3. The system of claim 2, wherein the presentation data comprises programming code.
  • 4. The system of claim 2, wherein the presentation data comprises web data.
  • 5. The system of claim 2, wherein analyzing the eye data comprises: determining a gaze of the user is off the display based on the eye data, wherein the first threat level is determined based on the determining the gaze of the user is off the display.
  • 6. The system of claim 2, wherein the first threat level indicates that the user is untrusted.
  • 7. The system of claim 6, wherein analyzing the eye data comprises: deriving an eye movement pattern based on the eye data; anddetermining the user is untrusted based on the derived eye movement pattern.
  • 8. The system of claim 7, wherein analyzing the eye data further comprises comparing the eye movement pattern against a set of trusted eye movement patterns.
  • 9. The system of claim 2, wherein the operations further comprise: receiving updated eye data associated with the user;determining a second threat level based on analyzing the updated eye data associated with the user;reverting to the presentation data based on the second threat level; andrendering the reverted presentation data on the display.
  • 10. The system of claim 2, wherein the operations further comprise: obtaining facial data associated with the user viewing the display, wherein the first threat level is determined further based on the obtained facial data.
  • 11. The system of claim 2, further comprising a camera configured to capture the eye data.
  • 12. The system of claim 2, wherein the display is a first display, wherein the presentation data is first presentation data for presenting first content on the first display, and wherein the operations further comprise: receiving second presentation data for presenting second content on a second display;determining a gaze of the user is on the second display based on analyzing the eye data; andrendering the second presentation data for display on the second display.
  • 13. The system of claim 12, wherein the operations further comprise: receiving updated eye data;in response to determining the gaze of the user has moved from the second display to the first display based on the updated eye data, (i) altering the second presentation data to obfuscate at least a portion of the second content and (ii) reverting to the first presentation data;rendering the altered second presentation data for display on the second display; andrendering the reverted first presentation data for display on the first display.
  • 14. A method, comprising: receiving, by one or more hardware processors, presentation data for presenting content on a display;obtaining, by the one or more hardware processors, eye data associated of a user viewing the display;analyzing, by the one or more hardware processors, the eye data to determine a gaze of the user is off the display;in response to determining the gaze of the user is off the display, altering, by the one or more hardware processors, the presentation data to obscure at least a portion of the content before rendering the altered presentation data on the display; andrendering, by the one or more hardware processors, the altered presentation data on the display.
  • 15. The method of claim 14, wherein the presentation data comprises programming code.
  • 16. The method of claim 14, wherein the presentation data comprises web data.
  • 17. The method of claim 14, further comprising: deriving an eye movement pattern based on the eye data;determining the user is untrusted based on the derived eye movement pattern; andmaintaining the rendering of the altered presentation data after determining the gaze of the user has moved to the display.
  • 18. The method of claim 17, wherein determining the user is untrusted comprises comparing the derived eye movement pattern against a set of trusted eye movement patterns.
  • 19. The method of claim 14, further comprising: receiving updated eye data associated with the user;analyzing the updated eye data to determine the gaze of the user has moved to the display;reverting to the presentation data based on determining the gaze of the user has moved to the display; andrendering the reverted presentation data on the display.
  • 20. A non-transitory machine-readable medium having stored thereon machine-readable instructions executable to cause a machine to perform operations comprising: receiving presentation data for presenting content on a display;obtaining eye data associated of a user viewing the display;determining a first threat level based on analyzing the eye data associated of the user;altering the presentation data to obscure at least a portion of the content based on the first threat level before rendering the altered presentation data on the display; andrendering the altered presentation data on the display.
  • 21. The non-transitory machine-readable medium of claim 20, wherein the display is a first display, wherein the presentation data is first presentation data for presenting first content on the first display, and wherein the operations further comprise: receiving second presentation data for presenting second content on a second display;determining the gaze of the user is on the second display based on analyzing the eye data; andrendering the second presentation data for display on the second display.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of and claims priority to U.S. patent application Ser. No. 14/584,335 filed Dec. 29, 2014, which is incorporated herein by reference in its entirety.

Continuations (1)
Number Date Country
Parent 14584335 Dec 2014 US
Child 15889160 US