Data stored on electronic devices in the consumer, commercial, and industrial sectors often includes data that is associated with varying levels of confidentiality and sensitivity. A user accessing or inputting the private data may need to display the data on an electronic display, such as a desktop computer, laptop computer, or mobile device, while maintaining the required levels of privacy.
Various embodiments described below provide for improving display privacy and power management by adjusting the angles, brightness, and focus areas of active and inactive screen areas or windows (herein either “screen areas” or “windows”) on an electronic display, and/or powering down all or parts of the display. The changes to the angles, brightness, focus areas, and power states may be determined based on, for example, the eye gaze of an authorized or “primary” user and a privacy or power management level based on the authorized user's location, the location of an unauthorized or “secondary” user, or the absence of a gaze toward the display from the primary user.
Generally, a user of an electronic device such as a desktop computer, laptop computer, tablet, mobile device, retail point of sale device, or other device (hereinafter “device”) may require a heightened level of privacy when inputting or accessing certain information. Privacy screens that may be physically applied to the device may obscure the device display, making it difficult to read at all times, and unnecessarily obscuring the display at times when a heightened level of privacy is not required, such as when the user is alone or in a private location.
For example, a user in an office environment may have several windows or screen areas on a display at any given time. In the office environment, the user may not have a need to obscure any of the windows, and may have a “privacy mode” turned off. However, if the user were in another environment, such as on a plane, train, or generally in a public space, the user may wish to obscure certain parts of the screen, and in particular the parts of the screen that the user is not looking at, as determined by the user's eye gaze. In some examples, the inactive parts of the screen may be obscured by changing the angles, brightness, and focus areas of windows or screen areas, thereby providing for increased privacy with minimal negative effect on the usability or workflow of the user.
As another example, in addition to having a privacy mode turned on or off, the user may wish to have varying levels of privacy modes based on location. In the office example, the user may wish to slightly adjust the angles and brightness of screen areas if a secondary user is detected near the display, e.g., by 20%, while in a mobile environment, the user may wish for the adjustments to the angles and brightness to be more pronounced, e.g., by 40%, as users in a mobile environment are less likely to be trusted or authorized users. In some examples, the percentages may be adjusted or relative to ambient lighting in a room or environment.
Moreover, in some examples, the primary user may wish to adjust the angles, brightness, and focus areas of the screen based on the location or distance of a secondary user. For example, if a secondary user is directly over the primary user's shoulder, the primary user may wish to apply the 40% adjustment examples discussed above, while if the secondary user is several feet away, the primary user may wish for lower adjustments, e.g., 20%, to be applied, with the adjustments dynamically changing based on the location of secondary users.
In other examples, for either privacy or power management reasons, the brightness levels for active and inactive screen areas may adjust based on whether the primary user's eye gaze is present or absent.
In the example of
In the example of
In the example of
In the example of
In some examples, sensor 106 may detect the authorized user's eye gaze and determine which window is active, i.e., which window or screen area the user is looking at on the screen. In such examples, the alterations to angle, brightness, and focus area may be lower for the active window. In the example of
The angle of window shift may also be relative to the location or distance of the secondary user. In the example of
In the example of
In the example of
In the example of
In the example of
In the example of
In various examples, the location of display 104, or a device housing a display 104 such as a laptop or mobile device, may be determined either by manual user input, or an automated detection such as determining user location based on GPS, a connected WiFi network, a token, or other device or method.
In the example of
In
In
In
In block 504, a user or users may be detected by a sensor, camera, or other component on a display, such as sensor 106 detected above. In block 506, the primary user and primary user eye gaze may be detected.
In block 508, the active screen area and inactive screen area may be determined based on the primary user's eye gaze.
In block 510, a first angle shift for the active screen area and a second angle shift for the inactive screen area may be determined based on the display location. For example, in a home environment, an angle of 10 degrees may be calculated for active windows, and an angle of 20 degrees may be calculated for inactive windows. In contrast, in a public environment, an angle of 20 degrees may be calculated for active windows, and an angle of 30 degrees may be calculated for inactive windows.
In block 512, a first brightness level for the active screen area and a second brightness level for the inactive screen area may be determined based on the display location. For example, in a home environment, a brightness reduction of 15 percent may be calculated for active windows, and a brightness reduction of 30 percent may be calculated for inactive windows. In contrast, in a public environment, a brightness reduction of 20 percent may be calculated for active windows, and a brightness reduction of 40 percent may be calculated for inactive windows.
In block 514, a focus area or boundaries for a focus area may be calculated, such as in the example of window 110C in
In block 516, the first angle, first brightness level, and focus area boundaries may be applied to the active screen area, and in block 518, the second angle and second brightness levels may be applied to inactive screen areas. The application of display changes may be carried out through, e.g., Instructions from a processor to a display driver.
In block 602, a user or users may be detected by a sensor, camera, or other component on a display, such as sensor 106 detected above. In block 604, the primary user and primary user eye gaze may be detected.
In block 606, a secondary user or users may be determined and a location of the secondary user or a distance between the secondary user and display 104 may be calculated.
In block 608, the active screen area and inactive screen area may be determined based on the primary user's eye gaze.
In block 610, a first angle shift for the active screen area and a second angle shift for the inactive screen area may be determined based on location or distance of the secondary user. For example, if the secondary user is several feet away from the primary user, an angle of 10 degrees may be calculated for active windows, and an angle of 20 degrees may be calculated for inactive windows. In contrast, if the secondary user is over the shoulder of the primary user, an angle of 20 degrees may be calculated for active windows, and an angle of 30 degrees may be calculated for inactive windows.
In another example, if the secondary user is located to the left of the primary user, the angle shift may be to the right to further impair the ability for the secondary user to be within a usable field of view of the display. In contrast, if the secondary user is to the right of the primary user, the angle shift may be to the left.
In block 612, a first brightness level for the active screen area and a second brightness level for the inactive screen area may be determined based on the location or distance of the secondary user to the display 104. For example, if the secondary user is several feet away from the primary user, a brightness reduction of 15 percent may be calculated for active windows, and a brightness reduction of 30 percent may be calculated for inactive windows. In contrast, if the secondary user is over the shoulder of the primary user, a brightness reduction of 20 percent may be calculated for active windows, and a brightness reduction of 40 percent may be calculated for inactive windows.
In block 614, a focus area or boundaries for a focus area may be calculated, such as in the example of window 110C in
In block 616, the first angle, first brightness level, and focus area boundaries may be applied to the active screen area, and in block 618, the second angle and second brightness levels may be applied to inactive screen areas. The application of display changes may be carried out through, e.g., instructions from a processor to a display driver.
In block 708, as discussed above, an active screen area and inactive screen areas are determined based on the primary user's eye gaze.
In block 710, a power saving interval is fetched. As discussed above, for example, the display may be configured to shut off the entire display 5 seconds after the absence of a user eye gaze is detected. In another example, the display may be configured to shut off all areas of the display 10 seconds after the absence of a user eye gaze is detected except for the last active screen area or window, and to shut that last active screen area or window off 20 seconds later if the user's eye gaze is not restored to the screen.
In block 712, a decision is made as to whether the primary user's eye gaze is present. If the eye gaze of the primary user remains present, flow proceeds to block 718 and the display is not adjusted.
If the primary user's eye gaze is not present, flow proceeds to block 714 where an adjusted brightness level for the active screen area is calculated. In some examples, the active screen area may be kept at full brightness, or dimmed, or shut off, or some combination thereof based on a progression of intervals using the methods described above.
In block 716, the adjusted brightness level is applied to the active screen area and the inactive screen areas are turned off. Flow may return to block 702 or 712 such that the system continues to monitor for users and eye gazes.
In an example, device 800 comprises a processing resource such as processor or CPU 802; a non-transitory computer-readable storage medium 804, a display 806, a memory 808, a camera or other sensor 810, and an ambient light sensor 812. In some examples, device 800 may also comprise a memory resource such as memory, RAM, ROM, or Flash memory; a disk drive such as a hard disk drive or a solid state disk drive; an operating system; and a network interface such as a Local Area Network LAN card, a wireless 802.11x LAN card, a 3G or 4G mobile WAN, or a WiMax WAN card. Each of these components may be operatively coupled to a bus.
Some or all of the operations set forth in the figures may be contained as a utility, program, or subprogram in any desired computer readable storage medium, or embedded on hardware. The computer readable medium may be any suitable medium that participates in providing instructions to the processing resource 802 for execution. For example, the computer readable medium may be non-volatile media, such as an optical or a magnetic disk, or volatile media, such as memory. The computer readable medium may also store other machine-readable instructions, including instructions downloaded from a network or the internet.
In addition, the operations may be embodied by machine-readable instructions. For example, they may exist as machine-readable instructions in source code, object code, executable code, or other formats.
Device 800 may comprise, for example, a computer readable medium that may comprise instructions 814 to receive, from a sensor, detection data associated with at least one user of a display; determine a primary user and a primary user eye gaze; determine an active screen area and an inactive screen area based on the primary user eye gaze; fetch a power-saving interval; and in response to the absence of the primary user eye gaze, calculate an adjusted brightness level for the active screen area and apply the adjusted brightness level to the active screen area, and power off the inactive screen area when the power-saving interval is satisfied.
The computer-readable medium may also store an operating system such as Microsoft Windows, Mac OS, Unix, or Linux; network applications such as network interfaces and/or cloud interfaces; and a cloud broker service, monitoring tool, or metrics tool, for example. The operating system may be multi-user, multiprocessing multitasking, and/or multithreading. The operating system may also perform basic tasks such as recognizing input from input devices, such as a keyboard or a keypad; sending output to a display keeping track of files and directories on a medium; controlling peripheral devices, such as drives, printers, or image capture devices; and/or managing traffic on a bus. The network applications may include various components for establishing and maintaining network connections, such as machine readable instructions for implementing communication protocols including but not limited to, TCP/IP, HTTP, Ethernet, USB, and FireWire.
In certain examples, some or all of the processes performed herein may be integrated into the operating system. In certain examples, the processes may be at least partially implemented in digital electronic circuitry, in computer hardware, in machine readable instructions (such as firmware and/or software), or in any combination thereof.
The above discussion is meant to be illustrative of the principles and various embodiments of the present disclosure. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2015/013999 | 1/30/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/122671 | 8/4/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6931596 | Gutta | Aug 2005 | B2 |
8115777 | Jain | Feb 2012 | B2 |
8686981 | Barnhoefer | Apr 2014 | B2 |
8913004 | Bozarth | Dec 2014 | B1 |
8922480 | Freed | Dec 2014 | B1 |
20010026248 | Goren et al. | Oct 2001 | A1 |
20030161500 | Blake | Aug 2003 | A1 |
20060087502 | Karidis et al. | Apr 2006 | A1 |
20060119564 | Fry | Jun 2006 | A1 |
20090273562 | Baliga et al. | Nov 2009 | A1 |
20100275266 | Jakobson et al. | Oct 2010 | A1 |
20110096095 | Xu | Apr 2011 | A1 |
20110206285 | Hodge et al. | Aug 2011 | A1 |
20120019152 | Barnhoefer et al. | Jan 2012 | A1 |
20120243729 | Pasquero | Sep 2012 | A1 |
20120288139 | Singhar | Nov 2012 | A1 |
20130125155 | Bhagavathy | May 2013 | A1 |
20130219012 | Suresh et al. | Aug 2013 | A1 |
20140002586 | Nourbakhsh | Jan 2014 | A1 |
20140055429 | Kwon et al. | Feb 2014 | A1 |
20140078164 | Chan et al. | Mar 2014 | A1 |
20140146069 | Tan | May 2014 | A1 |
20140168070 | Jeong | Jun 2014 | A1 |
20140201666 | Bedikian | Jul 2014 | A1 |
20140201844 | Buck | Jul 2014 | A1 |
20140240357 | Hou | Aug 2014 | A1 |
20140375698 | Zhang | Dec 2014 | A1 |
20150029093 | Feinstein | Jan 2015 | A1 |
20150058649 | Song | Feb 2015 | A1 |
20150062314 | Itoh | Mar 2015 | A1 |
20150095815 | Malkin | Apr 2015 | A1 |
20150153823 | Wu | Jun 2015 | A1 |
20150185875 | Li | Jul 2015 | A1 |
20150287164 | Kominar | Oct 2015 | A1 |
20160035310 | Song | Feb 2016 | A1 |
20160070344 | Gohl | Mar 2016 | A1 |
20160188973 | Ziaja | Jun 2016 | A1 |
20160299574 | Chen | Oct 2016 | A1 |
20170264851 | Kuplevakhsky | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
101578844 | Nov 2009 | CN |
103765374 | Apr 2014 | CN |
104137050 | Nov 2014 | CN |
201306573 | Feb 2013 | TW |
201406142 | Feb 2014 | TW |
WO-2012-162060 | Nov 2012 | WO |
Entry |
---|
Jason Kennedy, “Software Hack Adds Privacy Filter to Old, Decrepit LCD Screens,” Oct. 25, 2011, pp. 1-3, ExtremeTech. |
Justin Germino, “3M Privacy Filter Software Protects Against #VisualHacking,” Nov. 14, 2014, pp. 1-5, Dragon Dogger Technology and Entertainment. |
Number | Date | Country | |
---|---|---|---|
20170329399 A1 | Nov 2017 | US |