A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
The present disclosure relates to power saving techniques that can be used with foveated content, such as dynamically foveated content. Foveation refers to a technique in which some aspect of an image (e.g., an amount of detail, image quality, coloration, or brightness) is varied across displayed content based at least in part on a fixation point, such as a point or area within the content itself, a point or region of the content on which one or more eyes of a user are focused, or movement of the one or more eyes of the user. For example, the brightness level in various portions of the image can be varied depending on the fixation point. Indeed, in regions of the electronic display some distance beyond the fixation point, which are more likely to appear in a person's peripheral vision, the brightness may be lowered. In this way, foveation can reduce an amount of power used to display the content on the electronic display without being noticeable to the person viewing the electronic display.
In static foveation, various areas of an electronic display having different brightness levels each have a fixed size and location on the electronic display for each frame of content displayed to the user. In dynamic foveation, the various areas at different brightness levels may change between two or more images based at least in part on the gaze of the viewer. For example, as the eyes of the user move across the electronic display from a top left corner to a bottom right corner, the high brightness level portion of the electronic display also moves from the top left corner to the bottom right corner of the display. For content that uses multiple images, such as videos and video games, the content may be presented to the viewer by displaying the images in rapid succession. The high brightness and lower brightness portions of the electronic display in which the content is displayed may change between frames.
For dynamic foveation, an eye tracking system is used to determine a focal point of the eyes of the user on the electronic display. That is, a continuous input from the eye tracking system is provided to a foveation system and used to determine the size and location of the high brightness level area on the electronic display. If the eye tracking system detects movement of the gaze of the user, the foveation system may cause display artifacts to be visible or perceived by the user which negatively affect the experience of the user. The artifacts may include low luminance levels at the focal point of the eyes of the user, intermittent switching between high luminance levels and low luminance levels due to sudden movement of the foveated areas of the display, and flashing resulting from sudden luminance level changes at various areas of the display. Foveation errors (e.g., temporal flashing) on the electronic display may be visible to the user and may deteriorate the experience of the user looking at the electronic display.
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings described below.
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
The electronic device 10 shown in
Although the image processing circuitry 30 is shown as a component within the processor core complex 12, the image processing circuitry 30 may represent any suitable hardware and/or software that may occur between the initial creation of the image data and its preparation for display on the electronic display 18. Thus, the image processing circuitry 30 may be located wholly or partly in the processor core complex 12, wholly or partly as a separate component between the processor core complex 12 and the electronic display 18, or wholly or partly as a component of the electronic display 18.
The various components of the electronic device 10 may include hardware elements (including circuitry), software elements (including machine-executable instructions stored on a tangible, non-transitory medium, such as the local memory 14 or the storage device 16, or a combination of both hardware and software elements. It should be noted that
The processor core complex 12 may perform a variety of operations of the electronic device 10, such as generating image data to be displayed on the electronic display 18 and performing dynamic foveation of the content to be displayed on the electronic display 18. The processor core complex 12 may include any suitable data processing circuitry to perform these operations, such as one or more microprocessors, one or more application specific processors (ASICs), or one or more programmable logic devices (PLDs). In some cases, the processor core complex 12 may execute programs or instructions (e.g., an operating system or application) stored on a suitable storage apparatus, such as the local memory 14 and/or the storage device 16.
The memory 14 and the storage device 16 may also store data to be processed by the processor core complex 12. That is, the memory 14 and/or the storage device 16 may include random access memory (RAM), read only memory (ROM), rewritable non-volatile memory such as flash memory, hard drives, optical discs, or the like.
The electronic display 18 may be a self-emissive display, such as an organic light emitting diode (OLED) display, an LED display, or μLED display or may be a liquid crystal display (LCD) illuminated by a backlight. In some embodiments, the electronic display 18 may include a touch screen, which may allow users to interact with a user interface of the electronic device 10. Additionally, the electronic display 18 may show foveated content.
The electronic display 18 may display various types of content. For example, the content may include a graphical user interface (GUI) for an operating system or an application interface, still images, video, or any combination thereof. The processor core complex 12 may supply or modify at least some of the content to be displayed.
The input structures 22 of the electronic device 10 may enable a user to interact with the electronic device 10 (e.g., pressing a button or icon to increase or decrease a volume level). The I/O interface 24 and the network interface 26 may enable the electronic device 10 to interface with various other electronic devices. The power source 29 may include any suitable source of power, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter.
The network interface 26 may include, for example, interfaces for a personal area network (PAN), such as a Bluetooth network, a local area network (LAN) or wireless local area network (WLAN), such as an 802.11x Wi-Fi network, and/or a wide area network (WAN), such as a cellular network. The network interface 26 may also include interfaces for, for example, broadband fixed wireless access networks (WiMAX), mobile broadband Wireless networks (mobile WiMAX), asynchronous digital subscriber lines (e.g., ADSL, VDSL), digital video broadcasting-terrestrial (DVB-T) and its extension DVB Handheld (DVB-H), ultra-wideband (UWB), alternating current (AC) power lines, and so forth.
The eye tracker 32 may measure positions and movement of one or both eyes of a person viewing the electronic display 18 of the electronic device 10. For instance, the eye tracker 32 may be a camera that records the movement of a viewer's eye(s) as the viewer looks at the electronic display 18. However, several different practices may be employed to track a viewer's eye movements. For example, different types of infrared/near infrared eye tracking techniques such as bright-pupil tracking and dark-pupil tracking may be used. In these types of eye tracking, infrared or near infrared light is reflected off of one or both of the eyes of the viewer to create corneal reflections.
A vector between the center of the pupil of the eye and the corneal reflections may be used to determine a point on the electronic display 18 at which the viewer is looking. Moreover, as discussed below, varying portions of the electronic display 18 may be used to show content in relatively higher and lower luminance level portions based at least in part on the point of the electronic display 18 at which the viewer is looking.
As will be described in more detail herein, the image processing circuitry 30 may perform particular image processing adjustments to counteract artifacts that may be observed when the eye tracker 32 tracks eye movement during foveation. For example, foveated areas rendered on the electronic display 18 may be dynamically adjusted (e.g., by size and/or position).
As discussed above, the electronic device 10 may be a computer, a portable electronic device, a wearable electronic device, or other type of electronic device. Example computers may include generally portable computers (such as laptop, notebook, and tablet computers) as well as computers that are generally used in one place (such as conventional desktop computers, workstations, and/or servers). In certain embodiments, the electronic device 10 in the form of a computer may be a model of a MacBook®, MacBook® Pro, MacBook Air®, iMac®, Mac® mini, or Mac Pro® available from Apple Inc. of Cupertino, Calif.
By way of example, the electronic device 10 depicted in
The user input structures 22, in combination with the electronic display 18, may allow a user to control the handheld device 10B. For example, the input structures 22 may activate or deactivate the handheld device 10B, navigate a user interface to a home screen or a user-configurable application screen, and/or activate a voice-recognition feature of the handheld device 10B. Other input structures 22 may provide volume control, or toggle between vibrate and ring modes. The input structures 22 may also include a microphone to obtain a voice of the user for various voice-related features, and a speaker to enable audio playback and/or certain capabilities of the handheld device 10B. The input structures 22 may also include a headphone input to provide a connection to external speakers and/or headphones.
The electronic display 18 of the wearable electronic device 10E may be visible to a user when the electronic device 10E is worn by the user. Additionally, while the user is wearing the wearable electronic device 10E, an eye tracker (not shown) of the wearable electronic device 10E may track the movement of one or both of the eyes of the user. In some instances, the handheld device 10B discussed with respect to
The electronic display 18 of the electronic device 10 may show images or frames of content such as photographs, videos, and video games in a foveated manner. Foveation refers to a technique in which an amount of detail, resolution, image quality, or brightness is varied across an image based at least in part on a fixation point, such as a point or area within the image itself, a point or region of the image on which a viewer's eyes are focused, or based at least in part on the gaze movement of the viewer's eyes. More specifically, the brightness can be varied by using different luminance levels in various portions of an image. For instance, in a first portion of the electronic display 18, one luminance level may be used to display one portion of an image, while a lower or higher luminance level may be used for a second portion of the image on the electronic display 18. The second portion of the electronic display 18 may be in a different area of the display 18 than the first area or may be located within the first area.
In some embodiments, the change in brightness or luminance level may be a gradual (i.e., smooth) transition from a central portion having a high luminance level to a peripheral edge of the foveated area. That is, for example, the luminance level of the foveated region may have a central portion with a high luminance. A luminance level of an outer portion of the foveated region may gradually decrease from an edge of the central region to an edge of the outer portion.
To reiterate, the adjusted luminance levels of the areas 64, 66, and 68 are relative to the defined luminance levels of the areas 64, 66, and 68, respectively. The defined luminance thus may change depending on the content of the image data. The medium luminance level area 66 may have a lower luminance level than a defined luminance level of the same area. Similarly, the luminance level of the lower luminance level area 68 may be lower than the defined luminance level of the same area. Finally, the luminance level of the higher luminance level area 64 may be the same, lower, or even higher than the defined luminance level of the same area. In certain embodiments, the adjusted luminance level of an area further from the centerpoint 62 may be adjusted more (e.g., further reduced) than an adjusted luminance level of an area closer to the centerpoint 62. Additionally or alternatively, the adjusted luminance level of an area further from the centerpoint 62 may be adjusted less (e.g., reduced to a lesser extent) than an adjusted luminance level of an area closer to the centerpoint 62.
As one example, an adjusted luminance level of the lower luminance level area 68 may be between forty to sixty percent of a defined luminance level of an original image brightness associated with the area 68. That is, the adjusted luminance level may be between forty to sixty percent of the defined luminance level (e.g., sixty percent of the maximum luminance level) of the display, as described in the example above. An adjusted luminance level of the medium luminance level area 66 may be between sixty to eighty percent of a defined luminance level of an original image brightness associated with the area 66 and a luminance level of the higher luminance level area 64 may be between eighty to one hundred percent of a defined luminance level of an original image brightness associated with the area 64. As illustrated in
As described above, electronic displays such as the electronic display 18 may also use dynamic foveation. In dynamic foveation, the areas of the electronic display 18 at which the various luminance levels are used may change between two or more images based at least in part on the focal point of the eyes of the user. As an example, content that uses multiple images, such as videos and video games, may be presented to viewers by displaying the images in rapid succession. The portions of the electronic display 18 in which the content is displayed with a relatively high luminance level and a relatively low luminance level may change, for instance, based at least in part on data collected by the eye tracker 32 which indicates a focal point on the electronic display 18 of the eyes of the user.
The frames 74 and 86 are in different locations on the electronic display 18 based at least in part on a focal point of the eyes of the user. During a transition from the first frame 74 to the second frame 86 (or when the focal point of the eyes of the user move from the first location 72 of the first frame 74 to the second location 84 of the second frame 86), the higher luminance level area 76 and medium luminance level area 78 are moved from near a bottom left corner of the electronic display 18 to a top right corner of the electronic display 18.
A foveation system may reduce power and increase power savings by turning off circuit components of a display panel in one or more foveated areas. For example, a foveation system may receive an indication of a gaze from a gaze tracker and may determine corresponding portions of an electronic display which may be operated by a reduced number of circuit components.
When activated, luminance of a display pixel may be adjusted by amplified image data received via data lines 100. The source drivers may generate amplified image data by receiving the image data and amplifying voltage of the image data. The source drivers may then supply the amplified image data to the activated pixels. Based on received amplified image data, the display pixels may adjust a corresponding luminance using electrical power supplied from the power source 29. In some embodiments, the electronic display 18 includes a first source driver amplifier 92, a second source driver amplifier 94, a third source driver amplifier 96, and any number of inactive source driver amplifiers 98. The first source driver amplifier 92 may be associated with any number of rows and/or columns of pixels in the low luminance level area 102. The foveation system may receive an indication of a gaze from a gaze tracker and determine area 102 corresponds to a low luminance level area 102. As a result, the foveation system may turn off any number of source driver amplifiers 98 associated with the low luminance level area 102 and may connect the first source driver amplifier 92 to data lines previously supplied image data by the now inactive source driver amplifiers 98. For example, the first source driver amplifier 92 may be connected to four data lines 100 and may supply amplified image data to display pixels associated with the four data lines 100.
The second source driver amplifier 94 may be associated with any number of rows and/or columns of pixels in the medium luminance level area 104. The foveation system may receive an indication of a gaze from a gaze tracker and determine area 104 corresponds to a medium luminance level area 104. As a result, the foveation system may turn off any number of source driver amplifiers 98 associated with the medium luminance level area 104 and may connect the second source driver amplifier 94 to data lines previously supplied image data by the now inactive source driver amplifiers 98. For example, the second source driver amplifier 94 may be connected to two data lines 100 and may supply amplified image data to display pixels associated with the two data lines 100. The third source driver amplifier 96 may be connected to a single row and/or column of pixels in the high luminance level area 106. The foveation system may receive an indication of a gaze from a gaze tracker and determine area 106 corresponds to a high luminance level area 106. As a result, the foveation system may leave on all source driver amplifiers in the high luminance level area 106. For example, the third source driver amplifier 96 may be connected to a single data line 100 and may supply amplified image data to display pixels associated with the single data line 100. While the source driver amplifiers in
A foveation system may receive an indication of movement associated with a gaze from a gaze tracker and may adjust one or more foveated areas, as described above. If the eye tracking system detects movement of the gaze of the user, the foveation system may cause display artifacts to be visible or perceived by the user which negatively affect the experience of the user. The artifacts may include low luminance levels at the focal point of the eyes of the user, intermittent switching between high luminance levels and low luminance levels due to sudden movement of the foveated areas of the display, and flashing resulting from sudden luminance level changes at various areas of the display. To prevent artifacts levels from being visible and deteriorating an experience of the user, techniques described herein provide compensation to image data.
The control signal 114 may be a bit string for determining an operational mode for any number of source drivers. For example, in a 01 bit string of the control signal 114, the first switch 126 may close which connects two rows of pixels of the display 18 to a single source driver, such as source driver 124, as described above with respect to
In certain embodiments, the decode block 142 may receive the control signal 114 and may control operation of the set of switches to determine which source driver(s) to connect to the display 18 based on the control signal 114. For example, the decode block 142 may determine at least one of the source drivers is defective (e.g., inoperative) and may bypass the defective source driver.
In certain embodiments, the decode block 152 may receive the control signal 114 and may control operation of the set of switches to determine which source driver(s) to connect to the display 18 based on the control signal 114. For example, the decode block 152 may determine at least one of the source drivers is defective and may bypass the defective source driver.
In certain embodiments, the feedback circuit 172 may include a filter component 170. The filter component 170 may prevent the compensated reference voltage from changing too quickly in response to the compensation voltage. For example, the filter component 170 may include a one kilohertz filter and may filter out high frequency (e.g., above one kilohertz) spikes in the compensation voltage.
The graph 192 corresponds to a far end area of the display panel. The graph 192 illustrates a line 194 indicating an uncompensated reference voltage corresponding to a display panel without the sensing and compensation circuit 160. As illustrated, the line 194 drops from a first voltage value (e.g., between two to five volts) to a second voltage value (e.g, between two to five volts) in response to the changing emission current depicted in graph 180. As such, artifacts may be visible or may be perceived by the user which negatively affect the experience of the user. The line 196 indicates a compensated reference voltage corresponding to a display panel with a sensing and compensation circuit 160 having a sensing component 168 resistance equal to the routing resistance 166. As shown, the line 196 has a voltage drop (e.g., within one percent, within one tenth of a percent, within one hundredth of a percent, and so forth) and differs less than the uncompensated reference voltage in line 194 in response to the changing emission current depicted in graph 180. The line 198 indicates a compensated reference voltage corresponding to a display panel with a sensing and compensation circuit 160 having a sensing component 168 resistance equal to a sum of the routing resistance 166 and an associated resistance of the display panel. As shown, the line 198 stays relatively close to the initial value (e.g., within one percent, within one tenth of a percent, within one hundredth of a percent, and so forth) in response to the changing emission current depicted in graph 180. As such, the compensated reference voltage delivered to the display panel may reduce and/or may eliminate artifacts from being visible and deteriorating an experience of a user.
As may be appreciated, though the current embodiments refer to movement of the foveated areas toward the center of the display, movement of the foveated area toward other portions of the display could be performed in other embodiments. For example, based upon contextual (e.g., saliency) information of the images displayed on the display, it may be more likely that the focus of the eyes of the user will be at another part of the display (e.g., a more salient area of the display). A salient area of the display may be considered an area of interest based at least in part on the image content. The focal point of the eyes of the user may be drawn to the salient area of the display based at least in part on the content.
When a likely focus area is known, it may be prudent to default movement of the foveated areas toward that portion of the display rather than the center of the display. Thus, in an example where the images displayed have dynamic movement only in the upper right corner (i.e., other portions of the images in the display are still—this may be referred to as “saliency by the effect of movement”), the likely focal area may be the area where dynamic movement is being rendered. Accordingly, in this example the movement of the foveated areas may be toward the upper right corner (i.e., toward the dynamic movement being rendered).
The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function]. . . ” or “step for [perform]ing [a function]. . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
It is well understood that the use of personally identifiable information should follow privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining the privacy of users. In particular, personally identifiable information data should be managed and handled so as to minimize risks of unintentional or unauthorized access or use, and the nature of authorized use should be clearly indicated to users.
This application is a non-provisional application claiming priority to U.S. Provisional Application No. 63/083,704, entitled “FOVEATED DRIVING FOR POWER SAVING,” filed Sep. 25, 2020, which is hereby incorporated by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
20080024483 | Ishii | Jan 2008 | A1 |
20120081347 | Kim | Apr 2012 | A1 |
20180075798 | Nho et al. | Mar 2018 | A1 |
20180075811 | Wacyk et al. | Mar 2018 | A1 |
20180182329 | Yu | Jun 2018 | A1 |
20190237021 | Peng | Aug 2019 | A1 |
20190287450 | Urabe | Sep 2019 | A1 |
20200111422 | Park et al. | Apr 2020 | A1 |
20200357875 | Wang | Nov 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
63083704 | Sep 2020 | US |