This relates generally to displays, and, more particularly, to foveated displays.
Electronic devices often include displays. Particularly when high resolution images are being displayed for a viewer, it may be burdensome to display images at full resolution across an entire display. Foveation techniques involve displaying only critical portions of an image at full resolution and can help reduce the burdens on a display system. If care is not taken, however, display driver circuitry will be overly complex, bandwidth requirements will be excessive, and display quality will not be satisfactory.
An electronic device may have a display and a gaze tracking system. The electronic device may display images on the display that have a higher resolution in a portion of the display that overlaps a gaze location than other portions of the display. The gaze location may be updated in real time based on information from the gaze tracking system. As a user views different portions of the display, a graphics processing unit in the device may be used to dynamically produce high resolution image data in an area that overlaps the updated gaze location.
Timing controller circuitry and column driver circuitry may be used to display images on an array of pixels in the display. The timing controller circuitry may receive image data from the graphics processing unit and may provide image data to the column driver circuitry. The timing controller circuitry and column driver circuitry may include interpolation and filter circuitry. The interpolation and filter circuitry may be used to perform interpolation operations such as nearest neighbor interpolation and may be used to apply two-dimensional spatial filters to low resolution image data.
Display driver circuitry may be configured to load high resolution data from the graphics processing unit into selected portions of a display. The display driver circuitry may include low and high resolution image data buffers, configurable column driver circuitry, and configurable row driver circuitry.
Display driver circuitry may enable and disable data loading to blocks of pixels in the pixel array. Block enable transistors may be included in the pixels. The display driver circuitry may control the block enable transistors to allow selected blocks of pixels to be loaded with high resolution image data.
An illustrative electronic device with a display is shown in
Input-output circuitry in device 10 such as input-output devices 12 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 12 may include buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, speakers, tone generators, vibrators, cameras, sensors, light-emitting diodes and other status indicators, data ports, etc. A user can control the operation of device 10 by supplying commands through input-output devices 12 and may receive status information and other output from device 10 using the output resources of input-output devices 12.
Input-output devices 12 may include one or more displays such as display 14. Display 14 may mounted in a housing for a computer, cellular telephone or other device, may be mounted within a head-mounted display chassis (e.g., device 10 may be configured to be worn on the head of a user), may be mounted on a wall or on a stand, may be a projector, or may be any other suitable type of display.
Control circuitry 16 may be used to run software on device 10 such as operating system code and applications. During operation of device 10, the software running on control circuitry 16 may display images on display 14. For example, data source 20 may supply graphics processing unit 22 with information on three-dimensional images to be displayed on display 14. Data source 20 may, for example, be part of a computer game or other application running on control circuitry 16 that supplies output to graphics processing unit 22 in the form of three-dimensional coordinates. Graphics processing unit 22 may perform rendering operations that map the three-dimensional coordinates from data source 20 onto a two-dimensional plane for presentation as two-dimensional images on display 14. Other types of image content may be display, if desired.
Display 14 may be an organic light-emitting diode display, a liquid crystal display, a liquid crystal-on-silicon display, a projector display such as a microelectromechanical systems (MEMs) display (sometimes referred to as a digital light processing display), may be a display formed form an array of micro-light-emitting diodes (e.g., light-emitting diodes formed from discrete crystalline semiconductor dies), or other suitable type of display.
As shown in
Timing controller integrated circuit 28 and/or column driver integrated circuit 30 may include interpolation and filtering circuitry such as circuitry 42 in circuitry 28 and/or circuitry 44 in circuitry 30. This circuitry may ease the processing burden on graphics processing unit 22 and may thereby help to reduce the bandwidth requirements for the data links in device 10 such as links 34 and/or 36. In particular, the inclusion of interpolation and filtering circuitry in the display driver circuitry of display 14 may allow graphics processing unit 22 to only render portions of a displayed image at full resolution. Other portions of the image may be rendered at low and/or intermediate level(s) of resolution. Because graphics processing unit 22 need not render entire images at full resolution, the bandwidth involved in transmitting data between graphics processing unit 22 and circuit 28 (e.g., over a serial link) may be reduced.
Consider, as an example, the illustrative display of
As shown in
An illustrative filtering arrangement is shown in
As this example demonstrates, foveated rendering (e.g., limiting the native resolution rendering performed by graphics processing unit 22 to an area directly in a user's line of sight such as region 50 of
Display 14 may be used in an augmented reality or virtual reality environment. As a result, it may be desirable for display 14 to be able to cover a wide field of view (e.g., 145°) and exhibit low latency (e.g., less than 5 ms or other suitable amount). With one illustrative arrangement, display 14 of
During operation, the location of the user's gaze (location 74) may be tracked dynamically using eye tracking (e.g., gaze detection system 70). The highest acuity area of a human eye may span about 5 degrees, whereas the field of view encompassed by display 14 may be 145°. Based on gaze location information, device 10 can update the location of region 50 dynamically.
Consider, as an example, a scenario in which display 14 displays images in regions with two different resolutions (rather than the illustrative three different resolutions of
Display 14 may include frame buffer circuitry. With one illustrative configuration, the display driver circuitry for display 14 includes a single 8k×8k frame buffer and only a subset of the frame buffer (corresponding to high resolution area 50) is be updated with high resolution data from graphics processing unit 22. The entire frame buffer can be read into the display at 720 Hz (e.g., for a color sequential display). This would reduce data bandwidth from graphics processing unit 22 by a factor of 64.
With another illustrative configuration, display 14 has multiple frame buffers. This may reduce the amount of circuit resources needed for buffering. The multiple frame buffers (or frame buffer regions) of display 14 may each be associated with a different resolution. For example, the display driver circuitry may include two 1k×1k frame buffers. A low resolution frame buffer (LRFB) may be used to buffer data for low resolution area 54 of display 14 and a high resolution frame buffer (HRFB) may be used to buffer data for high resolution area 50 of display 14. This approach may be used to reduce both data bandwidth from graphics processing unit 22 and frame buffer area.
A diagram of an illustrative display (e.g., a liquid crystal on silicon display or other display) that includes multiple frame buffers is shown in
With a configuration of the type shown in
Display 14 may be driven using a dual frame architecture or an interleaved architecture. System latency is affected by the time consumed by eye tracking and by graphics processing unit operations. System latency is also affected by the time consumed with loading image data (e.g., data for the current and next frames). A timing diagram showing how display 14 may be operated using an illustrative dual frame architecture is shown in
Illustrative gate driver circuitry 32 that can be selectively configured to load data with different resolutions is shown in
As shown in
A bitwise AND operation may be performed between the mask data latch and the expanded low resolution data latch. A bitwise OR operation may then be performed on the output of the AND gates and the high resolution data latch. The output of the OR gates in the display driver circuitry may be loaded into column data latch CDL. This loaded digital image data may then be converted to analog data (analog data signals D) and loaded into the current row of pixels 26 of pixel array 24 by digital-to-analog converter circuitry.
The functions of
Display 14 may allow data to be updated in blocks. Display 14 may, for example, be an organic light-emitting diode display, a display having an array of light-emitting diodes formed from crystalline semiconductor die, or other display that has pixels 26 with block enable circuitry to enable the pixels in a block to be loaded together.
Consider, as an example, display 14 of
Pixels 26 may be grouped in blocks 96 of adjacent pixels 26 (e.g., blocks of n×n pixels, where n is 2-500, 200-400, 100-500, more than 200, less than 600, or other suitable number). Sets of blocks (e.g., sets of 2-25 blocks, sets that each contain 4-9 blocks, more than 2 blocks, or fewer than 50 blocks) or individual blocks may be supplied with high resolution data while remaining blocks 96 are supplied with low resolution data. In the example of
Blocks of pixels 26 can be updated relatively quickly and can support fast frame rates. Undesirable visible artifacts such as motion blur effects can be minimized by driving pixels 26 with a low duty cycle (e.g., 2 ms) and high frame rate (e.g., 90 Hz). Block-wise refreshing schemes may support this type of operation. The inclusion of block enable transistors into pixels 26 may also allow for selective high frame rate updating. For example, the entire video bandwidth of display 14 may be temporarily dedicated to refreshing pixel array 24 at low resolution whenever gaze detection system 70 detects that a user's gaze is rapidly changing (e.g., by disabling high resolution loading). As another example, display 14 may be configured to produce multiple light fields each associated with a different respective focal plane. This may be accomplished using multiple stacked transparent displays at different distances from a user's eyes, using tunable lenses that tune to different focal lengths at different times (when different image data is being displayed), using electrically adjustable beam steering equipment in combination with diffractive optics, etc. In a depth-fused multi-focal-plane display, peripheral blocks 96 may be refreshed with a relatively low rate when a user's gaze is steady while foveal blocks (blocks in the user's direct line of sight) can be refreshed at higher frequencies (e.g., in synchronization with lens tuning changes in a tunable lens system).
In
In
In accordance with an embodiment, an electronic device is provided that includes a graphics processing unit that supplies image data with a first resolution and image data with a second resolution that is higher than the first resolution, and a display includes a pixel array having rows and columns of pixels, data lines associated with the columns of pixels, gate lines associated with the rows of pixels, gate line driver circuitry coupled to the gate lines, a timing controller integrated circuit that receives the image data from the graphics processing unit, and a column driver integrated circuit that receives the image data from the timing controller integrated circuit and that loads the image data into the pixel array, at least one of the timing controller integrated circuit and the column driver integrated circuit includes interpolation and filter circuitry that performs interpolation and filtering on the image data with the first resolution.
In accordance with another embodiment, the interpolation and filter circuitry forms part of the timing controller integrated circuit and is configured to perform a nearest neighbor interpolation on the image data of the first resolution.
In accordance with another embodiment, the interpolation and filter circuitry forms part of the column driver integrated circuit and is configured to perform a nearest neighbor interpolation on the image data of the first resolution.
In accordance with another embodiment, the interpolation and filter circuitry forms part of the timing controller integrated circuit and is configured to perform box filtering on the image data of the first resolution.
In accordance with another embodiment, the interpolation and filter circuitry forms part of the column driver integrated circuit and is configured to perform box filtering on the image data of the first resolution.
In accordance with another embodiment, the interpolation and filtering circuitry includes a first interpolation and filtering circuit in the timing controller integrated circuit, and a second interpolation and filtering circuit in the column driver integrated circuit.
In accordance with another embodiment, the first interpolation and filtering circuit is configured to perform nearest neighbor interpolation on the image data of the first resolution for a first dimension of the pixel array and the second interpolation and filtering circuit is configured to perform nearest neighbor interpolation on the image data of the first resolution for a second dimension of the pixel array that is orthogonal to the first dimension.
In accordance with another embodiment, the first interpolation and filtering circuit is configured to perform box filtering on the image data of the first resolution and the second interpolation and filtering circuit is configured to perform box filtering on the image data of the first resolution.
In accordance with another embodiment, the first interpolation and filtering circuit is configured to perform a first one-dimensional spatial filtering operation for a two-dimensional spatial filter to the image data of the first resolution and the second interpolation and filtering circuit is configured to perform a second one-dimensional spatial filtering operation for the two-dimensional spatial filter to the image data of the first resolution.
In accordance with another embodiment, the first and second interpolation and filtering circuits are further configured to perform nearest neighbor interpolation operations on the image data of the first resolution.
In accordance with another embodiment, the electronic device includes a gaze tracking system that supplies information on a gaze location and the graphics processing unit is configured to produce the image data with the second resolution for a portion of the pixel array that overlaps the gaze location.
In accordance with another embodiment, the first and second interpolation and filtering circuits are configured to perform filtering on the image data with the first resolution without performing filtering on the image data with the second resolution.
In accordance with an embodiment, an electronic device is provided that includes an array of pixels, a gaze detection system that is configured to supply information on a gaze location, a graphics processing unit configured to provide image data for the array of pixels at a first resolution and that is configured to provide image data for a portion of the array of pixels that overlaps the gaze location at a second resolution that is higher than the first resolution, at least first and second frame buffers, the first frame buffer is configured to receive the image data from the graphics processing unit at the first resolution and the second frame buffer is configured to receive the image data from the graphics processing unit at the second resolution, and circuitry configured to load the image data with the first resolution into the array of pixels from the first frame buffer and that is configured to load the image data with the second resolution into the portion of the array of pixels that overlaps the gaze location from the second frame buffer.
In accordance with another embodiment, the array of pixels and the first and second frame buffers are formed on a liquid-crystal-on-silicon display.
In accordance with another embodiment, the circuitry that is configured to load the image data includes row driver circuitry that is configured to, assert signals on gate lines individually for portions of the pixel array that include the portion of the array of pixels that overlaps the gaze location, and assert a common gate line signal on a set of multiple adjacent gate lines in rows of the pixel array that do not include the portion of the array of pixels that overlaps the gaze location.
In accordance with another embodiment, the circuitry that is configured to load the image data includes column driver circuitry that includes a first latch configured to receive the image data with the first resolution and includes a second latch configured to receive the image data with the second resolution.
In accordance with an embodiment, an electronic device is provided that includes a pixel array having rows and columns of pixels, data lines associated with the columns of pixels. gate lines associated with the rows of pixels, display driver circuitry coupled to the data lines and gates lines, each pixel in the array of pixels has a pixel circuit with a switching transistor and has a block enable transistor coupled to the switching transistor, and a gaze detection system that is configured to supply information on a gaze location, the display driver circuitry is configured to turn on the block enable transistors in at least one block of the pixels based on the gaze location.
In accordance with another embodiment, each block enable transistor has a source-drain terminal coupled to a respective one of the data lines and has a gate controlled by a block enable line.
In accordance with another embodiment, the display driver circuitry is configured to turn on the block enable transistors in a set of the blocks based on the gaze location.
In accordance with another embodiment, the display driver circuitry is configured to receive image data with a first resolution, receive image data with a second resolution that is higher than the first resolution, load the image data with the second resolution into the set of blocks and load the image data with the first resolution into blocks in the array of pixel circuitry other than the set of blocks.
The foregoing is merely illustrative and various modifications can be made by those skilled in the art without departing from the scope and spirit of the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims priority to provisional patent application No. 62/375,633, filed on Aug. 16, 2016, which is hereby incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US17/47023 | 8/15/2017 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62375633 | Aug 2016 | US |