This relates generally to electronic devices, and, more particularly, to electronic devices with displays.
Electronic devices often include displays. In some cases, displays may include lenticular lenses that enable the display to provide three-dimensional content to the viewer. The lenticular lenses may be formed over an array of pixels such as organic light-emitting diode pixels or liquid crystal display pixels.
An electronic device may include a lenticular display. The lenticular display may have a lenticular lens film formed over an array of pixels. A plurality of lenticular lenses may extend across the length of the display. The lenticular lenses may be configured to enable stereoscopic viewing of the display such that a viewer perceives three-dimensional images.
The display may have a number of independently controllable viewing zones. Each viewing zone displays a respective two-dimensional image. Each eye of the viewer may receive a different one of the two-dimensional images, resulting in a perceived three-dimensional image.
Crosstalk between viewing zones and disparity between images received from different viewing zones may result in disparity-caused shifts in images perceived by viewer of the lenticular display. To mitigate these disparity-caused shifts, compensation circuitry may be included in the display pipeline circuitry.
The display pipeline circuitry may provide brightness values to display driver circuitry that then controls the lenticular display to display images. The compensation circuitry may compensate the brightness values provided to the display driver circuitry for disparity-caused shifts. The compensation circuitry may include stored disparity-caused shift calibration information that is used for the compensation. The stored disparity-caused shift calibration information may be a polynomial function that outputs a magnitude of disparity-caused shift for a given pixel location. The compensation circuitry may be incorporated into UV mapping circuitry, view mapping circuitry, or content rendering circuitry in the display pipeline.
An illustrative electronic device of the type that may be provided with a display is shown in
As shown in
To support communications between device 10 and external equipment, control circuitry 16 may communicate using communications circuitry 21. Circuitry 21 may include antennas, radio-frequency transceiver circuitry, and other wireless communications circuitry and/or wired communications circuitry. Circuitry 21, which may sometimes be referred to as control circuitry and/or control and communications circuitry, may support bidirectional wireless communications between device 10 and external equipment over a wireless link (e.g., circuitry 21 may include radio-frequency transceiver circuitry such as wireless local area network transceiver circuitry configured to support communications over a wireless local area network link, near-field communications transceiver circuitry configured to support communications over a near-field communications link, cellular telephone transceiver circuitry configured to support communications over a cellular telephone link, or transceiver circuitry configured to support communications over any other suitable wired or wireless communications link). Wireless communications may, for example, be supported over a Bluetooth® link, a WiFi® link, a 60 GHz link or other millimeter wave link, a cellular telephone link, or other wireless communications link. Device 10 may, if desired, include power circuits for transmitting and/or receiving wired and/or wireless power and may include batteries or other energy storage devices. For example, device 10 may include a coil and rectifier to receive wireless power that is provided to circuitry in device 10.
Input-output circuitry in device 10 such as input-output devices 12 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 12 may include buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, speakers, tone generators, vibrators, cameras, sensors, light-emitting diodes and other status indicators, data ports, and other electrical components. A user can control the operation of device 10 by supplying commands through input-output devices 12 and may receive status information and other output from device 10 using the output resources of input-output devices 12.
Input-output devices 12 may include one or more displays such as display 14. Display 14 may be a touch screen display that includes a touch sensor for gathering touch input from a user or display 14 may be insensitive to touch. A touch sensor for display 14 may be based on an array of capacitive touch sensor electrodes, acoustic touch sensor structures, resistive touch components, force-based touch sensor structures, a light-based touch sensor, or other suitable touch sensor arrangements.
Some electronic devices may include two displays. In one possible arrangement, a first display may be positioned on one side of the device and a second display may be positioned on a second, opposing side of the device. The first and second displays therefore may have a back-to-back arrangement. One or both of the displays may be curved.
Sensors in input-output devices 12 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors (e.g., a two-dimensional capacitive touch sensor integrated into display 14, a two-dimensional capacitive touch sensor overlapping display 14, and/or a touch sensor that forms a button, trackpad, or other input device not associated with a display), and other sensors. If desired, sensors in input-output devices 12 may include optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, optical touch sensors, optical proximity sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, fingerprint sensors, temperature sensors, sensors for measuring three-dimensional non-contact gestures (“air gestures”), pressure sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors, radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, gaze tracking sensors, and/or other sensors.
Control circuitry 16 may be used to run software on device 10 such as operating system code and applications. During operation of device 10, the software running on control circuitry 16 may display images on display 14 using an array of pixels in display 14.
Display 14 may be an organic light-emitting diode display, a liquid crystal display, an electrophoretic display, an electrowetting display, a plasma display, a microelectromechanical systems display, a display having a pixel array formed from crystalline semiconductor light-emitting diode dies (sometimes referred to as microLEDs), and/or other display. Configurations in which display 14 is an organic light-emitting diode display are sometimes described herein as an example.
Display 14 may have a rectangular shape (i.e., display 14 may have a rectangular footprint and a rectangular peripheral edge that runs around the rectangular footprint) or may have other suitable shapes. Display 14 may be planar or may have a curved profile.
Device 10 may include cameras and other components that form part of eye and/or head tracking system 18. The camera(s) or other components of system 18 may face an expected location for a viewer and may track the viewer's eyes and/or head (e.g., images and other information captured by system 18 may be analyzed by control circuitry 16 to determine the location of the viewer's eyes and/or head). This head-location information obtained by system 18 may be used to determine the appropriate direction with which display content from display 14 should be directed. Eye and/or head tracking system 18 may include any desired number/combination of infrared and/or visible light detectors. Eye and/or head tracking system 18 may optionally include light emitters to illuminate the scene. Eye and/or head tracking system may include a light detection and ranging (lidar) sensor, a time-of-flight (ToF) sensor, an accelerometer (e.g., to detect the orientation of electronic device 10), a camera, or a combination of two or more of these components. Including sensors such as a light detection and ranging (lidar) sensor, a time-of-flight (ToF) sensor, or an accelerometer may improve acquisition speeds when tracking eye/head position of the viewer.
A top view of a portion of display 14 is shown in
Display driver circuitry may be used to control the operation of pixels 22. The display driver circuitry may be formed from integrated circuits, thin-film transistor circuits, or other suitable circuitry. Display driver circuitry 30 of
To display the images on display pixels 22, display driver circuitry 30 may supply image data to data lines D while issuing clock signals and other control signals to supporting display driver circuitry such as gate driver circuitry 34 over path 38. If desired, circuitry 30 may also supply clock signals and other control signals to gate driver circuitry on an opposing edge of display 14.
Gate driver circuitry 34 (sometimes referred to as horizontal control line control circuitry) may be implemented as part of an integrated circuit and/or may be implemented using thin-film transistor circuitry. Horizontal control lines G in display 14 may carry gate line signals (scan line signals), emission enable control signals, and other horizontal control signals for controlling the pixels of each row. There may be any suitable number of horizontal control signals per row of pixels 22 (e.g., one or more, two or more, three or more, four or more, etc.).
Display 14 may sometimes be a stereoscopic display that is configured to display three-dimensional content for a viewer. Stereoscopic displays are capable of displaying multiple two-dimensional images that are viewed from slightly different angles. When viewed together, the combination of the two-dimensional images creates the illusion of a three-dimensional image for the viewer. For example, a viewer's left eye may receive a first two-dimensional image and a viewer's right eye may receive a second, different two-dimensional image. The viewer perceives these two different two-dimensional images as a single three-dimensional image.
There are numerous ways to implement a stereoscopic display. Display 14 (sometimes referred to as stereoscopic display 14, lenticular display 14, three-dimensional display 14, etc.) may be a lenticular display that uses lenticular lenses (e.g., elongated lenses that extend along parallel axes), may be a parallax barrier display that uses parallax barriers (e.g., an opaque layer with precisely spaced slits to create a sense of depth through parallax), may be a volumetric display, or may be any other desired type of stereoscopic display. Configurations in which display 14 is a lenticular display are sometimes described herein as an example.
As shown in
The lenses 46 of the lenticular lens film cover the pixels of display 14. An example is shown in
Consider the example of display 14 being viewed by a viewer with a first eye (e.g., a right eye) 48-1 and a second eye (e.g., a left eye) 48-2. Light from pixel 22-1 is directed by the lenticular lens film in direction 40-1 towards left eye 48-2, light from pixel 22-2 is directed by the lenticular lens film in direction 40-2 towards right eye 48-1, light from pixel 22-3 is directed by the lenticular lens film in direction 40-3 towards left eye 48-2, light from pixel 22-4 is directed by the lenticular lens film in direction 40-4 towards right eye 48-1, light from pixel 22-5 is directed by the lenticular lens film in direction 40-5 towards left eye 48-2, light from pixel 22-6 is directed by the lenticular lens film in direction 40-6 towards right eye 48-1. In this way, the viewer's right eye 48-1 receives images from pixels 22-2, 22-4, and 22-6, whereas left eye 48-2 receives images from pixels 22-1, 22-3, and 22-5. Pixels 22-2, 22-4, and 22-6 may be used to display a slightly different image than pixels 22-1, 22-3, and 22-5. Consequently, the viewer may perceive the received images as a single three-dimensional image.
Pixels of the same color may be covered by a respective lenticular lens 46. In one example, pixels 22-1 and 22-2 may be red pixels that emit red light, pixels 22-3 and 22-4 may be green pixels that emit green light, and pixels 22-5 and 22-6 may be blue pixels that emit blue light. This example is merely illustrative. In general, each lenticular lens may cover any desired number of pixels each having any desired color. The lenticular lens may cover a plurality of pixels having the same color, may cover a plurality of pixels each having different colors, may cover a plurality of pixels with some pixels being the same color and some pixels being different colors, etc.
Display 14 may be viewed by both a first viewer with a right eye 48-1 and a left eye 48-2 and a second viewer with a right eye 48-3 and a left eye 48-4. Light from pixel 22-1 is directed by the lenticular lens film in direction 40-1 towards left eye 48-4, light from pixel 22-2 is directed by the lenticular lens film in direction 40-2 towards right eye 48-3, light from pixel 22-3 is directed by the lenticular lens film in direction 40-3 towards left eye 48-2, light from pixel 22-4 is directed by the lenticular lens film in direction 40-4 towards right eye 48-1, light from pixel 22-5 is directed by the lenticular lens film in direction 40-5 towards left eye 48-4, light from pixel 22-6 is directed by the lenticular lens film in direction 40-6 towards right eye 48-3, light from pixel 22-7 is directed by the lenticular lens film in direction 40-7 towards left eye 48-2, light from pixel 22-8 is directed by the lenticular lens film in direction 40-8 towards right eye 48-1, light from pixel 22-9 is directed by the lenticular lens film in direction 40-9 towards left eye 48-4, light from pixel 22-10 is directed by the lenticular lens film in direction 40-10 towards right eye 48-3, light from pixel 22-11 is directed by the lenticular lens film in direction 40-11 towards left eye 48-2, and light from pixel 22-12 is directed by the lenticular lens film in direction 40-12 towards right eye 48-1. In this way, the first viewer's right eye 48-1 receives images from pixels 22-4, 22-8, and 22-12, whereas left eye 48-2 receives images from pixels 22-3, 22-7, and 22-11. Pixels 22-4, 22-8, and 22-12 may be used to display a slightly different image than pixels 22-3, 22-7, and 22-11. Consequently, the first viewer may perceive the received images as a single three-dimensional image. Similarly, the second viewer's right eye 48-3 receives images from pixels 22-2, 22-6, and 22-10, whereas left eye 48-4 receives images from pixels 22-1, 22-5, and 22-9. Pixels 22-2, 22-6, and 22-10 may be used to display a slightly different image than pixels 22-1, 22-5, and 22-9. Consequently, the second viewer may perceive the received images as a single three-dimensional image.
Pixels of the same color may be covered by a respective lenticular lens 46. In one example, pixels 22-1, 22-2, 22-3, and 22-4 may be red pixels that emit red light, pixels 22-5, 22-6, 22-7, and 22-8 may be green pixels that emit green light, and pixels 22-9, 22-10, 22-11, and 22-12 may be blue pixels that emit blue light. This example is merely illustrative. The display may be used to present the same three-dimensional image to both viewers or may present different three-dimensional images to different viewers. In some cases, control circuitry in the electronic device 10 may use eye and/or head tracking system 18 to track the position of one or more viewers and display images on the display based on the detected position of the one or more viewers.
It should be understood that the lenticular lens shapes and directional arrows of
The X-axis may be considered the horizontal axis for the display whereas the Y-axis may be considered the vertical axis for the display. As shown in
The example herein of the display having 14 independently controllable zones is merely illustrative. In general, the display may have any desired number of independently controllable zones (e.g., more than 2, more than 6, more than 10, more than 12, more than 16, more than 20, more than 30, more than 40, less than 40, between 10 and 30, between 12 and 25, etc.).
Each zone is capable of displaying a unique image to the viewer. The sub-pixels on display 14 may be divided into groups, with each group of sub-pixels capable of displaying an image for a particular zone. For example, a first subset of sub-pixels in display 14 is used to display an image (e.g., a two-dimensional image) for zone 1, a second subset of sub-pixels in display 14 is used to display an image for zone 2, a third subset of sub-pixels in display 14 is used to display an image for zone 3, etc. In other words, the sub-pixels in display 14 may be divided into 14 groups, with each group associated with a corresponding zone (sometimes referred to as viewing zone) and capable of displaying a unique image for that zone. The sub-pixel groups may also themselves be referred to as zones.
Control circuitry 16 may control display 14 to display desired images in each viewing zone. There is much flexibility in how the display provides images to the different viewing zones. Display 14 may display entirely different content in different zones of the display. For example, an image of a first object (e.g., a cube) is displayed for zone 1, an image of a second, different object (e.g., a pyramid) is displayed for zone 2, an image of a third, different object (e.g., a cylinder) is displayed for zone 3, etc. This type of scheme may be used to allow different viewers to view entirely different scenes from the same display. However, in practice there may be crosstalk between the viewing zones. As an example, content intended for zone 3 may not be contained entirely within viewing zone 3 and may leak into viewing zones 2 and 4.
Therefore, in another possible use-case, display 14 may display a similar image for each viewing zone, with slight adjustments for perspective between each zone. This may be referred to as displaying the same content at different perspectives, with one image corresponding to a unique perspective of the same content. For example, consider an example where the display is used to display a three-dimensional cube. The same content (e.g., the cube) may be displayed on all of the different zones in the display. However, the image of the cube provided to each viewing zone may account for the viewing angle associated with that particular zone. In zone 1, for example, the viewing cone may be at a −10° angle relative to the surface normal of the display (along the horizontal direction). Therefore, the image of the cube displayed for zone 1 may be from the perspective of a −10° angle relative to the surface normal of the cube (as in
There are many possible variations for how display 14 displays content for the viewing zones. In general, each viewing zone may be provided with any desired image based on the application of the electronic device. Different zones may provide different images of the same content at different perspectives, different zones may provide different images of different content, etc.
In one possible scenario, display 14 may display images for all of the viewing zones at the same time. However, this requires emitting light with all of the sub-pixels in the display in order to generate images for each viewing zone. To reduce power consumption in the display, one or more of the zones may be disabled based on information from the eye and/or head tracking system 18.
Eye and/or head tracking system 18 (sometimes referred to as viewer tracking system 18, head tracking system 18, or tracking system 18) may use one or more cameras such as camera 54 to capture images of the area in front of the display 14 where a viewer is expected to be present. The example of eye and/or head tracking system 18 including a camera 54 is merely illustrative. Eye and/or head tracking system may include a light detection and ranging (lidar) sensor, a time-of-flight (ToF) sensor, an accelerometer (e.g., to detect the orientation of electronic device 10), a camera, or a combination of two or more of these components. Including sensors such as a light detection and ranging (lidar) sensor, a time-of-flight (ToF) sensor, or an accelerometer may improve acquisition speeds when tracking eye/head position of the viewer. The tracking system may use information gathered by the sensors (e.g., sensor data) to identify a position of the viewer relative to the viewing zones. In other words, the tracking system may be used to determine which viewing zone(s) the viewer is occupying. Each eye of the user may be associated with a different viewing zone (in order to allow three-dimensional content to be perceived by the user from the display). Based on the captured images, tracking system 18 may identify a first viewing zone associated with a left eye of the viewer and a second viewing zone associated with a right eye of the viewer. Tracking system 18 may use one camera, two cameras, three cameras, more than three cameras, etc. to obtain information on the position of the viewer(s). The cameras in the tracking system may capture visible light and/or infrared light images.
Control circuitry 16 may use information from tracking system 18 to selectively disable unoccupied viewing zones. Disabling unoccupied viewing zones conserves power within the electronic device. Control circuitry 16 may receive various types of information from tracking system 18 regarding the position of the viewer. Control circuitry 16 may receive raw data from head tracking system 18 and process the data to determine the position of a viewer, may receive position coordinates from head tracking system 18, may receive an identification of one or more occupied viewing zones from head tracking system 18, etc. If head tracking system 18 includes processing circuitry configured to process data from the one or more cameras to determine the viewer position, this portion of the head tracking system may also be considered control circuitry (e.g., control circuitry 16). Control circuitry 16 may include a graphics processing unit (GPU) that generates image data to be displayed on display 14. The GPU may generate image data based on the viewer position information.
In general, electronic device 10 includes one or more cameras 54 for capturing images of an environment around the display (e.g., an area in front of the display where viewers are expected to be located). Control circuitry within the electronic device uses the images from the one or more cameras to identify which viewing zones are occupied by the viewer. The control circuitry then controls the display accordingly based on the occupied viewing zones.
The viewing zones occupied by the viewer may always display images (e.g., be turned on) to ensure images are presented to the viewer. One or more viewing zones adjacent to the occupied viewing zones may also be turned on to ensure low latency if the user changes viewing zones. One or more unoccupied viewing zones may be turned off to conserve power in the device. This example is merely illustrative, and all of the viewing zones may remain on if desired.
Ideally, a viewer's eye in a given viewing zone would only receive light from display pixels assigned to that viewing zone. However, in practice, there may be crosstalk between adjacent viewing zones. In other words, a viewer's eye in viewing zone 6 in
Display 14 may be configured to display images that, when perceived by the viewer, appear present on virtual plane 66 behind the display. Consider an example where display 14 displays images so that the viewer perceives a dot 60 (sometimes referred to as virtual dot 60) in virtual plane 66. The dot may be displayed on display 14 at point 62-1 to be viewable at viewing location 58-1. The dot may be displayed on display 14 at point 62-2 to be viewable at viewing location 58-2. The dot may be displayed on display 14 at point 62-3 to be viewable at viewing location 58-3.
There is a first disparity 64-1 between points 62-2 and 62-1. In other words, the physical location of dot 62 is shifted on the display by disparity 64-1 in order to cause dots 62-2 and 62-1 to appear at the same location on virtual plane 66. There is a second disparity 64-2 between points 62-2 and 62-3. In other words, the physical location of dot 62 is shifted on the display by disparity 64-2 in order to cause dots 62-2 and 62-3 to appear at the same location on virtual plane 66.
In the example of
Viewing location 58-2 in
Viewing location 58-3 in
Consider the example where a viewer's eye is positioned at viewing location 58-2 (e.g., in zone B). When only the pixels in zone B are turned on (e.g., and zones A and C are turned off), an image as shown by ‘zone B’ in
The convex curvature and lenticular arrangement of display 14 cause a disparity along the X-axis between the perceived images from zones A and B and from zones B and C. In the example of
As shown in
As shown in
To reiterate, zones A, B, and C in
As show in
It should be noted that the example in
Additionally, the example of the shift of the perceived images in
To mitigate unintended shifts in perceived images caused by disparity, compensation of the displayed images may be performed. To compensate for disparity-caused shifts, the magnitude of the disparity-caused shift may be measured across the display. Because the display panel and corresponding disparity-caused shifts are symmetrical, the effect of disparity may only be measured for one half of the display. The effect of disparity measured for the first half of the display may then be assumed to also apply to the second half of the display.
Next, the camera may capture images while multiple zones are turned on. For example, both the zone aligned with location 58-2 and adjacent zones aligned with adjacent viewing areas (e.g., zones A, B, and C in
The images captured by the camera may be used to measure the shift of each edge across the display. For evaluation purposes, during the measurements for compensation, the rectangles (bars) displayed in
The difference between the position of the edges along the X-axis when only one zone is turned on (as in the upper portion of
In
A trend curve 104 may be fit to the data points 102 in the graph of
The polynomial curve fit to the data points in
The example of using a third order polynomial curve for trend curve 104 is merely illustrative. If desired, the trend curve may be a polynomial curve of a different order (e.g., second order, fourth order, fifth order, sixth order, greater than sixth order, etc.) or an entirely different type of curve.
The trend curve obtained during the calibration measurements of
The difference between the position of the edges along the X-axis when only one zone is turned on (as in the upper portion of
In
The display pipeline circuitry may render a plurality of two-dimensional images of target content, with each two-dimensional image corresponding to a different view of the target content. In one example, the target content may be based on a two-dimensional (2D) image and a three-dimensional (3D) image. The two-dimensional image and the three-dimensional image may optionally be captured by a respective two-dimensional image sensor and three-dimensional image sensor in electronic device 10. This example is merely illustrative. The content may be rendered based on two-dimensional/three-dimensional images from other sources (e.g., from sensors on another device, computer-generated images, etc.). In some cases, the content may be rendered based on the viewer position detected by eye and/or head tracking system 18.
The images generated by display pipeline circuitry 110 may be compensated based on various factors. For example, the images may be compensated based on a brightness setting for the device, ambient light levels, disparity-caused shift calibration information (e.g., the trend curve from
As shown in
Content rendering circuitry 122 may render content to be displayed on display 14. As previously discussed, there is flexibility in the type of content that is displayed in each of the viewing zones of display 14. However, herein an illustrative example will be described where the viewing zones are used to display images of the same content at different perspectives (views). In other words, each subset of the pixel array associated with a given viewing zone displays a different view of the same content. As a viewer changes viewing zones, the appearance of the content gradually changes to simulate looking at a real-world object.
Content rendering circuitry 122 may render content for the plurality of views based on a received two-dimensional (2D) image and a three-dimensional (3D) image (e.g., from respective sensors in device 10). The two-dimensional image and three-dimensional image may be images of the same content. In other words, the two-dimensional image may provide color/brightness information for given content while the three-dimensional image provides a depth map associated with the given content. The two-dimensional image only has color/brightness information for one view of the given content. Content rendering circuitry 122 may optionally include a machine learning model.
Content rendering circuitry 122 may use 3D modeling circuitry 132 to generate a three-dimensional model of the content intended to be displayed (based on received three-dimensional images). Cylindrical warping adjustment circuitry 134 may perform adjustments on the three-dimensional model that correct for the convex curvature of display 14. Ultimately, content rendering circuitry 122 outputs information to UV map 124 (sometimes referred to as UV mapping circuitry 124 or mapping circuitry 124). A UV map (sometimes referred to as simply a map) is a flat representation of the surface of the three-dimensional content that is displayed on display 14. In other words, the UV map includes depth information (u, v) that represents the texture of the content ultimately displayed on display 14. UV mapping circuitry 124 may generate the final UV map based on information from cylindrical warping adjustment circuitry 134, as one example.
As shown in
Content rendering circuitry 122 may also include texture generation circuitry 136 that generates color and brightness information for display 14. The color and brightness information generated by texture generation circuitry 136 may be based on the 2D image and/or 3D image received by content rendering circuitry 122. The texture generation circuitry 136 may output a single two-dimensional image (that is intended to be displayed on display 14) or a plurality of two-dimensional images, with each 2D image corresponding to a respective viewing zone of the display. In other words, a first 2D image is displayed in the first viewing zone, a second 2D image is displayed in the second viewing zone, etc.
Pixel mapping circuitry 128 may receive the color and brightness information from texture generation circuitry 136, the UV map from UV mapping circuitry 124, and the view map from view mapping circuitry 126. Based on the received information, pixel mapping circuitry 128 outputs pixel brightness values for each pixel in the display. Image processing circuitry 130 may perform optional adjustments (e.g., color compensation, border masking, burn-in compensation, panel response correction, dithering, etc.) to the output pixel brightness values that are ultimately provided to display driver circuitry 30 and displayed on pixel array 112.
As an example, the pixel mapping circuitry may identify a first subset of pixels in the pixel array that is visible at viewing zone 1 (e.g., using the view map). The pixel mapping circuitry then uses the UV map and the received color and brightness information to map a first two-dimensional image to the first subset of pixels. Once displayed, the first two-dimensional image is viewable at viewing zone 1. The pixel mapping circuitry may then identify a second subset of pixels in the pixel array that is visible at viewing zone 2 (e.g., using the view map). The pixel mapping circuitry then uses the UV map and the received color and brightness information to map a second two-dimensional image to the second subset of pixels. Once displayed, the second two-dimensional image is viewable at viewing zone 2. This type of pixel mapping is repeated for every view included in the display. Once complete, pixel mapping circuitry 128 outputs pixel data for each pixel in the pixel array. The pixel data includes a blend of independent, two-dimensional images (with different views of the same content).
It should be understood that the subset of pixels used to display each view may be non-continuous. For example, the subset of pixels for each view may include a plurality of discrete vertical pixel strips. These discrete sections of pixels may be separated by pixels that are used to display other views to the viewer.
Texture generation circuitry 136 may update the content provided to pixel mapping circuitry 128 on a frame-by-frame basis. UV mapping circuitry 124 may also intermittently update the UV map (e.g., based on a new 3D image received by content rendering circuitry 122). However, the UV map is not updated every frame.
Display pipeline circuitry 110 may include disparity-caused shift compensation circuitry 138 that is used to compensate for disparity-caused shifts along the X-axis of the display (as discussed in connection with
The disparity-caused shift compensation circuitry 138 may include disparity-caused shift calibration information that is used to compensate for disparity-caused shift in the display. The disparity-caused shift calibration information may be, for example, data generated during calibration operations of the type shown and discussed in connection with
Consider the example where disparity-caused shift compensation circuitry 138 is incorporated in UV mapping circuitry 124. The UV mapping circuitry 124 may receive 3D model information from content rendering circuitry 122 (as previously discussed). UV mapping circuitry 124 generates a UV map that includes depth information (u, v) that represents the texture of the content for display 14. The UV map is a flat (2D) representation of the surface of the three-dimensional content that is displayed on display 14. Disparity-caused shift compensation circuitry 138 may compensate the 2D representation of the surface of the 3D content to mitigate disparity-caused shift in the display. The disparity-caused shift compensation circuitry 138 may apply the polynomial curve that represents disparity-caused shift to the UV map. This pre-distorts the UV map based on the measured disparity-caused shift. Later, the distortions (e.g., shifts) caused by disparity will be applied to the pre-distorted UV map, resulting in an undistorted perceived final image.
Consider the example where disparity-caused shift compensation circuitry 138 is incorporated in view mapping circuitry 126. The view mapping circuitry 126 may have a display calibration map (e.g., as determined during calibration operations) that indicates which physical pixels on the display panel correspond to which viewing zones. Disparity-caused shift compensation circuitry 138 may compensate the display calibration map (view map) to mitigate disparity-caused shift in the display. The disparity-caused shift compensation circuitry 138 may apply the polynomial curve that represents disparity-caused shift to the display calibration map. This pre-distorts the view map based on the measured disparity-caused shift. Later, the distortions (e.g., shifts) caused by disparity will be applied to the pre-distorted view map, resulting in an undistorted perceived final image.
Consider the example where disparity-caused shift compensation circuitry 138 is incorporated in texture generation circuitry 136. Disparity-caused shift compensation circuitry 138 may compensate the color and brightness information output by texture generation circuitry 136 to mitigate disparity-caused shift in the display. The disparity-caused shift compensation circuitry 138 may apply the polynomial curve that represents disparity-caused shift to the color and brightness information. This pre-distorts the color and brightness information based on the measured disparity-caused shift. Later, the distortions (e.g., shifts) caused by disparity will be applied to the pre-distorted color and brightness information, resulting in an undistorted perceived final image.
Display pipeline circuitry 110 in
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims priority to U.S. provisional patent application No. 63/296,417, filed Jan. 4, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
7425069 | Schwerdtner et al. | Sep 2008 | B2 |
8692871 | Harrold et al. | Apr 2014 | B2 |
9361727 | Fuchs et al. | Jun 2016 | B2 |
9667952 | Itoh | May 2017 | B2 |
9712810 | Smithwick et al. | Jul 2017 | B2 |
9927623 | Johnson et al. | Mar 2018 | B2 |
10136125 | Yoon et al. | Nov 2018 | B2 |
10171792 | Liu et al. | Jan 2019 | B2 |
10237541 | Hasegawa et al. | Mar 2019 | B2 |
20070008619 | Cha et al. | Jan 2007 | A1 |
20070035706 | Margulis | Feb 2007 | A1 |
20110157471 | Seshadri | Jun 2011 | A1 |
20120062991 | Krijn et al. | Mar 2012 | A1 |
20120113153 | Casner | May 2012 | A1 |
20140043323 | Sumi | Feb 2014 | A1 |
20150015668 | Bennett et al. | Jan 2015 | A1 |
20150145977 | Hoffman | May 2015 | A1 |
20160021367 | Yoon et al. | Jan 2016 | A1 |
20160266398 | Poon et al. | Sep 2016 | A1 |
20160349523 | Wei | Dec 2016 | A1 |
20170054963 | Kasazumi | Feb 2017 | A1 |
20180373101 | Chen | Dec 2018 | A1 |
20220357591 | Makinen | Nov 2022 | A1 |
20230030931 | Takahashi | Feb 2023 | A1 |
20240073391 | Gotoh | Feb 2024 | A1 |
Number | Date | Country | |
---|---|---|---|
63296417 | Jan 2022 | US |