The present application claims the benefit of and priority to Japanese Patent Application Number 2019-137636, filed on Jul. 26, 2019, which is hereby incorporated by reference in its entirety.
The disclosed technology generally relates to a display driver, display module and method for driving a display panel.
A display panel may be configured in a zigzag pixel arrangement in which rows of pixels in adjacent horizontal lines are located offset to each other. Meanwhile, a display panel, especially when in a large size, may be driven with a plurality of display drivers. In some cases, the plurality of display drivers may be adapted to a zigzag pixel arrangement.
This summary is provided to introduce in a simplified form a selection of concepts that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to limit the scope of the claimed subject matter.
In one or more embodiments, a display driver is provided. The display driver includes interface circuitry, image data processing circuitry, and drive circuitry. The interface circuitry is configured to receive first frame image data for a first frame image. The image data processing circuitry includes a buffer memory configured to store at least part of the first frame image data. The image data processing circuitry is configured to supply, based on the at least part of the first frame image data stored in the buffer memory, a first display data defined for a first display area of a plurality of display areas of a display panel having a zigzag pixel arrangement. The drive circuit is configured to drive a display element in the first display area based on the first display data.
In one or more embodiments, a display module is provided. The display module includes a display panel and a plurality of display drivers. The display panel has a zigzag pixel arrangement and includes a plurality of display areas. The plurality of display drivers is configured to drive the plurality of display areas, respectively. A first display driver of the plurality of display drivers includes first interface circuitry, first image data processing circuitry, and first drive circuitry. The first interface circuitry is configured to receive first frame image data for a first frame image. The first image data processing circuitry is configured to extract first image area image data defined for a first image area of the first frame image and first boundary image data from the first frame image data. The first boundary image data include pixel data defined for pixels located in a portion of a second image area adjacent to the first image area of the first frame image, the portion of the second image area being in contact with a boundary between the first image area and the second image area. The first image data processing circuitry is further configured to supply first display data based on the first image area image data and the first boundary image data. The drive circuitry is configured to drive a display element in a first display area of the plurality of display areas based on the first display data.
In one or more embodiments, a method for driving a display panel is provided.
The method includes: receiving, by a first display driver, first frame image data for a first frame image and extracting, by the first display driver, first image area image data and first boundary image data from the first frame image data. The first image area image data is defined for a first image area of the first frame image. The first boundary image data includes pixel data defined for pixels located in a portion of a second image area adjacent to the first image area of the first frame image, the portion being in contact with the first image area. The method further includes generating a first display data defined for a first display area of a plurality of display areas of a display panel having a zigzag pixel arrangement based on the first image area image data and the first boundary image data and driving, by the first display driver, a display element in the first display area based on the first display data.
Other aspects of the embodiments will be apparent from the following description and the appended claims.
So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the disclosure, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only exemplary embodiments, and are therefore not to be considered limiting of inventive scope, as the disclosure may admit to other equally effective embodiments.
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized on other embodiments without specific recitation. Suffixes may be attached to reference numerals for distinguishing identical elements from each other. The drawings referred to here should not be understood as being drawn to scale unless specifically noted. Also, the drawings are often simplified and details or components omitted for clarity of presentation and explanation. The drawings and discussion serve to explain principles discussed below, where like designations denote like elements.
The following detailed description is merely exemplary in nature and is not intended to limit the disclosure or the application and uses of the disclosure. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding background, summary, or the following detailed description.
The present disclosure provides various schemes for driving a display panel configured in a zigzag pixel arrangement (which may be hereinafter simply referred to as zigzag display panel) with a plurality of display drivers (e.g., a plurality of display driver integrated circuit (DDIC) chips, a plurality of touch and display driver integrations (TDDI), and other devices configured to drive a display panel). In a zigzag display panel, rows of pixels in adjacent horizontal lines may be shifted from each other. In driving a zigzag display panel with a plurality of display drivers, image inconsistency may occur at a boundary between adjacent display areas that are driven by different display drivers.
To address the inconsistency at boundaries, a display driver may be configured to receive frame image data for a frame image and generate display data for a corresponding display area of a plurality of display areas of the display panel. In one implementation, the display driver may be configured to extract part of the frame image data for a corresponding image area and boundary image data from the frame image data. The boundary image data may include pixel data for pixels located in a portion of an adjacent image area of the frame image adjacent to the corresponding image area. The display driver may be configured to generate the display data based on the part of the frame image data for the corresponding image area and the boundary image data.
The display panel 1 may be segmented into a plurality of display areas 3 such that the number of the display areas 3 is identical to the number of the display drivers 2. In the illustrated embodiment, the number of the display drivers 2 is two, and the two display drivers 2 have the same configuration. The display area 3 includes a left area 31 and a right area 32, which are arrayed in the horizontal direction, which is indicated by the x axis of an xy coordinate system in
The display drivers 2 include a left chip 21 configured to drive display elements disposed in the left area 31 of the display panel 1 and a right chip 22 configured to drive display elements disposed in the right area 32.
The left chip 21 and right chip 22 may be configured to support multidrop communication with a host 4 via a bus 5. In various embodiments, frame image data for the entirety of a frame image may be sent to both the left chip 21 and right chip 22 using the multidrop communication. The frame image data may include pixel data for respective pixels of the frame image. In one implementation, pixel data for each pixel may include grayscale values of respective colors (e.g., red, green, and blue). The left chip 21 is configured to drive display elements disposed in the left area 31 based on the frame image data received from the host 4, and the right chip 22 is configured to drive display elements disposed in the right area 32 based on the frame image data.
The R subpixels 7R, the G subpixels 7G, and the B subpixels 7B may include display elements configured to display red, green, and blue, respectively. In embodiments where the display panel 1 includes an organic light emitting diode (OLED) display panel, each display element may include a light emitting element, a select transistor, and a storage capacitor. In embodiments where the display panel 1 includes a liquid crystal display (LCD) panel, each display element may include a pixel electrode, a select transistor, and a storage capacitor. Each pixel 6 may additionally include one or more other subpixels 7 configured to display colors other than red, green, and blue.
In one or more embodiments, the display panel 1 is configured in a zigzag pixel arrangement. The display panel 1 may be configured such that pixels 6 in adjacent horizontal lines are shifted from each other. In the embodiment illustrated in
The shift amount and/or direction of the pixels 6 may be variously modified. In other embodiments, as illustrated in
In embodiments where the display panel 1 is configured in a zigzag pixel arrangement, a subpixel 7 located near the boundary 1a in the left area 31 may be driven based on pixel data for a pixel in a right half image area of the original frame image. In embodiments where a frame image is displayed in the display panel 1 illustrated in
In one or more embodiments, a subpixel 7 located near the boundary 1a in the right area 32 may be driven based on pixel data for a pixel in a left half image area of the original frame image. In embodiments where a frame image is displayed in the display panel 1 illustrated in
The left chip 21 and the right chip 22 each include interface circuitry 11, image data processing circuitry 12, and drive circuitry 13.
The interface circuitry 11 may be configured to receive frame image data 31 from the host 4 and forward the same to the image data processing circuitry 12. In one implementation, communications between the display drivers 2 and the host 4 may be achieved through low voltage differential signaling (LVDS), and the interface circuitry 11 may include an LVDS interface. In one or more embodiments, the frame image data 31 received by the interface circuitry 11 and forwarded to the image data processing circuitry 12 during a vertical sync period may include pixel data for all the pixels of one frame image. In other embodiments, the interface circuitry 11 may be configured to process frame image data received from the host 4 and use the processed frame image data as the frame image data 31 to be forwarded to the image data processing circuitry 12.
In one or more embodiments, the frame image data 31 includes left image data 32 and right image data 33. The left image data 32 may correspond to the left half image area of the frame image and include pixel data for pixels in the left half image area, where the pixel data may include grayscale values of the respective colors (e.g., red, green, and blue). The right image data 33 may correspond to the right half image area of the frame image and include grayscale values of the respective colors of pixels in the right half image area.
Left image data 32 for one horizontal line may include pixel data for pixels for half the horizontal resolution of the frame image. In embodiments where the horizontal resolution of the frame image is 3840 pixels, left image data 32 for one horizontal line may include pixel data for 1920 pixels. Right image data 33 for one horizontal line may include pixel data for pixels for half the horizontal resolution of the frame image, correspondingly. In embodiments where the horizontal resolution of the frame image is 3840 pixels, right image data 33 for one horizontal line may include pixel data for 1920 pixels.
In one or more embodiments, the image data processing circuitry 12 is configured to generate, based on the frame image data 31 received from the interface circuitry 11, display data 34 used to drive the display panel 1 by the drive circuitry 13. In
The drive circuitry 13 of the left chip 21 is configured to drive the display elements in the left area 31 of the display panel 1 in response to the display data 341 received from the image data processing circuitry 12, and the drive circuitry 13 of the right chip 22 is configured to drive the display elements in the right area 32 of the display panel 1 in response to the display data 342 received from the image data processing circuitry 12.
The image data processing circuitry 12 may include a line memory (LM) 21, a buffer memory (BM) 22, an image processing (IP) core 23, IP control circuitry 24, and a line latch 25.
The line memory 21 may be configured to store the frame image data 31 received from the interface circuitry 11 for one horizontal line. In embodiments where the horizontal resolution of the original frame image is 3840 pixels, the line memory 21 may have a capacity to store pixel data for 3840 pixels.
The buffer memory 22 is configured to sequentially receive and store the frame image data 31 from the line memory 21. The buffer memory 22 may be configured to store the frame image data 31 for multiple horizontal lines. In the embodiment illustrated in
In one implementation, each of the left chip 21 and the right chip 22 may include a touch controller (not illustrated) for proximity sensing to sense an approach or contact of an input object to a touch panel. In such embodiments, the number of horizontal lines for which the buffer memory 22 is configured to store the frame image data 31 may be selected to provide sufficient time for the touch controller to achieve the proximity sensing in each vertical sync period.
The image processing IP core 23 is configured to process the frame image data 31 received from the buffer memory 22 to generate processed image data 35. In
In one or more embodiments, the processed image data 351 generated by the image processing IP core 23 of the left chip 21 include processed left image data 36 and processed right boundary image data 37. The processed left image data 36 may be generated based on the left image data 32 of the frame image data 31. The processed left image data 36 may be generated by applying desired image processing to the left image data 32. In other embodiments, the left image data 32 extracted from the frame image data 31 may be used as the processed left image data 36 without modification. The processed right boundary image data 37 may be generated based on pixel data of the right image data 33 for the pixels located in a portion of the right half image area of the frame image, the portion being in contact with the boundary between the left half image area and the right half image area. In one embodiment, the processed right boundary image data 37 may be generated by extracting, from the right image data 33, pixel data for pixels in the portion of the right half image area of the frame image, the portion being in contact with the boundary between the left half image area and the right half image area, and applying image processing to the extracted pixel data. In other embodiments, the above-described pixel data extracted from the right image data 33 may be used as the processed right boundary image data 37 without modification.
In one or more embodiments, the processed image data 352 generated by the image processing IP core 23 of the right chip 22 includes processed right image data 38 and processed left boundary image data 39. The processed right image data 38 may be generated based on the right image data 33 of the frame image data 31. The processed right image data 38 may be generated by applying desired image processing to the right image data 33. In other embodiments, the right image data 33 extracted from the frame image data 31 may be used as the processed right image data 38 without modification. The processed left boundary image data 39 is generated based on pixel data of the left image data 32 for the pixels located in a portion of the left half image area of the frame image, the portion being in contact with the boundary between the left half image area and the right half image area. In one embodiment, the processed left boundary image data 39 may be generated by extracting, from the left image data 32, pixel data for pixels in the portion of the left half image area of the frame image, the portion being in contact with the boundary between the left half image area and the right half image area, and applying image processing to the extracted pixel data. In other embodiments, the above-described pixel data extracted from the left image data 32 may be used as the processed left boundary image data 39 without modification.
The line latches 25 may be configured to store the processed image data 35 for one horizontal line. In one implementation, the line latch 25 of the left chip 21 is configured to store the processed image data 351, and the line latch 25 of the right chip 22 is configured to store the processed image data 352. The line latches 25 are adapted to data transfer to the drive circuitry 13.
In one or more embodiments, data sorting is performed during the data transfer from the line latches 25 to the drive circuitry 13 to generate and supply display data 34 to the drive circuitry 13. The data sorting may be performed in accordance with the arrangement of the pixels 6 of the display panel 1. In one or more embodiments, part of the processed image data 351 stored in the line latch 25 of the left chip 21 and used to drive the display elements in the left area 31 is selected in accordance with the arrangement of the pixels 6 of the display panel 1, and the selected part of the processed image data 351 is transferred to the drive circuitry 13. In one implementation, the part of the processed image data 351 thus transferred to the drive circuitry 13 is used as the display data 341. In one or more embodiments, part of the processed image data 352 stored in the line latch 25 of the right chip 22 is correspondingly selected in accordance with the arrangement of the pixels 6 of the display panel 1, and the selected part to of the processed image data 352 is transferred to the drive circuitry 13. In one implementation, the part of the processed image data 352 thus transferred to the drive circuitry 13 is used as the display data 342.
In one or more embodiments, the processed image data 351 generated in the left chip 21 includes the processed right boundary image data 37 for all the horizontal lines of the frame image, and the processed image data 352 generated in the right chip 22 includes the processed left boundary image data 39 for all the horizontal lines of the frame image. This enables generating the display data 341 and 342 adaptively to various arrangements of the pixels 6 of the display panel 1 by modifying the data sorting performed during the data transfer from the line latches 25 to the drive circuitry 13.
In one or more embodiments, frame image data 31 received by the interface circuitry 11 during each vertical sync period include pixel data for all the pixels of one frame image, and the data extraction circuitry 41 is configured to extract pixel data to be stored in the line memory 42 and the buffer memory 43 from the frame image data 31 received from the interface circuitry 11. The extracted pixel data may be forwarded to the line memory 42.
The data extraction circuitry 41 of the left chip 21 is configured to extract left image data 32 and right boundary image data 51 from the frame image data 31 received from the interface circuitry 11. The left image data 32 may correspond to the left half image area of the frame image and include grayscale values of the respective colors (e.g., red, green, and blue) of pixels in the left half image area. The right boundary image data 51 may include pixel data for pixels located in a portion of the right half image area of the frame image, the portion being in contact with the boundary between the left half image area and the right half image area. The extracted left image data 32 and the right boundary image data 51 may be forwarded to the line memory 42 of the left chip 21.
The data extraction circuitry 41 of the right chip 22 is configured to extract right image data 33 and left boundary image data 52 from the frame image data 31 received from the interface circuitry 11. The right image data 33 may correspond to the right half image area of the frame image and include grayscale values of the respective colors of pixels in the right half image area. The left boundary image data 52 may include pixel data for pixels located in a portion of the left half image area of the frame image, the portion being in contact with the boundary between the left half image area and the right half image area. The extracted right image data 33 and the left boundary image data 52 may be forwarded to the line memory 42 of the right chip 22.
The right and left boundary image data 51 and 52 for one horizontal line may include pixel data for a number of pixels, the number being determined in accordance with image processing performed in the image processing IP cores 44. In one or more embodiment, the image processing IP cores 44 are each configured to perform image processing in units of blocks each consisting of a pixels located in the same horizontal line where a is a natural number of two or more, and the right and left boundary image data 51 and 52 for one horizontal line may each include pixel data for a pixels of one block.
The line memory 42 of the left chip 21 is configured to sequentially store the left image data 32 and the right boundary image data 51 received from the corresponding data extraction circuitry 41 and sequentially forward the same to the corresponding buffer memory 43. The line memory 42 of the right chip 22 is configured to sequentially store the right image data 33 and the left boundary image data 52 received from the corresponding data extraction circuitry 41 and sequentially forward the same to the corresponding buffer memory 43.
In one or more embodiments, the data extraction circuitry 41 of the left chip 21 is configured to extract pixel data #1 to #1920 as the left image data 32 and further extract pixel data #1921 to #1928 as the right boundary image data 51. The illustrated embodiment corresponds to the case where the image processing IP core 44 is configured to perform image processing in units of blocks each consisting of eight pixels located in the same horizontal line. The extracted left image data 32 and the right boundary image data 51 may be forwarded and stored in the line memory 42. The left image data 32 and the right boundary image data 51 stored in the line memory 42 may be forwarded to the buffer memory 43 in the next horizontal sync period.
In one or more embodiments, the data extraction circuitry 41 of the right chip 22 is configured to extract pixel data #1913 to #1920 as the left boundary image data 52 and further extract pixel data #1921 to #3840 as the right image data 33. The extracted left boundary image data 52 and the right image data 33 may be forwarded and stored in the line memory 42. The left boundary image data 52 and the right image data 33 stored in the line memory 42 may be forwarded to the buffer memory 43 in the next horizontal sync period.
The operation of the data extraction circuitry 41 illustrated in
Referring back to
The image processing IP core 44 of the right chip 22 may be configured to receive the right image data 33 and left boundary image data 52 and generate the processed image data 352 based on the received right image data 33 and left boundary image data 52. The processed image data 352 may include processed right image data 38 and processed left boundary image data 39. The processed right image data 38 may be generated based on the right image data 33 of the frame image data 31, and the processed left boundary image data 39 may be generated based on the left boundary image data 52. In some embodiments, the processed right image data 38 is generated by applying desired image processing to the right image data 33. In other embodiments, the right image data 33 may be used as the processed right image data 38 without modification. The processed left boundary image data 39 may be generated based on the left boundary image data 52. In some embodiments, the processed left boundary image data 39 may be generated by applying image processing to the left boundary image data 52. In other embodiments, the left boundary image data 52 may be used as the processed left boundary image data 39 without modification.
In one or more embodiments, the image processing IP cores 44 of the left and right chips 21 and 22 are configured to exchange control data used for the image processing. The image processing IP core 44 of the left chip 21 may be configured to calculate a feature value of the left image area of the frame image (e.g., the average picture level (APL) of the left image area) based on the left image data 32 and send the calculated feature value to the image processing IP core 44 of the right chip 22. The image processing IP core 44 of the right chip 22 may be configured to calculate a feature value of the right image area of the frame image (e.g., the average picture level (APL) of the right image area) based on the right image data 33 and send the calculated feature value to the image processing IP core 44 of the left chip 21. The image processing IP core 44 of the left chip 21 may be configured to calculate a feature value of the entire frame image based on the feature value calculated by itself and the feature value calculated by the right chip 22 and perform the image processing based on the calculated feature value of the entire frame image. The image processing IP core 44 of the right chip 22 may be configured to calculate a feature value of the entire frame image based on the feature value calculated by itself and the feature value calculated by the left chip 21 and perform the image processing based on the calculated feature value of the entire frame image. This operation enables the image processing IP cores 44 of both the left and right chips 21 and 22 to perform the image processing based on the feature value of the entire frame image (e.g., the APL of the entire frame image).
In one implementation, the processed image data 351 and 352 are subjected to data transfer similar to the embodiment described in relation to
In various embodiments, a display driver 2 is configured to operate as the left chip 21 illustrated in
In step 701, first and second display drivers 2 (e.g., the left chip 21 and the right chip 22) receive first frame image data for a first frame image (e.g., the frame image data 31.) In step 702, the first display driver 2 (e.g., the left chip 21) extracts first image area image data and first boundary image data from the first frame image data. The first image area image data includes pixel data for pixels in a first image area (e.g., the left area 31) of the display panel 1. In embodiments where the first image area is the left area 31, the first image area image data may be or may include the left image data 32 defined for the left area 31. The first boundary image data includes pixel data for boundary pixels located in a first portion of a second image area (e.g., the right area 32) of the first frame image, where the second image area is adjacent to the first image area, and the first portion is located in contact with the boundary between the first image area and the second image area. In embodiments where the second image area is the right area 32, the first boundary image data may be or may include the right boundary image data 51.
In step 703, the second display driver 2 (e.g., the right chip 22) extracts second image area image data and second boundary image data from the first frame image data. The second image area image data includes pixel data for pixels in a second image area (e.g., the right area 32) of the display panel 1. In embodiments where the second image area is the right area 32, the second image area image data may be or may include the right image data 33 defined for the right area 32. The second boundary image data includes pixel data for boundary pixels located in a second portion of the first image area (e.g., the left area 31) of the first frame image, where the second portion is located in contact with the boundary between the first image area and the second image area. In embodiments where the first image area is the left area 31, the second boundary image data may be or may include the left boundary image data 52.
In step 704, the first display driver 2 generates first display data (e.g., the display data 341) based on the first image area image data and the first boundary image data. The first display driver 2 may generate processed first image area data (e.g., the processed left image data 36) and processed first boundary image data (e.g., the processed right boundary image data 37) by applying image processing to the first image area image data and the first boundary image data, respectively. The first display driver 2 may further generate the first display data based on the processed first image area data and the processed first boundary image data. The generation of the first display data may include data sorting or selection of the processed first image area data and the processed first boundary image data for each horizontal line.
In step 705, the second display driver 2 generate second display data (e.g., the display data 342) based on the second image area image data and the second boundary image data. The second display driver 2 may generate processed second image area data (e.g., the processed right image data 38) and processed second boundary image data (e.g., the processed left boundary image data 39) by applying image processing to the second image area image data and the second boundary image data, respectively. The second display driver 2 may further generate the second display data based on the processed second image area data and the processed second boundary image data. The generation of the second display data may include data sorting or selection of the processed second image area data and the processed second boundary image data for each horizontal line.
In step 706, the first display driver 2 drives display elements in a first display area (e.g., the left area 31) of the display panel 1 based on the first display data. In step 707, the second display driver 2 drives display elements in a second display area (e.g., the right area 32) of the display panel 1 based on the second display data.
Referring to
In one or more embodiments, when the display driver 2 is placed in the independent operation mode, the number of pixels for which pixel data are stored in the buffer memory 43 per horizontal line is reduced less than that for the case when the display driver 2 is placed in the left operation mode or the right operation mode. In some embodiments, the number of horizontal lines for which pixel data are stored in the buffer memory 43 is increased when the display driver IC chip 2 is placed in the independent operation mode. This operation is useful, for example, when a touch controller (not illustrated) is integrated in the display driver 2. Storing pixel data for an increased number of horizontal lines in the buffer memory 43 is useful for providing sufficient time for achieving proximity sensing by the touch controller in each vertical sync period.
In one implementation, when the DDIC IC chip 2 is placed in the left operation mode, left image data 32 and right boundary image data 51 for p horizontal lines may be stored in the buffer memory 43, where p is a natural number of two or more. In the embodiment illustrated in
When the DDIC IC chip 2 is placed in the right operation mode, right image data 33 and left boundary image data 52 for p horizontal lines may be stored in the buffer memory 43. In embodiments where the right image data 33 for one horizontal line includes pixel data for 1920 pixels and the left boundary image data 52 for one horizontal line includes pixel data for eight pixels, the number of pixels for which the buffer memory 43 stores pixel data per horizontal line is 1928 also in the right operation mode.
When the DDIC IC chip 2 is placed in the independent operation mode, frame image data 53 for q horizontal lines may be stored in the buffer memory 43, where q is a natural number larger than p. In the embodiment illustrated in
The left chip 21, the right chip 22, and the middle chip 23 may have the same configuration. Each display driver 2 may be configured to operate as the left chip 21, the right chip 22, and the middle chip 23, when placed in a left operation mode, a right operation mode, and a middle operation mode, respectively.
In one or more embodiments, frame image data 61 received by the interface circuitry 11 of each display driver 2 during each vertical sync period includes pixel data for all the pixels of one frame image, and the data extraction circuitry 41A is configured to extract pixel data to be stored in the line memory 42 and the buffer memory 43 from the frame image data 61 received from the interface circuitry 11. The extracted pixel data may be forwarded to the line memory 42.
The frame image data 61 may include left image data 62 (which may be also referred to as first image area image data), right image data 63 (which may be also referred to as second image area image data), and middle image data 64 (which may be also referred to as second image area image data). The left image data 62 may be associated with or defined for the left image area (which may be also referred to as first image area) of the frame image and include pixel data for respective pixels in the left image area. The right image data 63 may be associated with or defined for a right image area (which may be also referred to as second image area) of the frame image and include grayscale values of respective colors of respective pixels in the right image area. The middle image data 64 may be associated with or defined for a middle image area (which may be also referred to as third image area) of the frame image and include grayscale values of respective colors of respective pixels in the middle image area.
Left image data 62 for one horizontal line may include pixel data for a number of pixels, the number being one-third of the horizontal resolution of the frame image. In one implementation, the horizontal resolution of the frame image is 3840 pixels, and the left image data 62 for one horizontal line includes pixel data for 1280 pixels. Correspondingly, right image data 63 and middle image data 64 for one horizontal line may each include pixel data for a number of pixels, the number being one-third of the horizontal resolution of the frame image. In one implementation, the right image data 63 and the middle image data 64 for one horizontal line include pixel data for 1280 pixels.
The data extraction circuitry 41A of the left chip 21 may be configured to extract the left image data 62 and first right boundary image data 65 from the frame image data 61 received from the interface circuitry 11. The first right boundary image data 65 may include pixel data for pixels located in a portion of the middle image area of the frame image, the portion being adjacent to the left image area. The left image data 62 and first right boundary image data 65 thus extracted may be forwarded to the line memory 42 in the left chip 21.
The data extraction circuitry 41A of the right chip 22 may be configured to extract the right image data 63 and first left boundary image data 66 from the frame image data 61 received from the interface circuitry 11. The first left boundary image data 66 may include pixel data for pixels located in a portion of the middle image area of the frame image, the portion being adjacent to the right image area. The right image data 63 and first left boundary image data 66 thus extracted may be forwarded to the line memory 42 in the right chip 22.
The data extraction circuitry 41A of the middle chip 23 may be configured to extract the middle image data 64, second left boundary image data 67, and second right boundary image data 68 from the frame image data 61 received from the interface circuitry 11. The second left boundary image data 67 may include pixel data for pixels located in a portion of the left image area of the frame image, the portion being adjacent to the middle image area. The second right boundary image data 68 may include pixel data for pixels located in a portion of the right image area of the frame image, the portion being adjacent to the middle image area. The middle image data 64, second left boundary image data 67 and second right boundary image data 68 thus extracted may be forwarded to the line memory 42 in the middle chip 23.
Such operation of the data extraction circuitry 41A enables driving the display panel 1 configured in the zigzag pixel arrangement, while contributing reduction in the capacities of the line memories 42 and the buffer memories 43.
The first right boundary image data 65, the first left boundary image data 66, the second left boundary image data 67, and the second right boundary image data 68 for one horizontal line may each include pixel data for a number of pixels, the number being determined based on image processing performed in the image processing IP cores 44. In one or more embodiment, the image processing IP cores 44 are each configured to perform image processing in units of blocks each consisting of a pixels located in the same horizontal line where a is a natural number of two or more, and the first right boundary image data 65, the first left boundary image data 66, the second left boundary image data 67, and the second right boundary image data 68 for one horizontal line may each include pixel data for a pixels of one block.
The line memory 42 of the left chip 21 is configured to sequentially store the left image data 62 and the first right boundary image data 65 received from the corresponding data extraction circuitry 41A and sequentially forward the same to the corresponding buffer memory 43. The line memory 42 of the right chip 22 is configured to sequentially store the right image data 63 and the first left boundary image data 66 received from the corresponding data extraction circuitry 41A and sequentially forward the same to the corresponding buffer memory 43. The line memory 42 of the middle chip 23 is configured to sequentially store the middle image data 64, the second left boundary image data 67, and the second right boundary image data 68 received from the corresponding data extraction circuitry 41A and sequentially forward the same to the corresponding buffer memory 43.
The image processing IP core 44 of the left chip 21 may be configured to generate processed image data 691 by applying desired processing to the left image data 62 and the first right boundary image data 65 received from the corresponding buffer memory 43. The processed image data 691 may include processed left image data 71 and first processed right boundary image data 72. In one implementation, the image processing IP core 44 of the left chip 21 may be configured to generate the processed left image data 71 and the first processed right boundary image data 72 by applying desired image processing to the left image data 62 and the first right boundary image data 65, respectively. The processed image data 691 thus generated may be forwarded to the line latch 46 of the left chip 21.
The image processing IP core 44 of the right chip 22 may be configured to generate processed image data 692 by applying desired processing to the right image data 63 and the first left boundary image data 66 received from the corresponding buffer memory 43. The processed image data 692 may include processed right image data 73 and first processed left boundary image data 74. In one implementation, the image processing IP core 44 of the right chip 22 may be configured to generate the processed right image data 73 and the first processed left boundary image data 74 by applying desired image processing to the right image data 63 and the first left boundary image data 66, respectively. The processed image data 692 thus generated may be forwarded to the line latch 46 in the right chip 22.
The image processing IP core 44 of the middle chip 23 may be configured to generate processed image data 693 by applying desired processing to the middle image data 64, the second left boundary image data 67, and the second right boundary image data 68 received from the buffer memory 43. The processed image data 693 may include processed middle image data 75, second processed left boundary image data 76, and second processed right boundary image data 77. In one implementation, the image processing IP core 44 of the middle chip 23 may be configured to generate the processed middle image data 75, the second processed left boundary image data 76, and the second processed right boundary image data 77 by applying desired image processing to the middle image data 64, the second left boundary image data 67, and the second right boundary image data 68, respectively. The processed image data 693 thus generated may be forwarded to the line latch 46 of the middle chip 23.
In one or more embodiments, the line latch 46 of each display driver 2 is adapted to data transfer to the corresponding drive circuitry 13. In one or more embodiments, data sorting is performed during the data transfer from the line latch 46 to the drive circuitry 13 to thereby supply display data 70 to the drive circuitry 13. The data sorting may be performed in accordance with the arrangement of the pixels 6 in the display panel 1.
In one implementation, display data used to drive the display elements in the left area 31 may be selected from the processed image data 691 stored in the line latch 46 of the left chip 21 and transferred to the corresponding drive circuitry 13. The data transferred to the drive circuitry 13 of the left chip 21 may be used as the display data 701.
Correspondingly, display data used to drive the display elements in the right area 32 may be selected from the processed image data 692 stored in the line latch 46 of the right chip 22 and transferred to the corresponding drive circuitry 13. The data transferred to the drive circuitry 13 of the right chip 22 may be used as the display data 702.
Further, display data used to drive the display elements in the middle area 33 may be selected from the processed image data 693 stored in the line latch 46 of the middle chip 23 and transferred to the corresponding drive circuitry 13. The data transferred to the drive circuitry 13 of the middle chip 23 may be used as the display data 703.
In one or more embodiments, the drive circuitry 13 of the left chip 21 is configured to drive the display elements in the left area 31 of the display panel 1 based on the display data 701; the drive circuitry 13 of the right chip 22 is configured to drive the display elements in the right area 32 of the display panel 1 based on the display data 702; and the drive circuitry 13 of the middle chip 23 is configured to drive the display elements in the middle area 33 of the display panel 1 based on the display data 703.
In other embodiments, the display panel 1 may be segmented into M display areas 3 and driven with M display drivers 2, where M is a natural number of three or more. In one implementation, the display driver 2 that drives the leftmost one of the M display areas 3 may be configured to operate similarly to the left chip 21 described in relation to
While many embodiments have been described, those skilled in the art, having benefit of this disclosure, will appreciate that other embodiments can be devised which do not depart from the scope. Accordingly, the scope of the invention should be limited only by the attached claims.
Number | Date | Country | Kind |
---|---|---|---|
JP2019-137636 | Jul 2019 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20060256033 | Chan | Nov 2006 | A1 |
20090189848 | Maeda | Jul 2009 | A1 |
20120194572 | Saitoh | Aug 2012 | A1 |
20190198575 | Liu | Jun 2019 | A1 |
20190347985 | Shaeffer | Nov 2019 | A1 |
Number | Date | Country | |
---|---|---|---|
20210027741 A1 | Jan 2021 | US |