This application claims priority to Korean Patent Application No. 10-2023-0023254, filed on Feb. 22, 2023, and all the benefits accruing therefrom under 35 USC § 119, the content of which in its entirety is herein incorporated by reference.
The disclosure relates to a display device, and more particularly to a display device that compensates for a color crosstalk, and a method of operating the display device.
A pixel of a display device, such as an organic light emitting diode (OLED) display device, may include sub-pixels that emit light of different colors, for example, a red sub-pixel, a green sub-pixel and a blue sub-pixel. In the display device, it is desirable that a white luminance when all of the red, green, and blue sub-pixels emit light is equal to a sum of a red luminance when the red sub-pixel emits light, a green luminance when the green sub-pixel emits light and a blue luminance when the blue sub-pixel emits light. However, a voltage (e.g., an initialization voltage) applied to a first sub-pixel may be changed when a second sub-pixel adjacent to the first sub-pixel emits light, and thus a luminance of the first sub-pixel may be changed. Accordingly, a color crosstalk phenomenon in which the white luminance is different from the sum of the red, green, and blue luminance may occur.
In an embodiment, a display device capable of compensating for color crosstalk is provided.
In an embodiment, a method of operating a display device capable of compensating for color crosstalk is provided.
According to an embodiment, there is provided a display device including a display panel having a plurality of pixels, each of the plurality of pixels including a first color sub-pixel, a second color sub-pixel and a third color sub-pixel, and a panel driver configured to drive the display panel based on input image data. The panel driver stores a gamma curve characteristic to which a color crosstalk offset of the display panel is applied, detects mixed color pixels among the plurality of pixels by analyzing the input image data, determines a compensation coefficient for each of the mixed color pixels based on the input image data for each of the mixed color pixels, and compensates a luminance of each of the mixed color pixels based on the color crosstalk offset and the compensation coefficient.
In an embodiment, color crosstalk values of the display panel may be calculated at respective gray levels, and the color crosstalk offset may be determined based on an average of the color crosstalk values.
In an embodiment, the color crosstalk value at each gray level may be determined based on an equation “CCT=(LW−LR−LG−LB)/LW*100”, where CCT denotes the color crosstalk value, LW denotes a white luminance of the display panel when all of the first, second and third color sub-pixels emit light, LR denotes a first color luminance of the display panel when the first color sub-pixel emits light, LG denotes a second color luminance of the display panel when the second color sub-pixel emits light, and LB denotes a third color luminance of the display panel when the third color sub-pixel emits light.
In an embodiment, the color crosstalk offset may correspond to a difference between a reference color crosstalk value and the average of the color crosstalk values.
In an embodiment, color crosstalk values of the display panel may be calculated at gray levels greater than or equal to a reference gray level, and the color crosstalk offset may be determined based on an average of the color crosstalk values at the gray levels that are greater than or equal to the reference gray level.
In an embodiment, entire gray levels may be divided into gray regions, an average of color crosstalk values of the display panel may be calculated in each of the gray regions, and the color crosstalk offset may be determined based on the average of the color crosstalk values in each of the gray regions.
In an embodiment, when the color crosstalk offset has a negative value, the gamma curve characteristic may decrease by an amount of the color crosstalk offset from a reference gamma curve characteristic. When the color crosstalk offset has a positive value, the gamma curve characteristic may increase by an amount of the color crosstalk offset from the reference gamma curve characteristic.
In an embodiment, the input image data for each of the plurality of pixels may include first color sub-pixel data for the first color sub-pixel, second color sub-pixel data for the second color sub-pixel, and third color sub-pixel data for the third color sub-pixel. The panel driver may determine a first pixel among the plurality of pixels as the mixed color pixel when at least two of the first color sub-pixel data, the second color sub-pixel data and the third color sub-pixel data for the first pixel represent a non-zero gray level.
In an embodiment, the panel driver may normalize the first, second and third color sub-pixel data for the mixed color pixel such that a maximum sub-pixel data among the first, second and third color sub-pixel data for the mixed color pixel has a value of 1, and may determine an average of the normalized first, second and third color sub-pixel data as the compensation coefficient of the mixed color pixel.
In an embodiment, the panel driver may normalize the first, second and third color sub-pixel data for the mixed color pixel such that a maximum sub-pixel data among the first, second and third color sub-pixel data for the mixed color pixel has a value of 1, may calculate a weighted average of the first, second and third color sub-pixel data by applying weights to the normalized first, second and third color sub-pixel data, and may determine the weighted average of the first, second and third color sub-pixel data as the compensation coefficient of the mixed color pixel.
In an embodiment, when the color crosstalk offset has a negative value, the panel driver may increase the luminance of each of the mixed color pixels by a product of an amount of the color crosstalk offset and the compensation coefficient. When the color crosstalk offset has a positive value, the panel driver may decrease the luminance of each of the mixed color pixels by a product of an amount of the color crosstalk offset and the compensation coefficient.
In an embodiment, the panel driver may include a gamma characteristic storage configured to store the gamma curve characteristic to which the color crosstalk offset is applied, a controller configured to receive the input image data, to detect the mixed color pixels among the plurality of pixels, to determine the compensation coefficient of each of the mixed color pixels, and to generate output image data based on the input image data, the color crosstalk offset and the compensation coefficient, and a data driver configured to provide data signals to the plurality of pixels based on the output image data.
In an embodiment, the controller may search for an original luminance corresponding to an input gray level represented by the input image data for each of the mixed color pixels in the gamma curve characteristic, may determine a target luminance by adjusting the original luminance by a product of an amount of the color crosstalk offset and the compensation coefficient, and may generate the output image data representing an output gray level for each of the mixed color pixels by searching for the output gray level corresponding to the target luminance in the gamma curve characteristic.
In an embodiment, the gamma characteristic storage may include a plurality of input gray-output gray lookup tables respectively corresponding to a plurality of corrected gamma curve characteristics that are compensated based on a plurality of compensation coefficients.
In an embodiment, the controller may select an input gray-output gray lookup table corresponding to the compensation coefficient of each of the mixed color pixels from among the plurality of input gray-output gray lookup tables, may determine an output gray level corresponding to an input gray level represented by the input image data for each of the mixed color pixels using the selected input gray-output gray lookup table, and may generate the output image data representing the output gray level for each of the mixed color pixels.
In an embodiment, the gamma characteristic storage may include a plurality of input gray-output gray lookup tables respectively corresponding to a plurality of reference corrected gamma curve characteristics that are compensated based on a plurality of reference compensation coefficients.
In an embodiment, the controller may select two input gray-output gray lookup tables corresponding to two reference compensation coefficients adjacent to the compensation coefficient of each of the mixed color pixels from among the plurality of input gray-output gray lookup tables, may determine two output gray levels corresponding to an input gray level represented by the input image data for each of the mixed color pixels in the selected two input gray-output gray lookup tables, and may generate the output image data representing an interpolated output gray level for each of the mixed color pixels by interpolating the two output gray levels.
According to an embodiment, there is provided a method of operating a display device including a plurality of pixels. Each of the plurality of pixels includes a first color sub-pixel, a second color sub-pixel and a third color sub-pixel. In the method, a gamma curve characteristic to which a color crosstalk offset of a display panel of the display device is applied is stored, input image data are received, mixed color pixels are detected among the plurality of pixels by analyzing the input image data, a compensation coefficient of each of the mixed color pixels is determined based on the input image data for each of the mixed color pixels, and the luminance of each of the mixed color pixels is compensated based on the color crosstalk offset and the compensation coefficient.
In an embodiment, to compensate the luminance of each of the mixed color pixels, an original luminance corresponding to an input gray level represented by the input image data for each of the mixed color pixels may be searched for in the gamma curve characteristic, a target luminance may be determined by adjusting the original luminance by a product of an amount of the color crosstalk offset and the compensation coefficient, an output gray level corresponding to the target luminance may be searched for in the gamma curve characteristic, and the display panel may be driven based on output image data representing the output gray level for each of the mixed color pixels.
In an embodiment, to store the gamma curve characteristic to which the color crosstalk offset is applied, a plurality of input gray-output gray lookup tables respectively corresponding to a plurality of corrected gamma curve characteristics that are compensated based on a plurality of compensation coefficients may be stored. To compensate the luminance of each of the mixed color pixels, an input gray-output gray lookup table corresponding to the compensation coefficient of each of the mixed color pixels may be selected from among the plurality of input gray-output gray lookup tables, an output gray level corresponding to an input gray level represented by the input image data for each of the mixed color pixels may be determined using the selected input gray-output gray lookup table, and the display panel may be driven based on output image data representing the output gray level for each of the mixed color pixels.
As described above, in a display device and a method of operating the display device according to an embodiment, a gamma curve characteristic to which a color crosstalk offset of a display panel is applied may be stored, mixed color pixels may be detected by analyzing input image data, a compensation coefficient for each of the mixed color pixels may be determined based on the input image data for each of the mixed color pixels, and the luminance of each of the mixed color pixels may be compensated based on the color crosstalk offset and the compensation coefficient. Accordingly, a color crosstalk of the display device may be compensated.
Illustrative, non-limiting embodiments will be more clearly understood from the following detailed description in conjunction with the accompanying drawings. The above and other features of embodiments will become apparent by describing in detail embodiments thereof with reference to the accompanying drawings.
Hereinafter, embodiments of the invention will be explained in detail with reference to the accompanying drawings. This invention may, however, be embodied in many different forms, and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art, where like reference numerals refer to like elements throughout.
Referring to
The display panel 110 may include a plurality of data lines, a plurality of scan lines, and the plurality of pixels PX that are connected to the plurality of data lines and the plurality of scan lines. Each pixel PX may include a first color sub-pixel RSP, a second color sub-pixel GSP, and a third color sub-pixel BSP that emit light of different colors. In an embodiment, each pixel PX may include a red sub-pixel RSP emitting red light, a green sub-pixel GSP emitting green light, and a blue sub-pixel BSP emitting blue light. Further, in an embodiment, each of the first, second and third color sub-pixels RSP, GSP and BSP, respectively, may include at least two transistors, at least one capacitor and a light emitting element, and the display panel 110 may be a light emitting display panel. For example, the light emitting element may be an organic light emitting diode (OLED), and the display panel 110 may be an OLED display panel. In other examples, the light emitting element may be a nano light emitting diode (NED), a quantum dot (QD) light emitting diode, a micro light emitting diode, an inorganic light emitting diode, and/or any other suitable light emitting element. However, the display panel 110 is not limited to the light emitting display panel and may be any suitable display panel.
In an embodiment, the gamma characteristic storage 130 may store a gamma curve characteristic (e.g., a gamma curve characteristic 320 illustrated in
In an embodiment, color crosstalk values of the display panel 110 may be at the entire gray levels (e.g., a 0-gray level to a 255-gray level), and the color crosstalk offset may be determined based on an average of the color crosstalk values. In other embodiments, color crosstalk values of the display panel 110 may be calculated at gray levels greater than or equal to a reference gray level (e.g., a 10-gray level), and the color crosstalk offset may be determined based on an average of the color crosstalk values at the gray levels greater than or equal to the reference gray level. In still other embodiments, the entire gray levels may be divided into gray regions each including two or more consecutive gray levels, an average of color crosstalk values of the display panel may be calculated in each of the gray regions, and the color crosstalk offset may be determined based on the average of the color crosstalk values in each of the gray regions. Here, applying the color crosstalk offset to the gamma curve characteristic may mean that the gamma curve characteristic decreases by an amount (or an absolute value) of the color crosstalk offset from a reference gamma curve characteristic (e.g., a gamma curve characteristic having a gamma value of about 2.2) when the color crosstalk offset has a negative value, and may mean that the gamma curve characteristic increases by the amount of the color crosstalk offset from the reference gamma curve characteristic when the color crosstalk offset has a positive value.
In an embodiment, the scan driver 140 may generate the scan signals SS based on a scan control signal SCTRL received from the controller 160, and may sequentially provide the scan signals SS to the plurality of pixels PX on a row basis. In an embodiment, the scan control signal SCTRL may include a start signal and a clock signal, but is not limited thereto. In an embodiment, the scan driver 140 may be integrated or formed in the display panel 110. In other embodiments, the scan driver 140 may be implemented as one or more integrated circuits.
In an embodiment, the data driver 150 may generate the data signals DS based on output image data ODAT and a data control signal DCTRL received from the controller 160, and may provide the data signals DS to the plurality of pixels PX. In an embodiment, the data control signal DCTRL may include an output data enable signal, a horizontal start signal and/or a load signal, but is not limited thereto. In an embodiment, the data driver 150 may be implemented as one or more integrated circuits. In other embodiments, the data driver 150 and the controller 160 may be implemented with a single integrated circuit, and the single integrated circuit may be referred to as a timing controller embedded data driver (TED) integrated circuit.
In an embodiment, the controller 160 (e.g., a timing controller (TCON)) may receive the input image data IDAT and a control signal CTRL from an external host device (e.g., a graphics processing unit (GPU), an application processor (AP) or a graphics card). In an embodiment, the control signal CTRL may include a vertical synchronization signal, a horizontal synchronization signal, an input data enable signal, a master clock signal, etc., but is not limited thereto. The controller 160 may generate the output image data ODAT, the data control signal DCTRL and the scan control signal SCTRL based on the input image data IDAT and the control signal CTRL. The controller 160 may control an operation of the scan driver 140 by providing the scan control signal SCTRL to the scan driver 140 and may control an operation of the data driver 150 by providing the output image data ODAT and the data control signal DCTRL to the data driver 150.
In the display device 100 according to an embodiment, the controller 160 of the panel driver 120 may detect mixed color pixels among the plurality of pixels PX by analyzing the input image data IDAT. For example, the input image data IDAT for each pixel PX may include first color sub-pixel data for the first color sub-pixel RSP, second color sub-pixel data for the second color sub-pixel GSP and third color sub-pixel data for the third color sub-pixel BSP, and the controller 160 may determine a pixel PX for which at least two of the first, second and third color sub-pixel data represent a non-zero gray level as the mixed color pixel.
Further, in an embodiment, the controller 160 of the panel driver 120 may determine a compensation coefficient for each mixed color pixel based on the input image data IDAT for each mixed color pixel. In an embodiment, the controller 160 may normalize the first, second and third color sub-pixel data for the mixed color pixel such that a maximum sub-pixel data among the first, second and third color sub-pixel data has a value of 1, and may determine an average of the normalized first, second and third color sub-pixel data as the compensation coefficient of the mixed color pixel. In other embodiments, the controller 160 may normalize the first, second and third color sub-pixel data for the mixed color pixel, may calculate a weighted average of the first, second and third color sub-pixel data by applying weights (e.g., weights of 2:7:1) to the normalized first, second and third color sub-pixel data, and may determine the weighted average of the first, second and third color sub-pixel data as the compensation coefficient of the mixed color pixel.
In an embodiment, the panel driver 120 may compensate the luminance of each mixed color pixel based on the color crosstalk offset and the compensation coefficient. In an embodiment, the panel driver 120 may adjust the luminance of each mixed color pixel by a product of an amount (or an absolute value) of the color crosstalk offset and the compensation coefficient. For example, the panel driver 120 may increase the luminance of each mixed color pixel by the product of the amount of the color crosstalk offset and the compensation coefficient when the color crosstalk offset has a negative value, and may decrease the luminance of each mixed color pixel by the product of the amount of the color crosstalk offset and the compensation coefficient when the color crosstalk offset has a positive value.
As described above, in the display device 100 according to an embodiment, the gamma curve characteristic to which the color crosstalk offset of the display panel 110 is applied may be stored, the mixed color pixels may be detected by analyzing the input image data IDAT, the compensation coefficient of each mixed color pixel may be determined based on the input image data IDAT for each mixed color pixel, and the luminance of each mixed color pixel may be compensated based on the color crosstalk offset and the compensation coefficient. Thus, a single color pixel among the plurality of pixels PX may emit light with an original luminance corresponding to the gamma curve characteristic to which the color crosstalk offset is applied, and the mixed color pixel among the plurality of pixels PX may emit light with a luminance that is compensated from the original luminance based on the color crosstalk offset and the compensation coefficient. Accordingly, in the display device 100 according to an embodiment, a white luminance when all of the first, second and third color sub-pixels RSP, GSP, and BSP emit light may be substantially equal to a sum of a first color luminance when the first color sub-pixel RSP emits light, a second color luminance when the second color sub-pixel GSP emits light and a third color luminance when the third color sub-pixel BSP emits light, and a color crosstalk of the display device 100 may be compensated for.
In an embodiment,
Referring to
As illustrated in
Further, in an embodiment, the color crosstalk offset CCT_OFS may be determined based on an average of the color crosstalk values 310 and 360 in the entire gray levels (e.g., a 0-gray level 0G to a 255-gray level 255G). In other embodiments, as illustrated in
In still other embodiments, the entire gray levels (e.g., the 0-gray level 0G to the 255-gray level 255G) may be divided into gray regions, an average of color crosstalk values 310 and 360 of the display panel 110 may be calculated in each of the gray regions, and the color crosstalk offset CCT_OFS may be determined based on the average of the color crosstalk values 310 and 360 in each of the gray regions.
Further, in an embodiment, when the color crosstalk offset CCT_OFS has a negative value (e.g., about-3%) as illustrated in
In an embodiment, a controller 160 of the panel driver 120 may receive input image data IDAT from an external host device (S220), and may detect mixed color pixels among a plurality of pixels PX of the display panel 110 by analyzing the input image data IDAT (S230). In an embodiment, as illustrated in
Further, in an embodiment, the controller 160 of the panel driver 120 may determine a compensation coefficient of each mixed color pixel based on the input image data IDAT for each mixed color pixel (S240).
In an embodiment, as illustrated in
In an embodiment, as illustrated in
In an embodiment, the panel driver 120 may compensate the luminance of each mixed color pixel based on the color crosstalk offset CCT_OFS and the compensation coefficient (S250). In an embodiment, the panel driver 120 may adjust the luminance of each mixed color pixel by a product of an amount of the color crosstalk offset CCT_OFS and the compensation coefficient.
In an embodiment, when the color crosstalk offset CCT_OFS has the negative value (e.g., about-3%) as illustrated in
Further, in an embodiment, when the color crosstalk offset CCT_OFS has the positive value (e.g., about +3%) as illustrated in
In an embodiment, to compensate the luminance of each mixed color pixel (S250), the controller 160 may search for an original luminance corresponding to an input gray level represented by the input image data IDAT for the mixed color pixel in the gamma curve characteristic 320 and 370 stored in the gamma characteristic storage 130 (S260), may determine a target luminance by adjusting the original luminance by a product of an amount of the color crosstalk offset CCT_OFS and the compensation coefficient CC (S270), and may search for an output gray level corresponding to the target luminance in the gamma curve characteristic 320 and 370 (S280). Further, the controller 160 may generate output image data ODAT representing the output gray level for the mixed color pixel, and a data driver 150 of the panel driver 120 may provide a data signal DS corresponding to the output image data ODAT to the mixed color pixel. Thus, the display panel 110 may be driven based on the output image data ODAT representing the output gray level for the mixed color pixel (S290).
For example, in an embodiment, when the color crosstalk offset CCT_OFS has the negative value (e.g., about-3%) and the luminance compensation amount LCA for a mixed color pixel is about +1.5%, as illustrated in
Further, in an embodiment, when the color crosstalk offset CCT_OFS has the positive value (e.g., about +3%) and the luminance compensation amount LCA for a mixed color pixel is about −1.5%, as illustrated in
Accordingly, in the method of operating the display device 100 according to an embodiment, a color crosstalk compensation operation described above may be performed, and an image quality of the display device 100 may be improved. For example, as shown in
In an embodiment, a method of
Referring to
For example, in an embodiment, when a color crosstalk offset CCT_OFS is about-3% as illustrated in
In an embodiment, a controller 160 of the panel driver 120 may receive input image data IDAT from an external host device (S420), and may detect mixed color pixels among a plurality of pixels PX of a display panel 110 by analyzing the input image data IDAT (S430). The controller 160 of the panel driver 120 may determine a compensation coefficient for each mixed color pixel based on the input image data IDAT for each mixed color pixel (S440). The panel driver 120 may compensate the luminance of each mixed color pixel based on the color crosstalk offset CCT_OFS and the compensation coefficient (S450).
In an embodiment, to compensate the luminance of each mixed color pixel (S450), the controller 160 may select an input gray-output gray lookup table 134a corresponding to the compensation coefficient CC of the mixed color pixel from among the plurality of input gray-output gray lookup tables 131a, 132a, 133a, . . . , 134a, . . . , 135a (S460). For example, when the color crosstalk offset CCT_OFS is about-3% and the compensation coefficient CC is about 0.5, the controller 160 may select the input gray-output gray lookup table 134a corresponding to the corrected gamma curve characteristic 323 on which the color crosstalk compensation operation of about 1.5% is performed. Further, the controller 160 may determine an output gray level OGRAY corresponding to an input gray level IGRAY represented by the input image data IDAT for the mixed color pixel using the selected input gray-output gray lookup table 134a (S470). The controller 160 may generate output image data ODAT representing the output gray level with respect to the mixed color pixel, and a data driver 150 of the panel driver 120 may provide the mixed color pixel with a data signal DS corresponding to the output image data ODAT. That is, the display panel 110 may be driven based on the output image data ODAT representing the output gray level OGRAY for the mixed color pixel (S480).
In an embodiment, a method of
Referring to
In an embodiment, for example, the plurality of reference compensation coefficients may be about 0, about 0.33, about 0.66 and about 1, but are not limited thereto. Further, when a color crosstalk offset CCT_OFS is about-3% as illustrated in
In an embodiment, a controller 160 of the panel driver 120 may receive input image data IDAT from an external host device (S520), and may detect mixed color pixels among a plurality of pixels PX of a display panel 110 by analyzing the input image data IDAT (S530). The controller 160 of the panel driver 120 may determine a compensation coefficient for each mixed color pixel based on the input image data IDAT for each mixed color pixel (S540). The panel driver 120 may compensate the luminance of each mixed color pixel based on the color crosstalk offset CCT_OFS and the compensation coefficient (S550).
In an embodiment, to compensate the luminance of each mixed color pixel (S550), the controller 160 may select two input gray-output gray lookup tables 132b and 133b corresponding to two reference compensation coefficients adjacent to the compensation coefficient CC of the mixed color pixel from among the plurality of input gray-output gray lookup tables 131b, 132b, 133b and 134b, respectively (S560). For example, when the color crosstalk offset CCT_OFS is about-3% and the compensation coefficient CC is about 0.5, the controller 160 may select the second input gray-output gray lookup table 132b corresponding to the reference corrected gamma curve characteristic 325 on which the color crosstalk compensation operation of about 1% is performed with the reference compensation coefficient of about 0.33, and the third input gray-output gray lookup table 133b corresponding to the reference corrected gamma curve characteristic 326 on which the color crosstalk compensation operation of about 2% is performed with the reference compensation coefficient of about 0.66. Further, the controller 160 may determine two output gray levels OGRAY1 and OGRAY2 corresponding to an input gray level IDAT represented by the input image data IDAT for the mixed color pixel in the selected two input gray-output gray lookup tables 132b and 133b (S570). For example, the controller 160 may determine a first output gray level OGRAY1 corresponding to the input gray level IGRAY by using the second input gray-output gray lookup table 132b, and may determine the second output gray level OGRAY2 corresponding to the input gray level IGRAY by using third input gray-output gray lookup table 133b. The controller 160 may calculate interpolated output gray level OGRAY by interpolating the first and second output gray levels OGRAY1 and OGRAY2 (S580). Further, the controller 160 may generate output image data ODAT representing the interpolated output gray level OGRAY with respect to the mixed color pixel, and a data driver 150 of the panel driver 120 may provide the mixed color pixel with a data signal DS corresponding to the output image data ODAT. That is, the display panel 110 may be driven based on the output image data ODAT representing the output grayscale OGRAY for the mixed color pixel (S590).
In an embodiment and referring to
In an embodiment, the processor 1110 may perform various computing functions or tasks. The processor 1110 may be an application processor (AP), a micro-processor, a central processing unit (CPU), etc. The processor 1110 may be coupled to other components via an address bus, a control bus, a data bus, etc. Further, in an embodiment, the processor 1110 may be further coupled to an extended bus such as a peripheral component interconnection (PCI) bus.
In an embodiment, the memory device 1120 may store data for operations of the electronic device 1100. For example, the memory device 1120 may include at least one non-volatile memory device such as an erasable programmable read-only memory (EPROM) device, an electrically erasable programmable read-only memory (EEPROM) device, a flash memory device, a phase change random access memory (PRAM) device, a resistance random access memory (RRAM) device, a nano floating gate memory (NFGM) device, a polymer random access memory (PoRAM) device, a magnetic random access memory (MRAM) device, a ferroelectric random access memory (FRAM) device, etc., and/or at least one volatile memory device such as a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, a mobile dynamic random access memory (mobile DRAM) device, etc.
In an embodiment, the storage device 1130 may be a solid state drive (SSD) device, a hard disk drive (HDD) device, a CD-ROM device, etc. The I/O device 1140 may be an input device such as a keyboard, a keypad, a mouse, a touch screen, etc., and/or an output device such as a printer, a speaker, etc. The power supply 1150 may supply power for operations of the electronic device 1100. The display device 1160 may be coupled to other components through the buses and/or other communication links.
In an embodiment, in the display device 1160, a gamma curve characteristic to which a color crosstalk offset of a display panel is applied may be stored, mixed color pixels may be detected by analyzing input image data, a compensation coefficient for each of the mixed color pixels may be determined based on the input image data for each of the mixed color pixels, and the luminance of each of the mixed color pixels may be compensated based on the color crosstalk offset and the compensation coefficient. Accordingly, a color crosstalk of the display device 1160 may be compensated.
In an embodiment, the invention may be applied to any electronic device 1100 including the display device 1160. For example, the invention may be applied to a television (TV), a digital TV, a 3D TV, a smart phone, a wearable electronic device, a tablet computer, a mobile phone, a personal computer (PC), a home appliance, a laptop computer, a personal digital assistant (PDA), a portable multimedia player (PMP), a digital camera, a music player, a portable game console, a navigation device, etc.
In an embodiment, an electronic device 2101 may output various information via a display module 2140 in an operating system. When a processor 2110 executes an application stored in a memory 2120, the display module 2140 may provide application information to a user via a display panel 2141.
In an embodiment, the processor 2110 may obtain an external input via an input module 2130 and/or a sensor module 2161 and may execute an application corresponding to the external input. For example, when the user selects a camera icon displayed on the display panel 2141, the processor 2110 may obtain a user input via an input sensor 2161-2 and may activate a camera module 2171. The processor 2110 may transfer image data corresponding to an image captured by the camera module 2171 to the display module 2140. The display module 2140 may display an image corresponding to the captured image via the display panel 2141.
In an embodiment, and as another example, when personal information authentication is executed in the display module 2140, a fingerprint sensor 2161-1 may obtain input fingerprint information as input data. The processor 2110 may compare the input data obtained by the fingerprint sensor 2161-1 with authentication data stored in the memory 2120, and may execute an application according to the comparison result. The display module 2140 may display information executed according to application logic via the display panel 2141.
In an embodiment and is still another example, when a music streaming icon displayed on the display module 2140 is selected, the processor 2110 obtains a user input via the input sensor 2161-2 and may activate a music streaming application stored in the memory 2120. When a music execution command is input in the music streaming application, the processor 2110 may activate a sound output module 2163 to provide sound information corresponding to the music execution command to the user.
In the above, in an embodiment, an operation of the electronic device 2101 has been briefly described. Hereinafter, a configuration of the electronic device 2101 will be described in detail. Some components of the electronic device 2101 described below may be integrated and provided as one component, and/or one component may be provided separately as two or more components.
In an embodiment and referring to
In an embodiment, the processor 2110 may execute software to control at least one other component (e.g., a hardware or software component) of the electronic device 2101 coupled with the processor 2110, and may perform various data processing or computation. According to an embodiment, as at least part of the data processing or computation, the processor 2110 may store a command and/or data received from another component (e.g., the input module 2130, the sensor module 2161 or a communication module 2173) in a volatile memory 2121, may process the command or the data stored in the volatile memory 2121, and/or may store resulting data in a non-volatile memory 2122.
In an embodiment, the processor 2110 may include a main processor 2111 and an auxiliary processor 2112. The main processor 2111 may include one or more of a central processing unit (CPU) 2111-1 and/or an application processor (AP). The main processor 2111 may further include any one or more of a graphics processing unit (GPU) 2111-2, a communication processor (CP), and an image signal processor (ISP). The main processor 2111 may further include a neural processing unit (NPU) 2111-3. The NPU 2111-3 may be a processor specialized in processing an artificial intelligence model, and/or the artificial intelligence model may be generated through machine learning. The artificial intelligence model may include a plurality of artificial neural network layers. The artificial neural network may be a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted Boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-network and/or a combination of two or more thereof, but is not limited thereto. The artificial intelligence model may, additionally or alternatively, include a software structure other than a hardware structure. At least two of the above-described processing units and processors may be implemented as an integrated component (e.g., a single chip), or respective processing units and processors may be implemented as independent components (e.g., a plurality of chips).
In an embodiment, the auxiliary processor 2112 may include a controller. The controller may include an interface conversion circuit and a timing control circuit. The controller may receive an image signal from the main processor 2111, may convert a data format of the image signal to meet interface specifications with the display module 2140, and may output image data. The controller may output various control signals required for driving the display module 2140.
In an embodiment, the auxiliary processor 2112 may further include a data conversion circuit 2112-2, a gamma correction circuit 2112-3, a rendering circuit 2112-4, and/or the like. The data conversion circuit 2112-2 may receive image data from the controller. The data conversion circuit 2112-2 may compensate for the image data such that an image is displayed with a desired luminance according to characteristics of the electronic device 2101 and/or the user's setting, and/or may convert the image data to reduce power consumption and/or to eliminate an afterimage. The gamma correction circuit 2112-3 may convert image data and/or a gamma reference voltage so that an image displayed on the electronic device 2101 has desired gamma characteristics. The rendering circuit 2112-4 may receive image data from the controller, and may render the image data in consideration of a pixel arrangement of the display panel 2141 in the electronic device 2101. At least one of the data conversion circuit 2112-2, the gamma correction circuit 2112-3 and the rendering circuit 2112-4 may be integrated in another component (e.g., the main processor 2111 or the controller). At least one of the data conversion circuit 2112-2, the gamma correction circuit 2112-3 and the rendering circuit 2112-4 may be integrated in a data driver 2143 described below.
In an embodiment, the memory 2120 may store various data used by at least one component (e.g., the processor 2110 or the sensor module 2161) of the electronic device 2101. The various data may include, for example, input data and/or output data for a command related thereto. The memory 2120 may include at least one of the volatile memory 2121 and the non-volatile memory 2122.
In an embodiment, the input module 2130 may receive a command and/or data to be used by the components (e.g., the processor 2110, the sensor module 2161, and/or the sound output module 2163) of the electronic device 2101 from the outside of the electronic device 2101 (e.g., the user or the external electronic device 2102).
In an embodiment, the input module 2130 may include a first input module 2131 for receiving a command and/or data from the user, and a second input module 2132 for receiving a command and/or data from the external electronic device 2102. The first input module 2131 may include a microphone, a mouse, a keyboard, a key (e.g., a button) and/or a pen (e.g., a passive pen or an active pen). The second input module 2132 may support a designated protocol capable of connecting the electronic device 2101 to the external electronic device 2102 by wire and/or wirelessly. In some embodiments, the second input module 2132 may include a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface and/or an audio interface. The second input module 2132 may include a connector that may physically connect the electronic device 2101 to the external electronic device 2102. For example, the second input module 2132 may include an HDMI connector, a USB connector, an SD card connector and/or an audio connector (e.g., a headphone connector).
In an embodiment, the display module 2140 may visually provide information to the user. The display module 2140 may include the display panel 2141, a scan driver 2142 and the data driver 2143. The display module 2140 may further include a window, a chassis and/or a bracket for protecting the display panel 2141.
In an embodiment, the display panel 2141 may include a liquid crystal display panel, an organic light emitting display panel and/or an inorganic light emitting display panel, but the type of the display panel 2141 is not limited thereto. The display panel 2141 may be a rigid type display panel, and/or a flexible type display panel capable of being rolled and/or folded. The display module 2140 may further include a supporter, a bracket and/or a heat dissipation member that supports the display panel 2141.
In an embodiment, the scan driver 2142 may be mounted on the display panel 2141 as a driving chip. Alternatively, the scan driver 2142 may be integrated into the display panel 2141. For example, the scan driver 2142 may include an amorphous silicon TFT gate driver circuit (ASG), a low temperature polycrystalline silicon (LTPS) TFT gate driver circuit and/or an oxide semiconductor TFT gate driver circuit (OSG) embedded in the display panel 2141. The scan driver 2142 may receive a control signal from the controller and may output scan signals to the display panel 2141 in response to the control signal.
In an embodiment, the display panel 2141 may further include an emission driver. The emission driver may output an emission control signal to the display panel 2141 in response to a control signal received from the controller. The emission driver may be formed separately from the scan driver 2142, or may be integrated into the scan driver 2142.
In an embodiment, the data driver 2143 may receive a control signal from the controller, may convert image data into analog voltages (e.g., data voltages) in response to the control signal, and then may output the data voltages to the display panel 2141.
In an embodiment, the data driver 2143 may be incorporated into other components (e.g., the controller). Further, the functions of the interface conversion circuit and/or the timing control circuit of the controller described above may be integrated into the data driver 2143.
In an embodiment, the display module 2140 may further include the emission driver, a voltage generator circuit, or the like. The voltage generator circuit may output various voltages used to drive the display panel 2141.
In an embodiment, the power management module 2150 may supply power to the components of the electronic device 2101. The power management module 2150 may include a battery that charges a power supply voltage. The battery may include a primary cell which is not rechargeable, a secondary cell which is rechargeable, and/or a fuel cell. The power management module 2150 may include a power management integrated circuit (PMIC). The PMIC may supply optimal power to each of the modules described above and modules described below. The power management module 2150 may include a wireless power transmission/reception member electrically connected to the battery. The wireless power transmission/reception member may include a plurality of antenna radiators in the form of coils.
In an embodiment, the electronic device 2101 may further include the internal module 2160 and the external module 2170. The internal module 2160 may include the sensor module 2161, the antenna module 2162 and the sound output module 2163. The external module 2170 may include the camera module 2171, a light module 2172 and the communication module 2173.
In an embodiment, the sensor module 2161 may detect an input by the user's body and/or an input by the pen of the first input module 2131, and may generate an electrical signal and/or data value corresponding to the input. The sensor module 2161 may include at least one of the fingerprint sensor 2161-1, the input sensor 2161-2 and a digitizer 2161-3.
In an embodiment, the fingerprint sensor 2161-1 may generate a data value corresponding to the user's fingerprint. The fingerprint sensor 2161-1 may include any one of an optical type fingerprint sensor and a capacitive type fingerprint sensor.
In an embodiment, the input sensor 2161-2 may generate a data value corresponding to coordinate information of the user's body input and/or the pen input. The input sensor 2161-2 may convert a capacitance change caused by the input into the data value. The input sensor 2161-2 may detect the input by the passive pen, and/or may transmit/receive data to/from the active pen.
In an embodiment, the input sensor 2161-2 may measure a bio-signal, such as blood pressure, moisture and/or body fat. For example, when a portion of the body of the user touches a sensor layer and/or a sensing panel, and does not move for a certain period of time, the input sensor 2161-2 may output information desired by the user to the display module 2140 by detecting the bio-signal based on a change in electric field due to the portion of the body.
In an embodiment, the digitizer 2161-3 may generate a data value corresponding to coordinate information of the input by the pen. The digitizer 2161-3 may convert an amount of an electromagnetic change caused by the input into the data value. The digitizer 2161-3 may detect the input by the passive pen, and/or may transmit/receive data to/from the active pen.
In an embodiment, at least one of the fingerprint sensor 2161-1, the input sensor 2161-2 and the digitizer 2161-3 may be implemented as a sensor layer formed on the display panel 2141 through a continuous process. The fingerprint sensor 2161-1, the input sensor 2161-2 and the digitizer 2161-3 may be disposed above the display panel 2141, and/or at least one of the fingerprint sensor 2161-1, the input sensor 2161-2 and the digitizer 2161-3 may be disposed below the display panel 2141.
In an embodiment, two or more of the fingerprint sensor 2161-1, the input sensor 2161-2 and the digitizer 2161-3 may be integrated into one sensing panel through the same process. When integrated into one sensing panel, the sensing panel may be disposed between the display panel 2141 and a window disposed above the display panel 2141. In an embodiment, the sensing panel may be disposed on the window, but the location of the sensing panel is not limited thereto.
In an embodiment, at least one of the fingerprint sensor 2161-1, the input sensor 2161-2 and the digitizer 2161-3 may be embedded in the display panel 2141. In other words, at least one of the fingerprint sensor 2161-1, the input sensor 2161-2 and the digitizer 2161-2 may be simultaneously formed through a process of forming elements (e.g., light emitting elements, transistors, etc.) included in the display panel 2141.
In addition, in an embodiment, the sensor module 2161 may generate an electrical signal and/or a data value corresponding to an internal state and/or an external state of the electronic device 2101. The sensor module 2161 may further include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor and/or an illuminance sensor.
In an embodiment, the antenna module 2162 may include one or more antennas for transmitting and/or receiving a signal and/or power to and/or from the outside. In an embodiments, the communication module 2173 may transmit and/or receive a signal to and/or from the external electronic device 2102 through an antenna suitable for a communication method. An antenna pattern of the antenna module 2162 may be integrated into one component (e.g., the display panel 2141) of the display module 2140 or the input sensor 2161-2.
In an embodiment, the sound output module 2163 may output sound signals to the outside of the electronic device 2101. The sound output module 2163 may include, for example, a speaker and/or a receiver. The speaker may be used for general purposes, such as playing multimedia and/or playing record. The receiver may be used for receiving incoming calls. In an embodiment, the receiver may be implemented as separate from, and/or as part of the speaker. A sound output pattern of the sound output module 2163 may be integrated into the display module 2140.
In an embodiment, the camera module 2171 may capture a still image and/or a moving image. In an embodiment, the camera module 2171 may include one or more lenses, an image sensor and/or an image signal processor. The camera module 2171 may further include an infrared camera capable of measuring the presence and/or absence of the user, the user's location and/or the user's line of sight.
In an embodiment, the light module 2172 may provide light. The light module 2172 may include a light emitting diode and/or a xenon lamp. The light module 2172 may operate in conjunction with the camera module 2171, and/or may operate independently of the camera module 2171.
In an embodiment, the communication module 2173 may support establishing a wired and/or wireless communication channel between the electronic device 2101 and the external electronic device 2102 and performing communication via the established communication channel. The communication module 2173 may include a wireless communication module (e.g., a cellular communication module, a short-range wireless communication module or a global navigation satellite system (GNSS) communication module) and/or a wired communication module (e.g., a local area network (LAN) communication module or a power line communication (PLC) module). The communication module 2173 may communicate with the external electronic device 2102 via a short-range communication network (e.g., Bluetooth™, wireless-fidelity (Wi-Fi) direct, or infrared data association (IrDA)) and/or a long-range communication network (e.g., a cellular network, the Internet or a computer network (e.g., LAN or wide area network (WAN)). These various types of communication modules 2173 may be implemented as a single chip, and/or may be implemented as multi-chips separate from each other.
In an embodiment, the input module 2130, the sensor module 2161, the camera module 2171, and the like may be used to control an operation of the display module 2140 in conjunction with the processor 2110.
In an embodiment, the processor 2110 may output a command and/or data to the display module 2140, the sound output module 2163, the camera module 2171 and/or the light module 2172 based on input data received from the input module 2130. For example, the processor 2110 may generate image data corresponding to input data applied through a mouse and/or an active pen, and may output the image data to the display module 2140. Alternatively, the processor 2110 may generate command data corresponding to the input data, and may output the command data to the camera module 2171 and/or the light module 2172. When no input data is received from the input module 2130 for a certain period of time, the processor 2110 may switch an operation mode of the electronic device 2101 to a low power mode and/or a sleep mode, thereby reducing power consumption of the electronic device 2101.
In an embodiment, the processor 2110 may output a command and/or data to the display module 2140, the sound output module 2163, the camera module 2171 and/or the light module 2172 based on sensing data received from the sensor module 2161. For example, the processor 2110 may compare authentication data applied by the fingerprint sensor 2161-1 with authentication data stored in the memory 2120, and then may execute an application according to the comparison result. The processor 2110 may execute a command and/or output corresponding image data to the display module 2140 based on the sensing data sensed by the input sensor 2161-2 and/or the digitizer 2161-3. In a case where the sensor module 2161 includes a temperature sensor, the processor 2110 may receive temperature data from the sensor module 2161, and may further perform luminance correction on the image data based on the temperature data.
In an embodiment, the processor 2110 may receive measurement data about the presence and/or absence of the user, the location of the user and/or the user's line of sight from the camera module 2171. The processor 2110 may further perform luminance correction on the image data based on the measurement data. For example, after the processor 2110 determines the presence or absence of the user based on the input from the camera module 2171, the data conversion circuit 2112-2 and/or the gamma correction circuit 2112-3 may perform the luminance correction on the image data, and the processor 2110 may provide the luminance-corrected image data to the display module 2140.
In an embodiment, at least some of the above-described components may be coupled mutually and communicate signals (e.g., commands or data) therebetween via an inter-peripheral communication scheme (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), mobile industry processor interface (MIPI) and/or ultra-path interconnect (UPI)). The processor 2110 may communicate with the display module 2140 via an agreed interface. Further, any one of the above-described communication methods may be used between the processor 2110 and the display module 2140, but the communication method between the processor 2110 and the display module 2140 is not limited to the above-described communication method.
In an embodiment, the electronic device 2101 according to various embodiments described above may be various types of devices. For example, the electronic device 2101 may include at least one of a portable communication device (e.g., a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device and/or a home appliance. However, the electronic device 2101 according to an embodiment is not limited to the above-described devices.
The foregoing is illustrative of embodiments and is not to be construed as limiting thereof. Although a few embodiments have been described, those skilled in the art will readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of the invention. Accordingly, all such modifications are intended to be included within the scope of the invention as defined in the claims. Therefore, it is to be understood that the foregoing is illustrative of various embodiments and is not to be construed as limited to the specific embodiments disclosed, and that modifications to the disclosed embodiments, as well as other embodiments, are intended to be included within the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0023254 | Feb 2023 | KR | national |