This application claims priority to Korean Patent Application No. 10-2023-0062428, filed on May 15, 2023, and all the benefits accruing therefrom under 35 U.S.C. § 119, the content of which in its entirety is herein incorporated by reference.
Embodiments of the invention relate to a display device, and more particularly, to a display device which delays an afterimage from appearing on the display device, a method of driving the display device, and an electronic apparatus including the display device.
With the development of information technologies, the importance of a display device that is a connection medium between a user and information has increased. Accordingly, display devices such as a liquid crystal display device, an organic light emitting display device, or the like are increasingly used. The organic light emitting display device displays an image using an organic light emitting diode that generates light by recombination of electrons and holes. The organic light emitting display device has a relatively high response speed, and is driven with relatively low power consumption.
When the organic light emitting display device displays the same image for a long period of time, an afterimage (or stain) may be visible in an image being displayed by the organic light emitting display device due to burn-in of the organic light emitting diode included in the organic light emitting display device. Specifically, when the organic light emitting display device displays a split image including different images with a boundary extending in a vertical direction in between, an afterimage (or stain) in the form of a line extending in the vertical direction may be visible.
Embodiments provide a display device which may delay an occurrence of an afterimage (or stain).
Embodiments provide a method of driving a display device which may delay an occurrence of an afterimage (or stain).
Embodiments provide an electronic apparatus including a display device which may delay an occurrence of an afterimage (or stain).
A display device according to an embodiment may include a driving controller which generates output image data based on input image data, a data driver which generates data voltages based on the output image data, and a display panel which displays an image based on the data voltages. The driving controller may generate a plurality of edge values in a first direction for a plurality of blocks dividing the display panel based on a plurality of luminance values for the plurality of blocks, determine a boundary area in the first direction based on a continuity of edge blocks in a second direction crossing the first direction, wherein the edge blocks are determined based on the plurality of edge values among the plurality of blocks, and generate the output image data based on the input image data and compensation data for decreasing a luminance of the image corresponding to the boundary area.
In an embodiment, each of the plurality of blocks may include a plurality of pixels.
In an embodiment, a luminance value for a block among the plurality of blocks may be an average of a plurality of luminance values corresponding to a plurality of grayscale values of the input image data for a plurality of pixels included in the block.
In an embodiment, the driving controller may include a luminance generator which generates the plurality of luminance values based on the input image data, an edge generator which generates the plurality of edge values based on the plurality of luminance values, a determiner which determines the boundary area based on the plurality of edge values, a compensation data generator which generates the compensation data based on the boundary area, and a data compensator which generates the output image data based on the input image data and the compensation data.
In an embodiment, the edge generator may generate the plurality of edge values by filtering the plurality of luminance values in the first direction using a high pass filter.
In an embodiment, the determiner may include a count generator which generates a plurality of count values for the plurality of blocks based on the plurality of edge values, and a boundary determiner which determines the edge blocks having a maximum count value among the plurality of blocks, and determines edge block columns based on the continuity of the edge blocks in the second direction as the boundary area.
In an embodiment, the count generator may increase a count value for a block among the plurality of blocks when an edge value for the block is greater than a threshold value.
In an embodiment, the count generator may maintain the count value for the block when the count value for the block is equal to the maximum count value.
In an embodiment, the count generator may decrease the count value for the block when the edge value for the block is less than or equal to the threshold value.
In an embodiment, the count generator may maintain the count value for the block when the count value for the block is equal to a minimum count value.
In an embodiment, the driving controller may further include a memory which stores the plurality of count values.
In an embodiment, the boundary determiner may calculate a number of the edge blocks included in each of a plurality of block columns extending in the second direction, and may determine block columns in which the number of the edge blocks is greater than a threshold number among the plurality of block columns as the edge block columns.
In an embodiment, the threshold number may be a half of a number of the blocks included in each of the plurality of block columns.
In an embodiment, the luminance generator may generate the plurality of luminance values for each frame period. The edge generator may generate the plurality of edge values for the frame period. The count generator may generate the plurality of count values for the frame period.
In an embodiment, the boundary determiner may determine the edge blocks and the edge block columns for a plurality of frame periods.
In an embodiment, the compensation data may include first compensation data. The compensation data generator may generate the first compensation data for gradually decreasing the luminance values for pixels included in the boundary area and a peripheral area located adjacent to the boundary area in the first direction along the first direction toward a boundary line extending in the second direction in the boundary area.
In an embodiment, the compensation data may further include second compensation data. The compensation data generator may generate the second compensation data for uniformly decreasing the luminance values for the pixels included in the display panel.
In an embodiment, a method of driving a display device including a display panel which displays an image may include generating a plurality of luminance values for a plurality of blocks dividing the display panel based on input image data, generating a plurality of edge values in a first direction for the plurality of blocks based on the plurality of luminance values, determining a boundary area in the first direction based on a continuity of edge blocks in a second direction crossing the first direction, where the edge blocks may be determined based on the plurality of edge values among the plurality of blocks, generating compensation data for decreasing a luminance of an image corresponding to the boundary area, and generating output image data based on the input image data and the compensation data.
In an embodiment, determining the boundary area may include generating a plurality of count values for the plurality of blocks based on the plurality of edge values, determining the edge blocks having a maximum count value among the plurality of blocks, and determining edge block columns determined based on the continuity of the edge blocks in the second direction as the boundary area.
In an embodiment, determining the edge block columns as the boundary area may include calculating a number of the edge blocks included in each of a plurality of block columns extending in the second direction, and determining block columns in which the number of the edge blocks is greater than a threshold number among the plurality of block columns as the edge block columns.
In an embodiment, the threshold number may be a half of a number of the blocks included in each of the plurality of block columns.
In an embodiment, the compensation data may include first compensation data. Generating the compensation data may include generating the first compensation data for gradually decreasing the luminance values for pixels included in the boundary area and a peripheral area located adjacent to the boundary area in the first direction along the first direction toward a boundary line extending in the second direction in the boundary area.
In an embodiment, the compensation data may further include second compensation data. Generating the compensation data may further include generating the second compensation data for uniformly decreasing the luminance values for the pixels included in the display panel.
An electronic apparatus according to embodiments may include a main processor which generates an image signal, a coprocessor which generates output image data by converting input image data corresponding to the image signal, and a display panel which displays an image based on the output image data. The coprocessor may generate a plurality of edge values in a first direction for a plurality of blocks based on a plurality of luminance values for the plurality of blocks dividing the display panel, determine a boundary area in the first direction based on a continuity of edge blocks in a second direction crossing the first direction, where the edge blocks may be determined based on the plurality of edge values among the plurality of blocks, and generate the output image data based on the input image data and compensation data for decreasing a luminance of the image corresponding to the boundary area.
In an embodiment, the coprocessor may include a luminance generator which generates the plurality of luminance values based on the input image data, an edge generator which generates the plurality of edge values based on the plurality of luminance values, a determiner which determines the boundary area based on the plurality of edge values, a compensation data generator which generates the compensation data based on the boundary area, and a data compensator which generates the output image data based on the input image data and the compensation data.
A display device according to embodiments may include a driving controller which generates output image data based on input image data, a data driver which generates data voltages based on the output image data, and a display panel which displays an image based on the data voltages. When the display panel displays a first image and a second image with a boundary line extending in a second direction crossing a first direction in between and grayscale values of the input image data corresponding to the first image are equal, a measured luminance of a first portion of the first image located adjacent to the boundary line may be lower than a measured luminance of a second portion of the first image located farther than the first portion in the first direction from the boundary line.
In an embodiment, the first direction may be parallel to a long side of the display panel. The second direction may be parallel to a short side of the display panel.
According to embodiments, in the display device, the method of driving the display device, and the electronic apparatus including the display device, the boundary area of the image may be determined based on the input image data, the compensation data for decreasing the luminance of the image corresponding to the boundary area may be generated, and the input image data may be compensated based on the compensation data, so that the luminance of the boundary area of the image may decrease. Accordingly, the occurrence of the afterimage (or stain) in the boundary area of the image may be delayed, and a lifetime of the display device may increase.
Illustrative, non-limiting embodiments will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings.
Hereinafter, a display device, a method of driving a display device, and an electronic apparatus according to embodiments will be described in more detail with reference to the accompanying drawings. The same or similar reference numerals will be used for the same elements in the accompanying drawings. This invention may, however, be embodied in many different forms, and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
It will be understood that when an element (or a region, a layer, a portion, or the like) is referred to as being related to another such as being “on”, “connected to” or “coupled to” another element, it may be directly disposed on, connected or coupled to the other element, or intervening elements may be disposed therebetween.
Like reference numerals or symbols refer to like elements throughout. In the drawings, the thickness, the ratio, and the size of the element are exaggerated for effective description of the technical contents. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
The term “and/or,” includes all combinations of one or more of which associated configurations may define.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the scope of the inventive concept. Similarly, a second element, component, region, layer or section may be termed a first element, component, region, layer or section. As used herein, the singular forms, “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Also, terms of “below”, “on lower side”, “above”, “on upper side”, or the like may be used to describe the relationships of the elements illustrated in the drawings. These terms have relative concepts and are described on the basis of the directions indicated in the drawings.
It will be further understood that the terms “comprise”, “includes” and/or “have”, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, components, and/or groups thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, being “disposed directly on” may mean that there is no additional layer, film, region, plate, or the like between a part and another part such as a layer, a film, a region, a plate, or the like. For example, being “disposed directly on” may mean that two layers or two members are disposed without using an additional member such as an adhesive member, therebetween.
“About” or “approximately” as used herein is inclusive of the stated value and means within an acceptable range of deviation for the particular value as determined by one of ordinary skill in the art, considering the measurement in question and the error associated with measurement of the particular quantity (i.e., the limitations of the measurement system). For example, “about” can mean within one or more standard deviations, or within +30%, 20%, 10% or 5% of the stated value.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
In an embodiment and referring to
In an embodiment, the display panel 110 may include pixels PX. The pixels PX may display an image based on gate signals GS and data voltages VDAT.
In an embodiment, the display panel 110 may be divided into a plurality of blocks BL. The blocks BL may be arranged in a first direction DR1 and a second direction DR2 crossing the first direction DR1. Each of the blocks BL may include a plurality of pixels PX.
In an embodiment, the gate driver 120 may provide the gate signals GS to the display panel 110. The gate driver 120 may generate the gate signals GS based on a gate control signal GCS. The gate control signal GCS may include a gate start signal, a gate clock signal, etc.
In an embodiment, the data driver 130 may provide the data voltages VDAT to the display panel 110. The data driver 130 may generate the data voltages VDAT based on output image data OID and a data control signal DCS. The output image data OID may include grayscale values for the pixels PX. The data control signal DCS may include a data clock signal, a horizontal start signal, a load signal, etc.
In an embodiment, the driving controller 140 may control a driving (or operation) of the gate driver 120 and a driving (or operation) of the data driver 130. The driving controller 140 may generate the output image data OID, the gate control signal GCS, and the data control signal DCS based on input image data IID and a control signal CS. The input image data IID may include grayscale values for the pixels PX. The control signal CS may include a vertical synchronization signal, a horizontal synchronization signal, a master clock signal, a data enable signal, etc.
In an embodiment and referring to
In an embodiment, the first transistor T1 may include a gate electrode connected to a first node N1, a first electrode receiving the first power voltage ELVDD, and a second electrode connected to a second node N2. The first transistor T1 may generate a driving current based on a voltage between the first node N1 and the second node N2. The first transistor T1 may be referred as a driving transistor.
In an embodiment, the second transistor T2 may include a gate electrode receiving the scan signal SC, a first electrode receiving the data voltage VDAT, and a second electrode connected to the first node N1. The second transistor T2 may provide the data voltage VDAT to the first node N1 in response to the scan signal SC. The second transistor T2 may be referred as a switching transistor or a write transistor.
In an embodiment, the third transistor T3 may include a gate electrode receiving the sensing signal SS, a first electrode receiving the initialization voltage VINT, and a second electrode connected to the second node N2. The third transistor T3 may provide the initialization voltage VINT to the second node N2 in response to the sensing signal SS. The third transistor T3 may be referred as an initialization transistor or a sensing transistor.
In an embodiment, the storage capacitor CST may include a first electrode connected to the first node N1 and a second electrode connected to the second node N2. The storage capacitor CST may store the voltage between the first node N1 and the second node N2.
In an embodiment, the light emitting diode EL may include a first electrode (or anode) connected to the second node N2 and a second electrode (or cathode) receiving the second power voltage ELVSS. The light emitting diode EL may emit light based on the driving current provided from the first transistor T1.
In an embodiment, the light emitting diode EL may be an organic light emitting diode. In another embodiment, the light emitting diode EL may be an inorganic light emitting diode or a quantum dot light emitting diode.
In an embodiment and referring to
In an embodiment, the luminance generator 141 may generate the luminance values LV for the blocks BL based on the input image data IID. The luminance generator 141 may generate the luminance values LV for each frame period.
In an embodiment, a luminance value LV for a block BL may be an average of luminance values corresponding to grayscale values of the input image data IID for pixels PX included in the block BL. In an embodiment, the luminance generator 141 may convert the grayscale values for the pixels PX included in the block BL into the luminance values for the pixels PX using a gamma curve (for example, a gamma curve with a gamma value of 2.2), and may calculate the average of the luminance values for the pixels PX as the luminance value LV for the block BL.
In an embodiment, the edge generator 142 may generate the edge values EV in the first direction DR1 for the blocks BL based on the luminance values LV for the blocks BL. When a difference between luminance values LV for blocks BL located adjacent in the first direction DR1 is large, edge values EV for the blocks BL located adjacent in the first direction DR1 may be large. When the difference between the luminance values LV for the blocks BL located adjacent in the first direction DR1 is small, the edge values EV for the blocks BL located adjacent in the first direction DR1 may be small. The edge generator 142 may generate the edge values EV for each frame period.
In an embodiment, the edge generator 142 may generate the edge values EV for the blocks BL by filtering the luminance values LV for the blocks BL in the first direction DR1 using a high pass filter. In an embodiment, the high pass filter may be [−1, 2, −1].
In an embodiment, the determiner 143 may determine the boundary area BA in the first direction DR1 based on the continuity of the edge blocks in the second direction DR2, which is determined based on the edge values EV for the blocks BL. The boundary area BA may extend in the second direction DR2. In an embodiment, the determiner 143 may include a count generator 143-1 and a boundary determiner 143-2.
In an embodiment, the count generator 143-1 may generate count values CV for the blocks BL based on the edge values EV for the blocks BL. The count generator 143-1 may generate the count values CV for each frame period.
In an embodiment, the count generator 143-1 may increase the count value CV for the block BL when the edge value EV for the block BL is greater than a threshold value. The threshold value may be a value that serves as a reference for determining the size of the edge value EV for the block BL. In an embodiment, the count generator 143-1 may add 1 to the count value CV for the block BL when the edge value EV for the block BL is greater than the threshold value.
In an embodiment, the count generator 143-1 may maintain the count value CV for the block BL when the count value CV for the block BL is equal to a maximum count value. Accordingly, an upper limit of the count value CV for the block BL may be the maximum count value, and the count value CV for the block BL may be prevented from excessively increasing.
In an embodiment, the count generator 143-1 may decrease the count value CV for the block BL when the edge value EV for the block BL is less than or equal to the threshold value. In an embodiment, the count generator 143-1 may add −1 to the count value CV for the block BL when the edge value EV for the block BL is less than or equal to the threshold value.
In an embodiment, the count generator 143-1 may maintain the count value CV for the block BL when the count value CV for the block BL is equal to a minimum count value. Accordingly, a lower limit of the count value CV for the block BL may be the minimum count value, and the count value CV for the block BL may be prevented from excessively decreasing.
In an embodiment, the memory 144 may store the count values CV for the blocks BL. The memory 144 may store the count values CV for each frame period. The memory 144 may not store the luminance values LV and the edge values EV for the blocks BL.
In an embodiment, the boundary determiner 143-2 may determine the edge blocks and edge block columns based on the count values CV for the blocks BL. The boundary determiner 143-2 may periodically determine the edge blocks and the edge block columns for a plurality of frame periods.
In an embodiment, the boundary determiner 143-2 may determine which of the edge blocks have the maximum count value among the blocks BL. Accordingly, the blocks BL in which the edge value EV is greater than the threshold value over a plurality of frame periods may be determined as the edge blocks. The boundary determiner 143-2 may determine the edge block columns based on the continuity of the edge blocks in the second direction DR2 as the boundary area BA. The boundary determiner 143-2 may calculate the number of edge blocks included in each of block columns extending in the second direction DR2, and may also determine the block columns in which the number of edge blocks is greater than the threshold number among the block columns as the edge block columns in order to determine the continuity of the edge blocks in the second direction DR2. The threshold number may be a reference for determining the continuity of the edge blocks in the second direction DR2.
In an embodiment, the threshold number may be a half of the number of blocks BL included in each of the block columns. In this case, when the number of edge blocks included in the block column is greater than a half of the number of blocks BL included in the block column, the block column may be determined as the edge block column included in the boundary area BA.
In an embodiment, the compensation data generator 146 may generate compensation data CD for decreasing the luminance of the image based on the boundary area BA.
In an embodiment, the data compensator 147 may generate the output image data OID based on the input image data IID and the compensation data CD.
Hereinafter, determination of the boundary area BA based on the input image data IID will be described with reference to
In an embodiment and referring to
In an embodiment and referring to
In an embodiment and as illustrated in
In an embodiment and referring to
In an embodiment, when the input image data IID corresponding to the input image IMG_IN illustrated in
For example, in an embodiment and as illustrated in
In an embodiment, the boundary determiner 143-2 may determine the edge block columns based on the continuity of the edge blocks BL_EG in the second direction DR2. In an embodiment, the boundary determiner 143-2 may determine block columns in which the number of edge blocks BL_EG is greater than a threshold number among the block columns C1-C16 to be the edge block columns. For example, the threshold number may be 9, which is a half of the number of blocks included in each of the block columns C1-C16, and the boundary determiner 143-2 may determine the eighth and ninth block columns C8 and C9 in which the number of edge blocks BL_EG is greater than 9 to be the edge block columns. The boundary determiner 143-2 may determine the edge block columns C8 and C9 to be the boundary area BA.
Hereinafter, the generation of the compensation data CD based on the boundary area BA will be described with reference to
In an embodiment and referring to
In an embodiment, the peripheral area PA may include two block columns disposed adjacent to the boundary area BA in the first direction DR1. For example, as illustrated in
In an embodiment, the block compensation data CD_BL may include gain values for the blocks BL. The gain value for each of the blocks BL included in the boundary area BA may be a first gain value G1, the gain value for each of the blocks BL included in the peripheral area PA may be a second gain value G2, and the gain value for each of the blocks BL included in a non-boundary area NBA excluding the boundary area BA and the peripheral area PA may be 1. The second gain value G2 may be greater than 0 and less than 1. The first gain value G1 may be greater than 0 and less than the second gain value G2.
In an embodiment and referring to
In an embodiment, the first compensation data CD1 may include gain values for the pixels PX. The compensation data generator 146 may generate the gain values for the pixels PX included in the first compensation data CD1 based on the gain values for the blocks BL included in the block compensation data CD_BL. The gain value for each of the pixels PX located in the non-boundary area NBA may be 1, the gain values for the pixels PX located in the peripheral area PA may gradually decrease from 1 to the second gain value G2 along the first direction DR1 from a boundary between the non-boundary area NBA and the peripheral area PA toward a boundary between the peripheral area PA and the boundary area BA, and the gain values for the pixels PX located in the boundary area BA may gradually decrease from the second gain value G2 to the first gain value G1 along the first direction DR1 from the boundary between the peripheral area PA and the boundary area BA toward the boundary line BDL.
In an embodiment and referring to
In an embodiment, when the input image IMG_IN corresponding to the input image data IID is displayed for a long time without compensation of the input image data IID, an afterimage (or stain) corresponding to content displayed by the input image IMG_IN may be recognized from the display panel 110. Specifically, as illustrated in
In an embodiment, when the input image IMG_IN including the first image IMG1 and the second image IMG2 that display different contents with the boundary line BDL extending in the second direction DR2 in between is displayed for a long time, the driving controller 140 may determine the boundary area BA based on the input image data IID, may generate the compensation data CD for gradually decreasing the luminance values for the pixels PX included in the boundary area BA and peripheral area PA along the first direction DR1 toward the boundary line BDL, and may generate the output image data OID based on the input image data IID and the compensation data CD, so that the output image IMG_OUT1 in which the luminance gradually decreases in the first direction DR1 toward the boundary line BDL near the boundary line BDL may be displayed as illustrated in
Hereinafter, the generation of the compensation data CD based on the boundary area BA according to another embodiment will be described with reference to
In an embodiment and referring to
In an embodiment, the second compensation data CD2 may include gain values for the pixels PX. The gain value for each of the pixels PX included in the display panel 110 may be a third gain value G3. The third gain value G3 may be greater than 0 and less than 1. In an embodiment, the third gain value G3 may be greater than the second gain value G2.
In an embodiment, when the gain values of the first compensation data CD1 are multiplied by the gain values of the second compensation data CD2, the gain value for each of the pixels PX located in the non-boundary area NBA may be the third gain value G3. The gain values for the pixels PX located in the peripheral area PA may gradually decrease from the third gain value G3 to a product (G2×G3) of the second gain value G2 and the third gain value G3 along the first direction DR1 from the boundary between the non-boundary area NBA and the peripheral area PA toward the boundary between the peripheral area PA and the boundary area BA, and the gain values for the pixels PX located in the boundary area BA may gradually decrease from the product (G2×G3) of the second gain value G2 and the third gain value G3 to a product (G1×G3) of the first gain value G1 and the third gain value G3 along the first direction DR1 from the boundary between the peripheral area PA and the boundary area BA toward the boundary line BDL.
In an embodiment and referring to
In an embodiment and as illustrated in
Hereinafter, the generation of the compensation data CD based on the boundary area BA according to another embodiment will be described with reference to
In an embodiment and referring to
In an embodiment, the block compensation data CD_BL may include gain values for the blocks BL. The gain value for each of the blocks BL included in the boundary area BA may be the first gain value G1, the gain value for each of the blocks BL included in the first peripheral area PA1 may be a fourth gain value G4, the gain value for each of the blocks BL included in the second peripheral area PA2 may be the second gain value G2, and the gain value for each of the blocks BL included in the non-boundary area NBA may be 1. The fourth gain value G4 may be greater than 0 and less than 1. The fourth gain value G4 may be greater or less than the second gain value G2.
In an embodiment and referring to
In an embodiment, the first compensation data CD1 may include gain values for the pixels PX. The compensation data generator 146 may generate the gain values for the pixels PX included in the first compensation data CD1 based on the gain values for the blocks BL included in the block compensation data CD_BL. The gain value for each of the pixels PX located in the non-boundary area NBA may be 1, the gain values for the pixels PX located in the first peripheral area PA1 may gradually decrease from 1 to the fourth gain value G4 along the first direction DR1 from the boundary between the non-boundary area NBA and the first peripheral area PA1 toward a boundary between the first peripheral area PA1 and the boundary area BA, the gain values for the pixels PX located in the second peripheral area PA2 may gradually decrease from 1 to the second gain value G2 along the first direction DR1 from the a boundary between the non-boundary area NBA and the second peripheral area PA2 toward a boundary between the second peripheral area PA2 and the boundary area BA, the gain values for the pixels PX located between the boundary between the first peripheral area PA1 and the boundary area BA and the boundary line BDL may gradually decrease from the fourth gain value G4 to the first gain value G1 along the first direction DR1 from the boundary between the first peripheral area PA1 and the boundary area BA toward the boundary line BDL, and the gain values for the pixels PX located between the boundary between the second peripheral area PA2 and the boundary area BA and the boundary line BDL may gradually decrease from the second gain value G2 to the first gain value G1 along the first direction DR1 from the boundary between the second peripheral area PA2 and the boundary area BA toward the boundary line BDL.
In an embodiment, as illustrated in
In an embodiment and referring to
In an embodiment, when the display panel 110 displays the first image IMG1 and the second image IMG2 with the boundary line BDL extending in the second direction DR2 in between for a long period of time and the grayscale values of the input image data IID corresponding to the first image IMG1 are equal, since the first portion P1 of the first image IMG1 is located closer to the boundary line BDL in the first direction DR1 than the second portion P2 of the first image IMG1, the gain values of the compensation data CD multiplied by the grayscale values of the input image data IID corresponding to the first portion P1 of the first image IMG1 may be less than the gain values of the compensation data CD multiplied by the grayscale values of the input image data IID corresponding to the second portion P2 of the first image IMG1. Accordingly, the grayscale values of the output image data OID corresponding to the first portion P1 of the first image IMG1 may be less than the grayscale values of the output image data OID corresponding to the second portion P2 of the first image IMG1, and the measured luminance of the first portion P1 of the first image IMG1 may be lower than the measured luminance of the second portion P2 of the first image IMG1.
In an embodiment, the first direction DR1 may be parallel to a long side SD1 of the display panel 110, and the second direction DR2 may be parallel to a short side SD2 of the display panel 110. When the display panel 110 has a rectangular planar shape with opposite long sides SD1 facing each other and opposite short sides SD2 facing each other, the user may use the display device 100 by dividing the display device 100 into two areas with the boundary line BDL extending parallel to the short side SD2 in between.
In an embodiment and referring to
In an embodiment, the edge generator 142 of the driving controller 140 may generate the edge values EV in the first direction DR1 for the blocks BL based on the luminance values LV for the blocks BL (S120). The edge generator 142 may generate the edge values EV for each frame period.
In an embodiment, the edge generator 142 may generate the edge values EV for the blocks BL by filtering the luminance values LV for the blocks BL in the first direction DR1 using a high pass filter. In an embodiment, the high pass filter may be [−1, 2, −1].
In an embodiment, the determiner 143 of the driving controller 140 may determine the boundary area BA in the first direction DR1 based on the continuity of the edge blocks in the second direction DR2 which is determined based on the edge values EV for the blocks BL (S130).
In an embodiment and referring to
In an embodiment and referring to
In an embodiment, the count generator 143-1 may maintain the count value CV for the block BL when the edge value EV for the block BL is greater than the threshold value and the count value CV for the block BL is equal to the maximum count value (S131-2).
In an embodiment, the count generator 143-1 may decrease the count value CV for the block BL when the edge value EV for the block BL is less than or equal to the threshold value and the count value CV for the block BL is greater than the minimum count value (S131-3). In an embodiment, the count generator 143-1 may add −1 to the count value CV for the block BL when the edge value EV for the block BL is less than or equal to the threshold value and the count value CV for the block BL is greater than the minimum count value.
In an embodiment, the count generator 143-1 may maintain the count value CV for the block BL when the edge value EV for the block BL is less than or equal to the threshold value and the count value CV for the block BL is equal to the minimum count value (S131-2).
In an embodiment and referring to
In an embodiment and referring to
In an embodiment, the threshold number may be a half of the number of blocks BL included in each of the block columns. In this case, when the number of edge blocks included in a block column is greater than a half of the number of blocks BL included in the block column, the block column may be determined as the edge block column included in the boundary area BA.
In an embodiment and referring to
In an embodiment and referring to
In an embodiment, the first compensation data CD1 may include gain values for the pixels PX. The gain value for each of the pixels PX located in the non-boundary area NBA may be 1, the gain values for the pixels PX located in the peripheral area PA may gradually decrease from 1 to the second gain value G2 along the first direction DR1 from the boundary between the non-boundary area NBA and the peripheral area PA toward the boundary between the peripheral area PA and the boundary area BA, and the gain values for the pixels PX located in the boundary area BA may gradually decrease from the second gain value G2 to the first gain value G1 along the first direction DR1 from the boundary between the peripheral area PA and the boundary area BA toward the boundary line BDL.
In an embodiment, the data compensator 147 may generate the output image data OID based on the input image data IID and the first compensation data CD1 (S151). In an embodiment, the data compensator 147 may calculate the grayscale values of the output image data OID by multiplying the grayscale values of the input image data IID by the gain values of the first compensation data CD1.
In an embodiment and referring to
In an embodiment, the second compensation data CD2 may include gain values for the pixels PX. The gain value for each of the pixels PX included in the display panel 110 may be the third gain value G3.
In an embodiment, the data compensator 147 may generate the output image data OID based on the input image data IID, the first compensation data CD1, and the second compensation data CD2 (S152). In an embodiment, the data compensator 147 may calculate the grayscale values of the output image data OID by multiplying the grayscale values of the input image data IID by the gain values of the first compensation data CD1 and the gain values of the second compensation data CD2.
Referring to
In an embodiment and referring to
In an embodiment, as illustrated in
In an embodiment, the processor 1010 may obtain an external input through an input module 1030 or a sensor module 1061, and may execute an application corresponding to the external input. For example, when the user selects a camera icon displayed on the display panel 1041, the processor 1010 may obtain a user input through an input sensor 1061-2, and may activate a camera module 1071. The processor 1010 may transmit image data corresponding to a captured image acquired through the camera module 1071 to the display module 1040. The display module 1040 may display an image corresponding to the captured image through the display panel 1041. Some of components of the electronic apparatus 1000 may be integrated and provided as one component, or one component may be provided separately into two or more components.
In an embodiment, the electronic apparatus 1000 may communicate with an external electronic apparatus 1002 through a network (e.g., a short-range wireless communication network or a long-range wireless communication network). In an embodiment, the electronic apparatus 1000 may include the processor 1010, the memory 1020, the input module 1030, the display module 1040, a power module 1050, an internal module 1060, and an external module 1070. In an embodiment, the electronic apparatus 1000 may omit at least one of the above-described components, or one or more other components may be added. In an embodiment, some of the above-described components (e.g., a sensor module 1061, an antenna module 1062, or an sound output module 1063) may be integrated into another component (e.g., the display module 1040).
In an embodiment, the processor 1010 may execute software to control at least one other component (e.g., hardware or software component) of the electronic apparatus 1000 connected to the processor 1010, and may perform various data processing or calculation. In an embodiment, as at least part of data processing or calculation, the processor 1010 may store commands or data received from another component (e.g., the input module 1030, the sensor module 1061, or a communication module 1073) in a volatile memory 1021, may process the commands or data stored in the volatile memory 1021, and may store resultant data in a non-volatile memory 1022.
In an embodiment, the processor 1010 may include a main processor 1011 and a coprocessor 1012. The main processor 1011 may include one or more of a central processing unit (CPU) 1011-1 or an application processor (AP). The main processor 1011 may further include one or more of a graphics processing unit (GPU) 1011-2, a communication processor (CP), and an image signal processor (ISP). At least two of the above-described processing unit and processor may be implemented as an integrated component (e.g., a single chip), or each may be implemented as an independent component (e.g., a plurality of chips).
In an embodiment, the coprocessor 1012 may include a controller 1012-1. The controller 1012-1 may include an interface conversion circuit and a timing control circuit. The controller 1012-1 may receive an image signal from the main processor 1011, may convert data format of the image signal to suit the interface specifications with the display module 1040, and may output image data. The controller 1012-1 may output various control signals necessary for driving the display module 1040.
In an embodiment, the coprocessor 1012 may further include a data conversion circuit 1012-2, a gamma compensation circuit 1012-3, a rendering circuit 1012-4, etc. The data conversion circuit 1012-2 may receive the image data from the controller 1012-1, and may compensate the image data such that the image is displayed at a desired brightness according to the characteristics of the electronic apparatus 1000 or the user's settings or may convert the image data to reduce power consumption or compensate for afterimages. The data conversion circuit 1012-2 may include at least one of the luminance generator 141, the edge generator 142, the determiner 143, the compensation data generator 146, and the data compensator 147 in
In an embodiment, the memory 1020 may store various data used by at least one component of the electronic apparatus 1000 (e.g., the processor 1010 or the sensor module 1061) and input data or output data for commands related thereto. The memory 1020 may include at least one of the volatile memory 1021 and the non-volatile memory 1022. The memory 1020 may include the memory 144 in
In an embodiment, the input module 1030 may receive commands or data to be used in components of the electronic apparatus 1000 (e.g., the processor 1010, the sensor module 1061, or the sound output module 1063) from the outside of the electronic apparatus 1000 (e.g., the user or the external electronic apparatus 1002).
In an embodiment, the input module 1030 may include a first input module 1031 through which commands or data are input from the user, and a second input module 1032 through which command or data are input from the external electronic apparatus 1002. The first input module 1031 may include a microphone, a mouse, a keyboard, a key (e.g., button), or a pen (e.g., passive pen or active pen). The second input module 1032 may support a designated protocol that can connect to the external electronic apparatus 1002 by wire or wirelessly. In an embodiment, the second input module 1032 may include a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface. The second input module 1032 may include a connector that can be physically connected to the external electronic apparatus 1002, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
In an embodiment, the display module 1040 may provide visual information to the user. The display module 1040 may include the display panel 1041, a scan driver 1042, and the data driver 1043. The display module 1040 may further include a window, a chassis, and a bracket to protect the display panel 1041. The display module 1040 may correspond to the display device 100 in
In an embodiment, the power module 1050 may supply power to components of the electronic apparatus 1000. The power module 1050 may include a battery that charges power voltage. The battery may include a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell. The power module 1050 may include a power management integrated circuit (PMIC). The PMIC may supply optimized power to each of the above-described modules and the modules described below. The power module 1050 may include a wireless power transmission/reception member electrically connected to the battery. The wireless power transmission/reception member may include a plurality of coil-shaped antenna radiators.
In an embodiment, the electronic apparatus 1000 may further include the internal module 1060 and the external module 1070. The internal module 1060 may include the sensor module 1061, the antenna module 1062, and the sound output module 1063. The external module 1070 may include the camera module 1071, a light module 1072, and a communication module 1073.
In an embodiment, the sensor module 1061 may detect an input by the user's body or an input by the pen among the first input module 1031, and may generate an electrical signal or a data value corresponding to the input. The sensor module 1061 may include at least one of a fingerprint sensor 1061-1, an input sensor 1061-2, and a digitizer 1061-3.
In an embodiment, the processor 1010 may output commands or data to the display module 1040, the sound output module 1063, the camera module 1071, or the light module 1072 based on the input data received from the input module 1030. For example, the processor 1010 may generate image data in response to input data applied through the mouse or the active pen and output the image data to the display module 1040, or may generate command data in response to the input data to output the command data to the camera module 1071 or the light module 1072. When no input data is received from the input module 1030 for a certain period of time, the processor 1010 may switch an operation mode of the electronic apparatus 1000 to a low-power mode or a sleep mode to reduce power consumption of the electronic apparatus 1000.
In an embodiment, the processor 1010 may output commands or data to the display module 1040, the sound output module 1063, the camera module 1071, or the light module 1072 based on sensing data received from the sensor module 1061. For example, the processor 1010 may compare authentication data authorized by the fingerprint sensor 1061-1 with authentication data stored in the memory 1020, and then may execute an application according to the comparison result. The processor 1010 may execute command or output corresponding image data to the display module 1040 based on sensing data detected by the input sensor 1061-2 or the digitizer 1061-3. When the sensor module 1061 includes a temperature sensor, the processor 1010 may receive temperature data for a temperature measured from the sensor module 1061, and may further perform luminance correction for the image data or the like based on the temperature data.
In an embodiment, the display device may be applied to a display device included in a computer, a notebook, a mobile phone, a smart phone, a smart pad, a PMP, a PDA, an MP3 player, or the like.
Although the display devices, the methods of driving the display devices, and the electronic apparatuses according to the embodiments have been described with reference to the drawings, the illustrated embodiments are examples, and may be modified and changed by a person having ordinary knowledge in the relevant technical field without departing from the scope of the invention.
The foregoing is illustrative of embodiments and is not to be construed as limiting thereof. Although a few embodiments have been described, those skilled in the art will readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of the invention. Accordingly, all such modifications are intended to be included within the scope of the invention. Therefore, it is to be understood that the foregoing is illustrative of various embodiments and is not to be construed as limited to the specific embodiments disclosed, and that modifications to the disclosed embodiments, as well as other embodiments, are intended to be included within the scope of the invention. Moreover, the embodiments or parts of the embodiments may be combined in whole or in part without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0062428 | May 2023 | KR | national |