The present application relates to display technology, and more particularly to a display driving method and device, and a display device.
With the development of technology, the resolution of display panels has been gradually improved. At present, the resolution of display panels has reached up to 8K (with a resolution of 7680×4320). Under the condition that the size of display panels is fixed, the improvement of resolution brings an effect of reduced aperture ratio, thereby reducing the light transmittance of display panels. Therefore, the display panels using an 8-domain pixel electrode structure to improve viewing angle cannot be applied in high resolution products because of the loss of light transmittance. Instead, a 4-domain pixel electrode structure is used for the display panels. However, the display panels with the 4-domain pixel electrode structure will have a degraded viewing angle. Therefore, the display panels with the 4-domain pixel electrode structure needs a compensation for viewing angle to improve the viewing angle performance.
In viewing angle compensation, a general way is to use a plurality of sub-pixels to form a grayscale pixel group. The grayscale pixel group includes a high-grayscale sub-pixel and a low-grayscale sub-pixel. With the grayscale pixel groups, the display effect resulted at oblique viewing angles can be improved. In an existing sub-pixel array structure for viewing angle compensation, each column of sub-pixels is provided with a data line, and the sub-pixels on each column of sub-pixels are connected to the same data line. In some viewing angle compensation approaches, in order to reduce flicker on the screen, adjacent data lines are set to have the same polarity in some arrangements. Therefore, the adjacent data lines will have repeated polarities such as “positive, positive” and “negative, negative”. Because the voltage drops of the coupling capacitance on the adjacent data lines in the afore-mentioned two cases cannot be canceled each other out, a high risk of crosstalk in the column direction is resulted.
The present application provides a display driving method and device of a display device, and a display device, so as to improve the problem of crosstalk risk caused by the fact that the voltage drops of the coupling capacitors on adjacent data lines cannot cancel each other.
The present application provides a display driving method for a display device, the display device including:
the display driving method including the following steps:
Optionally, in some embodiments of the present application, before the step of obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color according to the first to-be-processed chroma dataset and the second to-be-processed chroma dataset, the method further includes:
Optionally, in some embodiments of the present application, the step of obtaining the first initial chroma dataset and the second initial chroma dataset of the preset scene in the preprocessed images with respect to the preset color includes:
Optionally, in some embodiments of the present application, the step of creating the Gaussian model for the preset color according to the first initial chroma dataset and the second initial chroma dataset includes:
Optionally, in some embodiments of the present application, the step of obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color from the Gaussian model, according to the first to-be-processed chroma dataset and the second to-be-processed chroma dataset includes:
Optionally, in some embodiments of the present application, there are multiple preset scenes and multiple preset colors,
Optionally, in some embodiments of the present application, the sub-pixels of adjacent rows of each grayscale pixel group includes high-grayscale sub-pixels and low-grayscale sub-pixels.
Optionally, in some embodiments of the present application, the sub-pixels of each grayscale pixel group along a row direction are arranged in an alternate manner with high grayscale and low grayscale.
Optionally, in some embodiments of the present application, the sub-pixels of each grayscale pixel group along adjacent rows include first row-sub-pixels and second row-sub-pixels, and the first row-sub-pixels are low-grayscale sub-pixels, and the second row-sub-pixels are high-grayscale sub-pixels.
Optionally, in some embodiments of the present application, each grayscale pixel group includes sub-pixels in a 2×6 matrix and an arrangement of the sub-pixels of each grayscale pixel group along a row direction is as flows: “high grayscale, low grayscale, high grayscale, low grayscale, high grayscale and low grayscale”.
Correspondingly, the present application provides a display driving device for a display device, the display device including:
the display driving device including:
Correspondingly, the present application further provides a display device, including a processor, a storage and a computer program stored in the storage and executable on the processor, wherein the processor executes the computer program to implement steps of a display driving method for the display device,
the display device including:
the display driving method including the following steps:
Optionally, in some embodiments of the present application, before the step of obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color according to the first to-be-processed chroma dataset and the second to-be-processed chroma dataset, the method further includes:
Optionally, in some embodiments of the present application, the step of obtaining the first initial chroma dataset and the second initial chroma dataset of the preset scene in the preprocessed images with respect to the preset color includes:
Optionally, in some embodiments of the present application, the step of creating the Gaussian model for the preset color according to the first initial chroma dataset and the second initial chroma dataset includes:
Optionally, in some embodiments of the present application, the step of obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color from the Gaussian model, according to the first to-be-processed chroma dataset and the second to-be-processed chroma dataset includes:
Optionally, in some embodiments of the present application, there are multiple preset scenes and multiple preset colors,
Optionally, in some embodiments of the present application, the sub-pixels of adjacent rows of each grayscale pixel group includes high-grayscale sub-pixels and low-grayscale sub-pixels.
Optionally, in some embodiments of the present application, the sub-pixels of each grayscale pixel group along a row direction are arranged in an alternate manner with high grayscale and low grayscale.
Optionally, in some embodiments of the present application, the sub-pixels of each grayscale pixel group along adjacent rows include first row-sub-pixels and second row-sub-pixels, and the first row-sub-pixels are low-grayscale sub-pixels, and the second row-sub-pixels are high-grayscale sub-pixels.
The present application provides a display driving method and device, and a display device. The display driving method includes the following steps: obtaining a first to-be-processed chroma dataset and a second to-be-processed chroma dataset of a preset scene in a to-be-displayed image with respect to a preset color; obtaining Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color according to the first to-be-processed chroma dataset and the second to-be-processed chroma dataset; when the Gaussian probability is greater than or equal to a predetermined threshold, setting polarities of the sub-pixels in adjacent columns of each grayscale pixel group to be opposite to each other, setting the polarities of the sub-pixels adjacent to the grayscale pixel group in a row direction to be symmetrical to the polarities of the sub-pixels of the grayscale pixel group, and then displaying the to-be-displayed image; and when the Gaussian probability is less than the predetermined threshold, setting the polarities of the sub-pixels in adjacent columns to be opposite to each other, and then displaying the to-be-displayed image. By obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color, comparing the Gaussian probability with the predetermined threshold, and determining whether to perform the viewing angle compensation on the to-be-displayed image based on the comparison result, the present application reduces the risk of crosstalk caused by the fact that the voltage drops of the coupling capacitance on the adjacent data lines cannot be canceled each other out.
For explaining the technical solutions used in the embodiments of the present application more clearly, the appended figures to be used in describing the embodiments will be briefly introduced in the following. Obviously, the appended figures described below are only some of the embodiments of the present application, and those of ordinary skill in the art can further obtain other figures according to these figures without making any inventive effort.
The technical solutions in the embodiments of the present application are clearly and completely described below with reference to appending drawings of the embodiments of the present application. Obviously, the described embodiments are merely a part of embodiments of the present application and are not all of the embodiments. Based on the embodiments of the present application, all the other embodiments obtained by those of ordinary skill in the art without making any inventive effort are within the scope the present application.
In the description of the present application, it is to be understood that the terms “center”, “longitudinal”, “lateral”, “length”, “width”, “thickness”, “upper”, “lower”, “front”, “rear”, “left”, “right”, “vertical”, “horizontal”, “top”, “bottom”, “inner”, “outer”, and the like indicated orientation or positional relationship are based on the relationship of the position or orientation shown in the drawings, which is only for the purpose of facilitating description of the present application and simplifying the description, but is not intended to or implied that the device or element referred to must have a specific orientation, and be constructed and operated in a particular orientation. Therefore, it should not be construed as a limitation of the present disclosure. In addition, the terms “first” and “second” are used for descriptive purposes only, and should not be taken to indicate or imply relative importance, or implicitly indicate the indicated number of technical features. Thus, by defining a feature with “first” or “second” may explicitly or implicitly include one or more features. In the description of the present application, “a plurality” means two or more unless explicitly defined.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is provided in order to implement and utilize the present application by those of ordinary skill in the art. Details are also provided below for the purpose of explanation. It should be understood that those of ordinary skill in the art can be acknowledged that the present application is also achievable without these specific details. In other examples, well-known structures and processes will not be detailedly described in order not to render the description of the present application obscure by unnecessary details. Therefore, the present application is not intended to be limited to the illustrated embodiments, but is to be consistent with the widest range covered by the disclosed principles and features of the present application. Unless otherwise specified, the orientations including parallel or vertical involved in this application are not directed to parallel or vertical in a strict sense, as long as the corresponding structure can achieve its purpose.
The present application provides a display driving method and device, and a display device, which will be described in detail below. It needs to note that the order in describing the following embodiments is not intended to be treated as an order of preferred embodiments.
Please refer to
The display driving method includes the following steps:
Please refer to
Please refer to
The predetermined threshold may be set based on practical image quality requirements of the display device. Taking 8K resolution (e.g., 7680*4320) for example, the values of the predetermined threshold may range from 4727808 to 525472. Specifically, the predetermined threshold may be set as 4976640.
Therefore, by obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color, comparing the Gaussian probability with the predetermined threshold, and determining whether to perform the viewing angle compensation on the to-be-displayed image based on the comparison result, the present application reduces the risk of crosstalk caused by the fact that the voltage drops of the coupling capacitance on the adjacent data lines 20 cannot be canceled each other out.
Please refer to
Further, each grayscale pixel group 30 includes pixel units, each pixel unit includes a first sub-pixel 11, a second sub-pixel 12 and a third sub-pixel 13, and the sub-pixels 10 of each grayscale pixel group 30 along the row direction include a plurality of pixel units arranged in order. The first sub-pixel 11 is a red sub-pixel, the second sub-pixel 12 is a green sub-pixel, and the third sub-pixel 13 is a blue sub-pixel.
Specifically, in some embodiments, the sub-pixels 10 of each grayscale pixel group 30 along the row direction are arranged in an alternate manner with high grayscale and low grayscale. For example, each grayscale pixel group 30 includes sub-pixels 10 in a 2×6 matrix. The arrangement of the sub-pixels 10 of each grayscale pixel group 30 along the row direction may be as flows: “high grayscale, low grayscale, high grayscale, low grayscale, high grayscale and low grayscale”, or “low grayscale, high grayscale, low grayscale, high grayscale, low grayscale and high grayscale”.
Please refer to
Please refer to
Please refer to
Please refer to
Specifically, in the embodiments of the present application, related data in the to-be-displayed image are processed by the created Gaussian model to obtain the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color. In creating the Gaussian model, a plurality of preprocessed images can be selected from a relevant database, and these preprocessed images contain scenes corresponding to the preset scene. The types and the number of preset scenes can be set according to actual needs. In the image processing method in the embodiments of the present application, in creating the Gaussian model, the preset scene can be a portrait, blue sky, grass, food, an animal, any other natural scenery, a building, and etc. The preset color is a corresponding color in each scene. For example, in the case of a portrait scene, the preset color can be a skin color; in the case of a blue sky scene, the preset color can be a blue color; in the case of a grass scene, the preset color can be a green color.
In the embodiments of the present application, for example, the colors sensitive to human eyes and corresponding scenes are selected for illustrating the creation of the Gaussian model. In the embodiments of the present application, for the portrait, blue sky and grass selected as three preset scenes, their corresponding preset colors are skin color, blue and green, respectively. The following are illustrated by the afore-mentioned three preset scenes and their corresponding preset colors.
A plurality of first preprocessed images containing a preset portrait scene, a plurality of second preprocessed images containing a blue sky preset scene and a plurality of third preprocessed images containing a grass preset scene are selected from a database. The number of preprocessed images containing each of preset scenes may be set depending on various situations.
For any of the first preprocessed images, skin color data are extracted from the first preprocessed image, where a conventional extraction approach may be utilized for the extracting. After obtaining the skin color data of the first preprocessed image, the skin color data can be decomposed in Ycbcr space to obtain luminance data, first initial chroma data and second initial chroma data related to the skin color data.
For the skin color data of a plurality of first preprocessed images, a luminance dataset, a first initial chroma dataset and a second initial chroma dataset related to the skin color data can be obtained. The decomposition of the skin color data can be processed in the Ycbcr space, or in HSB color space. Similarly, the decomposition performed on other preset scenes with respect to other preset colors can be processed in the Ycbcr space, or in other color spaces. The following are illustrated by the processing in the Ycbcr space.
Specifically, the skin color data of the first preprocessed image can be processed using the following formulas:
y
skin(i)=(R*0.2567+G*0.5041+B*0.0979)+16 (1)
cb
skin(i)=(R*0.1482+G*0.2909+B*0.4391)+128 (2)
cr
skin(i)=(R*0.4392+G*0.3678+B*0.0714)+128 (3)
where R, G and B are a red component, a green component and a blue component of the skin color data, respectively, yskin(i) is the luminance data of the skin color data, cbskin(i) is the first initial chroma data of the skin color data, and crskin(i) is the second initial chroma data of the skin color data.
The same processing as above is performed on a plurality of first preprocessed images to obtain a plurality of luminance data yskin to form a luminance dataset, to obtain a plurality of first initial chroma data cbskin to form a first initial chroma dataset cbskin(1), cbskin(2) . . . cbskin(i) . . . , and to obtain a plurality of second initial chroma data crskin to form a second initial chroma dataset crskin(1), crskin(2) . . . crskin(i) . . . .
The mean value of the first initial chroma dataset is calculated to obtain a first chroma mean value μskin1 of as the skin color data; the variance askin of the first initial chroma dataset for each first initial chroma data with respect to the afore-mentioned first chroma mean value μskin1 is calculated. The mean value of the second initial chroma dataset is calculated to obtain a second chroma mean value μskin2 of the skin color data; the variance dskin of the second initial chroma dataset for each second initial chroma data with respect to the afore-mentioned second chroma mean value μskin2 is calculated.
From the variances askin, dskin, the first initial chroma dataset cbskin(1), cbskin(2) . . . cbskin(i) . . . , and the second initial chroma dataset crskin(1), crskin(2) . . . crskin(i) . . . , the covariance matrix cov(cbskin,crskin) of the first initial chroma data and the second initial chroma data of the portrait scene with respect to the skin color is obtained, and this is expressed as follows:
where cbskin(i) is the first initial chroma data of any of the first preprocessed images, crskin(i) is the second initial chroma data of any of the first preprocessed images, μskin1 is the first chroma mean value of the skin color data of the plurality of first preprocessed images, μskin2 is the second chroma mean value of the skin color data of the plurality of first preprocessed images, askin is the variance of the skin color data of the first preprocessed images for each first initial chroma data with respect to the afore-mentioned first chroma mean value μskin1, dskin is the variance of the skin color data of the first preprocessed images for each second initial chroma data with respect to the afore-mentioned second chroma mean value μskin2, bskin and cskin are the correlations between the first initial chroma dataset and the skin color of a first preset image and between the second initial chroma dataset and the skin color of the first preset image.
From formula (4), the inverse matrix cov−1(cbskin,crskin) or Σskin−1 of cov(cbskin,crskin) and the rank |Σskin| of cov(cbskin, crskin) can be obtained. From above parameters, a Gaussian model for the portrait scene with respect to the skin color can be created, and this can be expressed as follows:
where A is an amplitude of the Gaussian model in a domain [0, 1], gaussskin(cbi,cri) is initial probability obtained from the Gaussian model for the portrait scene with respect to the skin color, askin is the variance of the skin color data of the first preprocessed images for each first initial chroma data with respect to the afore-mentioned first chroma mean value μskin1, dskin is the variance of the skin color data of the first preprocessed images for each second initial chroma data with respect to the afore-mentioned second chroma mean value μskin2, cbi is a first chroma variable related to skin color, cri is a second chroma variable related to skin color, Σskin−1 is the inverse matrix of cov(cbskin,crskin), |Σskin| is the rank of cov(cbskin,crskin), and
Likewise, for any of the second preprocessed images, blue color data are extracted from the second preprocessed image. After obtaining the blue color data of the second preprocessed image, the blue color data can be decomposed in Ycbcr space to obtain luminance data, first initial chroma data and second initial chroma data related to the blue color data. For the blue color data of a plurality of second preprocessed images, a luminance dataset, a first initial chroma dataset and a second initial chroma dataset related to the blue color data can be obtained.
Specifically, the blue color data of the second preprocessed image can be processed using the following formulas:
y
sky(i)=(R*0.2567+G*0.5041+B*0.0979)+16 (6)
cb
sky(i)=(R*0.1482+G*0.2909+B*0.4391)+128 (7)
cr
sky(i)=(R*0.4392+G*0.3678+B*0.0714)+128 (8)
where R, G and B are a red component, a green component and a blue component of the blue color data, respectively, yskin(i) is the luminance data of the blue color data, cbskin(i) is the first initial chroma data of the blue color data, and crskin(i) is the second initial chroma data of the blue color data.
From above data of the second preprocessed images, the covariance matrix cov(cbsky,crsky) of the first initial chroma data and the second initial chroma data of the blue sky scene with respect to the blue color can be obtained, and this is expressed as follows:
In above formulas (9) and (10), cbsky(i) is the first initial chroma data of any of the second preprocessed images, crsky(i) is the second initial chroma data of any of the second preprocessed images, μsky1 is a first chroma mean value of the blue color data of the plurality of second preprocessed images, μsky2 is a second chroma mean value of the blue color data of the plurality of second preprocessed images, asky is the variance of the blue color data of the second preprocessed images for each first initial chroma data with respect to the afore-mentioned first chroma mean value μsky1, dsky is the variance of the blue color data of the second preprocessed images for each second initial chroma data with respect to the afore-mentioned second chroma mean value μsky2, bsky and csky are the correlations between the first initial chroma dataset and the blue color of a second preset image and between the second initial chroma dataset and the blue color of the second preset image; A is an amplitude of the Gaussian model in a domain [0, 1], gausssky(cbi,cri) is initial probability obtained from the Gaussian model for the blue sky scene with respect to the blue color, asky is the variance of the blue color data of the second preprocessed images for each first initial chroma data with respect to the afore-mentioned first chroma mean value μsky1, dsky is the variance of the blue color data of the second preprocessed images for each second initial chroma data with respect to the afore-mentioned second chroma mean value μsky2, cbi is a first chroma variable related to blue color, cri is a second chroma variable related to blue color, Σsky−1 is the inverse matrix of cov(cbsky,crsky), |Σsky| is the rank of cov(cbsky,crsky), and
For any of the third preprocessed images, green color data are extracted from the third preprocessed image. After obtaining the green color data of the third preprocessed image, the green color data can be decomposed in Ycbcr space to obtain luminance data, first initial chroma data and second initial chroma data related to the green color data. For the green color data of a plurality of third preprocessed images, a luminance dataset, a first initial chroma dataset and a second initial chroma dataset related to the green color data can be obtained.
Specifically, the green color data of the third preprocessed image can be processed using the following formulas:
y
grass(i)=(R*0.2567+G*0.5041+B*0.0979)+16 (11)
cb
grass(i)=(R*0.1482+G*0.2909+B*0.4391)+128 (12)
cr
grass(i)=(R*0.4392+G*0.3678+B*0.0714)+128 (13)
where R, G and B are a red component, a green component and a blue component of the green color data, respectively, yskin(i) is the luminance data of the green color data, cbskin(i) is the first initial chroma data of the green color data, and crskin(i) is the second initial chroma data of the green color data.
From above data of the third preprocessed images, the covariance matrix cov(cbgrass, crgrass) of the first initial chroma data and the second initial chroma data of the grass scene with respect to the green color can be obtained, and this is expressed as follows:
In above formulas (14) and (15), cbgrass(i) is the first initial chroma data of any of the third preprocessed images, crgrass(i) is the second initial chroma data of any of the third preprocessed images, μgrass1 is a first chroma mean value of the green color data of the plurality of third preprocessed images, μgrass2 is a second chroma mean value of the green color data of the plurality of third preprocessed images, agrass is the variance of the green color data of the third preprocessed images for each first initial chroma data with respect to the afore-mentioned first chroma mean value μgrass1, dgrass is the variance of the green color data of the third preprocessed images for each second initial chroma data with respect to the afore-mentioned second chroma mean value μgrass2, bgrass and cgrass are the correlations between the first initial chroma dataset and the green color of a third preset image and between the second initial chroma dataset and the green color of the third preset image; A is an amplitude of the Gaussian model in a domain [0, 1], gaussgrass(cbi,cri) is initial probability obtained from the Gaussian model for the grass scene with respect to the green color, agrass is the variance of the green color data of the third preprocessed images for each first initial chroma data with respect to the afore-mentioned first chroma mean value μgrass1, dgrass is the variance of the green color data of the third preprocessed images for each second initial chroma data with respect to the afore-mentioned second chroma mean value μgrass2, cbi is a first chroma variable related to green color, cri is a second chroma variable related to green color, Σgrass−1 is the inverse matrix of cov(cbgrass,crgrass), |Σgrass| is the rank of cov(cbgrass,crgrass), and
Using above method to build the Gaussian model can set the types and the number of the preset scenes according to actual needs and set the types and the numbers of the preset colors accordingly. The Gaussian model can be created separately for various preset scenes with respect to each of the preset colors. The composition of the Gaussian model can be flexibly adjusted based on application scenarios, customer needs or image quality requirements, etc. In addition, parameters such as the amplitude of the Gaussian model, the mean value for the preset color in the preprocessed images, and the relevant covariance matrix can be adjusted based on demands, and the parameters can also be adjusted based on precision or other considerations, with great practicality and versatility.
Specifically, in processing the to-be-displayed image, the data of the preset color in the to-be-displayed image is extracted first. For example, when a portrait scene, a blue sky scene and a grass scene in the to-be-displayed image are to be processed, skin color data, blue color data and grass color data in the to-be-displayed image are extracted respectively, and are decomposed in the Ycbcr space respectively, so as to obtain the first to-be-processed chroma dataset and the second to-be-processed chroma dataset related to skin color, the first to-be-processed chroma dataset and the second to-be-processed chroma dataset related to blue color, and the first to-be-processed chroma dataset and the second to-be-processed chroma dataset related to green color.
After the first to-be-processed chroma dataset and the second to-be-processed chroma dataset of each preset color are obtained, chroma data of the first to-be-processed chroma dataset and the second to-be-processed chroma dataset with respect to each preset color can be inputted into the Gaussian model for a corresponding preset color to obtain an initial probability map associated with the preset color.
For example, the chroma data in the first to-be-processed chroma dataset and the second to-be-processed chroma dataset of the skin color are inputted to the afore-mentioned formula (5) so as to obtain an initial probability map associated with the skin color in the to-be-displayed image. Likewise, the chroma data in the first to-be-processed chroma dataset and the second to-be-processed chroma dataset of the blue color are inputted to the afore-mentioned formula (10) so as to obtain an initial probability map associated with the blue color in the to-be-displayed image. The chroma data in the first to-be-processed chroma dataset and the second to-be-processed chroma dataset of the green color are inputted to the afore-mentioned formula (15) so as to obtain an initial probability map associated with the green color in the to-be-displayed image.
Please refer to
Further, in some embodiments, there are multiple preset scenes and multiple preset colors.
For any one of the preset scenes, the initial probability of the preset color of the preset scene is corrected using the correlation coefficient.
The Gaussian probability of the multiple preset colors of the multiple preset scenes in the to-be-displayed image is obtained by calculating a sum of the initial probabilities, corrected using the correlation coefficients, of the multiple preset colors of the multiple preset scenes in the to-be-displayed image.
Specifically, in processing the to-be-displayed image, whether the to-be-displayed image contains the preset scene may be determined, and a coefficient value for a correlation to the preset scene is assigned based on a result of the determination. The correlation coefficient of a corresponding preset scene is corrected according to whether the to-be-displayed image contains a preset scene, and thus the Gaussian probability of the preset color of the preset scene obtained according to the Gaussian model can be adjusted. This can not only improve the efficiency of processing the to-be-displayed image but also improve the accuracy of color detection, thereby avoiding false detection of colors in other scenes that are similar to the preset color of the preset scene. In addition, since only the preset color of the preset scene in the to-be-displayed image is processed, this can effectively reduce a perception of image boxes and increase image quality at the time the image is outputted. A conventional approach may be utilized to determine whether the to-be-displayed image contains the preset scene.
Specifically, when multiple preset scenes and corresponding preset colors are set, the overall Gaussian probability of the preset colors of the multiple preset scenes can be obtained from a sum of the initial probabilities corrected by the correlation coefficients of the preset scenes, which can be obtained by the following formula:
gauss(cb,cr)=α*gaussskin(cbi,cri)+β*gausssky(cbi,cri)+γ*gaussgrass(cbi,cri) (16)
In this formula, gauss(cb,cr) is the Gaussian probability of the preset colors of the preset scenes in the to-be-displayed image, α is the correlation coefficient of the portrait scene in the to-be-displayed image, gaussskin(cbi,cri) is the initial probability related to the skin color obtained using the Gaussian model, β is the correlation coefficient of the blue sky scene in the to-be-displayed image, gausssky(cbi,cri) is the initial probability related to the blue color obtained using the Gaussian model, γ is the correlation coefficient of the grass scene in the to-be-displayed image, gaussgrass(cbi,cri) is the initial probability related to the green color obtained using the Gaussian model.
When there is no corresponding preset scene in the to-be-displayed image, the corresponding correlation coefficient can be assigned with a value of 0, and then the product of it and the initial probability of the preset color of the preset scene obtained by fitting with the Gaussian model is 0, thereby avoiding false detection of similar colors in the to-be-displayed image.
For example, when the to-be-displayed image contains a portrait scene, the correlation coefficient α associated with the portrait scene is assigned with a value of 1; when there is no portrait scene in the to-be-displayed image, the correlation coefficient α associated with the portrait scene is assigned with a value of 0. Likewise, when the to-be-displayed image contains a blue sky scene, the correlation coefficient β associated with the blue sky scene is assigned with a value of 1; when there is no blue sky scene in the to-be-displayed image, the correlation coefficient β associated with the blue sky scene is assigned with a value of 0. When the to-be-displayed image contains a grass scene, the correlation coefficient γ associated with the grass scene is assigned with a value of 1; when there is no grass scene in the to-be-displayed image, the correlation coefficient γ associated with the grass scene is assigned with a value of 0.
For example, when the to-be-displayed image contains a portrait scene but does not contain a blue sky scene and a grass scene, the resulting Gaussian probability is gauss(cb,cr)=gaussskin(cbi,cri) after the color data of the to-be-displayed image is fitted using the Gaussian model with a correction using the correlation coefficients of corresponding preset scenes, in which the Gaussian fitting probability associated with the blue sky scene and the grass scene is 0. When the to-be-displayed image contains a portrait scene and a blue sky scene and does not contain a grass scene, the resulting Gaussian probability is gauss(cb,cr)=gaussskin(cbi,cri)+gausssky(cbi,cri) after the color data of th to-be-displayed image is fitted using the Gaussian model with a correction using the correlation coefficients of corresponding preset scenes. When the to-be-displayed image contains a portrait scene, a blue sky scene and a grass scene at the same time, the resulting Gaussian probability is gauss(cb,cr)=gaussskin(cbi,cri)+gausssky(cbi,cri)+gaussgrass(cbi,cri) after the color data of the to-be-displayed image is fitted using the Gaussian model with a correction using the correlation coefficients of corresponding preset scenes. Please refer to portions (a) and (b) in
Please refer to
The display driving device includes:
By obtaining the Gaussian probability of the preset scene in the to-be-displayed image with respect to the preset color, comparing the Gaussian probability with the predetermined threshold, and determining whether to perform the viewing angle compensation on the to-be-displayed image based on the comparison result, the present application reduces the risk of crosstalk caused by the fact that the voltage drops of the coupling capacitance on the adjacent data lines 20 cannot be canceled each other out.
The embodiments of the present application further provide a display device 100, including a processor, a storage and a computer program stored in the storage and executable on the processor, wherein the processor executes the computer program to realize the steps of the afore-described display driving method.
Specifically, in the embodiments of the present application, the processor may be a Central Processing Unit (CPU). The processor may also be any other general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Arrays (FPGA),
or any other Programmable logic device, a discrete gate, a transistor logic device, a discrete hardware component, and the like. A general processor may be a microprocessor or the processor may be any conventional processor or the like. The general-purpose processor may be a microprocessor or the processor may also be any conventional processor, or the like.
Further, it should be understood that the storage in the embodiments of the present application may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memory. The non-volatile memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM),
an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM), which is used as an external cache. By way of example, but not a limitation, many forms of RAM are available, such as a static random access memory (SRAM), a dynamic random access memory (DRAM), a Synchronous DRAM (SDRAM), a Double Data Rate SDRAM (DDR SDRAM), an Enhanced SDRAM (ESDRAM), a Synchlink DRAM (SLDRAM) and a Direct Rambus RAM (DRRAM).
The display driving method and device, and the display device 100 provided in the embodiments of the present application are described in detail above. The principle and implementation of the present application are described herein through specific examples. The description about the embodiments of the present application is merely provided to help understanding the method and core ideas of the present application. In addition, persons of ordinary skill in the art can make variations and modifications to the present application in terms of the specific implementations and application scopes according to the ideas of the present application. Therefore, the content of specification shall not be construed as a limit to the present application.
Number | Date | Country | Kind |
---|---|---|---|
2022110677406.8 | Jun 2022 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2022/102580 | 6/30/2022 | WO |