The present disclosure relates to a display device, a display method, and a computer program, and more particularly, to a display device suitably applied when a stereoscopic three-dimensional image is displayed, a display method, and a computer program.
There is a display device allowing a viewer to perceive a image displayed on a screen as a stereoscopic three-dimensional image. In order for the viewer to perceive the image as a stereoscopic three-dimensional image, it is necessary to display the image on the screen using a display method different from a general display method. As an example of the display method, there is a method of allowing the viewer to perceive the image as a stereoscopic image by changing a polarization state of an image for a right eye and an image for a left eye (for example, see Japanese Unexamined Patent Application Publication No. 10-63199). A viewer may perceive the image displayed on the screen as a stereoscopic three-dimensional image by changing the polarization state of the image for the right eye and the image for the left eye and wearing glasses of which the polarization state is changed between left and right sides so that the image for the right eye may be viewed from the right eye and the image for the left eye may be viewed from the left eye.
In order for the viewer to perceive the image as a stereoscopic three-dimensional image, it is usual to capture the image for the right eye and the image for the left eye, respectively, generally using two cameras and displaying the captured image on the display device. Further, when the three-dimensional image is captured using two cameras, it is necessary unify the setting of two cameras, such as a type of a lens, a diaphragm, and characteristics of an image pick-up device so as to create an image without a luminance difference or a color difference in the left and right images.
However, when the setting of two cameras is different and the luminance difference or the color difference between two types of images captured arises, in the display device perceiving the three-dimensional image by the method of alternately displaying the left and right images, flickering may be seen, image quality may be deteriorated, and visibility, or the like, may be adversely affected.
In order to prevent the flickering, although a method of synchronizing the focuses and the diaphragms and, the gains of the image pick-up devices between two cameras with each other or the like, has been disclosed (for example, see Japanese Unexamined Patent Application Publication No. 8-242468), the method has a problem that a dedicated camera increase necessary costs. Further, when analyzing the three-dimensional images actually broadcast, the luminances of the left and right images may be different from each other and the luminances or the contrasts of two cameras may not be adjusted. For example, there may be an image having a difference of about 4% in an average luminance of the entire screen.
It is desirable to suppress occurrence of flickering when displaying a three-dimensional image by correcting a difference occurring between an image for a right eye and an image for a left eye in a case in which the difference between the image for the right eye and the image for the left eye occurs.
According to an embodiment of the present disclosure, there is provided a display device including: a first measurement unit measuring information on luminance of a first image signal to output a first measurement result; a second measurement unit measuring information on a luminance of a second image signal to output a second measurement result; a comparator comparing the first measurement result with the second measurement result to output differential data; a correction amount determination unit determining a correction amount for the first image signal and/or the second image signal based on the differential data; and a correction unit correcting the luminance of the first image signal and/or the second image signal based on the correction amount.
The first measurement unit and the second measurement unit may measure the information on colors of the first image signal and the second image signal to output the first measurement result and the second measurement result.
The first measurement unit and the second measurement unit may divide the first image signal and the second image signal into a plurality of areas to perform the measurement on each area.
The correction amount determination unit may determine the correction amount for only the area in which the first measurement result and the second measurement result are a predetermined threshold value or more.
The correction amount determination unit may determine the correction amount for only an area of a central portion in the plurality of areas. Further, the correction amount determination unit may determine the correction amount for only the area in which the first measurement result and the second measurement result are a predetermined threshold value or more.
The comparator may output a difference square sum of the first measurement result and the second measurement result as the differential data.
The correction amount determination unit may determine the correction amount in response to the contents of the image displayed by the first image signal and the second image signal.
The display device may further include a display unit displaying the three-dimensional image based on the corrected first image signal and second image signal.
The first measurement unit and the second measurement unit may apply weighting to information on a black side of the information on the measured luminance to output the first measurement result and the second measurement result.
According to an embodiment of the present disclosure, there is provided a display method including: measuring information on luminance of a first image signal to output a first measurement result; measuring information on a luminance of a second image signal to output a second measurement result; comparing the first measurement result with the second measurement result to output differential data; determining a correction amount for the first image signal and/or the second image signal based on the differential data; and correcting the luminance of the first image signal and/or the second image signal based on the correction amount.
According to an embodiment of the present disclosure, there is provided a computer program allowing a computer to execute: measuring information on luminance of a first image signal to output a first measurement result; measuring information on a luminance of a second image signal to output a second measurement result; comparing the first measurement result with the second measurement result to output differential data; determining a correction amount for the first image signal and/or the second image signal based on the differential data; and correcting the luminance of the first image signal and/or the second image signal based on the correction amount.
As set forth above, the embodiment of the present disclosure measures the information on the luminance of the first image signal, measures the information on the luminance of the second image signal, compares the first measurement result and the second measurement result to output the differential data, determines the correction amount for the first image signal and/or the second image signal based on the differential data, and corrects the luminance of the first image signal and/or the second image signal based on the correction amount.
Hereinafter, exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Further, in the present specification and the drawings, components having the substantially same functional configuration are denoted by the same reference numerals and a repetitive description thereof will be omitted.
Further, a description will be made in the following order.
[1-1. Configuration of Display Device According to Embodiment of Present Disclosure]
[1-2. Functional Configuration of Display Device According to Embodiment of Present Disclosure]
[1-3. Configuration of Image Signal Controller]
[1-4. Configuration of Comparator]
[1-5. Image Correction Method]
[2-1. Configuration of Image Signal Controller]
[2-2. Image Correction Method]
[3-1. Configuration of Image Signal Controller]
Hereinafter, a configuration of a display device according to an embodiment of the present disclosure will be described. First, an appearance of the display device according to the embodiment of the present disclosure will be described.
The display device 100 illustrated in
Although the configuration of the image display unit 110 is described above, briefly describing this herein, the image display unit 110 is configured to include a light source, a liquid crystal panel, and a pair of polarizers having the liquid crystal panel interposed therebetween. Light from the light source becomes light polarized in a predetermined direction by transmitting the liquid crystal panel and the polarizers.
The shutter glasses 200 are configured to include an image transmitting unit 212 for a right eye and an image transmitting unit 214 for a left eye including, for example, a liquid crystal shutter. The shutter glasses 200 execute an opening and closing operation of the image transmitting unit 212 for the right eye and the image transmitting unit 214 for the left eye in response to a signal transmitted from the display device 100. The observer may see light emitted from the image display unit 110 through the image transmitting unit 212 for the right eye and the image transmitting unit 214 for the left eye of the shutter glasses 200 to perceive the image displayed on the image display unit 110 as the stereoscopic image.
Meanwhile, when the general image is displayed on the image display unit 110, the observer may see the light emitted from the image display unit 110 as it is to be perceived as the general image.
Further, although
As described above, an appearance of the display device 100 according to the embodiment of the present disclosure will be described. Next, a functional configuration of the display device 100 according to the embodiment of the present disclosure will be described.
As shown in
The image display unit 110 performs the display of the image as described above and performs the display of the image in response to the applied signal, when applied with a signal from the outside. The image display unit 110 is configured to include a display panel 112, a gate driver 113, a data driver 114, and a backlight 115.
The display panel 112 displays the image in response to the application of the signal from the outside. The display panel 112 displays the image by sequentially scanning a plurality of scanning lines. In the display panel 112, a liquid crystal molecule having a predetermined alignment state is sealed between transparent plates such as glass. A driving method of the display panel 112 may be a TN (twisted nematic) method, a VA (vertical alignment) method, or an IPS (in-phase-switching) method.
In the following description, although the driving method of the display panel 112 is described as the TN method unless particularly mentioned, it goes without saying that the embodiment of the present disclosure is not limited to the above example. Further, the display panel 112 according to the embodiment of the present disclosure is a display panel that may perform the rewriting of the screen at a high-speed frame rate (for example, 240 Hz). The embodiment of the present disclosure may alternately display the image for the right eye and the image for the left eye on the display panel 112 at a predetermined timing to allow the observer to perceive the images as the stereoscopic image.
The gate driver 113 is a driver for driving gate bus lines (not shown) of the display panel 112. The gate driver 113 receives a signal from the timing controller 140 and the gate driver 113 outputs the signal to the gate bus lines in response to the signal transmitted from the timing controller 140.
The data driver 114 is a driver that generates a signal for applying to data lines (not shown) of the display panel 112. The data driver 114 receives the signal from the timing controller 140 and the data driver 114 generates and outputs the signal applied to the data lines in response to the signal transmitted from the timing controller 140.
The backlight 115 is installed at the innermost of the image display unit 110 when being viewed from the observer side. When the image is displayed on the image display unit 110, white light that is not polarized (non-polarization) from the backlight 115 is emitted to the display panel 112 positioned at the observer side. As the backlight 115, for example, a light emitting diode may be used and a cold cathode tube may be used. Further, although
When the image signal controller 120 receives the transmission of the image signal from the outside of the image signal controller 120, it performs and outputs a variety of signal processings on the received image signal so that the received image signal becomes suitable to be displayed as the three-dimensional image in the image display unit 110. The image signal subjected to the signal processing in the image signal controller 120 is transmitted to the timing controller 140. Further, when the signal processing is performed in the image signal controller 120, the predetermined signal is transmitted to the shutter controller 130 in response to the signal processing. The signal processing in the image signal controller 120 may include the following example.
When the image signal (the image signal for the right eye) for displaying the image for the right eye on the image display unit 110 and the image signal (the image signal for the left eye) for displaying the image for the left eye on the image display unit 110 are transmitted to the image signal controller 120, the image signal controller 120 generates the image signal for the three-dimensional image from two image signals. In the embodiment of the present disclosure, the image signal controller 120 generates the image signal to be displayed on the display panel 112 in an order of the image for right eye→the image for the right eye→the image for the left eye→the image for the left eye→the image for the right eye→the image for the right eye→ . . . from the image signal for the right eye and the image signal for the left eye that are input.
Further, the image signal controller 120 performs the color correction processing unifying colors by removing the color difference when the color difference between the image for the right eye and the image for the left eye occurs. In addition, the configuration and the color correction processing of the image signal controller 120 will be described below.
The shutter controller 130 receives the transmission of the predetermined signal generated in response to the signal processing in the image signal controller 120 and generates the shutter control signal controlling the shutter operation of the shutter glasses 200 in response to the signal. The shutter glasses 200 perform the opening and closing operation of the image transmitting unit 212 for the right eye and the image transmitting unit 214 for the left eye based on the shutter control signal generated in the shutter controller 130 and generated from the infrared emitter 150.
The timing controller 140 generates a pulse signal used for the operation of the gate driver 113 and the data driver 114 in response to the signal transmitted from the image signal controller 120. The image in response to the signal transmitted from the image signal controller 120 is displayed on the display panel 112 by generating the pulse signal in the timing controller 140 and receiving the pulse signal generated in the timing controller 140 by the gate driver 113 and the data driver 114.
Further, the timing controller 140 performs the predetermined signal processing when generating the pulse signal used for the operation of the gate driver 113 and the data driver 114. The timing controller 140 is an example of a driving compensator of the embodiment of the present disclosure. Crosstalk may be improved for a period in which the shutters of the shutter glasses 200 are opened by the predetermined signal processing in the timing controller 140. The predetermined signal processing in the timing controller 140 will be described below in detail.
As described above, the functional configuration of the display device 100 according to the embodiment of the present disclosure will be described with reference to
As shown in
The left eye image measurement unit 121a measures a color difference (Cb and Cr) average, color difference (Cb and Cr) dispersion, and Hue histogram of the image signal for the left eye. The left eye image measurement unit 121a transmits the information on the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram, which are measured, to the comparator 122. In addition, the image signal (original image signal) for the left eye that is used for the measurement is transmitted to the left eye image correction unit 124a from the left eye image measurement unit 121a.
The right eye image measurement unit 121b measures the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram of the image signal for the right eye, similar to the left eye image measurement unit 121a. The right eye image measurement unit 121b transmits the information on the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram, which are measured, to the comparator 122. Further, the image signal (original image signal) for the right eye that is used for the measurement is transmitted to the right eye image correction unit 124b from the right eye image measurement unit 121b.
The comparator 122 compares the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram that are measured by the left eye image measurement unit 121a with the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram that are measured by the right eye image measurement unit 121b to generate the differential data between the image signal for the left eye and the image signal for the right eye. The differential data generated in the comparator 122 is transmitted to the correction amount determination unit 123.
The correction amount determination unit 123 determines the correction amount using the differential data generated by the results of comparing the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram that are measured by the left eye image measurement unit 121a with the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram that are measured by the right eye measurement unit 121b, all of which are transmitted from the comparator 122. The correction amount determination unit 123 may determine the correction amount by calculating the correction amount from the differential data, may determine the correction amount by referring to a lookup table from the differential data, and may determine the correction amount by other methods, when determining the correction amount. The information on the correction amount determined by the correction amount determination unit 123 is transmitted to the left eye image correction unit 124a and the right eye image correction unit 124b.
The correction amount determination unit 123 may also obtain the correction amount, for example, from the measurement result of the entire image and may also obtain the correction amount by dividing the image into the plurality of blocks and weighting a value of any specific block. When the correction amount is obtained by dividing the image into the plurality of blocks, a background portion considered as usually having the small difference is focused upon while considering the fact that the illumination of light to the object of interest within the image is different between the left and right sides. The correction amount determination unit 123 determines the correction amount so that the left and right differences of the background area become small while considering the fact that the difference of the background area indicates the left and right differences of the entire image. It is determined whether or not the area is the background area by using the luminance dispersion. In the image, the area having the small dispersion or the area having a smaller value than a threshold value may be the background area. The determination of the background area may also use the luminance data of the image.
As described above, the luminance dispersion and the color difference dispersion are obtained in the left eye image measurement unit 121a (or the right eye image measurement unit 121b) by dividing the image into the blocks and the correction amount determination unit 123 does not perform the calculation of the correction amount on the blocks having a value less than the predetermined threshold value and may perform the calculation of the correction amount on only the blocks having a value of the predetermined threshold value or more.
For example, when the block having the luminance dispersion less than 3000 is excluded from the object of the correction amount calculation, in the above Table 1, a fourth block, a fifth block, a seventh block, a twelfth block, a twenty-second block, and a twenty-fifth block become the blocks which are excluded from the object of the correction amount calculation.
Further, for example, when the block having the color difference (Cb) dispersion less than 20 is excluded from the object of the correction amount calculation, in the above Table 1, a first block to a fifth block, a seventh block, a twelfth block, a seventeenth block, a twentieth block, a twenty-second block, and a twenty-fifth block become the blocks which are excluded from the object of the correction amount calculation.
Further, for example, when the block having the color difference (Cr) dispersion less than 20 is excluded from the object of the correction amount calculation, in the above Table 1, a first block to a seventh block, a twelfth block, a seventeenth block, and a twenty-fifth block become the blocks which are excluded from the object of the correction amount calculation.
Further, the blocks in which any one of the luminance dispersion and the color difference (Cb and Cr) dispersion is less than a threshold value by obtaining the luminance dispersion and the color difference dispersion may be excluded from the object of the correction amount calculation and the blocks in which all of the luminance dispersion and the color difference (Cb and Cr) dispersion are less than a threshold value may be excluded from the object of the correction amount calculation.
Various methods for performing the calculation processing of the correction amount in the correction amount determination unit 123 may be adopted. In one example, the correction amount may be determined so as to, for example, uniformly apply bias to each pixel and coefficients of a gamma curve may be adjusted in order to obtain the correction amount in response to the color difference and the Hue of each pixel. Further, for example, when using the method referring to the look-up table, the correction amount of the color difference and the Hue is held in the table and the correction amount of the color difference and Hue may be an amount obtained by multiplying a predetermined gain in the table.
The left eye image correction unit 124a performs the color correction processing on the image for the left eye based on the correction amount determined by the correction amount determination unit 123. Similarly, the right eye image correction unit 124b performs the color correction processing on the image for the right eye based on the correction amount determined by the correction amount determination unit 123. Further, since it may be very difficult to fully match the colors of the image for the left eye and the image for the right eye, in the embodiment of the present disclosure, the color correction processing is performed in the left eye image correction unit 124a and the right eye image correction unit 124b so that the difference between the image for the left eye and the image for the right eye is smaller than the threshold value.
In the display device 100 according to the embodiment of the present disclosure, when there is a color difference between two images by comparing the image for the left eye and the image for the right eye, the images may be corrected so as to match the colors of the image adopting any one of the image for the left eye and the image for the right eye as a reference with the colors of the image referenced beforehand and both of the images may be corrected so as to form the intermediate color of the image for the left eye and the image for the right eye.
As described above, the configuration of the image signal controller 120 according to the embodiment of the present disclosure was described with reference to
The difference square sum calculator 126 compares the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram that are measured by the left eye image measurement unit 121a with the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram that are measured by the right eye image measurement unit 121b to calculate the difference square sum therebetween. The difference square sum calculated by the difference square sum calculator 126 is transmitted to the correction amount determination unit 123 as the differential data.
Next, the image correction method by the display device 100 according to the embodiment of the present disclosure will be described.
In the display device 100 according to the embodiment of the present disclosure, in order to perform the correction so that the color of the image for the right eye matches the color of the image for the left eye, the left eye image measurement unit 121a and the right eye image measurement unit 121b measure the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram of the image for the left eye and the image for the right eye, respectively (step S101).
When the left eye image measurement unit 121a and the right eye image measurement unit 121b measure the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram of the image for the left eye and the image for the right eye, respectively, the comparator 122 receives the measurement value from the left eye image measurement unit 121a and the right eye image measurement unit 121b to calculate the differential data of the measurement value (step S102). The differential data may be differential data obtained by simply calculating the difference from the color difference (Cb and Cr) average, the color difference (Cb and Cr) dispersion, and the Hue histogram of the image for the left eye and the image for the right eye and the difference square sum may be differential data obtained by calculating the difference square sum therebetween.
When the differential data of the measurement value are calculated in the comparator 122, the correction amount for the image for the left eye or the image for the right eye is determined in the correction amount determination unit 123 based on the differential data calculated by the comparator 122 (step S103). Further, as described above, when the correction amount is determined, the correction amount may be obtained from the measurement results of the entire image and may be obtained by dividing the image into the plurality of blocks and weighting the value of any specific block. Further, as described above, when the correction amount may be determined in the correction amount determination unit 123, the correction amount may be determined by uniformly applying bias to each pixel and the coefficients of the gamma curve may be adjusted in order to obtain the correction amount in response to the color difference and the Hue of each pixel. Further, for example, when the correction amount determination unit 123 uses the method referring to the look-up table, the correction amount for the color difference and the Hue are held in the table and the correction amount for the color difference and Hue may be an amount obtained by multiplying a predetermined gain in the table.
When the correction amount for the image for the left eye or the image for the right eye is determined in the correction amount determination unit 123, the left eye image correction unit 124a and the right eye image correction unit 124b perform the color correction processing on the image for the left eye or the image for the right eye based on the correction amount determined by the correction amount determination unit 123 (step S104). As described above, in the embodiment of the present disclosure, when there is a color difference between two images by comparing the image for the left eye and the image for the right eye, the images may be corrected so as to match the colors of the image adopting any one of the image for the left eye and the image for the right eye as a reference with the colors of the image referenced beforehand and both of the images may be corrected so as to form the intermediate color of the image for the left eye and the image for the right eye.
As described above, the image correction method by the display device 100 according to the embodiment of the present disclosure was described with reference to
First, similarly to the processing shown in
When the differential data of the measurement value are calculated in the comparator 122, subsequently, it is determined in the correction amount determination unit 123 whether the value of the calculated differential data is equal to or larger than a predetermined threshold value or not (step S113). If it is determined that the value of the calculated differential data is the predetermined threshold value or more, the correction amount determination unit 123 determines the correction amount for the image for the left eye or the image for the right eye based on the differential data calculated by the comparator 122 (step S114).
In this case, when the correction amount for the image for the left eye or the image for the right eye is determined in the correction amount determination unit 123, the left eye image correction unit 124a and the right eye image correction unit 124b performs the color correction processing on the image for the left eye or the image for the right eye based on the correction amount determined by the correction amount determination unit 123 (step S115). When the color correction processing is performed in the left eye image correction unit 124a and the right eye image correction unit 124b, the process returns to the above step S112 and the comparator 122 measures the color difference average, the color difference dispersion, and the Hue histogram of the image for the left eye or the image for the right eye, respectively, to calculate the differential data.
Meanwhile, at step S113, if the value of the differential data calculated by the comparator 122 is less than the predetermined threshold value, the process ends in this state.
As described above, when the correction processing is performed multiple times with reference to
By correcting the image for the right eye and the image for the left eye as described above, it is not necessary to control and synchronize the cameras when capturing the three-dimensional image, the improvement of the image quality may be expected due to the reduction in the flickering between the left and right images, and the image easily displayed stereoscopically due to the reduction in the flickering between the left and right images may be generated in the display device. Further, the color of the object of interest may be maintained in the image when the user performs the stereoscopic view, by dividing the image into the plurality of blocks to calculate the correction amount.
In addition, in the above description, although the color difference average, the color difference dispersion, and the Hue histogram of the image for the left eye and the image for the right eye are measured to calculate the differential data of the measurement results, the occurrence of the flickering when the user performs the stereoscopic view may be suppressed by measuring only the luminance histogram of the image for the left eye and the image for the right eye. In the following description, as a modified example of the embodiment of the present disclosure, the display device suppressing the occurrence of the flickering by measuring the luminance histogram of the image for the left eye and the image for the right eye and calculating the differential data will be described.
As shown in
The left eye image measurement unit 221a measures the luminance average, the luminance dispersion, and the luminance histogram of the image signal for the left eye. The information on the luminance average, the luminance dispersion, and the luminance histogram that are measured by the left eye image measurement unit 221a is transmitted to the comparator 222. Further, the image signal (original image signal) for the left eye that is used for the measurement is transmitted to the left eye image correction unit 224a from the left eye image measurement unit 221a.
The right eye image measurement unit 221b measures the luminance average, the luminance dispersion, and the luminance histogram of the image signal for the right eye, similarly to the left eye image measurement unit 221a. The information on the luminance average, the luminance dispersion, and the luminance histogram that are measured by the right eye image measurement unit 121b are transmitted to the comparator 222. Further, the image signal (original image signal) for the right eye that is used for the measurement is transmitted to the right eye image correction unit 224b from the right eye image measurement unit 221b.
The comparator 222 compares the luminance average, the luminance dispersion, and the luminance histogram that are measured by the left eye image measurement unit 221a with the luminance average, the luminance dispersion, and the luminance histogram that are measured by the right eye image measurement unit 221b to generate the differential data between the image signal for the left eye and the image signal for the right eye. The differential data generated in the comparator 222 is transmitted to the correction amount determination unit 223.
The correction amount determination unit 223 determines the correction amount using the differential data generated as the result of comparing the luminance average, the luminance dispersion, and the luminance histogram that are measured by the right eye image measurement unit 221a with the luminance average, the luminance dispersion, and the luminance histogram that are measured by the left eye image measurement unit 221b, which are transmitted from the comparator 222. The correction amount determination unit 223 may determine the correction amount by calculating the correction amount from the differential data, may determine the correction amount by referring to a lookup table from the differential data, and may determine the correction amount by other methods, when determining the correction amount. The information on the correction amount determined by the correction amount determination unit 223 is transmitted to the left eye image correction unit 224a and the right eye image correction unit 224b.
The correction amount determination unit 223 may also obtain the correction amount, for example, from the measurement result of the entire image and may also obtain the correction amount by dividing the image into the plurality of blocks and weighting a value of any specific block. When the correction amount is obtained by dividing the image into the plurality of blocks, a background portion considered as usually having the small difference is focused on while considering the fact that the illumination of light to the object of interest within the image is different between the left and right sides. The correction amount determination unit 223 determines the correction amount so that the left and right differences of the background area become small while considering the fact that the difference of the background area indicates the left and right differences of the entire image. It is determined whether or not the area is the background area by using the luminance dispersion. In the image, the area having the small dispersion or the area having a smaller value than a threshold value may be the background area. The determination of the background area may also use the luminance data of the image.
In the present modified example, the correction amount determination unit 223 does not perform the calculation of the correction amount on the block having a value less than the predetermined threshold value and may perform the calculation of the correction amount on only the block having a value of the predetermined threshold value or more, by dividing the image into the plurality of blocks and obtaining the luminance dispersion in the left eye image measurement unit 221a and the right eye image measurement unit 221b as shown in
For example, when the block having the luminance dispersion less than 3000 is excluded from the object of the correction amount calculation, in the above Table 1, a fourth block, a fifth block, a seventh block, a twelfth block, a twenty-second block, and a twenty-fifth block become the blocks which are excluded from the object of the correction amount calculation.
Further, similarly to the above-mentioned correction amount determination unit 123, various methods for the calculation processing of the correction amount in the correction amount determination unit 223 may be adopted. In one example, the correction amount may be determined so as to, for example, uniformly apply bias to each pixel and the coefficients of the gamma curve may be controlled in order to obtain the correction amount in response to the luminance of each pixel. Further, for example, when using the method referring to the look-up table, the correction amount for the luminance is held in the table and the correction amount for the luminance may be an amount obtained by multiplying predetermined gains in the table.
The left eye image correction unit 224a performs the luminance gain control processing on the image for the left eye based on the correction amount determined by the correction amount determination unit 223. Similarly, the right eye image correction unit 224b performs the luminance gain control processing on the image for the right eye based on the correction amount determined by the correction amount determination unit 223. Further, since it may be very difficult to fully match the colors of the image for the left eye and the image for the right eye, in the embodiment of the present disclosure, the luminance gain control processing is performed in the left eye image correction unit 224a and the right eye image correction unit 224b so that the difference between the image for the left eye and the image for the right eye is smaller than the threshold value.
In the modified example of the embodiment of the present disclosure, when there is the color difference between two images by comparing the image for the left eye and the image for the right eye, the images may be corrected so as to match the luminance of the image adopting either one of the image for the left eye and the image for the right eye as a reference and the other one thereof as a reference and both of the images may be corrected so as to form the intermediate luminance of the image for the left eye and the image for the right eye.
As described above, the configuration of an image signal controller 220 that is a modified example of the image signal controller 120 according to the embodiment of the present disclosure was described. In addition, similarly to the above-mentioned comparator 122, when generating the differential data, the comparator 222 in
Next, the image correction method by an image signal controller 220 that is a modified example of the image signal controller 120 according to the embodiment of the present disclosure will be described.
In the image signal controller 220 according to the present modified example, in order to perform the correction so that the luminance of the image for the right eye matches that of the image for the left eye, the left eye image measurement unit 221a and the right eye image measurement unit 221b first measure the luminance average, the luminance dispersion, and the luminance histogram of the image for the left eye and the image for the right eye, respectively (step S201).
When the left eye image measurement unit 221a and the right eye image measurement unit 221b measure the luminance average, the luminance dispersion, and the luminance histogram of the image for the left eye and the image for the right eye, respectively, the differential data of the measurement value is calculated in the comparator 222 (step S202). The differential data may be the differential data obtained by simply calculating the difference from the luminance average, the luminance dispersion, and the luminance histogram of the image for the left eye and the image for the right eye and the difference square sum may be the differential data obtained by calculating the difference square sum of both of the images.
When the differential data of the measurement value are calculated in the comparator 222, the correction amount for the image for the left eye or the image for the right eye is determined in the correction amount determination unit 223 based on the differential data calculated by the comparator 222 (step S203). Further, as described above, when the correction amount is determined, the correction amount may be obtained from the measurement results of the entire image and may be obtained by dividing the image into the plurality of blocks and applying weighting to the value of any specific block. Further, as described above, when the correction amount is determined in the correction amount determination unit 123, the correction amount may be determined by uniformly applying bias to each pixel and the coefficients of the gamma curve may be controlled in order to obtain the correction amount in response to the luminance of each pixel. Further, for example, when the correction amount determination unit 123 uses the method referring to the look-up table, the correction amount for the luminance is held in the table and the correction amount for the luminance may be an amount obtained by multiplying a predetermined gain in the table.
When the correction amount for the image for the left eye or the image for the right eye is determined in the correction amount determination unit 123, the left eye image correction unit 124a and the right eye image correction unit 124b performs the luminance correction processing on the image for the left eye or the image for the right eye based on the correction amount determined by the correction amount determination unit 123 (step S204). As described above, in the embodiment of the present disclosure, when there is the luminance difference between two images by comparing the image for the left eye and the image for the right eye, the images may also be corrected so as to match the luminance of the image adopting any one of the image for the left eye and the image for the right eye as a reference and the other one thereof as a reference and both of the images may be corrected so as to form the intermediate luminance of the image for the left eye and the image for the right eye.
As described above, the image correction method by the display device 100 according to the embodiment of the present disclosure was described with reference to
As such, even though there is the luminance difference between the image for the left eye and the image for the right eye, both of the images may be corrected to have the same brightness by measuring the luminance average, the luminance dispersion, and the luminance histogram of the image for the left eye and the image for the right eye, calculating the differential data of the measurement result, and obtaining the correction amount of the luminance for the image for the left eye and the image for the right eye based on the differential data.
By correcting the image for the right eye and the image for the left eye as described above, it is not necessary to control and synchronize the cameras when capturing the three-dimensional image, the improvement of the image quality may be expected due to the reduction in the flickering between the left and right images, and the image easily displayed stereoscopically due to the reduction in the flickering between the left and right images may be generated in the display device. Further, the brightness of the object of interest may be maintained in the image when the user performs the stereoscopic view, by dividing the image into the plurality of blocks to calculate the correction amount.
Further, although the embodiment of the present disclosure and the modified example thereof describe the display device 100 providing the stereoscopic view to the viewer by the shutter glasses 200, the present disclosure is not limited thereto. Similarly, it goes without saying that the present disclosure may also be applied to the display device providing the stereoscopic view to the viewer without using the shutter glasses 200.
The APL measurement unit 321 measures an average value of the input image signals. In this case, the calculation of the average value of the luminance values will be continuously described. The APL measurement unit 321 corresponds to the left eye image measurement unit 121a and the right eye image measurement unit 121b of the image signal controller 120 in
The luminance average value from the APL measurement unit 321 is supplied to the APL holding unit 323 and the calculator 324. The APL holding unit 323 holds the average luminance value measured from the image signal of a frame earlier by one frame than the luminance average value (the luminance average value output from the APL measurement unit 321) input to the calculator 324. The APL holding unit 323 has a function of performing delay processing in order to supply the luminance average value prior to one frame to the calculator 324 by the APL measurement unit 321.
The calculator 324 is supplied with the luminance average value from the APL measurement unit 321 and the luminance average value from the APL holding unit 323. As described above, the APL measurement unit 321 is alternately input with the image signal of the image for the left eye and the image signal of the image for the right eye. Therefore, the luminance average value measured from the image signal of the image for the left eye and the luminance average value measured from the image signal of the image for the right eye are alternately output from the APL measurement unit 321.
Therefore, when the luminance average value of the image for the left eye is output from the APL measurement unit 321, the APL holding unit 323 is in a state in which the luminance average value of the image for the right eye prior to one frame is held. In this case, the calculator 324 is supplied with the luminance average value of the image for the left eye from the APL measurement unit 321 and is supplied with the luminance average value of the image for the right eye from the APL holding unit 323. Further, when the luminance average value of the image for the right eye is output from the APL measurement unit 321, the APL holding unit 323 is in a state in which the luminance average value of the image for the left eye prior to one frame is held. In this case, the calculator 324 is supplied with the luminance average value of the image for the right eye from the APL measurement unit 321 and is supplied with the luminance average value of the image for the left eye from the APL holding unit 323.
As described above, the calculator 324 is supplied with the luminance average value of the image for the left eye and the luminance average value of the image for the right eye. The calculator 324 subtracts the luminance average value of one side from the luminance average value of the other side and outputs the difference value to the gain correction unit 325. In this case, the subtraction of the luminance average value of the image for the left eye from the luminance average value of the image for the right eye will be continuously described.
The luminance average value from the APL measurement unit 321 is input to a terminal a of the calculator 324 and the luminance average value from the APL measurement unit 321 is input to a terminal b of the calculator 324. In this case, when the luminance average value of the image for the right eye is input to the terminal a, the terminal a becomes positive (+). In this case, since the terminal b is input with the luminance average value of the image for the left eye, the terminal b becomes negative (−). Further, when the luminance average value of the image for the left eye is input to the terminal a, the terminal a becomes negative (−). In this case, since the terminal b is input with the luminance average value of the image for the right eye, the terminal b becomes positive (+). As described above, since the luminance average value of the image for the right eye becomes positive at all times and the luminance average value of the image for the left eye becomes negative by attaching a sign, the luminance average value of the image for the left eye is subtracted from the luminance average value of the image for the right eye to calculate the difference value.
The gain correction unit 325 calculates the value of the corrected gain (correction amount) from the input difference value. In this case, the correction method of the gain of the gain correction unit 325 will be described. The gain correction unit 325 corrects the gain based on, for example, the gain correction curve shown in
A horizontal axis of the gain correction curve shown in
When the difference value becomes the first threshold value or less, the correction amount is increased as a linear function (in this case, increased in a negative direction) and when the difference value exceeds a constant value, the correction amount also becomes a constant value (−lr limit). Similarly, when the difference value becomes the second threshold value or more, the correction amount is increased as a linear function (in this case, increased in a positive direction) and when the difference value exceeds a constant value, the correction amount also becomes a constant value (lr limit).
When the difference value exceeds the constant value, the reason for making the correction amount the constant value is that the rapid change in, for example, the luminance value due to, for example, the change of a scene is considered. If the luminance value is rapidly changed due to the change of the scene, the difference value also becomes large. However, under the above-mentioned situation, when the correction amount becomes large according to the size of the difference value, even though the rapid change of the luminance value is correct, the correction is performed by the large correction amount according to the rapid change, such that the incorrect correction is performed. In the case of the constant difference value or more, the above-mentioned situation does not occur by making the correction amount the constant value.
The gain correction unit 325 (
The correction amount from the gain correction unit 325 is supplied to the filter 327. The filter 327 may be configured as, for example, an infinite impulse response (IIR) filter. The filter 327 is installed to absorb the rapid change. For example, when the correction amount from the gain correction unit 325 is rapidly changed, for example, when the correction amount is changed from the negative correction amount to the positive correction amount, it is considered that the rapid change in the luminance may be caused even in the corrected image. The filter 327 is installed so as not to cause the above-mentioned rapid change and therefore, any filter having the above-mentioned function may be applied as the filter 327.
The amplifier 328 amplifies the output from the filter 327 at a predetermined magnification. For example, the amplifier 328 may amplify the input correction amount at a magnification of ½. Further, the correction amount that is not amplified by the amplifier 328 and previously amplified at ½ times by the gain correction unit 325 may be output.
The amplifier 328 performs the amplification as well as the inversion processing of the sign of the correction amount, if necessary. In detail, when the image signal of the image for the right eye is input, the positive sign is multiplied and when the image signal of the image for the left eye is input, the negative sign is multiplied. Therefore, in this case, the amplifier 328 multiplies (½) when the image signal of the image for the right eye is input and multiplies (−½) when the image signal of the image for the left eye is input.
As described above, the calculator 324 and the amplifier 328 convert the sign according to whether the image signal input to the APL measurement unit 321 is the image for the right eye or the image for the left eye and processes the image signal. For this reason, a flag showing whether the image signal input to the APL measurement unit 321 is the image for the right eye or the image for the left eye is input to the calculator 324 and the amplifier 328 and a flag generator 326 generating the flag and the image signal controller 320 shown in
The flag generator 326 is input with, for example, a V synchronization signal. The flag generator 326 is configured to determine whether the image signal is the image for the right eye or the image for the left eye from the input V synchronization signal and generate the flag. Further, the flag generator 326 is configured to hoist the flag when the image signal is the image for the right eye and to lower the flag when the image signal is the image for the left eye. In the case of the above-mentioned configuration, the calculator 324 and the amplifier 328 determine whether the flag from the flag generator 326 is hoisted or not to determine whether the image signal is the image for the right eye or not.
Further, in this case, although the description indicating whether the image signal is the image for the right eye or the image for the left eye by the flag is made, a system transferring whether the image signal is the image for the right eye or the image for the left eye as the information other than the flag to the calculator 324 and the amplifier 328 may be installed.
The correction amount from the amplifier 328 is supplied to the luminance controller 322. The luminance controller 322 is supplied with the image signal and the correction amount input to the image signal controller 320. The luminance controller 322 performs the correction on the image, that is, the supplied image signal on the basis of the correction amount and outputs the corrected image signal to the image display unit 110. In this case, the image signal of which the luminance value is corrected is output.
In addition, in this case, although the luminance is described by way of example, even when, for example, the color difference other than the luminance is corrected, the correction may be processed by the image signal controller 320 of the configuration shown in
Next, the correspondence relationship between the image signal input to the image signal controller 320 and the output image signal will be described with reference to
At time t1, when the image signal L1 of the image for the left eye is input to the APL measurement unit 321, the APL measurement unit 321 calculates a luminance average value APL-L1. At time t1, the image signal L1 is also input to the luminance controller 322. Further, at time t1, the luminance average value APL-R0 calculated by the APL measurement unit 321 is supplied to the APL holding unit 323 at time t0 and is held. Even at time t1, since the luminance average value from the APL holding unit 323 is not input to the calculator 324, the luminance controller 322 performs the processing of making the correction amount 0 without calculating the correction amount. Therefore, at time t1, the image signal L1 input to the luminance controller 322 is output without change.
At time t2, when the image signal R2 of the image for the right eye is input to the APL measurement unit 321, the APL measurement unit 321 calculates a luminance average value APL-R2. At time t2, the image signal R2 is also input to the luminance controller 322. Further, at time t2, the luminance average value APL-L1 calculated by the APL measurement unit 321 at time t1 is supplied to the APL holding unit 323 and held in the APL holding unit 323 and at the same time, the luminance average value APL-R0 held at time t1 is supplied to the calculator 324. Further, the calculator 324 is supplied with the luminance average value APL-L1 from the APL measurement unit 321.
At time t2, the calculator 324 subtracts the luminance average value APL-L1 from the luminance average value APL-R0 and outputs the difference value to the gain correction unit 325. Even at time t2, since there is no output from the gain correction unit 325, the luminance controller 322 performs the processing of making the correction amount 0 without calculating the correction amount. Therefore, at time t2, the image signal R2 input to the luminance controller 322 is output without change.
At time t3, when an image signal L3 of the image for the left eye is input to the APL measurement unit 321, the APL measurement unit 321 calculates a luminance average value APL-L3. At time t3, the image signal L3 is also input to the luminance controller 322. Further, at time t3, the luminance average value APL-R2 calculated by the APL measurement unit 321 at time t2 is supplied to the APL holding unit 323 and held in the APL holding unit 323 and at the same time, the luminance average value APL-L1 held at time t2 is supplied to the calculator 324. Further, the calculator 324 is supplied with the luminance average value APL-R2 from the APL measurement unit 321.
At time t3, the calculator 324 subtracts the luminance average value APL-L1 from the luminance average value APL-R2 and outputs the difference value to the gain correction unit 325. At time t3, the gain correction unit 325 calculates the correction amount from the input difference value, which is in turn output to the filter 327. The correction amount is subjected to the processing of each of the filter 327 and the amplifier 328 and is supplied to the luminance controller 322. In this case, at time t3, the correction amount output from the amplifier 328 is considered a correction amount Z1.
At time t3, the luminance controller 322 corrects the input image signal L3 with the correction amount Z1 and outputs the corrected image signal L3 (Z1). In this case, the mark of the image signal L3 (Z1) shows the image signal L3 corrected with the correction amount Z1. As described above, the correction amount Z1 is a value calculated from the luminance average value APL-R2 and the luminance average value APL-L1. As described above, in the image signal controller 320, the corrected image signal is corrected by the correction amount calculated from the image signal prior to one frame and the image signal prior to two frames.
At time t4, when an image signal R4 of the image for the right eye is input to the APL measurement unit 321, the APL measurement unit 321 calculates a luminance average value APL-R4. At time t4, the luminance average value APL-L3 calculated by the APL measurement unit 321 at time t3 is supplied to the APL holding unit 323 and held in the APL holding unit 323 and at the same time, the luminance average value APL-R2 held at time t3 is supplied to the calculator 324. Further, the calculator 324 is supplied with the luminance average value APL-L3 from the APL measurement unit 321.
At time t4, the calculator 324 subtracts the luminance average value APL-L3 from the luminance average value APL-R2 and outputs the difference value to the gain correction unit 325. At time t4, a correction amount Z2 is output from the gain correction unit 325 and is subjected to the processing of each of the filter 327 and the amplifier 328 and is then supplied to the luminance controller 322. At time t4, the luminance controller 322 corrects the input image signal R4 with the correction amount Z2 and outputs the corrected image signal R4 (Z2).
At time t5, when an image signal L5 of the image for the left eye is input to the APL measurement unit 321, the APL measurement unit 321 calculates a luminance average value APL-L5. At time t5, the luminance average value APL-R4 calculated by the APL measurement unit 321 at time t4 is supplied to the APL holding unit 323 and held in the APL holding unit 323 and at the same time, the luminance average value APL-L3 held at time t4 is supplied to the calculator 324. Further, the calculator 324 is supplied with the luminance average value APL-R4 from the APL measurement unit 321.
At time t5, the calculator 324 subtracts the luminance average value APL-L3 from the luminance average value APL-R4 and outputs the difference value to the gain correction unit 325. At time t5, a correction amount Z3 is output from the gain correction unit 325 and is subjected to the processing of each of the filter 327 and the amplifier 328 and is then supplied to the luminance controller 322. At time t5, the luminance controller 322 corrects the input image signal L5 with the correction amount Z3 and outputs the corrected image signal L5 (Z3).
The above-mentioned processing is repeated in the image signal controller 320, such that the image signal of which the luminance value is corrected is output. The image based on the corrected image signal is provided to the user, such that for example, the flickering may not be caused.
Incidentally, an example of a method of providing the three-dimensional image to the user may mainly include a frame sequential method, a side by side method, and an over and under (that is, top and bottom) method. When the above-mentioned image signal controller 320 corresponds to the frame sequential method, the configuration shown in
The image signal controller 320′ shown in
The processing after being converted into the frame sequential method by the frame sequential converter 351 is similar to the image signal controller 320 shown in
Incidentally, the human eye has a characteristic of being sensitive to a black side. Since the human eye reacts sensitively to the change in luminance at the black side rather than the change in luminance at the white side, for example, the luminance value of the black side rather than the luminance value of the white side may be intensively processed.
For example, in the image signal controller 320 shown in
In the graph shown in
Further, when the weighting coefficients as shown in
As described above,
Further, although the embodiment describes the case of performing the processing using the weighting coefficients so that the APL measurement unit 321 (image signal controller 320) may intensively process the luminance of the black side, for example, the left eye image measurement unit 121a or the right eye image measurement unit 121b of the image signal controller 120 shown in
Further, in the embodiment of the present disclosure, the correction processing may be performed once and may be performed multiple times until the difference is less than the predetermined threshold value.
Since even the case in which the image for the right eye and the image for the left eye are corrected by the above-mentioned luminance average value of the black side is the correction according to the characteristics of an eye, the correction to prevent, for example, flickering from occurring may be performed.
The series of processes described in the embodiment of the present disclosure may be performed by dedicated hardware but may be performed by software. When the series of processes are performed by software, a recording medium recording a computer program is stored in the display device 100 and the series of processes may be implemented by executing the computer program by a CPU or other control devices. Further, when the series of processes are performed by software, the recording medium recording the computer program is stored in a dedicated or general-purpose computer and the series of processes may be implemented by executing the computer program by a CPU or other control devices.
As described above, although the exemplary embodiment of the present disclosure was described with reference to the accompanying drawings, the embodiment of the present disclosure is not limited thereto. It is apparent that a person skilled in the art to which the present disclosure pertains can implement various modifications and alterations without departing from the scope of the appended claims and it should be understood that they belong to the scope of the present disclosure.
For example, although the above embodiment may divide the image into the plurality of blocks to determine the correction amount for only the blocks in which the dispersion of the luminance or the color difference is the predetermined threshold value or more when determining the correction amount, the embodiment of the present disclosure is not limited thereto. For example, the embodiment of the present disclosure may divide the image into the plurality of blocks to determine the correction amount for a central block (in the above embodiment, for example, seventh to ninth blocks, twelfth to fourteenth blocks, and seventeenth to nineteenth blocks) in which the left and right disparity is small. Further, the embodiment of the present disclosure may determine the correction amount for only the block in which the dispersion of the luminance or the color difference is also the predetermined threshold value or more after the block determining the correction amount is limited to the central block.
Further, for example, as the result of analyzing the image for the left eye and the image for the right eye, when the characters are included in the image, the correction amount may be determined in the correction amount determination units 123 and 223 so that the luminance or the color difference of the portion corresponding to the character is matched. Further, for example, the correction amount may be determined in the correction amount determination units 123 and 223 according to the analysis of the image for the left eye and the image for the right eye and the contents included in the image. For example, when the image includes relatively high proportion of scenery, the correction amount may be determined in the correction amount determination units 123 and 223 so that the luminance or the color difference of the portion corresponding to the scenery is matched. Further, when the image includes relatively many people, the correction amount may be determined in the correction amount determination units 123 and 223 so that the luminance or the color difference of the portion corresponding to the people is matched.
Further, for example, as the result of analyzing the image for the left eye and the image for the right eye, when the image is computer graphics, the correction amount determination units 123 and 223 may omit the calculation of the correction amount, such as deliberately not performing the correction.
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-124997 filed in the Japan Patent Office on May 31, 2010, the entire contents of which are hereby incorporated by reference.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
P2010-124997 | May 2010 | JP | national |