1. Field of the Invention
The present invention relates to an image processing technique and, more particularly, to image processing when a display device displays a moving image.
2. Description of the Related Art
Moving image display devices represented by a TV set can be classified into hold-type display devices and impulse-type display devices. A hold-type display device continues displaying a single image in one frame interval ( 1/60 sec when the frame rate is 60 Hz). A liquid crystal display device and organic EL display using TFTs are known as hold-type display devices. On the other hand, an impulse-type display device displays an image only in the scanning interval of one frame interval so the pixel luminances start lowering immediately after the scanning. A CRT (Cathode Ray Tube) and FED (Field-Emission-type Display) are known as impulse-type display devices.
A hold-type display device is known to have a problem that a viewer readily perceives blurs of a moving object displayed on the screen (motion blurring). To cope with the blurs, the hold-type display device raises the driving frequency of its display to shorten the hold time. For example, Japanese Patent Laid-Open No. 2006-184896 discloses a technique (to be referred to as driving distributing hereinafter) which generates two sub frames from one input frame, that is, a sub frame without a high frequency component and a sub frame containing an emphasized high frequency component, and alternately displays two sub frames generated in correspondence with each frame.
On the other hand, an impulse-type display device is more advantageous in moving image visibility than a hold-type display device. However, since the device emits light only instantaneously in each frame interval ( 1/60 sec when the frame rate is 60 Hz), and repeats light emission at the period of 1/60 sec, a problem of flickering may arise. Flickering is more noticeable on a larger screen, and therefore tends to be a serious problem especially in the recent trend shifting toward display devices with wider screens. The impulse-type display device adopts, as a measure against flickering, a technique of raising the driving frequency of its display.
However, the present inventor found by experiments that when driving distributing raised the frame rate, the sum of waveforms of distributed sub frames and the integration effect by human eye were not always the same. More specifically, it was found that a uniform luminance portion of a frame image sometimes looked as if it changed brightness upon driving distributing.
The present invention provides a higher-quality display image for a viewer when a display device displays a moving image.
According to one aspect of the present invention, an image processing apparatus comprises: an input unit configured to input image data including m frame images per unit time; a filtering unit configured to generate a high-frequency component emphasized frame image and a low-frequency component frame image from each frame image included in the input image data; a correction unit configured to correct a luminance of the low-frequency component frame image corresponding to each frame image at a predetermined ratio so as to make the image data perceptible in the same brightness as that of each of the frame images output as the m frames per unit time; and an output unit configured to alternately output the high-frequency component emphasized frame image generated by the filtering unit and the low-frequency component frame image whose luminance has been corrected by the correction unit as image data including 2 m frame images per unit time.
According to another aspect of the present invention, a method of controlling an image processing apparatus, comprises the steps of: inputting image data including m frame images per unit time; generating a high-frequency component emphasized frame image and a low-frequency component frame image from each frame image included in the input image data; correcting a luminance of the low-frequency component frame image corresponding to each frame image at a predetermined ratio so as to make the image data perceptible in the same brightness as that of each of the frame images output as the m frames per unit time; and alternately outputting the high-frequency component emphasized frame image generated in the step of generating the high-frequency component emphasized frame image and the low-frequency component frame image whose luminance has been corrected in the step of correcting the luminance as image data including 2 m frame images per unit time.
According to the present invention, it is possible to provide a higher-quality display image for a viewer when a display device displays a moving image.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings. Note that the following embodiments are not intended to limit the scope of the invention, but are merely examples.
As the first embodiment of an image processing apparatus according to the present invention, an image processing apparatus 100 which outputs an image to a panel module 109 serving as a display device will be exemplified below. Note that an example will be explained below in which two sub frames (sub frame images) are generated from each of a plurality of frame images contained in moving image data of 60 frames per sec (60 Hz), and a moving image of 120 frames per sec (120 Hz) is output. The present invention is also applicable to any other input frame rate or output frame rate. Note that in the following description, “frame frequency” indicates the number of frames displayed per sec in progressive scanning, or the number of fields displayed per sec in interlaced scanning.
<Technical Premise>
The display characteristics of the hold-type display device and impulse-type display device described above in “BACKGROUND” will be described in more detail.
Hold-Type Display Device
As shown on the left view of
Impulse-Type Display Device
As shown on the left view of
<Arrangement of Apparatus>
<Operation of Apparatus>
Evaluation Experiments
The present inventor conducted evaluation experiments using the circuit arrangement shown in
Note that in the image processing apparatus 100, the minimum value filter 102 is configured to input the same value as the value of the pixel of interest to the entire input region (for example, 5×5 pixel region) of the filter. The softening filter 103 is configured to use “1” as the coefficient for the pixel of interest and “0” as the coefficient for other pixels. The distribution ratio circuit 104 is configured to set the first sub frame to 100% and the second sub frame to 0% for the patch of 60-Hz display, and set the first sub frame to 50% and the second sub frame to 50% for the patch of 120-Hz display. The luminance correction circuit 107 is configured not to perform luminance correction.
Referring to
Driving Distributing without Luminance Correction
That is, when the first sub frame (waveform 401) and the second sub frame (waveform 402) are alternately displayed, they are expected to be perceived as the waveform 403. Actually, however, the central portion looks dark, as indicated by the waveform 404. This is because the measured luminance (physical quantity) and the sensory luminance (psychological quantity) are different depending on the display frequency, as shown in
This will be explained in more detail with reference to
As described with reference to
Driving Distributing with Luminance Correction
Assume that the luminance correction circuit 107 performs luminance correction (sensory luminance correction) to compensate for the luminance. An example will be described here in which the luminance correction circuit 107 performs +4% luminance correction (the luminance correction coefficient is 1.04) to multiply the luminance of a sub frame corresponding to the “second sub frame” by 1.04.
Note that the luminance correction circuit 107 makes the luminance of the waveform 602 slightly higher (+4%) than that of the waveform 402 indicated by the dotted line. The luminance obtained as a measured luminance (physical quantity) by combining the waveforms 401 and 602 is higher at the central portion, as indicated by the waveform 603. However, the waveform 604 represented as a sensory luminance (psychological quantity) looks slightly dark at the central portion due to the influence of the above-described luminance change. For this reason, the luminance-corrected portion and the influence of the sensory luminance cancel each other so that a waveform having a uniform brightness like the original frame image can be obtained.
As described above, according to the first embodiment, it is possible to compensate for a decrease in the image luminance caused upon driving distributing while improving the display quality of a moving image on the display unit by driving distributing. This allows to display a higher-quality moving image for the user.
Note that the above-described change in the sensory luminance depending on the display frequency can occur in both the hold-type display device and the impulse-type display device. Hence, the above-described image processing apparatus can obtain the same effect for both the hold-type display device and the impulse-type display device.
Note that although simply correcting a “luminance” has been described above, the processing may be performed for the luminance (Y) component of an image expressed by YCbCr components or for the pixel value of each of the RGB colors (the luminance value of each color) of an RGB image.
A luminance correction circuit 2101 (second correction circuit in claims) performs luminance correction for the output from a subtraction processing circuit 106. Assume that the luminance correction circuit 2101 performs luminance correction (sensory luminance correction) to compensate for the luminance. An example will be described here in which the luminance correction circuit 2101 performs −4% luminance correction (the luminance correction coefficient is 0.96) to multiply the luminance of a sub frame corresponding to the “first sub frame” by 0.96.
Note that the luminance correction circuit 2101 makes the luminance of the waveform 2201 slightly lower (−4%) than that of a waveform 401 indicated by the dotted line. The luminance obtained as a measured luminance (physical quantity) by combining the waveforms 2201 and 402 is higher at the central portion, as indicated by the waveform 2203. However, the sensory luminance (psychological quantity) looks slightly dark at the central portion due to the influence of the above-described luminance change. For this reason, the luminance-corrected portion and the influence of the sensory luminance cancel each other so that the waveform 2204 having a uniform brightness like the original frame image can be obtained.
As described above, according to the second embodiment, it is possible to compensate for a decrease in the image luminance caused upon driving distributing while improving the display quality of a moving image on the display unit by driving distributing. This allows to display a higher-quality moving image for the user.
(Modification)
Note that the above-described first and second embodiments may be combined. More specifically, two luminance correction circuits may be provided to perform luminance correction for both the first sub frame and the second sub frame. For example, assume that an image of 60-Hz display that is darkened by 4% looks in the same brightness as an image of 120-Hz display. In this case, the luminance correction coefficient for the first sub frame is set to 0.98, and that for the second sub frame is set to 1.02.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (for example, computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2009-243783, filed on Oct. 22, 2009, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2009-243783 | Oct 2009 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
3699241 | Larsen | Oct 1972 | A |
4668975 | Kuwayama et al. | May 1987 | A |
6278445 | Tanaka et al. | Aug 2001 | B1 |
6771320 | Choi | Aug 2004 | B2 |
7260244 | Shikami et al. | Aug 2007 | B2 |
7428021 | Fukuda | Sep 2008 | B2 |
7508541 | Ishiguro | Mar 2009 | B2 |
7542619 | Toyooka et al. | Jun 2009 | B2 |
7609869 | Bernhardt et al. | Oct 2009 | B2 |
7639287 | Konda et al. | Dec 2009 | B2 |
7711203 | Wan | May 2010 | B2 |
7724307 | Wan et al. | May 2010 | B2 |
7970234 | Park et al. | Jun 2011 | B2 |
8174624 | Sakashita | May 2012 | B2 |
20030179945 | Akahori | Sep 2003 | A1 |
20050259064 | Sugino et al. | Nov 2005 | A1 |
20060023965 | Kimbell et al. | Feb 2006 | A1 |
20060119617 | Toyooka et al. | Jun 2006 | A1 |
20060164557 | Fukuda | Jul 2006 | A1 |
20060227249 | Chen et al. | Oct 2006 | A1 |
20070003156 | Lin et al. | Jan 2007 | A1 |
20070053607 | Mitsunaga | Mar 2007 | A1 |
20070263121 | Take et al. | Nov 2007 | A1 |
20080002872 | Gatesoupe et al. | Jan 2008 | A1 |
20080284768 | Yoshida et al. | Nov 2008 | A1 |
20090040374 | Kobayashi | Feb 2009 | A1 |
20090040376 | Kobayashi | Feb 2009 | A1 |
20090073192 | Kobayashi | Mar 2009 | A1 |
20090074325 | Kameyama | Mar 2009 | A1 |
20090226110 | Chen et al. | Sep 2009 | A1 |
20090273611 | Itokawa et al. | Nov 2009 | A1 |
20090278786 | Chan et al. | Nov 2009 | A1 |
20090303391 | Jung et al. | Dec 2009 | A1 |
20100013991 | Miyazaki et al. | Jan 2010 | A1 |
20100020230 | Suzuki | Jan 2010 | A1 |
20100098349 | Arashima et al. | Apr 2010 | A1 |
20100118214 | Yoshimura | May 2010 | A1 |
20100119150 | Kanai | May 2010 | A1 |
20100156772 | Arashima et al. | Jun 2010 | A1 |
20100259675 | Kawai | Oct 2010 | A1 |
20100310191 | Sato | Dec 2010 | A1 |
20110001757 | Kerofsky et al. | Jan 2011 | A1 |
20110019095 | He et al. | Jan 2011 | A1 |
20110234899 | Kobayashi | Sep 2011 | A1 |
20110261293 | Kimura | Oct 2011 | A1 |
Number | Date | Country |
---|---|---|
2006-184896 | Jul 2006 | JP |
Entry |
---|
XP031018371, Hanfeng Chen et al., “Nonlinearity compensated smooth frame insertion for motion-blur reduction in LCD”, IEEE 7th Workshop on Multimedia Signal Processing, Oct. 2005, pp. 1-4. |
Office action issued on Jul. 6, 2011, in 10186383.5, an EP counterpart of the present application. |
Number | Date | Country | |
---|---|---|---|
20110097012 A1 | Apr 2011 | US |