1. Field of the Invention
The present invention relates to an image interpolation technology that generates a nonexistent video frame between two video frames by interpolation.
2. Description of the Related Art
A liquid crystal display (LCD) which is one of displays to be used in thin televisions employs “hold display” that keeps holding an image of a previous video frame until image data of a next video frame comes, unlike “impulse display” that flashes an image momently on a display device such as a Cathode Ray Tube (CRT) and a plasma display.
The hold display causes a phenomenon called “motion judder” on objects moving in the video due to a mismatch between motion interpolation resulted from the following of the eyeball and the hold display with no positional change. In addition, in one segment broadcasting which has started in recent years, unnatural motions may be prominent due to its lower frame rate, about 15 fps, than that of the current analog broadcasting.
In order to solve the problem, it is effective to generate a middle frame between video frames and show a video with a motion interpolation.
In broadcasting, a video to be interpolated does not always exist on the entire screen. For example, it corresponds to a case where non-picture areas (black bar area/wallpaper area) for adjusting picture size are provided, typically like letterbox bars and side panels.
In a case where the frame interpolation process is performed at a position close to a display device, not only video signals but also other display data including, for example, data on a text area and a pictogram display area may be processed as an input video. In this case, only the picture part may be enlarged to display on the entire screen by a user operation, and the borders of the areas do not always continue through time though there is a certain rule of screen design.
Japanese Laid-open Patent Publication No. 6-276510 discusses a method, as a technology for controlling the motion estimation in a non-picture area, which limits the direction of search on the basis of the current coordinates data in order to prevent the motion estimation at a side area of the screen from referring to the area outside the screen.
Japanese Laid-open Patent Publication No. 2005-287049 discusses a method, as a technology for generating an interpolation picture in a non-picture area, which inhibits generation of new image data when a reference image determined on the basis of a motion vector exists at an unacceptable image position.
However, the aforesaid conventional image interpolation methods have problems as follows.
The interpolation method of Japanese Laid-open Patent Publication No. 6-276510 may not properly limit the direction of motion estimation unless the image area data that designates a screen size is given at an appropriate time. It neither discusses how the switching of screen sizes is detected.
The interpolation method of Japanese Laid-open Patent Publication No. 2005-287049 assumes that the borders of the picture area is known and does not change. Thus, the method may not handle borders of a picture area changing in time.
Accordingly, it is an object of the present invention to provide a stable video by preventing deterioration of quality due to improper interpolation in a case where the video to be applied with image interpolation contains a non-picture area.
According to an aspect of the present invention, provided is an image interpolation apparatus for generating an interpolated video frame on the basis of a preceding video frame and a following video frame. The image interpolation apparatus includes a designation data storage, a first calculator, a second calculator, a determiner, a frame generator, and a controller. The designation data storage stores area data designating an image area within a screen area. The first calculator calculates a first feature value for the screen area. The second calculator calculates a second feature value for the image area in accordance with the area data stored in the designation data storage. The determiner determines whether the first feature value matches the second feature value. The frame generator generates the interpolated video frame including interpolated data generated on the basis of frame data of the preceding video frame and frame data of the following video frame. The controller controls the frame generator to generate the interpolated video frame including interpolated data for the image area and non-interpolated data for an area other than the image area within the screen area when a determination result by the determiner indicates a mismatch. The non-interpolated data may be the frame data of the preceding video frame or the frame data of the following video frame.
The first feature value may preferably be a value of an average motion vector.
The frame generator may include a first generator for generating the interpolated data for the image area, and a second generator for generating interpolated data for the area other than the image area within the screen area. In such a configuration, the controller may control the second generator to perform generation of the interpolated data when the determination result by the determiner indicates a match, and stop generation of the interpolated data when the determination result by the determiner indicates a mismatch.
With reference to the drawings, embodiments of the present invention will be discussed in detail below.
The designation data storage 101 stores area data for designating an image area at the screen center and outputs the area data to the feature value calculator 102 and the interpolated image generator 106. The motion estimator 105 performs block matching using blocks (rectangular areas) in a predefined size between two input video frames to obtain a motion vector in accordance with the matching result. The obtained motion vector is output to the feature value calculators 102 and 103 and the interpolated image generator 106. The feature value calculator 103 calculates a feature value of a screen for the entire screen, and the feature value calculator 102 calculates a feature value of a screen only within a designated area in accordance with given area data.
The interpolation determiner 104 determines whether the two feature values calculated by the feature value calculators 102 and 103 are matched to determine whether an interpolation at a side area of the screen outside the image area is valid. If the two feature values are matched, the interpolation is determined to be valid. Then, a control signal indicative of interpolation-ON is output to the interpolated image generator 106. If the two feature values are not matched on the other hand, the interpolation is determined to be invalid. Then, a control signal indicative of interpolation-OFF is output to the interpolated image generator 106.
The interpolated image generator 106 includes a center area interpolator 111 and a side area interpolator 112. The center area interpolator 111 generates an interpolated image within the image area on the basis of a motion vector obtained by the motion estimator 105. The side area interpolator 112 generates an interpolated image at a side area of the screen on the basis of a motion vector in accordance with the control signal output from the interpolation determiner 104. The side area interpolator 112 performs the interpolation process on the side area of the screen on the basis of the motion vector in response to an input of the control signal indicative of interpolation-ON, and stops the interpolation process on the side area of the screen in response to an input of the control signal indicative of interpolation-OFF.
Thus, an interpolated frame with interpolation on the entire screen is output in a case of interpolation-ON, and an interpolated frame with interpolation on the designated image area only is output in a case of interpolation-OFF.
Such an image interpolation process allows generation of an interpolated video in adapting to the change through time, if any, of the non-picture area at a side area of the screen. Thus, deterioration of quality due to improper interpolation on a non-picture area may be prevented, obtaining a stably interpolated video.
Next, operations by the image interpolation apparatus will be discussed in a case where images having side panels as shown in
The motion estimator 105 can obtain the motion vector on only an area (a searchable area) within the screen of an input video frame, in which the motion estimation is available. As for an area (an unsearchable area) on the fringe of the screen, in which the motion estimation is unavailable, the motion vector is obtained from spatial compensation employing the motion vector of an adjacent area.
The feature value calculator 103 calculates a feature value, in a searchable area within the screen area shown in
The interpolation determiner 104 compares the two average speeds of horizontal scrolling. In this case, since the average speed of horizontal scrolling obtained by the feature value calculator 103 includes the speed in the black bar areas (that is, still areas) of the side panels, the two average speeds of horizontal scrolling are unmatched even in a horizontal scrolling video. From the comparison result, the image is determined to include side panels, and the control signal indicative of interpolation-OFF is output to the interpolated image generator 106.
One famous method for motion estimation is a block matching method which obtains a motion vector by performing calculation of similarity between data of blocks in a predefined size. Each of grid squares in the searchable area indicates a unit block in the block matching method. The searchable area is divided into N×M blocks. BL(i,j) shown in
The feature value calculator 103 calculates a feature value within a feature value calculation area 605 including all blocks BL(i,j) (where i=0 to M-1, and j=0 to N-1). On the other hand, the feature value calculator 102 calculate a feature value within a feature value calculation area 606 including blocks BL(i,j) (where i=0 to M-1, and j=1 to N-2) between the borders 603 and 604.
Assume that mv(i,j) denotes a motion vector at the block BL(i,j). The average motion vector within a screen can be obtained by calculating an average value mvave(j) of M mv(i,j) in the vertical direction for each column and calculating an average value of the obtained multiple mvave(j). The feature value calculator 103 calculates an average motion vector mvaveall below, and the feature value calculator 102 calculates an average motion vector mvavesub below.
The interpolation determiner 104 compares mvaveall and mvavesub. In a case of an image having side panels, mvaveall does not match mvavesub. Therefore, the outside areas of the borders 603 and 604 are determined to be non-picture areas, and the control signal indicative of interpolation-OFF is output.
The feature value calculator 103 calculates a feature value, in a searchable area within the screen area shown in
The interpolation determiner 104 compares the two average speeds of vertical scrolling. In this case, since the average speed of vertical scrolling obtained by the feature value calculator 103 includes the speed in the black bar areas (that is, still areas) of the letterbox bars, the two average speeds of vertical scrolling are unmatched even in a vertical scrolling video. From the comparison result, the image is determined to include letterbox bars, and the control signal indicative of interpolation-OFF is output to the interpolated image generator 106.
Assume that mv(i,j) denotes a motion vector at the block BL(i,j). The average motion vector within a screen can be obtained by calculating an average value mvave(i) of N mv(i,j) in the horizontal direction for each row and calculating the average value of the obtained multiple mvave(i). The feature value calculator 103 calculates an average motion vector mvaveall below, and the feature value calculator 102 calculates an average motion vector mvavesub below.
The interpolation determiner 104 compares mvaveall and mvavesub. In a case of an image having letterbox bars, mvaveall does not match mvavesub. Therefore, the upper end area and lower end area are determined to be non-picture areas, and the control signal indicative of interpolation-OFF is output.
Although interpolation-ON/OFF is controlled by using the two feature value calculators on one kind of image area in the embodiment discussed above, three or more feature value calculators may be used for control in a case where multiple kinds of image area are designated in advance.
In this case, one feature value calculator may calculate a feature value of the entire screen, and the other feature value calculators may calculate feature values of respective image areas in accordance with area data. Then, by comparing obtained feature value of the entire screen and obtained feature values of the image areas, the kind of image area and the validity of the interpolation on side areas of the screen are determined.
The delay 1602 delays successively input video frames 1611 and 1612 by a predefined period of time and outputs them. The image interpolation apparatus 1601 generates an interpolated frame 1613 from a video frame 1612 at a current time and a video frame 1611 at a preceding time output from the delay 1602. The switch 1603 alternately selects and outputs a video frame output from the delay 1602 and an interpolated frame output from the image interpolation apparatus 1601. In this manner, the video frame 1611, interpolated frame 1613 and video frame 1612 are output in order from the frame rate converter.
The memory 1902 includes a ROM (read only memory) and a RAM (random access memory) and stores a program and data to be used for processing. The CPU 1901 uses the memory 1902 to execute a program for performing an image interpolation process and a frame rate conversion process.
In this case, an input video frame is stored in the memory 1902 as data to be processed, and a searched motion vector thereof is stored in the memory 1902 as data of the processing result. The designation data storage 101 corresponds to the memory 1902, and the feature value calculators 102 and 103, interpolation determiner 104, motion estimator 105, and interpolated image generator 106 correspond to the CPU 1901 executing respective processing in accordance with programs stored in the memory 1902.
The input device 1903 includes a keyboard or a pointing device, for example, and is used for inputting an instruction or data by an operator The output device 1904 includes a display device, a printer, or a loudspeaker, for example, and is used for outputting an inquiry or a processing result to an operator.
The external storage device 1905 includes a magnetic disk device, an optical disk device, a magneto-optical disk device or a tape device, for example. The information processing apparatus may store a program and data in the external storage device 1905 in advance and load them to the memory 1902 for use as required.
The medium drive device 1906 drives a portable recording medium 1909 and accesses recorded contents. The portable recording medium 1909 is an arbitrary computer-readable recording medium including a memory card, a flexible disk, an optical disk and a magneto-optical disk. An operator may store a program and data in the portable recording medium 1909 in advance and load them to the memory 1902 for use as required.
The network connection device 1907 connects to a communication network such as a LAN (local area network) and performs data conversion involved in communication. The information processing apparatus receives a program and data from an external device via the network connection device 1907 and loads them to the memory 1902 for use as required.
As discussed above, even in a case where a non-picture area in a video changes through time, a validity of interpolation can be determined in adapting to the change. Therefore, deterioration of quality due to improper interpolation on a non-picture area may be prevented, obtaining a stably interpolated video.
Also, even in a case where the frame interpolation process is performed at a position adjacent to a display device, external timing for changing the image area data is not required.
Number | Date | Country | Kind |
---|---|---|---|
2007-317531 | Dec 2007 | JP | national |