The present invention is generally directed to image processing and, more particularly, to correcting rotation of video frames.
A user of a video camera may be moving, have unsteady hands or hold the camera at one or more different angles while shooting a video. As a result, a recorded video may include one or more frames of the same scene that are rotated at different angles with respect to a reference (e.g., horizontal or vertical). The rotated frames may make the recorded video difficult to view or edit.
A method and apparatus for correcting a rotation of a video frame are described. According to a method, an amount of the rotation of the video frame with respect to a reference is determined. The rotation of the video frame is corrected based at least in part on the detected amount of the rotation of the video frame.
A more detailed understanding may be had from the following description, given by way of example in conjunction with the accompanying drawings wherein:
Video editing software may enable a user to manually correct undesired rotation of video frames (e.g., frame by frame). Such manual correction may be tedious, particularly if many frames are rotated at various angles. Accordingly, embodiments described below may provide for automatic rotation correction of video frames. Additionally, embodiments provide for scene change detection, which may enable a rotation correction function to distinguish between rotation of a frame and a change of scene to prevent mis-identification of frames as being rotated upon scene change.
The system 100 may also include a user interface 114 via which the user input 108 may be entered for consideration by the change integrator unit 110. The user interface 114 may be, for example, a graphical user interface, a manual user interface (including, for example, one or more buttons, switches, etc.) or a combination thereof. The user input 108 may include, for example, a number of degrees to rotate each frame in addition to any detected rotation. The number of degrees may be a custom number (e.g., 22 degrees, 37 degrees, etc.) or may be selected from a number of predetermined options (e.g., rotate each frame an additional 90 degrees, 180 degrees, 270 degrees, etc.) to compensate for, for example, a video captured in a landscape orientation. User input may also include enabling and disabling rotation correction functionality.
The rotation detection unit 104 may be configured to obtain the video input 102. By way of non-limiting example, the rotation detection unit 104 may receive or retrieve a video frame of a video signal from a video production, capture, reproduction and/or storage device such as a video camera, video camera phone, DVD player, PC or storage unit or apparatus. An example video signal 200 is illustrated in
Referring back to
The scene change detection unit 106 may also obtain the video input 102. The scene change detection unit 106 may analyze the video input 102 and determine whether a scene has changed at a particular frame. The scene change detection unit 106 may use any scene change detection technique known in the art. By way of example, the scene change detection unit 106 may detect a scene change by performing a frame to frame comparison (e.g., using a histogram or edge detection approach). By way of another example, the scene change detection unit 106 may detect a scene change by analyzing a compressed video signal (e.g., analyzing motion vectors of a moving picture experts group (MPEG) signal).
The change integrator unit 110 may receive the amount of rotation for each frame from the rotation detection unit 104, any indication that the scene has changed at a particular frame and any user input 108 and determine whether and how much to correct the rotation of each frame based on the received information. If the change integrator unit 110 determines that correction of the rotation of a frame is necessary, the change integrator unit 110 may correct its rotation by the determined amount (e.g., corrective action) and provide the video output 112. In an embodiment, the change integrator unit 110 may compare the received amount of rotation of the frame with a threshold (e.g., 5 degrees) and determine not to correct the frame if the received amount of rotation is less than the threshold amount.
If the rotation function is enabled (step 300), a video frame may be obtained (step 302). With respect to the signal 200 of
If a video frame is obtained (step 302), steps 304, 306 and 308 may occur. Steps 304, 306 and 308 are illustrated in
An amount of rotation of the obtained video frame may be detected (step 304). As described with respect to
With respect to the example signal 200 of
In an embodiment, more than one region of the currently obtained frame may be compared with more than one region of the previously obtained frame (e.g., regions A2 and C2 of frame 204 may be compared with regions A1 and C1 of frame 202, respectively). Here, the rotation detection unit 104 of
A scene change may be detected (step 306). With respect to the signal 200 of
Any user input that has been entered (e.g., the user input 108 of
An amount to correct the rotation of the obtained video frame (if any) may be determined (step 310). The amount of correction may equal, for example, the sum of the amount of rotation detected in step 304 and any additional amount of rotation correction entered as user input 108 and obtained in step 308. If a scene change is detected in step 306, the amount of correction applied to the obtained video frame may not include the amount of rotation detected in step 304. In an embodiment, if a scene change is detected in step 306, the amount of correction may equal the additional amount of correction entered as user input and obtained in step 308 (if any). This may prevent a frame in which a scene change occurs from being over or under rotated due to the rotation detection unit 104 identifying, for example, a region of the frame 210 that is similar to regions A1, A2 and A3 of frames 202, 204 and 306 (e.g., region B1) and erroneously detecting rotation of the frame based on the rotation of the identified region. If no amount of rotation is detected in step 304 and no additional amount of rotation correction has been entered, it may be determined that no correction should be made to the frame.
The rotation of the currently obtained frame may be corrected by the determined amount (if any) (step 312) and step 300 may be repeated. The example method may be repeated until the rotation function is disabled or no more video frames are able to be obtained (e.g., a video is finished playback).
Examples of corrected frames are illustrated in
Although features and elements are described above in particular combinations, each feature or element can be used alone without the other features and elements or in various combinations with or without other features and elements. The methods or flow charts provided herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable storage medium for execution by a general purpose computer or a processor.
Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of processors, one or more processors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
Embodiments of the present invention may be represented as instructions and data stored in a computer-readable storage medium. For example, aspects of the present invention may be implemented using Verilog, which is a hardware description language (HDL). When processed, Verilog data instructions may generate other intermediary data, (e.g., netlists, GDS data, or the like), that may be used to perform a manufacturing process implemented in a semiconductor fabrication facility. The manufacturing process may be adapted to manufacture and test semiconductor devices (e.g., processors) that embody various aspects of the present invention.
Number | Name | Date | Kind |
---|---|---|---|
5561571 | Kim | Oct 1996 | A |
5567050 | Zlobinsky et al. | Oct 1996 | A |
6256061 | Martin et al. | Jul 2001 | B1 |
6778699 | Gallagher | Aug 2004 | B1 |
6785401 | Walker et al. | Aug 2004 | B2 |
7048717 | Frassica | May 2006 | B1 |
7113880 | Rhea et al. | Sep 2006 | B1 |
7436411 | Marshall et al. | Oct 2008 | B2 |
7483657 | Ishida | Jan 2009 | B2 |
7512348 | Suzuki et al. | Mar 2009 | B2 |
7593042 | Cutler | Sep 2009 | B2 |
7831159 | Ishida | Nov 2010 | B2 |
7893963 | Gallagher et al. | Feb 2011 | B2 |
8189945 | Stojancic et al. | May 2012 | B2 |
20110149096 | Matsuyama | Jun 2011 | A1 |
Number | Date | Country |
---|---|---|
4419499 | Feb 2010 | JP |
2011176677 | Sep 2011 | JP |
Number | Date | Country | |
---|---|---|---|
20130136379 A1 | May 2013 | US |